• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 30FPS was once acceptable, BIT ISN'T ANYMORE!... mostly...

01011001

Banned
READ THE FUCKING POST BEFORE POSTING NON-ARGUMENTS PLEASE.

This thread is inspired by some of the comments I read in this thread:
where people talk about playing old games that ran at super low framerates, and having no issues with that...
An I too am one of them, I even just recently played through Disaster Report on XBSX2, a PS2 game that has a 20fps lock, and even drops below that 20fps lock in some scenes...
I had no issues playing it... but why do I hate 30fps in modern games, when I play 20fps retro games just fine?

The quick explanation would be:
1: Framerate is not the issue.
2: Input lag, and visual noise is.

Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.


But I will elaborate on these a bit, otherwise what an OP would this be? :D


First of all, input lag:
Many modern engines have an insufferable amount of input lag, all while the games aren't designed with them in mind either.
Unreal Engine is especially bad with this, but is by far not the only one... God of War's current engine has the same issue, as does Cryengine and many others.
They don't inherently have these issues, but the way almost all developers use them is the issue, as they rely on many of the default way these engines read inputs, render images and Vsync games.

Ocarina of Time on N64 had less input lag at 20fps, than almost any modern game has at 30fps... and for some even the 60fps modes have higher input lag than 20fps Ocarina of Time... this is not an exaggeration, it's absolutely true.

And this is not down to modern TVs either, in fact, if you have a modern Samsung TV from the last 2 production years for example, and you play at 120fps on it, your Input Lag of your Screen is lower than the latency of a CRT... because yes CRTs also have input lag, this lag comes from how they draw an image from top to bottom 60 times each second, which means 1 image takes 16.6 ms to be drawn by a CRT, input lag is measured at the center of the image usually, meaning a CRT given those standards would have input latency of at least 8.3ms of input lag. While a modern 120hz Samsung TV has ~5ms of input lag... and about 9ms at 60hz.
The CRT Input Lag problem basically has ben solved now, we are so close to that even on TV screens now that it's not a factor anmore, let alone on high end PC monitors.

And this increase in latency isn't only obvious when comparing super old games to super new games.
New games still have a shitload of variation, with games like Call of Duty Modern Warfare 2/Warzone 2.0 having input lag so low, that they compete with old SNES games!
We are talking ~40ms of latency at 60fps, and below 30ms at 120fps, which is lower than most 8 and 16 bit games ever dreamt of reaching.
We could compare some Xbox 360 games that ran at 30fps to some modern games that run at 60fps, and Halo 3 would win... God of War Ragnarök at 60fps funnily enough has about the same latency as Halo 3 at 30fps, which is around 90ms for both (+/- 5ms)

So even tho there are modern games that absolutely crush it when it comes to latency, like Call of Duty, and there are of course old games like Killzone 2 that were infamous due to their high latency, it's sadly a pattern that latency is going up.
We are now in a spot where a bulk of modern titles have the same, or even more Input Lag than Killzone 2 had, a game that was panned at the time for how awful the aiming feels due to it's lag.
Coming back to God of War Ragnarök. That game in its 30fps Graphics Mode, has an input latency of around 170ms. Killzone 2's latency was in the 150ms ballpark!!!

So let that sink in for a moment... during Gen 7, Killzone 2 got massive bad PR for having 150ms of latency... and a modern game like God of War Ragnarök easily exceeds that!
Meanwhile Halo 3 had 90ms of latency at 30fps on an Xbox 360, and Ragnarök has about the same amount of latency in its 60fps Performance Mode.
In God of War's case it's mostly the Vsync that is the issue it seems, since as soon as you use the unlocked VRR mode that deactivates Vsync, the latency shrinks down to 90ms, in par with Halo 3 and the 60fps Vsync mode of the game.

Why does Vsync introduce input lag? Because it pre-renderes (buffers) frames. And some games take this to another level, usually to smooth out framerates due to being extremely demanding on the GPU, and they will hold multiple frames in the back buffer before displaying a new one, which gives them the possibility to basically have a fallback frame under most circumstances, which keeps the percieved framerate consistent at the cost of input lag.
Sea of Thieves in fact is the only game I know of that actually lets the user chose how many frames it should pre-render.


Secondly, how games build their final image:
Modern games have tons of high frequency details, and modern games are expected to be played on high resolution displays.
These 2 factors are the reason many games now use aggressive TAA, and more and more games use one of the many forms of Upsampling to higher resolutions from a lower base resolution.

These 2 things both lead to the same issue, a muddy and hard to parse image in motion.

Then a third factor comes in, MOTION BLUR *scary thunder noises*
And Motion Blur adds even more visual noise and muddies the final image even more.

So a modern game will often look pristine and super clean when you hold your camera completely still, but as soon as your character, and especially the camera, moves at all... MUD AND SMEARING.
And funnily enough we have an old game as an example here again, that at the time got a lot of flak for having the same issues, that nowadays are so common that noon even really talks about them in reviews or anything anymore...
And that game is Halo Reach.

Halo Reach was an early example of a game that used a form of Temporal Anti Aliasing... TAA.
And TAA at the time Halo Reach was made, was still firmly in its infancy, exhibiting a lot of issues and visual noise. So if you stood completely still in Halo Reach, it looked GLORIOUSLY smooth and clean... move the camera and it was a jittery mess.

These days TAA got a lot better, but it still has issues with ghosting and clear fizzle and disocclusion artifacts the moment something on screen moves.
But of course, TAA isn't the shiny new thing anymore... we now have FSR2, TSR and many other proprietary methods by different developers...
And these Upsampling methods basically bring bag almost all the issues Halo Reach had, and then some!

When you play Ocarina of Time at 20fps on an N64 connected to a CRT, motion is clean! You can see motion clearly, there's no artifacts from anything moving too fast, there is no blur to reduce legibility of the action happening around you, and there's no motion blur to try to make it seem smoother than it is.

"but why is this a 30fps problem mister Y" is what you are now asking I bet!
WELL, TAA, FSR2, TSR, CBR, ADD UPSAMPLING METHOD HERE... and Motion Blur, all of these rely on image data from previous frames!
THE LESS IMAGE DATA, THE MUDDIER THE IMAGE IS IN MOTION!
So, at 30fps these modern upsampling and anti aliasing methods have less temporal data available to them to create their final image!
A game running at 1440p FSR2 Quality Mode will look sharper and have less artifacts when running at 60fps than it would have at the same settings but locked to 30fps.
So the lower the framerate, the more "smear" the more "mud" and the more "fizzle" there is.

So in the end, all of this adds up.
The Motion Blur, the Image Reconstruction, the Antialiasing, all of it get worse the lower the framerate is.

This is why Jedi Survivor in motion looks like this (I purposefully took a shot from a spot where the game locks to 60fps to give it the best chances):
starwarsjedisurvivorm7ifc.png


and F-Zero GX like this (I'm turning sharp left here to give it the worst chances with fast camera and vehicle movement):
fzeroscreend5cph.png



Normalized for same image size so you don't have to zoom as much on mobile (not zoomed in in any way, just cropped)
swscreenzoomxlfwx.png
fzeroscreenzoom7ocx7.png



And the lower the framerate, the worse all the weird grainy, blurry, fizzle, artifacts you see in Jedi Survivor will get. Meanwhile if you ran F-Zero GX at 10fps, it would look literally the exact same in terms of image quality and artifacts (because it has no artifacts)
And we haven't even touched motion blur, which is off in my screenshot here, along will all the other image filters.

If a game like Jedi Survivor runs at a lower framerate, the amount of data the engine has to draw the image from one frame to the next is reduced, the image gets less readable and muddy, and you will see blur in motion, even tho motion blur is off.


Smaller issues that aren't technical but due to Art Design of games:
Older games have different considerations when it comes to art (textures, models, scale) than modern games.
Older consoles couldn't have as much detail than newer consoles can handle. This means textures had less "visual noise", meaning there is less for your eyes to get hung up on, less to process for your brain.
In motion, when running through a level, or turning a camera, this means the things that are important, or could be important, as well as your general environment, is easier to digest.

You will have an easier time knowing where enemies are, or where a button or rope is, when it's simpler in design, bigger in scale and stands out more. And all of these features are features older graphics had!
On a wall with a simple flat texture, no shadows or reflections, seeing a red button that is on top of that unrealistically big in scale, is easier to see than a red button with realistic scale, on a wall with detailed textures, shadows of adjacent objects on it and other lighting applied.

So even if older games had the graphical issues I explained above, the simpler graphics would still make it way easier on your eyes, because there is not as much small detail that can get lost in all the blur, fizzle and breakup.

In short: Less Detail + Scale = Easier to read environments even at low framerates.



SO IN CONCLUSION:
Playing a game like Ocarina of Time will need getting used to at first, but once your eyes are adjusted to the 20fps output, the image will look pretty decent and clean.
Playing Jedi Survivor at 60fps already makes the whole image look muddy and blurry due to the artifacting of the upsamling. At 30fps this will only get worse, add Motion Blur to negate the stutter and you'll see barely anything in motion.

Playing Ocarina of Time at 20fps on real hardware will feel... not amazingly responsive, but more responsive than many modern games like... Jedi Survivor... or Redfall... or God of War Ragnarök feel at 30fps, hell some of them even at 60fps!

So you can not use an old game and point the finger at it saying "LOOK! back then we had no issues with this! this also ran at a super low framerate!", while not taking into account how modern game engines work, how they react to player input, and how they construct their final output image.

Old games had simple graphics, they were easier to parse by your eyes because important stuff was exaggerated in scale and shape, they didn't have motion blut, no reconstruction artifacts, no TAA or Screen Space effects that lag behind and take time to accumulate data. They had simpler engines with less latency.
And that is why they look cleaner and play better at low framerates than new games.

And yes, there are modern games and old games that are outliers from this. Like some modern ones like From Software games have pretty low latency, and some old games like Mortal Kombat on GameBoy have massive amounts of lag.
And some old games had weird fake motion blur which made the image hard to read, while some modern games have clean images with barely anything to muddy them in motion.
The difference is that there are more of the less good examples today, and more of the good examples in the past.
 
Last edited:

Abriael_GN

RSI Employee of the Year
30 FPS is perfectly acceptable depending on the game. Some benefit from higher frame rate, and some simply don't.

For many games, stable 30 is actually better for most people than unstable 60.

Ultimately, framerate perception is very, very personal, and 60 FPS fanatics that bend over twice to try to demonstrate that one option is objectively better than the other simply don't understand this simple fact. 30 FPS may not be acceptable for you, but it certainly is for many other people, and how they feel is exactly as valid as how you feel.
 
Last edited:

01011001

Banned
30 FPS is perfectly acceptable depending on the game. Some benefit from higher frame rate, and some simply don't.

For most games, stable 30 is actually better for most people than unstable 60.

Ultimately, framerate perception is very, very personal, and posts that try to demonstrate that one option is objectively better than the other simply don't understand this simple fact. 60 FPS may not be acceptable for you, but it certainly is for many other people, and how they feel is exactly as valid as how you feel.

so you do not address anything I said in the OP,
almost noone reads the OP of a thread, I get this... but it's so fucking annoying...
 

Deerock71

Member
As long as the devs accounted for the framerate (EXAMPLES: Starfox SNES and Goldeneye N64), then there are no issues EXCEPT the people playing them.
 

Abriael_GN

RSI Employee of the Year
so you do not address anything I said in the OP,
almost noone reads the OP of a thread, I get this... but it's so fucking annoying...

I don't, because this is the usual thread that ignores the simple fact that frame rate perception is a very personal thing and replaces that fact with the self-centered approach that what is unacceptable for you must be unacceptable for everyone.

We've had a gazillion of these threads, here, and elsewhere. You're not making any new argument, and like everyone else who made a thread like this before, you fail to understand that this is a subjective matter that depends entirely on how each individual's brain is wired.
 

01011001

Banned
I don't, because this is the usual thread that ignores the simple fact that frame rate perception is a very personal thing and replaces that fact with the self-centered approach that what is unacceptable for you must be unacceptable for everyone.

We've had a gazillion of these threads, here, and elsewhere. You're not making any new argument, and like everyone else who made a thread like this before, you fail to understand that this is a subjective matter that depends entirely on how each individual's brain is wired.

then stop posting if you aren't interested in what I am saying. Just read it and argue it, or don't and ignore it.
and I have seen ZERO threads talking about what I am talking about here, which you would know IF YOU ACTUALLY READ IT!
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
It is acceptable, just not ideal. Some of the greatest games released in the past 10 years were console-only games at 30fps. BOTW is like a 25-30fps game and it's still fantastic.

By getting a console and settling for convenience, you're accepting whatever the developer decides to give you and if it's 30fps, you don't have much of a choice.

30fps isn't a deal-breaker for me on console if the game is worth playing. A 30fps cap on PC on the other hand is a different story.
 

Abriael_GN

RSI Employee of the Year
then stop posting if you aren't interested in what I am saying. Just read it and argue it, or don't and ignore it.
and I have seen ZERO threads talking about what I am talking about here, which you would know IF YOU ACTUALLY READ IT!

The fact that I find what you said completely moot and unoriginal doesn't mean I haven't read it. That's your (wrong) assumption.

I argued it. And my argument is that your whole premise is misguided because again, you fail to consider the fact that frame rate perception is demonstrably very personal. Hence, what is acceptable or isn't, is also entirely subjective.
 

01011001

Banned
The fact that I find what you said completely moot and unoriginal doesn't mean I haven't read it. That's your (wrong) assumption.

I argued it. And my argument is that your whole premise is misguided because again, you fail to consider the fact that frame rate perception is demonstrably very personal. Hence, what is acceptable or isn't, is also entirely subjective.

So you don't think 30fps games NOW are worse than 30fps games in the 90s? because THAT is my actual argument, which you fail to address, because I bet you in fact didn't read the post.
30fps games now are for the most part objectively worse than 20fps games on N64 in many aspects. THAT is my argument
 
Last edited:

Abriael_GN

RSI Employee of the Year
So you don't think 30fps games NOW are worse than 30fps games in the 90s? because THAT is my actual argument, which you fail to address, because I bet you in fact didn't read the post.
30fps games now are for the most part objectively worse than 20fps games on N64

Worse or better is irrelevant to whether it's acceptable or not. Whether they're worse doesn't just depend on personal perception, but also on the kind of game.

One of the worst fallacies argumentative people on the internet fall prey to is believing that people who don't accept or agree with their argument must not have read or understood it.

I've read it. I've understood it. And as I went out of my way to explain, I consider it flawed at its core, no more and no less than the many, many people who have argued for an absolute solution to this problem before you. It's simple as that.

Incidentally, showing what a game looks *in motion* with a still screenshot actually adds some comedic relief to the otherwise moot argument.
 
Last edited:

01011001

Banned
Worse or better is irrelevant to whether it's acceptable or not. Whether they're worse doesn't just depend on personal perception, but also on the kind of game.

it still is a fact that Jedi Survivor has a less clean and harder to read image in motion and worse input lag (normalized for TV lag) than Ocarina of Time, even tho Ocarina of Time ran at 20fps


One of the worst fallacies argumentative people on the internet fall prey to is that people who don't accept or agree with their argument must not have read or understood it.

well you still fail to adress my arguments, so you clearly didn't read it, or act the same as a person who didn't.


I've read it. I've understood it. And as I went out of my way to explain, I consider it flawed at its core, no more and no less than the many, many people who have argued for an absolute solution to this problem before you. It's simple as that.

how is it flawed? I am literally arguing with objective facts about video games, how they behave, and how older games at lower framerates are objectively superior in terms of playability than most modern titles.
that's not a flawed argument, that is using facts to demonstrate why you can't use old games as examples of why low framerates are totally ok in the here and now.
 
Last edited:

StreetsofBeige

Gold Member
A bad fps game feels shit to play regardless of input lag.

A 30 fps PS1 game is just as stodgy as a modern day 30 fps game. In fact, a modern game will feel better and look nicer since you got motion blur to smooth things out (as long as it's not too much). Those old 30 fps games barely even were 30 fps. More like "up to 30 fps". The second the action heats up or youre driving a car and turn a corner, it tanks to the 20s and the camera pans chop city.
 

diffusionx

Gold Member
Didn't read but um.. Bloodborne is really hard for me to play bc of the framerate. There I said it
Bloodborne feels crappy to play because of the bad frame pacing, e.g. the 30 frames do not come out evenly. This is another potential issue.

I also think another issue is input lag, you add all this extra processing and the game just takes forever to respond to your inputs, so even at a consistent 30fps the game just feels less responsive. I think as a general rule the 360/PS3 generation all had better input latency on 30fps than games today, the best example is NFS HP2010.

edit: Obviously you play a N64 game on a CRT vs a current one on a new TV and you're cutting 15-50ms of lag out entirely which can make a huge difference.
 
Last edited:

01011001

Banned
A bad fps game feels shit to play regardless of input lag.

A 30 fps PS1 game is just as stodgy as a modern day 30 fps game. In fact, a modern game will feel better and look nicer since you got motion blur to smooth things out (as long as it's not too much). Those old 30 fps games barely even were 30 fps. More like "up to 30 fps". The second the action heats up or youre driving a car and turn a corner, it tanks to the 20s and the camera pans chop city.

I disagree, because motion blur only makes it easier to adjust to the low framerate. but you get used to the judder of a low framerate, and without motion blur you will have a way cleaner image that gives you superior readability at low framerates.

if you play Ocarina of Time for an hour, you won't notice the 20fps anymore. if it had motion blur, you would notice how hard of a time you have focusing on elements on screen due to the blur
 
Last edited:

Abriael_GN

RSI Employee of the Year
it still is a fact that Jedi Survivor has a less clean and harder to read image in motion and worse input lag (normalized for TV lag) than Ocarina of Time, even tho Ocarina of Time ran at 20fps

well you still fail to adress my arguments, so you clearly didn't read it, or act the same as a person who didn't.

how is it flawed? I am literally arguing with objective facts about video games, how they behave, and how older games at lower framerates are objectively superior in terms of playability than most modern titles.
that's not a flawed argument, that is using facts to demonstrate why you can't use old games as examples of why low framerates are totally ok in the here and now.

You're not arguing with objective facts. You're mistaking your personal perception for objective facts.

Which, I'll give you, is what 90% of people arguing on the internet does, but it isn't any less misguided just because it's common.
 
Bloodborne feels crappy to play because of the bad frame pacing, e.g. the 30 frames do not come out evenly. This is another potential issue.

I also think another issue is input lag, you add all this extra processing and the game just takes forever to respond to your inputs, so even at a consistent 30fps the game just feels less responsive. I think as a general rule the 360/PS3 generation all had better input latency on 30fps than games today, the best example is NFS HP2010.
I completely agree. I've been playing on PS5 but I'm so tempted to mod my ps4 to use that 60 fps hack I've hear about. I do love fromsoft games so it seems worth it.
 

Punished Miku

Gold Member
Totally agree with your section about modern games having far too much visual noise. Personally I think that's been a problem for 10 years, regardless of framerate. PS4 gen on has had frequent games that require highlighting things in neon outlines to even know its there. The more detailed it is often the less you actually "see" when the goal is interactivity and ease of play.
 

phant0m

Member
good post OP. As I posted in the Jedi Survivor OT, turning off Motion Blur and Film Grain does a LOT to clean up the performance mode image. Is it perfect? no, but boy does it help a lot in my eyes.

to your point, this does not work well for 30 fps mode -- eliminating the blur makes camera movement very stuttery. that said, leave it on, put your frame and pixel counters away, play the game. it's perfectly playable and very fun. is playing at a locked 60 fps better? sure is. does 30 fps mode feel bad and sluggish after switching from performance? yeah. does it still feel bad after playing for 15 minutes? not really

maybe i'm just fortunate that i've played games on so many systems across the decades that i can just adapt to various levels of latency/framerate and it doesn't bother me. i grew up playing split-screen Goldeneye dude, and i promise you variable 15-20 fps (or less) does not feel better than a modern game locked at 30. shit was messy AF and unpredictable. the best thing that era had going for responsiveness was a) CRT displays and b) all wired inputs

I am probably going to be another one who didn't read the OP but I really have to wonder how you 01011001 01011001 can enjoy any game, given how critical you are.

....enjoy games? where do you think you are??
 
Last edited:

01011001

Banned
You're not arguing with objective facts. You're mistaking your personal perception for objective facts.

Which, I'll give you, is what 90% of people arguing on the internet does, but it isn't any less misguided just because it's common.

so the OBJECTIVE FACT that Jedi Survivor at 30fps has more input lag than Ocarina of Time at 20fps is a personal perception?
or the OBJECTIVE FACT that modern engines have a less clean image resolve due to up-sampling, TAA and a variety of other image treatments?

those are personal perceptions? how?
 

radewagon

Member
Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.

Ricky Gervais Lol GIF


Nah, OP. Just no. Played Ocarina back in the day and it was not a significantly better technical experience when compared with Jedi Survivor at 30fps. The only difference is that now people whine and complain about perfectly acceptable framerates because they've decided that 60fps is some sort of gold standard where anything below that number is now literally unplayable. I mean, just look at how much text you needed to use to justify an absolutely ridiculous idea.

Maybe this quote from your essay will help put things into perspective. I made some helpful corrections (shown in bold, the excised sections shown with strikethrough).

Playing a game like Ocarina of Time Jedi Survivor will need getting used to at first, but once your eyes are adjusted to the 20fps 30fps output, the image will look pretty decent and clean.
Playing Jedi Survivor Literally ANY N64 title (like OOT, for example) at 60fps whatever wildly erratic fps the poor machine could churn out already makes the whole image look muddy and blurry due to the artifacting of the upsamling low resolution and smeary Vaseline-like garbage output that mascaraed as anti-aliasing.


Our games look the best that they ever have. Image quality is amazing and performance is about the same as it always was.
 

01011001

Banned
good post OP. As I posted in the Jedi Survivor OT, turning off Motion Blur and Film Grain does a LOT to clean up the performance mode image. Is it perfect? no, but boy does it help a lot in my eyes.

to your point, this does not work well for 30 fps mode -- eliminating the blur makes camera movement very stuttery. that said, leave it on, put your frame and pixel counters away, play the game. it's perfectly playable and very fun. is playing at a locked 60 fps better? sure is. does 30 fps mode feel bad and sluggish after switching from performance? yeah. does it still feel bad after playing for 15 minutes? not really

maybe i'm just fortunate that i've played games on so many systems across the decades that i can just adapt to various levels of latency/framerate and it doesn't bother me. i grew up playing split-screen Goldeneye dude, and i promise you variable 15-20 fps (or less) does not feel better than a modern game locked at 30. shit was messy AF and unpredictable. the best thing that era had going for responsiveness was a) CRT displays and b) all wired inputs



....enjoy games? where do you think you are??

I am a strictly no motion blur guy. gimme that raw 30fps if you have to, but I want a clean image.
I played through Spider-Man on PS4 Pro with motion blur off, and it was way better than with motion blur.

like I said in another post above, you will get used to the judder of a raw 30fps, but the muddy image from motion blur will not go away if you get used to it, that's why no blur for me.
 

01011001

Banned
Ricky Gervais Lol GIF


Nah, OP. Just no. Played Ocarina back in the day and it was not a significantly better technical experience when compared with Jedi Survivor at 30fps. The only difference is that now people whine and complain about perfectly acceptable framerates because they've decided that 60fps is some sort of gold standard where anything below that number is now literally unplayable. I mean, just look at how much text you needed to use to justify an absolutely ridiculous idea.

Maybe this quote from your essay will help put things into perspective. I made some helpful corrections (shown in bold, the excised sections shown with strikethrough).

Playing a game like Ocarina of Time Jedi Survivor will need getting used to at first, but once your eyes are adjusted to the 20fps 30fps output, the image will look pretty decent and clean.
Playing Jedi Survivor Literally ANY N64 title (like OOT, for example) at 60fps whatever wildly erratic fps the poor machine could churn out already makes the whole image look muddy and blurry due to the artifacting of the upsamling low resolution and smeary Vaseline-like garbage output that mascaraed as anti-aliasing.


Our games look the best that they ever have. Image quality is amazing and performance is about the same as it always was.

so you think this looks good?
swscreenzoomxlfwx.png



also, Jedi Survivor has more latency than Ocarina of Time. so they are objectively worse to play than older titles.
 
Last edited:

feynoob

Banned
Courtesy of chatgpt.

It seems like you are discussing the issue of why some gamers do not have issues with playing older games with low frame rates but struggle with modern games running at 30 frames per second. The answer you provide revolves around input lag and visual noise. Modern games have more input lag than older games because of the way developers use modern engines. For example, the Unreal Engine has a lot of input lag, as do other engines like God of War's current engine and Cryengine.

Input lag has become more noticeable in modern games because of the decrease in frame rate, which is something that older games had as well. However, modern games are more demanding, and the engines used to run them rely on many default ways of reading inputs, rendering images, and Vsyncing games. Visual noise is another factor, and it occurs when a game is rendered in a way that creates a lot of motion blur or other visual effects that can make it harder to see what's happening on the screen.

The issue of input lag is not limited to modern games. For example, Ocarina of Time on the N64 had less input lag at 20 frames per second than almost any modern game has at 30 frames per second. Some modern TVs have a lower input lag than CRTs, which are known for their high input lag because of how they draw images from top to bottom 60 times per second.

Some modern games, like Call of Duty, have very low input lag, but others, like God of War Ragnarök, have high input lag, even exceeding that of older games like Killzone 2, which was infamous for its high latency. God of War Ragnarök's 30fps graphics mode has an input latency of around 170ms, while Halo 3 had 90ms of latency at 30fps on an Xbox 360. Vsync can also introduce input lag because it pre-renders frames, and some games hold multiple frames in the back buffer before displaying a new one.
 

lh032

I cry about Xbox and hate PlayStation.
tldw but i agree on the input lag.
and most of the time 30fps dont stay at 30fps.

So its shiet.
 

feynoob

Banned
Again chatgpt.

Q: is there a fix to that?
A:
Reducing input lag and visual noise in modern games is not a straightforward fix, as it depends on how the specific game is designed and implemented. However, there are a few things that players can do to mitigate these issues:

1. Play on a gaming monitor with a low input lag: Monitors designed for gaming typically have lower input lag than TVs, which can help reduce the delay between your input and the game's response.

2. Turn off Vsync: As mentioned in the original post, Vsync can introduce input lag by buffering frames. Disabling Vsync can reduce this lag, but may also cause screen tearing and other visual artifacts.

3. Adjust graphics settings: Lowering graphics settings can improve performance and reduce visual noise, although this may come at the cost of visual fidelity.

4. Use a wired connection: Playing on a wired connection rather than Wi-Fi can reduce latency and improve responsiveness.

5. Check for updates: Game updates and patches can sometimes improve performance and reduce input lag, so it's a good idea to keep your games up to date.

Ultimately, the best way to reduce input lag and visual noise will depend on the specific game and hardware setup being used. It may take some experimentation and tweaking to find the best settings for each game.

Q: what about from developers side?
A:
From the developer's side, they can implement various measures to mitigate the impact of memory leaks in their code. Here are a few things they can do:

1. Use automated tools: There are several automated tools available that can help detect memory leaks in the code. Developers can use these tools to catch memory leaks early in the development process.

2. Follow best practices: Developers should follow best practices for memory management such as using smart pointers, avoiding global variables, and freeing memory when it's no longer needed.

3. Conduct code reviews: Developers can conduct code reviews to catch memory leaks and other issues early in the development process.

4. Use a garbage collector: Garbage collectors are programs that automatically free memory that is no longer needed. Some programming languages, like Java, have built-in garbage collectors.

5. Monitor memory usage: Developers can use monitoring tools to track memory usage and identify memory leaks. They can also use profiling tools to analyze the performance of their code and identify bottlenecks.

By implementing these measures, developers can reduce the risk of memory leaks and ensure their code is more reliable and efficient.
 
Great thread OP! It actually took me until adult hood to figure out why gaming long sessions made me sick and it was 30fps games specifically with bad frame pacing lol and latency is definitely an issue as well I was just playing Forbidden West complaining about how it's not as snappy as I want it to be (great game btw). Something Nintendo always seems to get right is snappy low latency controls it definitely feels as if latency has become a "feature" in some next-gen engines.
 

Pimpbaa

Member
Some games are fine at 30fps, like Insomniacs recent games. But others are almost seizure inducing like Horizon Forbidden West (on an OLED). Hell they got 30fps right in Redfall, and that game is a fucking mess. But yeah you do lose a lot of detail in motion the lower the framerate is. I was playing GoW:R in it’s vrr performance mode and even tho the resolution was like only 1440p, but the game looked detailed as fuck due to the 80 to 90 framerate (things probably drop more than that during fighits).
 

Alexios

Cores, shaders and BIOS oh my!
So based on this your problem isn't really 30 fps but motion blur, image reconstruction techniques and lower than native resolutions which make things look bad at 60fps as you've demonstrated. Why blame the 30 fps? Demand native high resolution and options for motion blur and similar effects.

Better fps is better for all the things it does in motion and for gameplay, not for stills. Chances are to have even higher fps (so say 60 in all areas rather than just a few) in the game you demonstrate as ugly here it would be even uglier in stills and in motion with potentially lower base resolution etc.

Although the comparison is not fair as in one game you have a low base resolution reconstructed and in the old game you have it in its native resolution in its native size rather than stretched to fit a 4K pixel area as in that case it would probably look really ugly also even without any motion blur.

Funnily enough F-Zero GX was originally panned for its graphics often looking plain and blocky in such stills when it actually looks pretty damn stylish and when in its blazing fast motion you don't have time to notice that, sure, it's not the most geometrically complex GC game but it looks sweet.

I guess you can demand we go back to GameCube graphics which can possibly run in native 4K with no issues nowadays and give up on everything that makes things worse for you or demand 4K is banned and replaced with 1440p for everyone until it is done without or better reconstruction 🤷‍♂️
 
Last edited:
OP is right to some degree especially when it comes to visual noise. I think it's why Dreamcast games hold up better than PS2 games in motion whatever the framerate is. All the detail just becomes a blur in Forbidden West at 30 but not so much with Ratchet and I think that is due to visual noise. I do think however modern displays have something to do with it though. CRTs were so much better at handling motion and still are. I can play the same game on a CRT, Plasma, OLED and LCD and the motion clarity gets worse progressively.
 

Chastten

Banned
Okay, but I can play modern games at 20-30 fps without any issues. And no, it's not because 'I don't know any better' because I can play most games on my PC at 60 or 120+ fps just fine. Maybe this is an issue if you're mostly into fast-paced pew pew games or whatever but I have no issues with 30 fps, period.

Sucks to be you if you can't handle 30 but yeah, this entire topic reads like you need to get out, smell some fresh air and find a new hobby.
 

CGNoire

Member
I dont know about all that Warhawk @20fps on PS1 is damm near unbearable and same for GTA3 @25fps on PS2.


Current image quality has been blurry as all hell for a while now for sure.
Temproral Debt is gonna bite us all in the ass bye the end of this gen.
 

Reallink

Member
Out of curiosity what was the input lag of Ocarina running on an N64 at 20fps?
 
Last edited:

NikuNashi

Member
If you are good watching slideshows great.

I demand more from next gen (current gen) experiences and that includes a rock solid 60fps.

If you can't hit 60fps then lower your CPU or GPU load, lower res, less polys, reduced shadows, lod's etc. Because gameplay and immediacy of control are more important than anything else.
 
Top Bottom