• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 30FPS was once acceptable, BIT ISN'T ANYMORE!... mostly...

LordOfChaos

Member
Yess, that makes sense to me OP.

Some modern games feel abjectly horrible in 30fps in ways older games didn't. In Control, something about the lights got super smeary in 30 with RT for example.
 
READ THE FUCKING POST BEFORE POSTING NON-ARGUMENTS PLEASE.

This thread is inspired by some of the comments I read in this thread:
where people talk about playing old games that ran at super low framerates, and having no issues with that...
An I too am one of them, I even just recently played through Disaster Report on XBSX2, a PS2 game that has a 20fps lock, and even drops below that 20fps lock in some scenes...
I had no issues playing it... but why do I hate 30fps in modern games, when I play 20fps retro games just fine?

The quick explanation would be:
1: Framerate is not the issue.
2: Input lag, and visual noise is.

Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.


But I will elaborate on these a bit, otherwise what an OP would this be? :D


First of all, input lag:
Many modern engines have an insufferable amount of input lag, all while the games aren't designed with them in mind either.
Unreal Engine is especially bad with this, but is by far not the only one... God of War's current engine has the same issue, as does Cryengine and many others.
They don't inherently have these issues, but the way almost all developers use them is the issue, as they rely on many of the default way these engines read inputs, render images and Vsync games.

Ocarina of Time on N64 had less input lag at 20fps, than almost any modern game has at 30fps... and for some even the 60fps modes have higher input lag than 20fps Ocarina of Time... this is not an exaggeration, it's absolutely true.

And this is not down to modern TVs either, in fact, if you have a modern Samsung TV from the last 2 production years for example, and you play at 120fps on it, your Input Lag of your Screen is lower than the latency of a CRT... because yes CRTs also have input lag, this lag comes from how they draw an image from top to bottom 60 times each second, which means 1 image takes 16.6 ms to be drawn by a CRT, input lag is measured at the center of the image usually, meaning a CRT given those standards would have input latency of at least 8.3ms of input lag. While a modern 120hz Samsung TV has ~5ms of input lag... and about 9ms at 60hz.
The CRT Input Lag problem basically has ben solved now, we are so close to that even on TV screens now that it's not a factor anmore, let alone on high end PC monitors.

And this increase in latency isn't only obvious when comparing super old games to super new games.
New games still have a shitload of variation, with games like Call of Duty Modern Warfare 2/Warzone 2.0 having input lag so low, that they compete with old SNES games!
We are talking ~40ms of latency at 60fps, and below 30ms at 120fps, which is lower than most 8 and 16 bit games ever dreamt of reaching.
We could compare some Xbox 360 games that ran at 30fps to some modern games that run at 60fps, and Halo 3 would win... God of War Ragnarök at 60fps funnily enough has about the same latency as Halo 3 at 30fps, which is around 90ms for both (+/- 5ms)

So even tho there are modern games that absolutely crush it when it comes to latency, like Call of Duty, and there are of course old games like Killzone 2 that were infamous due to their high latency, it's sadly a pattern that latency is going up.
We are now in a spot where a bulk of modern titles have the same, or even more Input Lag than Killzone 2 had, a game that was panned at the time for how awful the aiming feels due to it's lag.
Coming back to God of War Ragnarök. That game in its 30fps Graphics Mode, has an input latency of around 170ms. Killzone 2's latency was in the 150ms ballpark!!!

So let that sink in for a moment... during Gen 7, Killzone 2 got massive bad PR for having 150ms of latency... and a modern game like God of War Ragnarök easily exceeds that!
Meanwhile Halo 3 had 90ms of latency at 30fps on an Xbox 360, and Ragnarök has about the same amount of latency in its 60fps Performance Mode.
In God of War's case it's mostly the Vsync that is the issue it seems, since as soon as you use the unlocked VRR mode that deactivates Vsync, the latency shrinks down to 90ms, in par with Halo 3 and the 60fps Vsync mode of the game.

Why does Vsync introduce input lag? Because it pre-renderes (buffers) frames. And some games take this to another level, usually to smooth out framerates due to being extremely demanding on the GPU, and they will hold multiple frames in the back buffer before displaying a new one, which gives them the possibility to basically have a fallback frame under most circumstances, which keeps the percieved framerate consistent at the cost of input lag.
Sea of Thieves in fact is the only game I know of that actually lets the user chose how many frames it should pre-render.


Secondly, how games build their final image:
Modern games have tons of high frequency details, and modern games are expected to be played on high resolution displays.
These 2 factors are the reason many games now use aggressive TAA, and more and more games use one of the many forms of Upsampling to higher resolutions from a lower base resolution.

These 2 things both lead to the same issue, a muddy and hard to parse image in motion.

Then a third factor comes in, MOTION BLUR *scary thunder noises*
And Motion Blur adds even more visual noise and muddies the final image even more.

So a modern game will often look pristine and super clean when you hold your camera completely still, but as soon as your character, and especially the camera, moves at all... MUD AND SMEARING.
And funnily enough we have an old game as an example here again, that at the time got a lot of flak for having the same issues, that nowadays are so common that noon even really talks about them in reviews or anything anymore...
And that game is Halo Reach.

Halo Reach was an early example of a game that used a form of Temporal Anti Aliasing... TAA.
And TAA at the time Halo Reach was made, was still firmly in its infancy, exhibiting a lot of issues and visual noise. So if you stood completely still in Halo Reach, it looked GLORIOUSLY smooth and clean... move the camera and it was a jittery mess.

These days TAA got a lot better, but it still has issues with ghosting and clear fizzle and disocclusion artifacts the moment something on screen moves.
But of course, TAA isn't the shiny new thing anymore... we now have FSR2, TSR and many other proprietary methods by different developers...
And these Upsampling methods basically bring bag almost all the issues Halo Reach had, and then some!

When you play Ocarina of Time at 20fps on an N64 connected to a CRT, motion is clean! You can see motion clearly, there's no artifacts from anything moving too fast, there is no blur to reduce legibility of the action happening around you, and there's no motion blur to try to make it seem smoother than it is.

"but why is this a 30fps problem mister Y" is what you are now asking I bet!
WELL, TAA, FSR2, TSR, CBR, ADD UPSAMPLING METHOD HERE... and Motion Blur, all of these rely on image data from previous frames!
THE LESS IMAGE DATA, THE MUDDIER THE IMAGE IS IN MOTION!
So, at 30fps these modern upsampling and anti aliasing methods have less temporal data available to them to create their final image!
A game running at 1440p FSR2 Quality Mode will look sharper and have less artifacts when running at 60fps than it would have at the same settings but locked to 30fps.
So the lower the framerate, the more "smear" the more "mud" and the more "fizzle" there is.

So in the end, all of this adds up.
The Motion Blur, the Image Reconstruction, the Antialiasing, all of it get worse the lower the framerate is.

This is why Jedi Survivor in motion looks like this (I purposefully took a shot from a spot where the game locks to 60fps to give it the best chances):
starwarsjedisurvivorm7ifc.png


and F-Zero GX like this (I'm turning sharp left here to give it the worst chances with fast camera and vehicle movement):
fzeroscreend5cph.png



Normalized for same image size so you don't have to zoom as much on mobile (not zoomed in in any way, just cropped)
swscreenzoomxlfwx.png
fzeroscreenzoom7ocx7.png



And the lower the framerate, the worse all the weird grainy, blurry, fizzle, artifacts you see in Jedi Survivor will get. Meanwhile if you ran F-Zero GX at 10fps, it would look literally the exact same in terms of image quality and artifacts (because it has no artifacts)
And we haven't even touched motion blur, which is off in my screenshot here, along will all the other image filters.

If a game like Jedi Survivor runs at a lower framerate, the amount of data the engine has to draw the image from one frame to the next is reduced, the image gets less readable and muddy, and you will see blur in motion, even tho motion blur is off.


Smaller issues that aren't technical but due to Art Design of games:
Older games have different considerations when it comes to art (textures, models, scale) than modern games.
Older consoles couldn't have as much detail than newer consoles can handle. This means textures had less "visual noise", meaning there is less for your eyes to get hung up on, less to process for your brain.
In motion, when running through a level, or turning a camera, this means the things that are important, or could be important, as well as your general environment, is easier to digest.

You will have an easier time knowing where enemies are, or where a button or rope is, when it's simpler in design, bigger in scale and stands out more. And all of these features are features older graphics had!
On a wall with a simple flat texture, no shadows or reflections, seeing a red button that is on top of that unrealistically big in scale, is easier to see than a red button with realistic scale, on a wall with detailed textures, shadows of adjacent objects on it and other lighting applied.

So even if older games had the graphical issues I explained above, the simpler graphics would still make it way easier on your eyes, because there is not as much small detail that can get lost in all the blur, fizzle and breakup.

In short: Less Detail + Scale = Easier to read environments even at low framerates.



SO IN CONCLUSION:
Playing a game like Ocarina of Time will need getting used to at first, but once your eyes are adjusted to the 20fps output, the image will look pretty decent and clean.
Playing Jedi Survivor at 60fps already makes the whole image look muddy and blurry due to the artifacting of the upsamling. At 30fps this will only get worse, add Motion Blur to negate the stutter and you'll see barely anything in motion.

Playing Ocarina of Time at 20fps on real hardware will feel... not amazingly responsive, but more responsive than many modern games like... Jedi Survivor... or Redfall... or God of War Ragnarök feel at 30fps, hell some of them even at 60fps!

So you can not use an old game and point the finger at it saying "LOOK! back then we had no issues with this! this also ran at a super low framerate!", while not taking into account how modern game engines work, how they react to player input, and how they construct their final output image.

Old games had simple graphics, they were easier to parse by your eyes because important stuff was exaggerated in scale and shape, they didn't have motion blut, no reconstruction artifacts, no TAA or Screen Space effects that lag behind and take time to accumulate data. They had simpler engines with less latency.
And that is why they look cleaner and play better at low framerates than new games.

And yes, there are modern games and old games that are outliers from this. Like some modern ones like From Software games have pretty low latency, and some old games like Mortal Kombat on GameBoy have massive amounts of lag.
And some old games had weird fake motion blur which made the image hard to read, while some modern games have clean images with barely anything to muddy them in motion.
The difference is that there are more of the less good examples today, and more of the good examples in the past.
I’m crazy late to this thread, but this post is amazing man.
 

rofif

Can’t Git Gud
FSR2 is crap. I knew this tech will make games uglier in the long run. Cant even trust resolution anymore
 
Last edited:

ryan90k

Neo Member
I agree with OP and can give several examples of both great and bad implementations of 30 fps.

Take Spiderman and the sequel which have incredibly responsive controls even in the 30 fps mode thanks to the engine itself having low lag. (true for all insomniac games) If you compare this to something like Deamon souls remake the same 30 fps mode feels extremely slow to respond and "heavy" it's just not enjoyable to play and as a result the 60 fps mode becomes the default as it has significantly lower input lag and is no longer distracting this is also true for God of War and the sequal Ragnarok. A few other 30 fps games with low input lag would be Zelda - Tears of the kingdom, Persona 5, Xenoblade Chronicles 3, Alan Wake - Remaster (Actually was convinced it was running at 60fps as it was so responsive).

There are also great methods to reduce the motion judder of 30 fps without extreme motion blur; Digital foundry covered this with Final Fantasy 7 Remake which simulated a specific camera shutter speed that produced smoother motion without as much blur within Unreal engine 4.

The perception that 30 fps is bad comes from games with poor uneven frame pacing resulting in strange stuttering movement and high input lag in my opinion. PC games often have terrible 30 fps implementations also which force the user to force framerate caps and Vsync settings using the control panel or RTSS.
 

ryan90k

Neo Member
I also remember actually not wanting to play Crash Team Racing - Nitro Fueled because it was 30fps even though I loved the original on PS1 and it was also 30fps. I thought there was no way the game could be good as I had played Sonic and Sega all star racing transformed on ps3 and found it to be so unresponsive and that I must have simply gotten accustomed to 60fps racing. In reality however the 30fps in Crash Team Racing is excellent and has lower input lag than many 60fps games making it a gem to play. The game actually made me realise how silly it was to exclude a game due to it's frame-rate alone.
 

Celcius

°Temp. member
Oh no.... I've finally been bitten by the 60 fps bug.

I wanted to play Star Ocean 5 and I held out for a long time hoping it would get ported to Steam like SO4 and SO6 but eventually I got tired of waiting and I bought the ps4 version on sale. The ps4 version runs at 60 fps and the whole time I'm playing through it I'm really liking how smooth the motion is.

Eventually I beat it and then I go back to other games on my ps5. 30 feels so sluggish and janky. I swap back and forth between resolution mode (30 fps) and performance mode (60 fps) in several games and the difference is night and day. The slightly lower resolution isn't that big of a deal but the motion difference is instantly noticeable and a huge deal.

How did I live with 30 fps for so long?

I feel like now I have no reason to resist OLED anymore since I've seen the light - if something looks stuttery at 30fps then it's not the display's fault... it's the source. Even on my 60 hz lcd monitor the difference is so clear. From here on out on console I'm choosing the 60 fps option.

Perhaps it's a good thing I've never played a game in 120+ hz/fps before... I don't know what I'm missing there.
 

ryan90k

Neo Member
Oh no.... I've finally been bitten by the 60 fps bug.

I wanted to play Star Ocean 5 and I held out for a long time hoping it would get ported to Steam like SO4 and SO6 but eventually I got tired of waiting and I bought the ps4 version on sale. The ps4 version runs at 60 fps and the whole time I'm playing through it I'm really liking how smooth the motion is.

Eventually I beat it and then I go back to other games on my ps5. 30 feels so sluggish and janky. I swap back and forth between resolution mode (30 fps) and performance mode (60 fps) in several games and the difference is night and day. The slightly lower resolution isn't that big of a deal but the motion difference is instantly noticeable and a huge deal.

How did I live with 30 fps for so long?

I feel like now I have no reason to resist OLED anymore since I've seen the light - if something looks stuttery at 30fps then it's not the display's fault... it's the source. Even on my 60 hz lcd monitor the difference is so clear. From here on out on console I'm choosing the 60 fps option.

Perhaps it's a good thing I've never played a game in 120+ hz/fps before... I don't know what I'm missing there.
I also prefer 60 fps when the option is there unless the resolution drop is terribly noticeable. Guardians of the Galaxy on PS5 was the only game I've played where the 60 fps mode just looked that bad. The game has a pristine image quality in resolution mode running at 4k but the 60 fps mode ran at 1080p with terrible temporal AA that smeared all the details and blurred the image then to make matters worse hair seemed to be rendered at sub native resolution and would turn into a smearing soupy mess (technical term I swear) in motion. I did however get used to the judder and worse input lag of the 30 fps version and still found it to be a great game.
 

JackMcGunns

Member
Sony lead the launch of the PS5 with a presentation of a demo running 30fps and 1440p output. I think we should've started raging since then. I wonder why the sudden issue? :goog_unsure:

Not every game needs to be 60fps. Racing games, Fighting games, twitch shooters... sure. But some adventure, RPGs, slower paced exploration games, etc., can do just fine at 30fps. Everyone is just spoiled that for the first time in console history (3D era), practically every game has been able to run 60fps, but eventually as the generation progresses, that's going to change unless you get the Pro consoles.
 
Last edited:

ryan90k

Neo Member
Sony lead the launch of the PS5 with a presentation of a demo running 30fps and 1440p output. I think we should've started raging since then. I wonder why the sudden issue? :goog_unsure:

Not every game needs to be 60fps. Racing games, Fighting games, twitch shooters... sure. But some adventure, RPGs, slower paced exploration games, etc., can do just fine at 30fps. Everyone is just spoiled that for the first time in console history (3D era), practically every game has been able to run 60fps, but eventually as the generation progresses, that's going to change unless you get the Pro consoles.
To be honest I'm fine with having them target a lower framerate as time goes on for rpg/adventure and slower paced games. I just hope they can get it right. The medium for instance on PS5 is a great example of a game where 30 fps makes sense but unfortunately the PS5 suffers from poor frame pacing at 30fps which is a very common problem resulting in stuttering movement which I unfortunately find distracting to the point it breaks game emersion. bloodborne has the same issue which is always caused by bad VSYNC. The other thing that's important is input lag as 30fps naturally has 16.7 more milliseconds of lag no matter how you slice it and sometimes more depending on the engines own lag and what you are asking it to do (more complex graphics/ simulations at 30fps can also mean more engine lag on top of the delay from the lower framerate itself).

Some things that could help would be allowing 60hz content to run in a 120hz container(could be done on ps5 firmware level with a toggle) as this has been shown to reduce lag since the screen is quite literally allowing for updates faster helping massively to reduce VSYNC lag. a higher VRR range implementation would also help as It's already possible to get VRR working below the 48-120 fps window on PS5 if framerate doubling (low framerate compensation) is enabled (which could be included in the same "run all content in 120hz mode" firmware toggle) allowing anything down to 24fps to work. VRR acts like VSYNC once the refresh rate is hit meaning 60hz refresh with 60fps game means VSYNC input latency, 59fps however can actually have lower latency due to being within VRR range with no vsync input lag. This is why running a 60fps game at 120hz refresh rate with VRR enabled would give the best results automatically doubling any frames that fall below 48hz resulting in smooth framerates from 24-60 fps . This could all be implemented in firmware. Hopefully Sony do this at some point.

The other thing I hope is considered by 30fps developers is an uncapped framerate (with framerate doubling enabled to allow for VRR) as well as 40fps modes as 10 frames more is much easier than 30fps more of 60fps but the input lag different is 33% more responsive. 33ms at 30fps vs 25ms 40fps
 
Last edited:

vkbest

Member
Sony lead the launch of the PS5 with a presentation of a demo running 30fps and 1440p output. I think we should've started raging since then. I wonder why the sudden issue? :goog_unsure:

In the time where PS1 released, most people was playing in PC with no GPU cards, running game in their Pentium at 15/20fps and low resolutions. I wonder why people is raging now because their new games can't reach 120fps in their new PCs.
 
Hey "Y", your greatest flaw in your OP is that you are appealing to people who go by emotions and not logic and reasoning. Unless, you're and established tech specialist a-la Digital Foundry, these average Joe's will just whiz by your post even with that big red notification you just placed there.
 

Sophist

Member
Hey "Y", your greatest flaw in your OP is that you are appealing to people who go by emotions and not logic and reasoning. Unless, you're and established tech specialist a-la Digital Foundry, these average Joe's will just whiz by your post even with that big red notification you just placed there.
Dude wants to talk about input lag but did not say a single word about animation which is the main reason for today input lag. that's not serious.
 
Dude wants to talk about input lag but did not say a single word about animation which is the main reason for today input lag. that's not serious.
Judging by the responses of this thread, it wouldn't matter what he wrote. They all are either off-topic or saying "but I like 30fps because it doesn't bother me" in a 1000 different ways
 

hussar16

Member
In the time where PS1 released, most people was playing in PC with no GPU cards, running game in their Pentium at 15/20fps and low resolutions. I wonder why people is raging now because their new games can't reach 120fps in their new PCs.
I don't. As long as there's no stutter I didn't care
 

kuncol02

Banned
In the time where PS1 released, most people was playing in PC with no GPU cards, running game in their Pentium at 15/20fps and low resolutions. I wonder why people is raging now because their new games can't reach 120fps in their new PCs.
On 14 inch CRT monitors. Screen size actually matter for how frame rate feels.
 

vkbest

Member
On 14 inch CRT monitors. Screen size actually matter for how frame rate feels.
My point is standards change over time, 3 years ago people was used to 30fps, now except maybe 4 or 5 games, all games released at 60 or with 60fps mode.

Also, using your example, the fact you have now OLED screens where motion blur feels terrible, and you need motion blur for fake smoothness in 30 fps, it’s a reason to abandon 30fps
 

StueyDuck

Member
It's not rocket science that 30 isn't as good as 60.

The only reason why it isn't acceptable however is 2-fold.

1) more games are 60 now than prior gen so more people are getting use to it.

2) stupid dumb arse suits (execs) at these companies said stupid shit like promising 60 for all games going forward.

Without those (number 2 mostly) the 30fps thing wouldn't be as big an issue in the general gaming populous as It is now.
 
Last edited:

Mr Reasonable

Completely Unreasonable
I can’t stand 30 fps on Ps5 or Xbox on my LG CX. Feels like a flip book after 60 fps.

Weirdly though, Tears Of The Kingdom feels absolutely fine at 30 fps. Have no idea why. In fact, Switch games in general feel ok.

I hated TOTK on my C9 so much I turned on the motion smoothing.

Tbh, it was fine once I'd done that. I'd never do it with any other game, obvs.
 

EekTheKat

Member
IMHO,
The rise of the 40fps mode in a 120hz container has been one of this generation's best contributions to gaming. It's also literally one of the better compromises between the 30fps and 60fps war.

Hell even the 40fps mode on the 40hz screen setting on the Steam Deck has been a blessing from the battery life gods.
 

Zathalus

Member
Honestly I want more 30fps games that push the envelope. I can stomach 30fps if I have to but I'll probably just play it on PC at 100fps+. Cross-gen needs to die.
 

IAmRei

Member
I'm pretty fine with 30 fps, as long as not janky enough up to the point where its broken. Last fast pace games i played with 30 fps is Final Fantasy XVI, and its still okay and fine with me. Dragons Dogma 2 also 30 but still fine with me. In the contrary, if its on the pc, i dont like under 60 fpr 3D games. Its weird for me. I have ps pc and nintendo most of the time. I can play locked or variable 30 or 60 without much complain.

Maybe, 30 will still be good as long as it was intended to be like that.
 

rofif

Can’t Git Gud
On 14 inch CRT monitors. Screen size actually matter for how frame rate feels.
It doesn’t that much. Went from 27 to 48 oled on desk and it’s fine. Just don’t disable motion blur in games. Is there for a reason. I play 30 fps just fine and enjoyable. Depends on a game. Ff16 got the best 30 fps mode this gen
 
Top Bottom