• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

40 FPS modes coming to Spider-man and Miles Morales on PS5

JimboJones

Member
This has nothing to do with VRR.

40 FPS on a refresh rate that divides evenly into it will be smooth without judder.

By that logic, if you have a display that supports 90hz, you can run a 45 FPS game on it and it will divide evenly and play even smoother without any judder. It just needs to divide evenly.
I think he means with a VRR display you can lock your framerate to whatever arbitrary refresh you want .
I've played games locked at 77fps or 35fps for example.
But yeah with fixed refresh rate displays it has to divide evenly into whatever refresh your running at.
 
Last edited:

DeepEnigma

Gold Member
hqdefault.jpg

YOU SUCKAS AT NAUGHTY DOG THOUGHT YOU HEARD THE LAST OF US!!??

hqdefault.jpg

WELL YOU'RE WRONG AGAIN, WE AT INSOMNIAC ARE WORKING ON THREE MAJOR GAMES

hqdefault.jpg

AND JUST FOR THE HECK OF IT WE PATCHED IN 40FPS FIDELTY MODES ON ALL OUR PS5 GAMES

hqdefault.jpg

AND WE DID THIS DURING A LUNCH BREAK! YOU'RE NOT THE TOP DOG ANYMORE!

hqdefault.jpg

NOW GO FETCH ME A BONE, CAUSE BONESAW IS READY!​
Cracking Up Lol GIF
 

DeepEnigma

Gold Member
These extra modes are a big deal, Spiderman feels so much better at 40fps, people who dismissed this as a Band-Aid are in for a rude awakening.
Hopefully we can benchmark just how high the framerate can go with VRR, if it's in the fifties constantly it will be a big bonus.
R&C had that mode for a year now, and it felt buttery smooth in comparison to the 30. Literal night and day.
 

DJ12

Member
These extra modes are a big deal, Spiderman feels so much better at 40fps, people who dismissed this as a Band-Aid are in for a rude awakening.
Hopefully we can benchmark just how high the framerate can go with VRR, if it's in the fifties constantly it will be a big bonus.
40 fps mode has got nothing to do with vrr.

It was only a band aid for you when claiming victory with poorer performance in face offs.

Don't confuse yourself.
 

Riky

$MSFT
40 fps mode has got nothing to do with vrr.

It was only a band aid for you when claiming victory with poorer performance in face offs.

Don't confuse yourself.

I'm getting over 40fps actually, don't be so hurt about it, you can enjoy extra modes too right now, don't deny yourself.
 

SlimySnake

Flashless at the Golden Globes
So the 40 fps mode I tried out last night had some jitters and judders while moving the camera around when moving at high speeds, those juddres are mostly gone. It feels much closer to 60 fps now. If there are drops, I dont notice them. I wonder if its in the 50s now. Feels way smoother than the 40 fps ratchet patch.

But the true star of the show is the Perforrmance mode. Maybe it is the placebo effect but going from the Fidelity RT 40+ fps mode to a Performance 60 fps mode feels incredible. I wouldnt be surprised if its 120 fps.
 

01011001

Banned
So the 40 fps mode I tried out last night had some jitters and judders while moving the camera around when moving at high speeds, those juddres are mostly gone. It feels much closer to 60 fps now. If there are drops, I dont notice them. I wonder if its in the 50s now. Feels way smoother than the 40 fps ratchet patch.

But the true star of the show is the Perforrmance mode. Maybe it is the placebo effect but going from the Fidelity RT 40+ fps mode to a Performance 60 fps mode feels incredible. I wouldnt be surprised if its 120 fps.

can you not display real time refresh numbers on your TV?
if my TV supports the update I will test this later, but since my TV is HDMI2.0 I'm not getting my hopes up that it works, even tho it technically should... but Sony said they only support 2.1, I'm curious lol
 

Soodanim

Member
Honestly I never really understood that, I mean I get it on paper, in theory, but in reality to me the higher is always better. Like for example 40 fps on a 60hz screen will feel much better than 30 fps, despite 30 fps being half refresh.
No one really explained it (from what I saw). It's in the remainders of the division. Trying to display 45 frames across 60 refreshes means some of those frames will have to be displayed for a different amount of time to the others, and that results in the video being less smooth. That's why we're only seeing 40fps on console now - because 120hz allows for the clean division needed to display 40fps smoothly. It's very noticeable, so it was never an option before because it wouldn't have been worth it.
I don't understand why they wrote "50% or more" if it's only "at times" above 60fps.
The same reason "Under $20" includes $19.99. Companies will say as much as they can get away with.
The why is explained many times in the thread on the first page alone. You could have read the answer in the time it took to post even that low effort drive-by post of yours
 
Last edited:

yamaci17

Member
I hope 60 fps becomes the standard.

Would be fantastic at 120 fps, but I don't believe in santa so I wouldn't believe in that either.

60 fps is cool, but sometimes sacrifice can be too heavy for some. especially in terms of resolution, i'd like to play RDR 2 at 4K @40 FPS instead of 1440p @60 FPS. I have nothing against 60 FPS, but 40 FPS modes can enable higher resolutions with a smoother experience than 30 FPS! I
 

Soodanim

Member
I hope 60 fps becomes the standard.

Would be fantastic at 120 fps, but I don't believe in santa so I wouldn't believe in that either.
I almost always won't settle for <60fps (when on a 60hz display), but let's be real: 60 is only the baseline because the choice was (and usually still is) 30 or 60. If we always had access to arbitrary refresh rates, 60 wouldn't be held in such high regard, nor would we settle for 30. It doesn't stop more being better, but it's a sliding scale from 30 to 60 and 40 is a healthy middle ground. If 40 can become the new 30 for those with 120hz displays, that's a huge win
 

DJ12

Member
I'm getting over 40fps actually, don't be so hurt about it, you can enjoy extra modes too right now, don't deny yourself.
Fact is you implied 40fps mode was a benefit of vrr and its not. Take it on the chin and try not to double down on BS.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
60 fps is cool, but sometimes sacrifice can be too heavy for some. especially in terms of resolution, i'd like to play RDR 2 at 4K @40 FPS instead of 1440p @60 FPS. I have nothing against 60 FPS, but 40 FPS modes can enable higher resolutions with a smoother experience than 30 FPS! I
But still a worse gameplay experience than 120 fps. The input latency alone makes anything below bad.
 

lukilladog

Member
Hmm, the actual games had a "37fps Mode"?
I'm vaguely aware that there were some kooky/adventurous works done in UK games (and modifications of existing games) to deal with PAL, but I've never heard of 37FPS mode or really much that specifically was placed to take advantage of over-60Hz monitors targeting 74Hz. (Granted, I'm not in Europe and am mostly a console gamer, so I'd be out of the loop... Google has nothing useful for 37FPS Mode or 74Hz Games, but Google sucks at most things before Google or not pertaining to the US... are you talking about a PAL TV, or am I off track even there?)

For sure there were PC games that let you run at a higher framerate when PC monitors started going into 75/90/120Hz (and that roughly translated to "framerate" in the US and exactly translated to framerate in PAL regions,) but I always assumed that was them taking advantage of unlocked framerates rather than specific framerates, and then to specifically target a half-framerate on a high-framerate monitor seems to be unusual thinking to me... smart, but not the way they were thinking back then, they usually just cranked everything up as high as it could go and hoped your machine could handle it.

Also to consider: I believe we're talking about interlaced screens in that time? Over-60Hz displays were coming out before progressive scan came into regular practice. And an interlaced realtime graphic is a different challenge from a full-frame progressive scan picture.

But yes, running at 74Hz would look better, provided the game could keep up with it; you'd get a lot more frames per second, you'd have the increase of direct responsiveness, and it'd look different from a "regular game", so it would be a good break from convention for your eyes too.

Almost any lcd monitor could do 74hz progressive, you just had to create a custom resolution with that refresh rate. As for games, there were not 37fps modes, and even 1/2 vsync was not a thing in graphics drivers, but you could use regular vsync and lock fps to 37 with fraps. I think the rivatuner guy saw people doing this and then implemented fps locks on his tools, and later asked nvidia devs for things like vertical tear control and half vsync (neat new xbox 360 features at the time)
 

ClosBSAS

Member
Lmao. 40fps. Ppl calming that 40fps at 4k is better than 1440p at 60, haha. I guess to each their own, but damn. Demons souls looks amazing at 1440p 60fps and plays much better than 4k 30fps.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
40fps mode you were fine for 5 minutes but 80-100fps in Performance RT gives me Peter Tingles, sorry.
It's incredible.

60 fps is cool, but sometimes sacrifice can be too heavy for some. especially in terms of resolution, i'd like to play RDR 2 at 4K @40 FPS instead of 1440p @60 FPS. I have nothing against 60 FPS, but 40 FPS modes can enable higher resolutions with a smoother experience than 30 FPS! I
I am playing RDR2 at native 4k 80 fps and it still feels sluggish lol.
 
Last edited:

yamaci17

Member
But still a worse gameplay experience than 120 fps. The input latency alone makes anything below bad.
input latency is specifically a huge problem when vsync is involved. with no vsync, and no GPU pressure lag (<%90 utilization), even 40 fps will feel snappy with minimal input lag
 

yamaci17

Member
i want to provide some insight on what i say;

witcher 3 60 fps + vsync
72 ms of app latency, heavy input lag


witcher 3 only 60 fps cap + no vsync + vrr enabled
22 ms of app latency, smooth and snappy


witcher 3 only 40 fps cap + no vsync + vrr enabled
32 ms of app latency (going from 16.6 ms to 25 ms), snappy, not smooth as 60 fps but greatly snappier than vsync+60 fps


and destruction comes upon you, 40 fps + 1/3 vsync...
110 ms of latency... enormous input lag. literally unplayable


all in all, VSYNC becomes proporitonally worse the lower your framerate is. having no vsync and having only VRR at low framerates gets you minimal input lag. as a matter of fact, as proven above, you get less input lag and snappier gameplay at 40 fps with no vsync compared to 60 fps with vsync
 
Last edited:

sn0man

Member
It's a big improvement over 30fps even though it's not as smooth as 60. Here's a great comparison from DF when R&C got the 40fps/120hz modes:

q7vq3vb.png
It’s sort of shocking how close the end to end is between 40fps and Classic 60fps. I don’t quite get why this is the case. Seems overrated a bit but maybe I’m not considering it well.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
input latency is specifically a huge problem when vsync is involved. with no vsync, and no GPU pressure lag (<%90 utilization), even 40 fps will feel snappy with minimal input lag
But this would be a pretty bad gaming experience. You can't avoid screen tearing without vsync without using g sync or amd freesync.

But having to debate and optimise and disable stuff to try to make 40 fps fluid shows there's a problem with this low framerate no matter how you spin it.

No one talks about input latency +100 fps.

Also, why would you not use your GPU to the fullest? Why settle with 40 fps if you can beef more out with your GPU?
 

Ammogeddon

Member
Fantastic news, not sure what amount of work has to go into it but would love to see 40fps roll out to more games. Insomniac are setting a great example that others should follow imo.
 

TrebleShot

Member
The Fidelity mode feels pretty close to 60, I’d be surprised if it isn’t .

60fps mode feels like 120fps

I didn’t believe VRR would do much but if this is an early indication I am more than happy to be completely wrong.
 

DeepEnigma

Gold Member
The Fidelity mode feels pretty close to 60, I’d be surprised if it isn’t .

60fps mode feels like 120fps

I didn’t believe VRR would do much but if this is an early indication I am more than happy to be completely wrong.
If you play with VRR, you are unlocking the framerate, so I believe 40 mode does hit 60 and stays relatively in the 50s.

The 60/120 mode reaches in the 80's-90s.

Without VRR, 40 will be locked at 40 and 60 at 60.
 
Last edited:

TrebleShot

Member
If you play with VRR, you are unlocking the framerate, so I believe 40 mode does hit 60 and stays relatively in the 50s.

The 60/120 mode reaches in the 80's-90s.

Without VRR, 40 will be locked at 40 and 60 at 60.
Makes sense, which is incredible, how close were they to 60 at launch with RT at 4k. This is the kind of optimisation I love to see over raw power.

I really hope more devs implement an unlocked FPS.
 

YCoCg

Member
I don’t quite get why this is the case. Seems overrated a bit but maybe I’m not considering it well.
The faster you run a display, the faster it can "sync" with devices outputting at the same Hz. E.g. having the game still locked at 30/60fps but having the console sync that at 120Hz reduces the display latency side of things. So that's why 40fps at 120Hz feels way better than it should, you're getting the exact in between frame time (25ms) of 30 and 60 from the console BUT ALSO the faster syncing of the display which also reduces latency.

It's something PC players do considering a lot of games have specific frame rate locks, so instead having the game output to a high refresh rate (120/240/360) but locking to 60fps so you still get the benefit of severely reducing the display latency.

And yes, in theory this could be done console wide by the OS by forcing 120Hz display at all times and keeping 60fps games as is but enjoy the benefit of faster display syncing.
 

Represent.

Represent(ative) of bad opinions
No. 60FPS does wonders.

This is dumb.

The baseline framerate should be 60FPS, even if it means sacrificing resolution to 1440p.
What a ridiculous post.

40fps is an amazing option. Does wonders when you dont have to sacrifice any resolution and the gameplay is still smooth.
I'll still be sticking to 30FPS, as I want the best possible visual fidelity.
 
Last edited:

DeepEnigma

Gold Member
What a ridiculous post.

40fps is an amazing option. Does wonders when you dont have to sacrifice any resolution and the gameplay is still smooth.
I'll still be sticking to 30FPS, as I want the best possible visual fidelity.
40's graphical fidelity is the same as 30 (it's literally the same mode now). It defaults to 40Hz in Insomniac games that have this if you are on a 120hz set and select Fidelity mode.

Unless you disable 120hz on the system level, or on a 60Hz set which 40Hz won't output anyhow, just 30.
 

Soodanim

Member
Given that a mode for 40fps is coming to games that already had a choice between 30 and 60, if you are in this thread comparing 40 and 60 with the aim of proving that 60 is better, you're an idiot.

Of course 60 is better, that's why the choice already exists. Compare 40 to 30, because that's the only actual change here. 60 never went anywhere and shouldn't be in the discussion at all.
 

xion4360

Member
If you play with VRR, you are unlocking the framerate, so I believe 40 mode does hit 60 and stays relatively in the 50s.

The 60/120 mode reaches in the 80's-90s.

Without VRR, 40 will be locked at 40 and 60 at 60.
I dont think it does? Im not sure but it seemed to stay at 40fps ..at least if it was over 47 fps it would show up on the framerate counter, but it stays at 118 hz

It seems the ps5 implementation of VRR is 47-120hz so unless the fidelity mode was over 47 it wouldnt help
 

DeepEnigma

Gold Member
I dont think it does? Im not sure but it seemed to stay at 40fps ..at least if it was over 47 fps it would show up on the framerate counter, but it stays at 118 hz

It seems the ps5 implementation of VRR is 47-120hz so unless the fidelity mode was over 47 it wouldnt help
I may have my wires crossed then, since some seemed to say it did with VRR.
 

xion4360

Member
NXGamer NXGamer & V VG Tech as no ones tvs show framerates below 60 in 120 hz mode can you give us the facts on these games with vrr enabled?

they noted in the patch announcement that they increased the dynamic resolution targets...with the removal of V-Sync thats a + to perfomance and VRR handling any drops below 60 no prob I say its totally worth it.
 

RoadHazard

Gold Member
If you play with VRR, you are unlocking the framerate, so I believe 40 mode does hit 60 and stays relatively in the 50s.

The 60/120 mode reaches in the 80's-90s.

Without VRR, 40 will be locked at 40 and 60 at 60.

It's pretty impressive that the PS5 can run that game at that resolution with such high quality RT at 50-60fps, I've gotta say. Few believed that before launch. But Insomniac are known to be wizards.
 

Tygeezy

Member
It’s sort of shocking how close the end to end is between 40fps and Classic 60fps. I don’t quite get why this is the case. Seems overrated a bit but maybe I’m not considering it well.
Because you get half the display latency at 120 hz and you also aren't running into the buffered frames of vsync that causes added latency.
 
40 fps without motion blur is ridiculously smooth. People need to start turning off motion blur in anything above 30 fps.
My brother and I just tested this mode on his 120hz PC monitor and it's actually so smooth that when we jumped from 60fps down to 40fps it actually didn't give me any motion sickness as opposed to jumping from 60fps down to 30fps.
 
Top Bottom