• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NX Gamer] Horizon: Forbidden West - 120Hz VRR Patch 1.17 Patch Analysis - PS5

FUBARx89

Member
Not sure if you're being sarcastic, but have you seen Spider-man on performance mode in 120hz with VRR? Framerate reaches 100fps
Same with the fidelity 40fps mode; it can go up to 50fps...

I'm not being sarcastic at all. I know really it's not space magic it's changes to the dyanmic res scaler.

But my TV shows both fidelity and 30fps going above 60hz, like you said performance was 80-100

Granted, I haven't played through either of them fully since the patch as I've played them both to death already.
 
This was really more a preview than an analysis from NXGamer. Except framerate there are only guesswork here notably about resolution. From pics it's obvious the resolution of 40fps mode has being lowered.

I am bit disappointed about this patch, too. Framerate is capped at 40fps (even with VRR). The only good thing is potentially a higher resolution in the performance mode, but again, no confirmation from NXGamer, only, "maybe".
 

Lysandros

Member
If the 40 FPS mode's resolution is indeed lowered from the 30 FPS one contrary to the implementation of Ratchet and Clank, this would give a pretty strong argument to the detractors of 40 FPS modes as this defeats its original purpose (win-win/no concessions) to some degree. Maybe the next analysis with precice info on resolution/IQ will shed more light into it.
 

Arioco

Member
PS5 VRR range is 48 - 120 hz. They'd have to have a floor of 48fps instead of 40 for that to work...

Locked 40 with balanced quality is a good option in such a case... and VRR can be used for 60 and 120 modes.

Not really, as long as it's in a 120 fps container the game can use low frame rate compensation and the floor would be 24 fps, so VRR is perfectly possible for a 40 fps game. In fact both Spider-Man and Ratchet and Clank use VRR in their 40 fps mode to uncap those 40 fps and it looks amazing despite the games being below 48 at times.
 

Clear

CliffyB's Cock Holster
As you can see here with G-Sync [VRR module from Nvidia used in this case, but most VRR screens acts the same way] we see 40 ms, but as soon as you turn on V-Sync it jumps to 59 ms of input delay at the same screen refresh rate of 180 Hz:

In both cases same fps were maintained:



Care to explain this 47% increase in input delay simply going from VRR to V-sync. Cheers in advance matey.


No idea what the monitor is doing, or precisely what these tools are measuring, but I can tell you from actual experience as a programmer what Vsync does. It simply waits until the display's update has enter the vblank period to present the next frame. Its simply a safeguard to prevent the display buffer from getting over-written midway through its update cycle - a time period that we know the precise duration of, so it can be stated as a fact what the maximum delay has to be.

Side-note: an interesting question crops up when you disable vsync, and you end up with torn frames - basically a mixture of the "new" frame partially covering the "old". Does that count as being "good enough" to quantify as a new frame for the purposes of measuring lag or responsiveness?
Bearing in mind that the pixel row at which the new frame data starts is at an entirely arbitrary position down full height of the display, and hence the thing you are supposed to be reacting to may not in fact be visible yet.

This the trade-off you make when implementing a soft vsync, and you extend the window permitted to start displaying the new frame to a point where a few lines of the old frame may already have been shown. So you get that tearing where the old data is still onscreen but only for a few rows at the top. It gives the code a bit of extra headroom to finish generating the frame-buffer at a cost of some visual integrity.

Obviously if a hard vsync is implemented, then even the tiniest deviation will force the program to hold off and keep that old frame onscreen. You keep pristine integrity because you can say definitively that each frame displayed is a "whole".

How do these tools discriminate between these cases? Even a human analyzing these frames by-eye needs to answer this.

And once again, there's a world of computational work happening between the inputs being read and the result being processed through the game code, passed onto the renderer to draw, and finally generated on-screen and perceived by the user.

What's more, if you switch inputs or turn the monitor off altogether, what happens to latency? That game code is still going to be spitting out display data as before even if you can't see it. Do you imagine it drops to zero? I'm not being facetious by saying this, there are plenty of circumstances in games that expect well-timed input to simultaneously presented visual and audible cues.
 
starting off with fh5

at 65 fps with %99 gpu load, we have 47.6 ms of input lag, which is actually really good, there are worse games with worse behaviours

fuFYgHM.png


when frames are capped to 55, indeed we have reduced input lag


r9baK9k.png


god of war

so at 68 frames with max load, we get 50 ms input lag, which is not that bad, but can be better

3wX8V5z.jpg


reflex reduces the framerate, but also drastically reduces the input lag!

2dJ6AqD.jpg


finally we have manual 56 fps cap which has similar lag to reflex

rkKfxSg.jpg


to my understanding, its practically doing the capping based on GPU headroom on the fly, which is very useful

as you said however, steady input lag may be more preferrable. and then again, it may also be hard to attain specific performance levels at times, especially on PC where most games lack proper dynamic resolution scaling. on consoles i guess what you want is very much feasible with dynamic resolution
In my experience NV reflex always keeps me under 99% GPU usage. Quite certain that's all it does haha
 

saintjules

Member
This was really more a preview than an analysis from NXGamer. Except framerate there are only guesswork here notably about resolution. From pics it's obvious the resolution of 40fps mode has being lowered.

I am bit disappointed about this patch, too. Framerate is capped at 40fps (even with VRR). The only good thing is potentially a higher resolution in the performance mode, but again, no confirmation from NXGamer, only, "maybe".

HDTVTest/DF or bust.
 
1.17 is a great patch that adds welcome features to the game but I wish the developers had been a bit clearly about how they labelled the various modes as it is not clear at all which is which on a VRR capable display such as my LG B9.

Favour Resolution is 4K and a locked 30 fps so while VRR is enabled in this mode is restricted to 48-60 Hz so doesn't actually work (I believe) as LFC isn't working due to it not running at 120 Hz.
Favour Performance is presumably the same 1800p checkerboard resolution at 60-120 fps as VRR shows as being 48-120 Hz on my TV. This is basically the same as the previous mode except the framerate now runs *above* 60 fps and up to 120 fps (if you look at the sky, presumably?).
Balanced seems to be 4K at a locked 40 fps and, again, VRR shows the range as 48-120 Hz on my TV but LFC is presumably working because the game is running at 120 Hz?

The descriptions for each of these modes is vague and I would have preferred if it actually explained exactly what it each one was doing rather me having to guess or rely on YouTube videos for more information.

P.S. While testing all three modes last night, I actually found the Favour Resolution mode (30 fps) to feel smoother than the Balanced (40 fps) mode!?!
 
Last edited:
isn't VRR a system wide thing? I don't understand why they had to do anything to support it. Never heard a PC game not supporting VRR.

Normally, yes, it is but for some reason Sony have done things differently such that you CAN force VRR for all *unsupported* games, only PS5 ones though, but for some bizarre reason it did not work with many of the first-party games such as Horizon: Forbidden West and Gran Turismo 7. Why, I don't know, as it means that instead of being able to use VRR straight way as a system feature like on the PC or Xbox Series X for example you instead have to rely on a patch from the developer. That may mean the game may never support VRR at all. Has Gran Turismo 7 been patched yet to support VRR? Has Elden Ring (admittedly a third-party release with partial VRR support as long as the framerate stays north of 48 fps) been patched to support VRR at 120 Hz for LFC support?
 
Last edited:
No idea what the monitor is doing, or precisely what these tools are measuring, but I can tell you from actual experience as a programmer what Vsync does. It simply waits until the display's update has enter the vblank period to present the next frame. Its simply a safeguard to prevent the display buffer from getting over-written midway through its update cycle - a time period that we know the precise duration of, so it can be stated as a fact what the maximum delay has to be.

Side-note: an interesting question crops up when you disable vsync, and you end up with torn frames - basically a mixture of the "new" frame partially covering the "old". Does that count as being "good enough" to quantify as a new frame for the purposes of measuring lag or responsiveness?
Bearing in mind that the pixel row at which the new frame data starts is at an entirely arbitrary position down full height of the display, and hence the thing you are supposed to be reacting to may not in fact be visible yet.

This the trade-off you make when implementing a soft vsync, and you extend the window permitted to start displaying the new frame to a point where a few lines of the old frame may already have been shown. So you get that tearing where the old data is still onscreen but only for a few rows at the top. It gives the code a bit of extra headroom to finish generating the frame-buffer at a cost of some visual integrity.

Obviously if a hard vsync is implemented, then even the tiniest deviation will force the program to hold off and keep that old frame onscreen. You keep pristine integrity because you can say definitively that each frame displayed is a "whole".

How do these tools discriminate between these cases? Even a human analyzing these frames by-eye needs to answer this.

And once again, there's a world of computational work happening between the inputs being read and the result being processed through the game code, passed onto the renderer to draw, and finally generated on-screen and perceived by the user.

What's more, if you switch inputs or turn the monitor off altogether, what happens to latency? That game code is still going to be spitting out display data as before even if you can't see it. Do you imagine it drops to zero? I'm not being facetious by saying this, there are plenty of circumstances in games that expect well-timed input to simultaneously presented visual and audible cues.

I may not understand where you're going with this, but you get no torn frames with VRR [when you keep it in the range] He has posted a video about his method to measure the input and nobody has questioned it, not Nvidia nor AMD.

Theres a data infront of your eyes showing ~50% lower input delay with VRR when comapred to V-sync at the exact same framerate, same in game settings, monitor running at the exact same refresh. In both cases you get no tearing.

A locked 60 with VRR will give you way more responsiveness, no tearing, no stutter, will lower machine temperature [whether it's console or pc] and even more importantly that will lead to lower noise levels.
 
Last edited:

Swift_Star

Banned
Just because Sony and MS advertised their consoles as being capable of 120fps doesn't mean they have to add it to games that arnt capable of it.
Leave these games 60fps and leave the 120fps for games like Ori.
Why? These games will run on ps5 pro or ps6. Isn’t it better that the game has a mode that will take advantage of these future machines? Future proofing and all…
 

Clear

CliffyB's Cock Holster
I may not understand where you're going with this, but you get no torn frames with VRR [when you keep it in the range] He has posted a video about his method to measure the input and nobody has questioned it, not Nvidia nor AMD.

Yes, VRR active = no nVsync, because the monitor will adjust its refresh (within limits) to match the incoming signal. As I wrote before what do you think Vsync means and exactly what is being synchronized to?

Theres a data infront of your eyes showing ~50% lower input delay with VRR when comapred to V-sync at the exact same framerate, same in game settings, monitor running at the exact same refresh. In both cases you get no tearing.

Data showing what exactly? The latency between which events? This is absolutely important. Is it the total delay or some element of that, and is that a product of monitor behaviour, driver behaviour, engine behaviour? What?

If that 50% increase is to an element that makes up 20% at worst of the total overall delay between button input and the result manifesting visually, how significant is it actually after all?

This is my point, any vsync must cost less than the time taken for a single refresh cycle, because that's what is being synced to. 16.65ms on a 60hz display


A locked 60 with VRR will give you way more responsiveness, no tearing, no stutter, will lower machine temperature [whether it's console or pc] and even more importantly that will lead to lower noise levels.

Frame-rate and stuttering are not the same thing. What we think of as "frame-rate" is an average frequency over a sample period, it does not reflect each frame individually. The cadence of delivery being even within the period is what connotes smoothness. VRR mitigates this by adjusting refresh rate dynamically (within ranges) to match, meaning that it doesn't ever need to resync to a vblank.

I feel like you are conflating a bunch of related but not strictly consequential elements with a limited understanding of what the big picture actually entails.
 
Yes, VRR active = no nVsync, because the monitor will adjust its refresh (within limits) to match the incoming signal. As I wrote before what do you think Vsync means and exactly what is being synchronized to?



Data showing what exactly? The latency between which events? This is absolutely important. Is it the total delay or some element of that, and is that a product of monitor behaviour, driver behaviour, engine behaviour? What?

If that 50% increase is to an element that makes up 20% at worst of the total overall delay between button input and the result manifesting visually, how significant is it actually after all?

This is my point, any vsync must cost less than the time taken for a single refresh cycle, because that's what is being synced to. 16.65ms on a 60hz display




Frame-rate and stuttering are not the same thing. What we think of as "frame-rate" is an average frequency over a sample period, it does not reflect each frame individually. The cadence of delivery being even within the period is what connotes smoothness. VRR mitigates this by adjusting refresh rate dynamically (within ranges) to match, meaning that it doesn't ever need to resync to a vblank.

I feel like you are conflating a bunch of related but not strictly consequential elements with a limited understanding of what the big picture actually entails.

His testing method shows total system latency. [from the input to pixel displayed on screen] You know the most relevant one since again the system used is all the same, the monitor used, the settings, refresh, everything is identical between tests, only that v-sync has ~50 more lag that vrr. Eliminating any discrepancies. Are you dismissing his methodology or what you're even trying to say here with your spins and needless attempts at education about things that most people in here knows or can easily look up?

Whats the point in picking parts of my post and suddenly claiming I don't know the difference between "Frame-rate and stuttering are not the same thing"? Are you trying to claim v-sync is better than vrr or downplaying it even when theres data showing it's inferior in many ways. What's the point of your post?
 

Clear

CliffyB's Cock Holster
Whats the point in picking parts of my post and suddenly claiming I don't know the difference between "Frame-rate and stuttering are not the same thing"? Are you trying to claim v-sync is better than vrr or downplaying it even when theres data showing it's inferior in many ways. What's the point of your post?

Correcting misinformation by ill-informed amateurs who've never written a line of code in their lives, let alone a render-loop.

MightySquirrel MightySquirrel Dude, I have pretty minimal respect for DF, so don't feel too sore about the "amateur" jibe. :D
 
Last edited:

Salz01

Member
I don’t know if it’s me, but 40 in this game feels slower and less smooth, than the 40 in Rarchet and Clank.
 

yamaci17

Member
left one, 60
upper right one is most likely 30 fps
and lower right one is most likely 40 fps (i noticed some artifacts that may be related to CB)
 
Last edited:
Why? These games will run on ps5 pro or ps6. Isn’t it better that the game has a mode that will take advantage of these future machines? Future proofing and all…
You can have the option in the code to open up the frame rate for future consoles without having the option for a 120fps mode on the PS5 or XSX.
 
Top Bottom