• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Uncharted: Legacy of Thieves: New 120Hz VRR Patch Unlocks PS5 GPU Performance

Lunatic_Gamer

Gold Member



A recent patch for Naughty Dog's Uncharted: Legacy of Thieves Collection uses 120Hz displays and variable refresh rate support to offer an excellent 40fps fidelity mode and some very, very fast 1440p performance. With frame-rate caps removed, we get to see exactly what the PS5 GPU can do across a range of different performance modes. Oliver Mackenzie has the full low-down on both official and 'unofficial' system level VRR performance.


OMPgu7Z.jpg


 

SlimySnake

Flashless at the Golden Globes
Interesting video, ps4 can have ps5 4k uncapped mode performance level tough rendering 16x less pixels ;)

Yeah, thats the classic Jaguar CPU bottleneck. On my GTX570, if I could run a game at 1080p 30 fps, I could easily turn down the resolution by half to 720p to get 60 fps. The fact that U4 had to drop to 1/4 resolution of 1080p is proof that these jaguar CPUs were holding back GPUs and were probably the reason why even the PS4 Pro and X1X couldnt do 60 fps in most 30 fps titles.

My GTX 570 was paired with a crappy $90 AMD 4 core CPU from 2010. It's a shame that Sony and MS couldnt even get that CPU in their SoCs. It held back not just that generation but also the first half of this generation.

VRR is good actually, no matter what is going on this board. Very happy to see it utilized.
To be fair, before Insomniac started the 40 fps and unlocked 40+ and 60+ fps trend, no one was using it for two years on the xbox series consoles that always had this feature. I thought 40 fps felt great until I played Spiderman, Ratchet and Uncharted at 40-50 fps in VRR.

On xbox, it was just smoothing over some dropped frames in games that were 60 fps 99% of the time. This on the other hand is tranformative. It's a shame that Forza's native 4k 30 fps mode never got an unlocked framerate mode. My 2080 was running it at 40-50 fps at Extreme settings so Im sure the Xbox Series X couldve hit 40-50 fps if not more.
 
Yeah, thats the classic Jaguar CPU bottleneck. On my GTX570, if I could run a game at 1080p 30 fps, I could easily turn down the resolution by half to 720p to get 60 fps. The fact that U4 had to drop to 1/4 resolution of 1080p is proof that these jaguar CPUs were holding back GPUs and were probably the reason why even the PS4 Pro and X1X couldnt do 60 fps in most 30 fps titles.

My GTX 570 was paired with a crappy $90 AMD 4 core CPU from 2010. It's a shame that Sony and MS couldnt even get that CPU in their SoCs. It held back not just that generation but also the first half of this generation.


To be fair, before Insomniac started the 40 fps and unlocked 40+ and 60+ fps trend, no one was using it for two years on the xbox series consoles that always had this feature. I thought 40 fps felt great until I played Spiderman, Ratchet and Uncharted at 40-50 fps in VRR.

On xbox, it was just smoothing over some dropped frames in games that were 60 fps 99% of the time. This on the other hand is tranformative. It's a shame that Forza's native 4k 30 fps mode never got an unlocked framerate mode. My 2080 was running it at 40-50 fps at Extreme settings so Im sure the Xbox Series X couldve hit 40-50 fps if not more.
Sony initially wanted to use bulldozer cores but amd talked them out of it.
 

ChiefDada

Gold Member
So I take it that fidelity unlocked with VRR would be the best way to play it now?

It will come down to preference. I love the fidelity mode it's the best 4k mode that comes closest to 60fps, but that performance mode is god-tier and often pushes 90-100fps. The AA solution works well enough to the point where 4k vs 1440p isn't nearly as noticeable as the jump from 45-50fps vs 90-100. The game is a looker. It just makes me pissed about TLOU Pt. 1 performance and what it could have and should have been.
 
Yeah, thats the classic Jaguar CPU bottleneck. On my GTX570, if I could run a game at 1080p 30 fps, I could easily turn down the resolution by half to 720p to get 60 fps. The fact that U4 had to drop to 1/4 resolution of 1080p is proof that these jaguar CPUs were holding back GPUs and were probably the reason why even the PS4 Pro and X1X couldnt do 60 fps in most 30 fps titles.

My GTX 570 was paired with a crappy $90 AMD 4 core CPU from 2010. It's a shame that Sony and MS couldnt even get that CPU in their SoCs. It held back not just that generation but also the first half of this generation.


To be fair, before Insomniac started the 40 fps and unlocked 40+ and 60+ fps trend, no one was using it for two years on the xbox series consoles that always had this feature. I thought 40 fps felt great until I played Spiderman, Ratchet and Uncharted at 40-50 fps in VRR.

On xbox, it was just smoothing over some dropped frames in games that were 60 fps 99% of the time. This on the other hand is tranformative. It's a shame that Forza's native 4k 30 fps mode never got an unlocked framerate mode. My 2080 was running it at 40-50 fps at Extreme settings so Im sure the Xbox Series X couldve hit 40-50 fps if not more.
Uncharted is 45-65fps from Nx gamers tests
 

8BiTw0LF

Banned
Can you explain the difference?
The OS handles the output resolution, the game handles the rendering resolution.

Your TV will say it's 4K - but it's not.

For consoles this is pretty smart, cause they will output the highest available resolution and games can do dynamic resolution without TV's have to be compatible with other resolutions than standard TV resolutions.
 

DenchDeckard

Moderated wildly
VRR continues to be this gens golden gpos3. I flipping love it.

Well this seals it, the playstation vr is getting packed away this weekend and my ps5 will finally be enjoying its VRR goodness. Looking forward to firing up lost legacy.
 

01011001

Banned
I thought the PS5 couldn't output at 1440p?

internal resolution =/= output resolution.
it's kinda sad that this has to be explained on a "gaming" forum tbh.

also it does support 1440p, but not VRR at 1440p because Sony's shit TVs don't support 1440p vrr so Sony won't let the PS5 support it either in order to not make their TVs look bad by not supporting a feature the PS5 supports.
 

Naru

Member
Interesting, is that table correct? The NX Gamer video said the 120-fps performance mode would render at 1080p and they also showed images that clearly were blurrier than the other modes. DF says it's 1440p.
 

Thief1987

Member
Interesting, is that table correct? The NX Gamer video said the 120-fps performance mode would render at 1080p and they also showed images that clearly were blurrier than the other modes. DF says it's 1440p.
Performance + is 1080p, it's just DF being DF.
 

01011001

Banned
Interesting, is that table correct? The NX Gamer video said the 120-fps performance mode would render at 1080p and they also showed images that clearly were blurrier than the other modes. DF says it's 1440p.


DF doesn't say it's 1440p in "performance+" mode, they say it's 1440p in unlocked VRR "performance" mode which is the 60fps mode without framerate lock
P3uU8oV.png



but here's VG Tech non-VRR tests

PS5 in the 60fps Performance Mode renders at a native resolution of 2560x1440.
PS5 in the 120fps Performance+ Mode renders at a native resolution of 1920x1080.
PS5 in the 30fps Fidelity Mode renders at a native resolution of 3840x2160.
 
Last edited:

01011001

Banned
I asked because the picture in the OP literally says exactly that.

typo most likely, the earlier mode list I screencapped in the video shows Performance+ as 1080p120, and since performance+ isn't in any way influenced by this patch (which he also says in the video) the second pic was clearly a typo
 

Vick

Gold Member
Interesting video, ps4 can have ps5 4k uncapped mode performance level tough rendering 16x less pixels ;)

Eh, the PS4 Pro version is at times more demanding than the same 1440p Performance mode on PS5. It wouldn't have been downgraded otherwise.

And after Part I it became obvious why they had to tone down the flashlight GI on this collection, which sucks, because personally fuck 120fps. Just get a panel with better motion resolution compared to the jokes currently on the market.. 300 resolved lines on 4K panels, give me a fucking break.

It just makes me pissed about TLOU Pt. 1 performance and what it could have and should have been.
Yeah, by downgrading the game tech/visuals. No thanks. Uncharted 4 literally has some PS3 era direct shadows in this Collection when they are like two generations above on Part I.
 
I am not that impressed about the performance of those games on PS5. The PS4 games were already 1080p 30fps so using the 4K 40fps mode the PS5 outputs only 5.2x more than PS4. That's really low compared to others exclusives like Death stranding (8x, from 1080p30fps to 4k 60fps). Another example: Spiderman has a similar resolution / framerate improvement in both modes but adds RT reflections!

Like with Horizon we can clearly see modest improvements compared to the PS4 / Pro games. Those games are clearly not optimized for PS5 the way Death stranding or Spiderman are which makes sense as those games are very likely PC versions ported to PS5 and released on that console first in order to milk Sony payers.
 
Last edited:

Synless

Member
Didn’t Uncharted 4 see numerous downgrades compare to its PS4 counterpart? I swear there is a thread with thousands of examples on all the downgrades.

My point, I’m not overly impressed that they got some extra performance considering the downgrades.
 
Last edited:

AMSCD

Member
internal resolution =/= output resolution.
it's kinda sad that this has to be explained on a "gaming" forum tbh.

also it does support 1440p, but not VRR at 1440p because Sony's shit TVs don't support 1440p vrr so Sony won't let the PS5 support it either in order to not make their TVs look bad by not supporting a feature the PS5 supports.
Sorry to have disappointed you.
 



A recent patch for Naughty Dog's Uncharted: Legacy of Thieves Collection uses 120Hz displays and variable refresh rate support to offer an excellent 40fps fidelity mode and some very, very fast 1440p performance. With frame-rate caps removed, we get to see exactly what the PS5 GPU can do across a range of different performance modes. Oliver Mackenzie has the full low-down on both official and 'unofficial' system level VRR performance.


OMPgu7Z.jpg


[/URL]

Assuming they are right that they arbitrarily capped the fidelity mode to 60 can we please let naught dog know yo uncap it up to 120 for ultimate future proofing as well as maximizing performance?
 
typo most likely, the earlier mode list I screencapped in the video shows Performance+ as 1080p120, and since performance+ isn't in any way influenced by this patch (which he also says in the video) the second pic was clearly a typo
Did they at least remove v sync from performance+ mode like the other modes when vrr is engaged?
 
I am not that impressed about the performance of those games on PS5. The PS4 games were already 1080p 30fps so using the 4K 40fps mode the PS5 outputs only 5.2x more than PS4. That's really low compared to others exclusives like Death stranding (8x, from 1080p30fps to 4k 60fps). Another example: Spiderman has a similar resolution / framerate improvement in both modes but adds RT reflections!

Like with Horizon we can clearly see modest improvements compared to the PS4 / Pro games. Those games are clearly not optimized for PS5 the way Death stranding or Spiderman are which makes sense as those games are very likely PC versions ported to PS5 and released on that console first in order to milk Sony payers.
Death stranding could go even higher if they added vrr support to uncap he framerate
 
Never heard that one before. Do you have a reliable source for that?
Take this with an extreme grain of salt, but...

https://www.stuff.tv/news/sony-play...re-bulldozer-cpu-and-new-controller-revealed/

https://forums.anandtech.com/threads/amd-bulldozer-in-ps4-rumor-sufaces.2177927/

https://www.quora.com/Why-did-Xbox-...es-rather-than-offer-four-more-powerful-cores

It's unfortunate for many reasons that AMD ended up supplying Jaguar-based CPU cores for the One and PS4. I know things like power consumption and heat are important, but with the cooling solutions being used on this gen it would be nice if Sony and MS had that mindset at the start of last gen.

Think how much better this past gen would have been if everyone involved didn't cheap out and they went potentially with a separate CPU and GPU die.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Nx gamer has the most accurate vrr results cause he bought a vrr capture card
Whats a VRR capture card?

Its just a capture card, once captured you transfer the footage to a PC to do the framerate analysis.

A capture card will capture whatever you throw at it.
It doesnt actually care whether the framerate changes mid capture or whatever.
 

Mr Moose

Member
Whats a VRR capture card?

Its just a capture card, once captured you transfer the footage to a PC to do the framerate analysis.

A capture card will capture whatever you throw at it.
It doesnt actually care whether the framerate changes mid capture or whatever.
VRR passthrough. My Razer capture card doesn't support it (or HDR) so if I want to play with VRR on I have to plug the PS5 directly into the TV.
 

Calverz

Member
Iv still to play uncharted lost legacy. But I’m not paying £10 to unlock a patch. So il play it at 30fps 720p or whatever it is lol
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
VRR passthrough. My Razer capture card doesn't support it (or HDR) so if I want to play with VRR on I have to plug the PS5 directly into the TV.
Ahhh I got you, I got you.

But whats that got to do with having a more accurate fps count?
Are we assuming DF captures are less accurate than NXGs because........because why again?
 

ChiefDada

Gold Member
Yeah, by downgrading the game tech/visuals. No thanks. Uncharted 4 literally has some PS3 era direct shadows in this Collection when they are like two generations above on Part I.

Still doesn't excuse TLOU Pt. 1 performance. Having a VRR fidelity mode that consistently tanks well into the 30s is unacceptable. Optimization rarely requires reduction in perceptible fidelity. Look at the support GG has given to Forbidden West post launch.
 

Mr Moose

Member
Ahhh I got you, I got you.

But whats that got to do with having a more accurate fps count?
Are we assuming DF captures are less accurate than NXGs because........because why again?
The guy who recorded this video had to use his TVs fps reading for the VRR modes, but it lines up with other tests.
I noticed something off with NXGs fps line before on one of his recent videos so I wouldn't say either is more accurate at the moment.
hmm.gif
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The guy who recorded this video had to use his TVs fps reading for the VRR modes, but it lines up with other tests.
I noticed something off with NXGs fps line before on one of his recent videos so I wouldn't say either is more accurate at the moment.
This is the new-er guy yeah?
I guess he doesnt have the OG members capture card.
They should really give the whole team high powered capture cards if they are gonna let team members do pieces on games that have VRR and/or go to 120.

The 4K60Pro 2 or HD60X arent even that expensive for a channel like DF, they are about 250 dollars.
Im sure this very video would have made that in the first hour.
 
Last edited:
Top Bottom