• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGNxgame: Uncharted: Legacy of Thieves Collection PC vs. PS5 Performance Review

rofif

Banned
O
It's still there of course, just the radius was reduced in half. There are some examples in that infamous Thread.
I blame the 120fps mode, as seen with TLOU Part I unlocked this tech is extremely taxing.
One other thing I’ve noticed in Madagascar level is that it is much harder to drive up some slippery surfaces at 120hz performance+ mode
 

yamaci17

Member
It's not just that he is using an old CPU.
In previous tests we could see his 2700X was underperforming a lot. The diference to other users was around 30%.
This makes that 2700X slower than a 1700X and closer in performance to a Bulldozer.
Yes stock 2700x clocks around 4-4.1 ghz
most 2700x owner usually match it with decent 3000 cl15/3200 cl16/3200 cl14 kits
considering he has some kind of jank 2800 mhz ram, I question the stability of said kits. i wouldn't be surprised if he did a whack job of overclocking a 2666/2400 mhz kit to 2800 with super random auto timings.
tbh zen/zen+ users are super niche in gaming community. most moved onto zen 2s. others stayed with their intel cpus. a simple i5 10th gen chip runs circles around the "2700x" while being cheaper, more available and have less headaches
 
Last edited:
2700 at 3.8 ghz with some jank 2800 MHz memory is no match for console CPUs.

two ccx design, enormous cross ccx latency (140 ns). that kind of latency is practically non existent on consoles due to specific core scheduling they'd use there. on PC you get the full penalty if the game do not care about it (and they have no incentives to care about it, majority of PC gamers have Intel/newer gen AMD CPUs. zen/zen+ CPUs did not see wide adoption in PC gaming space due to how obnoxious their performance WERE back in 2017. the top dog 2700x paired with 3200 mhz ram CONSISTENTLY lost against a puny 6 core 6 thread i5 from 2017 that runs with 2666 mhz rams. zen/zen+ cpus were pathetic, even then, and aged super horridly into this era. you will still get good enough framerates but %1 lows will suffer and you really need to be a patient person or simply have lower standards)

only way to avoid cross ccx latency on PC is not to have a double CCX CPU. They were, to me, experimental CPUs that allowed AMD to have higher yield on first and second gen AMD CPUs. All Intel CPUs are fully monolothic and are no subject to such weird latencies, same goes for Zen 3 and onward.(unless you buy 16+ core parts and run into weird CCD latency issues).

%15-20 lower IPC compared to zen 2 (depends from game to game. some games, only %10. some games it is up to %25. f1 2017 %17, wd legion %15, rainbow six %21 etc.


so PS5's zen 2 cpu running at 3.6 ghz is most likely can match a zen+ cpu running at 4.3 GHz. and he runs the his Zen CPU only at 3.8 ghz.

combine these facts with consoles having lower overhead for CPU bound workloads, his CPU has no business "matching" PS5.

Problematic part, as I said again, zen/zen+ cpus are pos products that most gamers ignored. even a cheap lowly i5 10400f trounces these CPUs.

That's nice in theory but we actually had 2 benchmarks of PS5 CPU showing it performs around 1700X or 2700 (clocked higher). It's because it has only 4MB of cache and we have seen with the 3D AMD CPUs how important cache is for gaming. PS5 (and Xbox series) have technically a weak mobile Zen 2 CPU.
 
Last edited:

rofif

Banned
To stay off, gladly.
Unless you have 240hz or more... then I can agree motion blur is redundant.
But below that it is genious. Especially at lowly 30 or 60fps.
it really helps to flesh out animations and feeling of speed.
And helps to cover big frame gaps in a sense.
If you only have 30 or 60fps and dive fast, make fast movements or look around fast, the motion blur helps greatly to cheat the brain that there is in fact enough frame information to convey he full motion.
I don't like how stiff things can look without it.
Of course I am talking about per object/pixel stuff. Not the cheap crap!
 
How 10 or 12GB hold them back?
My thoughts as well. I'd love to see how returnal is going to run on PC next year. If properly optimized, something like a then relessed 4050 or 4050ti should match a PS5.

Not sure Sony would love entry level cards matching or outpacing their console that early in the console lifecycle. Specially as years go by and the gap vs PC HW grows larger. They will find a way to make stuff like returnal struggle on a 4090
Those cards should only match or beat the ps5 in rt not in rasterization I’d say that’s more 4060 level
 
2700 at 3.8 ghz with some jank 2800 MHz memory is no match for console CPUs.

two ccx design, enormous cross ccx latency (140 ns). that kind of latency is practically non existent on consoles due to specific core scheduling they'd use there. on PC you get the full penalty if the game do not care about it (and they have no incentives to care about it, majority of PC gamers have Intel/newer gen AMD CPUs. zen/zen+ CPUs did not see wide adoption in PC gaming space due to how obnoxious their performance WERE back in 2017. the top dog 2700x paired with 3200 mhz ram CONSISTENTLY lost against a puny 6 core 6 thread i5 from 2017 that runs with 2666 mhz rams. zen/zen+ cpus were pathetic, even then, and aged super horridly into this era. you will still get good enough framerates but %1 lows will suffer and you really need to be a patient person or simply have lower standards)

only way to avoid cross ccx latency on PC is not to have a double CCX CPU. They were, to me, experimental CPUs that allowed AMD to have higher yield on first and second gen AMD CPUs. All Intel CPUs are fully monolothic and are no subject to such weird latencies, same goes for Zen 3 and onward.(unless you buy 16+ core parts and run into weird CCD latency issues).

%15-20 lower IPC compared to zen 2 (depends from game to game. some games, only %10. some games it is up to %25. f1 2017 %17, wd legion %15, rainbow six %21 etc.


so PS5's zen 2 cpu running at 3.6 ghz is most likely can match a zen+ cpu running at 4.3 GHz. and he runs the his Zen CPU only at 3.8 ghz.

combine these facts with consoles having lower overhead for CPU bound workloads, his CPU has no business "matching" PS5.

Problematic part, as I said again, zen/zen+ cpus are pos products that most gamers ignored. even a cheap lowly i5 10400f trounces these CPUs.

Still a travesty the consoles used zen 2 instead of 3 hopefully the pro models save us
 

ACESHIGH

Banned
To add insult to injury this stellar port requires AVX2 instructions, so capable CPUs like Sandy or Ivy Bridge cant run this ps4 game. Imagine having a 3770k - 3930k and not being able to run this game made for 2013 1.6ghz tablet cores.

Scalability is one of the key factors behind a good PC port, guess iron galaxy and naughty dog forgot about that.
 

hollams

Gold Member
The game looks and runs great and it has HDR. It does feel odd playing the game with an Xbox Controller.
 
You are wrong. Motion blur MUST be included in games. it just must to make it look more realistic.
I like this example why low shutter speed/motion blur looks closer to reality:
They talk to you about what they think they see with their eyes and you respond with an example made with a camera (it's a good example).

Now, the eyes do not resolve a lot of details if you turn your head around fast either, we also barely resolve any details outside our focus area and if we look at a very bright light its image may get stuck there for a little while.

I don't think the camera in a game should strive to simulate all these things, not when you are trying to spot baddies while turning the camera in the middle of a shootout.
 

winjer

Gold Member
To add insult to injury this stellar port requires AVX2 instructions, so capable CPUs like Sandy or Ivy Bridge cant run this ps4 game. Imagine having a 3770k - 3930k and not being able to run this game made for 2013 1.6ghz tablet cores.

Scalability is one of the key factors behind a good PC port, guess iron galaxy and naughty dog forgot about that.

Those are CPUs from 11 years ago. Even before the launch of the PS4 and X1.
No company has to maintain compatibility with hardware that old, not with modern games.
 

ACESHIGH

Banned
Those are CPUs from 11 years ago. Even before the launch of the PS4 and X1.
No company has to maintain compatibility with hardware that old, not with modern games.
C'mon mate, don't jump on the developer defense force bandwagon. Not asking for compatibility on an AMD K6...

99% of 8th gen games work just fine, even great on those CPU1s. And issues like these were always patched down the line to allow further compatibility. Just look at how even friggin WINDOWS 11 was patched to allow a wider range of users to install it.

I am afraid many gamers wallets won't be compatible with this game either. Their loss.
 

winjer

Gold Member
C'mon mate, don't jump on the developer defense force bandwagon. Not asking for compatibility on an AMD K6...

99% of 8th gen games work just fine, even great on those CPU1s. And issues like these were always patched down the line to allow further compatibility. Just look at how even friggin WINDOWS 11 was patched to allow a wider range of users to install it.

I am afraid many gamers wallets won't be compatible with this game either. Their loss.

It's not a matter of defending or not a developer.
Expecting that 10 year old hardware to run modern games is a silly prospect.
This has never happened in PC gaming. And you can't expect devs to stop advancing with tech, just because some people are stuck in 2011.
 

rofif

Banned
Seems i was full of shit basing my opinion on review copies....
the launch verstion DOES HAVE HDR !!!!!!!
And it does have motion blur... but motion blur is broken a bit.



It's ok but this is what I dislike about some pc games... every option must have low-ultra settings despite not doing anything but costing performance.... why even have these options. Include only those that matter.
Most of the options look exactly the same... so how is player supposed to know that it's a performance waste to set it on ultra without watching DF ?
Like... shadows low, medium, high, very hig, ultra.... and only low looks different. but shadows WHAT ?! Distance? resolution?
SEttings for the sake of settings is crap
 
Last edited:

sachos

Member
Seems i was full of shit basing my opinion on review copies....
the launch verstion DOES HAVE HDR !!!!!!!
And it does have motion blur... but motion blur is broken a bit.



It's ok but this is what I dislike about some pc games... every option must have low-ultra settings despite not doing anything but costing performance.... why even have these options. Include only those that matter.
Most of the options look exactly the same... so how is player supposed to know that it's a performance waste to set it on ultra without watching DF ?
Like... shadows low, medium, high, very hig, ultra.... and only low looks different. but shadows WHAT ?! Distance? resolution?
SEttings for the sake of settings is crap

Quite a disappointing video by Alex really, he puts PC vs PS5 in the title but doesnt compare the 4K or the unlocked FPS modes, it would have been a great way to test the GPU power of the PS5, adding another data point you know. I don't like that they've stoped doing those kinds of experiments lately.
 

01011001

Banned
Quite a disappointing video by Alex really, he puts PC vs PS5 in the title but doesnt compare the 4K or the unlocked FPS modes, it would have been a great way to test the GPU power of the PS5, adding another data point you know. I don't like that they've stoped doing those kinds of experiments lately.

he is apparently sick and had to redo the whole video because pre release code was garbage... so I don't hold it against him that this video is kinda phoned in... and the PC version isn't especially interesting either tbh
 
Last edited:
Seems like a below average port so far.
Disappointing how Sony's ports keep having technical issues.
They should give PC players more value for money through high-quality ports.
 
Last edited:
Quite a disappointing video by Alex really, he puts PC vs PS5 in the title but doesnt compare the 4K or the unlocked FPS modes, it would have been a great way to test the GPU power of the PS5, adding another data point you know. I don't like that they've stoped doing those kinds of experiments lately.
It was expected as they did the same thing with Spiderman by refusing to compare PC against PS5 uncapped (when they could). The comparison is too favorable for PS5 so they decreed both couldn't be compared (say the guys who use a 5Ghz high end PC CPU to make 'fair' GPU comparisons against PS5 in others games :messenger_dizzy:).
 
Last edited:

YCoCg

Member
It needs Exclusive Fullscreen support. It's crazy that most Sony titles DON'T support this, so far only Spider-Man supports (and even then you need to set it through the launcher first, as it won't show in the launched game originally).
 
The actual reason is cause DF never bothered to buy a vrr capture card
No, the actual reason is that there is no point doing a PC Vs PS5 performance benchmark when the settings can't be matched. It's as simple as that. Check watch dogs legion video for that where they found the EXACT settings via ini files. That is a more accurate benchmark
 

daninthemix

Member
It needs Exclusive Fullscreen support. It's crazy that most Sony titles DON'T support this, so far only Spider-Man supports (and even then you need to set it through the launcher first, as it won't show in the launched game originally).
Nope. This thinking is straight out of 2014.

The modern, best way is to use DX12 Flip Model Presentation for the lowest latency and best performance.
 

daninthemix

Member
It was expected as they did the same thing with Spiderman by refusing to compare PC against PS5 uncapped (when they could). The comparison is too favorable for PS5 so they decreed both couldn't be compared (say the guys who use a 5Ghz high end PC CPU to make 'fair' GPU comparisons against PS5 in others games :messenger_dizzy:).
Are you okay, friend?
 

YCoCg

Member
Nope. This thinking is straight out of 2014
And yet it's the only way to use additional features from Nvidia and AMD, with Borderless Fullscreen you lose the ability to render games at a higher resolution than Desktop and lose the ability to downsample. Considering DLSS and FSR are main stays of these ports, not being able to push these games further than your display screen is just a stupid move. That's why Spider-Man so far looks the best because I'm rendering that at a resolution beyond my screen thanks to DLSS and then using Nvidias DLSR to supersample it down to native, game looks clean.
 

MMaRsu

Banned
It's not a matter of defending or not a developer.
Expecting that 10 year old hardware to run modern games is a silly prospect.
This has never happened in PC gaming. And you can't expect devs to stop advancing with tech, just because some people are stuck in 2011.
This was the same bs I kept reading when I had a SSE2 incompatible cpu, bunch of games came out and they wouldnt work on the cpu Phenom II X6, but not because of performance but a missing instruction set.

Most if not all games that required this instruction set were patched, and they ran without any issues on that cpu. But people on Steam forums and EA forums would say nah upgrade your cpu its ancient blablabla.
 

winjer

Gold Member
This was the same bs I kept reading when I had a SSE2 incompatible cpu, bunch of games came out and they wouldnt work on the cpu Phenom II X6, but not because of performance but a missing instruction set.

Most if not all games that required this instruction set were patched, and they ran without any issues on that cpu. But people on Steam forums and EA forums would say nah upgrade your cpu its ancient blablabla.

BS is expecting devs to support today, hardware from the age of the PS3.
 
No, the actual reason is that there is no point doing a PC Vs PS5 performance benchmark when the settings can't be matched. It's as simple as that. Check watch dogs legion video for that where they found the EXACT settings via ini files. That is a more accurate benchmark
Use settings a notch lower than ps5s ti cover discrepancies like what Nx gamer did (for example say a ps5 is running a setting between high and ultra Nx gamer will use high in the benchmark) this can show us a bare minumum difference
 

Stuart360

Member
Its not that bad of a port to be honest. Its very heavy in terms of requirements but i'm up to the pirate island part of U4, which i assume is near the end of the game, and i have had a near locked 60fps at max settings at 1080p on my '3700x, 32gb ram, gtx 1080ti' PC.. For what is essentially a upgraded PS4 game, i would of expected 1440p at least on my system, but atleast i could max the game. And all the Sony ports so far have needed much better PC parts than you would of expected.
I did have the odd loading stutter but waiting for shaders to compile eliminates most of it.
 
Its not that bad of a port to be honest. Its very heavy in terms of requirements but i'm up to the pirate island part of U4, which i assume is near the end of the game, and i have had a near locked 60fps at max settings at 1080p on my '3700x, 32gb ram, gtx 1080ti' PC.. For what is essentially a upgraded PS4 game, i would of expected 1440p at least on my system, but atleast i could max the game. And all the Sony ports so far have needed much better PC parts than you would of expected.
I did have the odd loading stutter but waiting for shaders to compile eliminates most of it.
I’m happy your satisfied with it. Do you think you over budgeted on the ram considering the rest of your system? Especially cause you mentioned you don’t care about rt
 

Md Ray

Member
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.

Screenshot-7.png

vlcsnap-2022-10-23-02h11m37s747.png


Screenshot-8.png

vlcsnap-2022-10-23-02h18m35s520.png


Screenshot-9.png

vlcsnap-2022-10-23-02h24m11s322.png


Screenshot-11.png

vlcsnap-2022-10-23-02h32m40s081.png


Screenshot-13.png

vlcsnap-2022-10-23-02h35m42s318.png


Screenshot-14.png

vlcsnap-2022-10-23-02h38m38s898.png


Screenshot-18.png

vlcsnap-2022-10-23-02h52m15s744.png


Screenshot-19.png

vlcsnap-2022-10-23-02h59m27s961.png

SlimySnake SlimySnake yamaci17 yamaci17
 
Last edited:
Use settings a notch lower than ps5s ti cover discrepancies like what Nx gamer did (for example say a ps5 is running a setting between high and ultra Nx gamer will use high in the benchmark) this can show us a bare minumum difference
That's still leaving things up for guesswork, which defeats the purpose of a benchmark. And this is where NXG is lacking, because his 2700x system seems to be benching lower than similar equipped systems consistently. That should be his starting point. His haphazard testing methodologies is why most people take his videos with a grain of salt.

Absolutely like-for-like testing environment and consistency is key for benchmarking. Did you check out the watch dogs legion DF comparison I mentioned earlier? What are your thoughts on it?
 
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.
Check out how each CPU core is loaded during those scenes. I wouldn't be surprised if one or two cores are 100 percent utilised and others are not utilised adequately. DF noted this issues as what explained the notably high loading times.

Clearly not a lot of work has gone into porting this PS5 engine to the PC. Rather poor level effort by Iron Galaxy.
 
Top Bottom