• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGNxgame: Uncharted: Legacy of Thieves Collection PC vs. PS5 Performance Review

It's a bonafide garbage bin port, probably the worst they've released yet. 3080 struggles to hit 60fps in 4K while the PS5's new unlocked fidelity runs 45-50 in 4K. 3080 is comfortably 3-4x faster than a PS5. Even the 4090 (more than 10x a PS5) barely crests 100fps in 4K.
3080 one rasterization is usually from 50-70% better than ps5 not 2x and definitely nowhere near 4x (the gap isn’t that big even raytracing) that’s 4090 levels. You also ignored the vram holds the 3080 back
 

SmokedMeat

Gamer™
there is nothing wrong with using the 2700 as long as you point out that the 2700 might be holding back the 2070 Super. You can also not use a different CPU to compare performance of another GPU. The CPU HAVE to be identical. Hell, the entire system needs to be identical.

If he wants to show the PS5 GPU is being held back by a 2700 caliber CPU then fine, say that. but he also needs to show performance metrics for GPUs that were not impacted by that CPU.

His comparison between the two platforms is meaningless. An old CPU is absolutely going to hold back performance.

No PC gamer uses this as a source of information.
 

JeloSWE

Member
You are wrong. Motion blur MUST be included in games. it just must to make it look more realistic.
I like this example why low shutter speed/motion blur looks closer to reality:


Wave your hand in front of your face fast. it will be blurred. The same goes for fast motion of surroundings.
I understand not every motion blur implementation is good but ND blur is exceptionally good. Camera motion blur is just as needed too.... and you can just use the slider in console version to modify it

While I don't mind object motion blur, actually it's rather nice. Camera motion blur on the other hand is a big NO for me. It's nauseating and terrible and it just get worse the lower the frame rate you go, which is ironic in that it's often implemented in 30fps titles to compensate for the low frame rate but it just makes the environment horribly indecipherable while moving the camera round.
 

ACESHIGH

Banned
3080 one rasterization is usually from 50-70% better than ps5 not 2x and definitely nowhere near 4x (the gap isn’t that big even raytracing) that’s 4090 levels. You also ignored the vram holds the 3080 back

RTX 3080 VRAM buffer Is a scam. Can't believe we got 8GB VRAM mainstream cards back in 2016. AMD higher ups must have been drunk back then.
 

SmokedMeat

Gamer™
We are gauging a pc gpu vs a console gpu in an equivalent scenario and environment (so same settings and resolution as well as framerate target) the only parts that should be different in the benches are of course the gpu no one does gpu benchmarks on pc when all the other parts of the pc are different

No, the video is titled PC Performance. PC vs PS5.
If you’re going to use old PC hardware, then don’t give it a misleading title.

I’ll wait for PC centric sources.
 

Reallink

Member
Nah, 3080 is only 2x faster at best in standard rasterization. The PS4 is basically a 2080 in AMD supported games. And a 3080 is roughly 65-80% better depending on the game. Ray tracing performance is obviously another story but this game doesnt support it. So id expect 1.2-2x better performance at best.

We are not seeing that here but we dont know have the PS4 preset settings in this game. its just a guessing game.

Its possible the game is unoptimized like GOW and holding back the GPUs, but i wouldnt expect 3-4x more performance from a 3080.

These are PS4 games, made for decade old 1.8TF hardware. PS5 is an RX 5700XT with somewhere between 9 and 10TF's of performance depending on CPU load. 3080's full time boost clock to around 2Ghz making them nearly 35 Teraflops cards in practice, with nearly double the memory bandwidth of PS5. PS4 games are absolutely not running into a 10GB VRAM limit. Meanwhile 4090's full time boost clock to 2900Mhz+ making them nearly 100TF cards with nearly triple the memory bandwidth of PS5. There is no reality where a 3080 should be running 10FPS better than PS5, nor one where a 4090 struggles to merely double a PS5. This is a God awful port, full stop.
 
Last edited:

ACESHIGH

Banned
These are PS4 games, made for decade old 1.8TF hardware. PS5 is an RX 5700XT with somewhere between 9 and 10TF's of performance depending on CPU load. 3080's full time boost clock to around 2Ghz making them nearly 35 Teraflops cards in practice, with nearly double the memory bandwidth of PS5. PS4 games are absolutely not running into a 10GB VRAM limit. Meanwhile 4090's full time boost clock to 2900Mhz+ making them nearly 100TF cards with nearly triple the memory bandwidth of PS5. There is no reality where a 3080 should be running 10FPS better than PS5, nor one where a 4090 struggles to merely double a PS5. This is a God awful port, full stop.

Even if nvidias TF figures mean little in real world gaming performance I agree that 2080 and up should be stomping ps5 in rasterization. They do with DLSS on but that's another story

And that cards from Maxwell era and up, hell even GCN should have no issues running games matching ps4 at the very least. Just like cards like the 8800 GT ran 7th gen era games like a hot knife through butter.
Devs are not putting in work, that's all there is to it.
 

SlimySnake

Flashless at the Golden Globes
These are PS4 games, made for decade old 1.8TF hardware. PS5 is an RX 5700XT with somewhere between 9 and 10TF's of performance depending on CPU load. 3080's full time boost clock to around 2Ghz making them nearly 35 Teraflops cards in practice, with nearly double the memory bandwidth of PS5. PS4 games are absolutely not running into a 10GB VRAM limit. Meanwhile 4090's full time boost clock to 2900Mhz+ making them nearly 100TF cards with nearly triple the memory bandwidth of PS5. There is no reality where a 3080 should be running 10FPS better than PS5, nor one where a 4090 struggles to merely double a PS5. This is a God awful port, full stop.
This is a bad port, no disagreements there.

But those 35 tflops figures Nvidia put out are fake. There are more knowledgable PC posters who can explain why that 35 tflops GPU is gimped and is effectively a 20 tflops GPU, but the gist of it is that nvidia is fudging the numbers because the performance does not match the tflops gains from the 2080.

Notice how its pretty much equivalent to a 20 tflops 6800xt despite boasting 35 tflops of power.


6mDPcVU.jpg


This also matches user benchmarks which put it just 60% faster than the 2080.

Now lets take a look at the 6600xt which is AMD's 10.7 tflops card. only 4% slower than the 2080 which on my system ran at 1950 Mhz consistently giving us 11.4 tflops. Math adds up here. However, look at how powerful it is compared to the 20 tflops 6800xt and the 35 tflops 3080. Same thing we saw above.

brMrv1D.jpg


This basically gives us a 1.75x figure if we assume the PS5 is somewhere around a 9.9 tflops 2070 Super or a 1.64x if we go by the 11.4 tflops 2080. Either way, it is no where near the 3-4x claim the numbers have you believe. Of course, ray tracing changes all of this and looking at how Nvidia's 30 series cards are roughly 50-70% better at ray tracing performance, i would assume traditional ray tracing would put it around 2.5x. However, the matrix didnt seem to give my 2080 or 3080 a big edge over AMD cards so it might differ by implementation.
 

Pimpbaa

Member
Probably already said but these games were made for the ps4, which had cpu cores intended for cheap windows tablets plus a now very old gpu. The PS5 versions were barely an upgrade compared to something like Spiderman. They were lazy ports, this is even worse. This reminds me of when Coleco made subpar ports of Donkey Kong to other consoles to make their own console (Colecovision) look better.
 

ACESHIGH

Banned
This reminds me of when Coleco made subpar ports of Donkey Kong to other consoles to make their own console (Colecovision) look better.

My thoughts as well. I'd love to see how returnal is going to run on PC next year. If properly optimized, something like a then relessed 4050 or 4050ti should match a PS5.

Not sure Sony would love entry level cards matching or outpacing their console that early in the console lifecycle. Specially as years go by and the gap vs PC HW grows larger. They will find a way to make stuff like returnal struggle on a 4090
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Probably already said but these games were made for the ps4, which had cpu cores intended for cheap windows tablets plus a now very old gpu. The PS5 versions were barely an upgrade compared to something like Spiderman. They were lazy ports, this is even worse. This reminds me of when Coleco made subpar ports of Donkey Kong to other consoles to make their own console (Colecovision) look better.
Yeah, unlike what some other posters said, these games were literally built on the PS4, not the PC. And just like GOW, the games were built around the shackles of the jaguar CPUs. While other third party games from Ubisoft, activision and EA were built on the PC and downported to the consoles. Which is why they scale much better. Way better multithreading and scale rather well with higher clocks.

The GOW porting studio specifically said that they didnt have the time or resources to rewrite the CPU instructions to be more multithreaded. Which is why you saw such poor performance on GPUs like the 580. My 2080 had a lot of problems running HZD at 120 fps no matter how low i went with the resolution. It drove me nuts. It scaled fine up until maybe 80 fps and then just stopped scaling linearly. I could run the game at 4k 60 fps with DLSS quality but 1440p dlss quality didnt boost by framerate to 120 fps like it does in other games. Better GPUs like the 3080 can brute force it just like GOW but there was definitely a bottleneck in the code there.
 

SlimySnake

Flashless at the Golden Globes
My thoughts as well. I'd love to see how returnal is going to run on PC next year. If properly optimized, something like a then relessed 4050 or 4050ti should match a PS5.

Not sure Sony would love entry level cards matching or outpacing their console that early in the console lifecycle. Specially as years go by and the gap vs PC HW grows larger. They will find a way to make stuff like returnal struggle on a 4090
Returnal showed up on the steamdeck database. A 1.6 tflops GPU. I dont think Sony gives two fucks about any optics anymore.

It also supports full ray traced reflections, shadows and maybe even GI so yes it will bring the 4090 to its knees. Lets not forget that it runs at 1080p 60 fps on the PS5 checkerboarded to 1440p and then temporally upscaled to 4k. So yeah, its also a very poorly optimized game on the PS5 given how Demon Souls with way better graphics runs at 1440p 60 fps and ratchet runs at native 4k 50 fps with ray traced reflections.
 

ACESHIGH

Banned
The GOW porting studio specifically said that they didnt have the time or resources to rewrite the CPU instructions to be more multithreaded. Which is why you saw such poor performance on GPUs like the 580. My 2080 had a lot of problems running HZD at 120 fps no matter how low i went with the resolution. It drove me nuts. It scaled fine up until maybe 80 fps and then just stopped scaling linearly. I could run the game at 4k 60 fps with DLSS quality but 1440p dlss quality didnt boost by framerate to 120 fps like it does in other games. Better GPUs like the 3080 can brute force it just like GOW but there was definitely a bottleneck in the code there.

Yep the "superb" (according to Digital Foundry) God of war PC port was DX11 only. A 2022 high profile release. Go figure...

I can understand and somewhat tolerate growing pains on the first batch of PC ports of these Sony games because they were never intended to be released on pc, but I expect next releases on those engines like forbidden west, gow Ragnarok and last of us part 1 to be much more polished and optimized.

And also all upcoming PS5 games to be developed in tandem with PC versions and not as an afterthought. You can really tell when this is the case (death stranding PC version was stellar because the game was meant to be released on PC from day one)

Even if they delay the PC release for commercial reasons like rockstar does with their PC games.
 
Last edited:

Klosshufvud

Member
I generally dislike how quality of port is decided with high-end hardware. Great PC ports are games like MGSV and DMC4 that scale really well to lower components but can also be made to look great on high-end hardware. Like others mentioned, I feel like critics are way too lenient on recent Sony PC ports. A game that runs well on a 1,6Ghz Jaguar CPU/7870m GPU should be no match for any PC of this decade. A game like GoW ought to be real light weight on performance considering its baseline.
 

Reallink

Member
This is a bad port, no disagreements there.

But those 35 tflops figures Nvidia put out are fake. There are more knowledgable PC posters who can explain why that 35 tflops GPU is gimped and is effectively a 20 tflops GPU, but the gist of it is that nvidia is fudging the numbers because the performance does not match the tflops gains from the 2080.

Notice how its pretty much equivalent to a 20 tflops 6800xt despite boasting 35 tflops of power.


This also matches user benchmarks which put it just 60% faster than the 2080.

Now lets take a look at the 6600xt which is AMD's 10.7 tflops card. only 4% slower than the 2080 which on my system ran at 1950 Mhz consistently giving us 11.4 tflops. Math adds up here. However, look at how powerful it is compared to the 20 tflops 6800xt and the 35 tflops 3080. Same thing we saw above.


This basically gives us a 1.75x figure if we assume the PS5 is somewhere around a 9.9 tflops 2070 Super or a 1.64x if we go by the 11.4 tflops 2080. Either way, it is no where near the 3-4x claim the numbers have you believe. Of course, ray tracing changes all of this and looking at how Nvidia's 30 series cards are roughly 50-70% better at ray tracing performance, i would assume traditional ray tracing would put it around 2.5x. However, the matrix didnt seem to give my 2080 or 3080 a big edge over AMD cards so it might differ by implementation.

I'm aware of the difference, the higher Nvidia numbers represent FP32 flops. Starting with the 3XXX's, half the cuda cores are INT32 cores that can also perform FP32 calculation. So the high teraflop value is real if you had a game using mostly FP32 math, not real if it uses mostly INT32 math, and somewhere in between if it uses a mix of both. 3080 should by all accounts approximately double a PS5 in an "average" raw rasterization workload, more than triple it in RT, and be approaching or exceeding 4x with DLSS. Here it's seeing a 10-15FPS lift over PS5. A quality port should see a 3080 approaching or exceeding 100FPS and a 4090 potentially exceeding 200FPS.
 
Last edited:

yamaci17

Member
It's just that his cpu 1 core perf is too low to compare for these tests maybe
2700 at 3.8 ghz with some jank 2800 MHz memory is no match for console CPUs.

two ccx design, enormous cross ccx latency (140 ns). that kind of latency is practically non existent on consoles due to specific core scheduling they'd use there. on PC you get the full penalty if the game do not care about it (and they have no incentives to care about it, majority of PC gamers have Intel/newer gen AMD CPUs. zen/zen+ CPUs did not see wide adoption in PC gaming space due to how obnoxious their performance WERE back in 2017. the top dog 2700x paired with 3200 mhz ram CONSISTENTLY lost against a puny 6 core 6 thread i5 from 2017 that runs with 2666 mhz rams. zen/zen+ cpus were pathetic, even then, and aged super horridly into this era. you will still get good enough framerates but %1 lows will suffer and you really need to be a patient person or simply have lower standards)

only way to avoid cross ccx latency on PC is not to have a double CCX CPU. They were, to me, experimental CPUs that allowed AMD to have higher yield on first and second gen AMD CPUs. All Intel CPUs are fully monolothic and are no subject to such weird latencies, same goes for Zen 3 and onward.(unless you buy 16+ core parts and run into weird CCD latency issues).

%15-20 lower IPC compared to zen 2 (depends from game to game. some games, only %10. some games it is up to %25. f1 2017 %17, wd legion %15, rainbow six %21 etc.


so PS5's zen 2 cpu running at 3.6 ghz is most likely can match a zen+ cpu running at 4.3 GHz. and he runs the his Zen CPU only at 3.8 ghz.

combine these facts with consoles having lower overhead for CPU bound workloads, his CPU has no business "matching" PS5.

Problematic part, as I said again, zen/zen+ cpus are pos products that most gamers ignored. even a cheap lowly i5 10400f trounces these CPUs.
 
You will surely just use DLSS anyhow as it can't be worse than the in-game TAA even if it has a bit of ghosting.

These are from the 1440p mode on PS5. The image really breaks up when you move the camera and you can see the pixels in hair and vegetation as well as ghosting. Putting it on a GIF actually flatters it as it is even more obvious on a big OLED.

People have been complaining about DLSS3 where every other frame has artifacts but literally every frame in UC4 has artifacts when you move the camera and no-one seems that bothered for some reason.

7pt2syz.png


H0IeGkt.png


yiqNtB2.gif
 

winjer

Gold Member
His comparison between the two platforms is meaningless. An old CPU is absolutely going to hold back performance.

No PC gamer uses this as a source of information.

It's not just that he is using an old CPU.
In previous tests we could see his 2700X was underperforming a lot. The diference to other users was around 30%.
This makes that 2700X slower than a 1700X and closer in performance to a Bulldozer.
 
These are PS4 games, made for decade old 1.8TF hardware. PS5 is an RX 5700XT with somewhere between 9 and 10TF's of performance depending on CPU load. 3080's full time boost clock to around 2Ghz making them nearly 35 Teraflops cards in practice, with nearly double the memory bandwidth of PS5. PS4 games are absolutely not running into a 10GB VRAM limit. Meanwhile 4090's full time boost clock to 2900Mhz+ making them nearly 100TF cards with nearly triple the memory bandwidth of PS5. There is no reality where a 3080 should be running 10FPS better than PS5, nor one where a 4090 struggles to merely double a PS5. This is a God awful port, full stop.

Yes PS5 will drop it's GPU clock (a little bit) when certain scene will require a lot of CPU resources, but in such scenario CPU always becomes a bottleneck, therefore GPU doesn't need to be running at max clock (it's simply waste of energy resources). PS5 GPU should have no problems running at full 10TF clock as long as game require 99% GPU load.

PS5 GPU have mesh shaders, so IDK if we can directly compare it to 5700XT. Also no HW RT will put 5700XT in disadvantage in all games that use RT. PS5 should run circles around 5700XT in games that have mesh shaders implementation or RT support.

But I agree there's something wrong with "Uncharted legacy of thieves" requirements, because PS4 GPU was only 1.8TF, while minimal requirements on PC list 290X, a GPU that's 3x times stronger (5.6TF), and that's just for 720p and minimal settings that looks worse than PS4 version.
 
Last edited:

rofif

Can’t Git Gud
You will surely just use DLSS anyhow as it can't be worse than the in-game TAA even if it has a bit of ghosting.

These are from the 1440p mode on PS5. The image really breaks up when you move the camera and you can see the pixels in hair and vegetation as well as ghosting. Putting it on a GIF actually flatters it as it is even more obvious on a big OLED.

People have been complaining about DLSS3 where every other frame has artifacts but literally every frame in UC4 has artifacts when you move the camera and no-one seems that bothered for some reason.

7pt2syz.png


H0IeGkt.png


yiqNtB2.gif

EDIT: Sorry I know noticed you are talking about artifacting when rotating around character. Not about motion blur itself. OK, I FULLY AGREE !!!

This motion blur is not made to be viewed paused.
I think it's good as it helps to represent movement and velocity.
If You pause any movie, also it will be blurry.... that's how motion blur works.

At 30 fps if you rotate fast, the shutter speed is longer to help fill the gaps.
At 60 it's faster.
On pc you can disable it at 120 fps why not
But I think it looks stuttery without motion blur. Especially on oled as it does not blur itself.
Oleds are notoriously bad when it comes to panning/rotating shots. Especially at 30fps. And motion blur here helps a TON. I replayed the remaster 4k30 this year and it looked fantastic. Not at all stuttery like some other low fps content
 
Last edited:

DenchDeckard

Moderated wildly
You will surely just use DLSS anyhow as it can't be worse than the in-game TAA even if it has a bit of ghosting.

These are from the 1440p mode on PS5. The image really breaks up when you move the camera and you can see the pixels in hair and vegetation as well as ghosting. Putting it on a GIF actually flatters it as it is even more obvious on a big OLED.

People have been complaining about DLSS3 where every other frame has artifacts but literally every frame in UC4 has artifacts when you move the camera and no-one seems that bothered for some reason.

7pt2syz.png


H0IeGkt.png


yiqNtB2.gif

Thank you, it looks like shit! but consoles can get away with this kind of stuff because they have to. PC is deffo looked at with more of a fine toothed comb because of the cost of parts but shit like you shared is awful too, looks terrible.
 

rofif

Can’t Git Gud
And worse LOD.. and additional visual glitches.
So what are we doing here Vick?
Are we going back to playing ps4 pro version on ps5 ?
I think that one is perfect or are there any problems during backwards compatibility?
 

MidGenRefresh

*Refreshes biennially
Horizon was also in a sad state when it was released on PC and it was quickly patch. Today PC is the best place to play it by a country mile.

Let's see if Sony patches this turd.
 

Vick

Gold Member
So what are we doing here Vick?
Imagining how the games would look with RT shadows/PCSS at 4K/60fps + intact bounce lighting and all the little things here and there disappeared/broken after the Pro version..





malcolm-x-angry.gif


Are we going back to playing ps4 pro version on ps5 ?
I think that one is perfect or are there any problems during backwards compatibility?
No problem there, but I just can't go back to 30fps.. especially with these games as they feel fucking sublime to control at 60fps.

For a truly perfect and definitive edition, I'll have to wait for the PS4 version to be emulated perfectly, and for some good soul to bother implementing the improvements saw in the Legacy of Thieves versions on top of it. Of just fix the latter using the former.
 
Last edited:

ACESHIGH

Banned
But I agree there's something wrong with "Uncharted legacy of thieves" requirements, because PS4 GPU was only 1.8TF, while minimal requirements on PC list 290X, a GPU that's 3x times stronger (5.6TF), and that's just for 720p and minimal settings that looks worse than PS4 version.

That's another thing I can't believe we are still seeing in 2022: BS PC requirements. The steam Deck can run this game easily at its native resolution and higher than 30 fps with it's IGPU.
There's no way an R9 290, an old but top of the line GPU does run the game at only 720p 30 FPS low settings.
 

rofif

Can’t Git Gud
Imagining how the games would look with RT shadows/PCSS at 4K/60fps + intact bounce lighting and all the little things here and there disappeared/broken after the Pro version..





malcolm-x-angry.gif



No problem there, but I just can't go back to 30fps.. especially with these games as they feel fucking sublime to control at 60fps.

For a truly perfect and definitive edition, I'll have to wait for the PS4 version to be emulated perfectly, and for some good soul to bother implementing the improvements saw in the Legacy of Thieves versions on top of it. Of just fix the latter using the former.

I am good with 30fps but 40-50 at 4k is much better.
I've tested this when playing remaster and the bounce flashlight lighting is still there. Are some instances broken ?
 

JackSparr0w

Banned
My only question is why are they using the 2070 when we have the perfect ps5 equivalent in the form of the AMD 6650 XT?
 
Last edited:

Midn1ght

Member
better yet without hdr and motion blur.
wccftech posted a picture of the settings on their website and it has HDR and motion blur support.
Maybe a case of review copies not having all the settings yet. I know some of you are eager to shit on PC ports as quickly as you can but let's wait for the launch version to release and see if the settings are in.

UNCHARTED-Legacy-of-Thieves-Collection-Screenshot-2022-2-1920x1080.jpg

WCCTFECH LINK
 

Vick

Gold Member
I've tested this when playing remaster and the bounce flashlight lighting is still there. Are some instances broken ?
It's still there of course, just the radius was reduced in half. There are some examples in that infamous Thread.
I blame the 120fps mode, as seen with TLOU Part I unlocked this tech is extremely taxing.
 
Top Bottom