• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Death Stranding Director's Cut: PC vs PS5 Graphics Breakdown

SlimySnake

Flashless at the Golden Globes
PS5 is outperforming the RTX 2080 here even with Alex using a 10900k. If the game is not CPU bound, why not use a less powerful CPU like a Ryzen 3600 or even the 2700 NX Gamer uses? 10900k has 10 cores, 20 threads running at 5.2 ghz. the ryzen 3600 has 6 threads and 12 threads and runs at 4.3 ghz. Since the PS5 reserves 1 core for its OS, the 3600 is probably the better CPU to use.

zWKva8t.jpg
 
Last edited:

Arioco

Member
- An RTX 2060 Super is 71.2% of PS5's performance at the exact same settings.

- A 5700 is 75.4% of PS5's.

- A 5700 XT is 83.5%.

- An RTX 2070 Super is 88.5%.

- An RTX 2080 is 96.6% of PS5's performance.


So according to Alex PS5 performs somewhere between an RTX 2080 and a RTX 2080 Super.

Not bad for a machine that is 399, right? 🤷‍♂️
 
Last edited:

SlimySnake

Flashless at the Golden Globes
So according to Alex PS5 performs somewhere in between an RTX 2080 and a RTX 2080 Super.
I dont think this test is that accurate. I upgraded my CPU last year and i saw better performance in pretty much all games even under 60 fps.

This test is also flawed because hes using GPUs that have dedicated 448 GBps of bandwidth whereas the PS5 has to share its 448 GBps of bandwidth with the CPU so for all we know its GPU performance might be getting bottlenecked by the vram bandwidth.

Lastly, by turning on Vsync, he saw massive drops in framerate on PC GPUs because the GPU wasnt being utilized 100%. He had to hack in and enable triple buffering. How do we know the PS5 GPU is being fully utilized? What if it is also at around 80% utilization? We saw the 2060 Super drop frames below 60 fps and still had only 80% utilziation so why are we assuming the PS5 isnt also suffering from the same issue?
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
- An RTX 2060 Super is 71.2% of a PS5's performance at the same xact same settings.

- A 5700 is 75.4% of PS5's.

- A 5700 XT is 83.5%.

- An RTX 2070 Super is 88.5%.

- An RTX 2080 is 96.6% of PS5's performance.


So according to Alex PS5 performs somewhere between an RTX 2080 and a RTX 2080 Super.

That's pretty good. Do we have any idea how many gamers on Steam have a RTX 2080 and a RTX 2080 Super?
 

MikeM

Member
- An RTX 2060 Super is 71.2% of a PS5's performance at the same xact same settings.

- A 5700 is 75.4% of PS5's.

- A 5700 XT is 83.5%.

- An RTX 2070 Super is 88.5%.

- An RTX 2080 is 96.6% of PS5's performance.


So according to Alex PS5 performs somewhere between an RTX 2080 and a RTX 2080 Super.

Not bad for a machine that is 399, right? 🤷‍♂️
Proof to the haters that the PS5 can handle its own. Slightly above 2080 performance is awesome.
 
Kind of a disingenuous comparison if you ask me. Doesn't touch on DLSS really at at all. Which is what I'd be more interested. How quality DLSS compares in both to PS5's resolution and performance. Really weird comparisons in that video.

Proof to the haters that the PS5 can handle its own. Slightly above 2080 performance is awesome.

I mean....It's a nearly 4 y/o GPU. Nothing to scoff at, but not exactly what I'd be leading the charge with in the Silicon Wars as a talking point.
 
Last edited:

Md Ray

Member
- An RTX 2060 Super is 71.2% of a PS5's performance at the same xact same settings.

- A 5700 is 75.4% of PS5's.

- A 5700 XT is 83.5%.

- An RTX 2070 Super is 88.5%.

- An RTX 2080 is 96.6% of PS5's performance.


So according to Alex PS5 performs somewhere between an RTX 2080 and a RTX 2080 Super.

Not bad for a machine that is 399, right? 🤷‍♂️
Yup! Pretty close to a stock 3060 Ti (in rasterization) if we compare PS5 GPU to the current Ampere architecture.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Yeah, I am not sure. What is your thinking on that as to why he didn't?
laziness. I believe Richard has the 6600xt and 6700xt cards they received for reviews and he didnt want to do the 30 second benchmarks Alex did for this test.

But as the PC expert on the panel, he cannot make the excuse he doesnt have the cards. He should have all the cards. 5700 and 2060 super are so far behind the PS5 here, they should not even be included.
 

DeepEnigma

Gold Member
PS5 is outperforming the RTX 2080 here even with Alex using a 10900k. If the game is not CPU bound, why not use a less powerful GPU like a Ryzen 3600 or even the 2700 NX Gamer uses? 10900k has 10 cores, 20 threads running at 5.2 ghz. the ryzen 3600 has 6 threads and 12 threads and runs at 4.3 ghz. Since the PS5 reserves 1 core for its OS, the 3600 is probably the better CPU to use.

zWKva8t.jpg
You know why.
 

MikeM

Member
Kind of a disingenuous comparison if you ask me. Doesn't touch on DLSS really at at all. Which is what I'd be more interested. How quality DLSS compares in both to PS5's resolution and performance. Really weird comparisons in that video.



I mean....It's a nearly 4 y/o GPU. Nothing to scoff at, but not exactly what I'd be leading the charge with in the Silicon Wars as a talking point.
Sure. But the PS5 is almost two years old now. And the PS5 runs an APU that has the graphics potential to beat a 2080 dedicated GPU. Thats pretty awesome imo.
 
Kind of a disingenuous comparison if you ask me. Doesn't touch on DLSS really at at all. Which is what I'd be more interested. How quality DLSS compares in both to PS5's resolution and performance. Really weird comparisons in that video.



I mean....It's a nearly 4 y/o GPU. Nothing to scoff at, but not exactly what I'd be leading the charge with in the Silicon Wars as a talking point.

I think the holy grail has been, to find a way to directly benchmark in a like-for-like scenario (GPU VS GPU with no DLSS/etc), with identical quality settings, which was able to be done in this video.
 
Sure. But the PS5 is almost two years old now. And the PS5 runs an APU that has the graphics potential to beat a 2080 dedicated GPU. Thats pretty awesome imo.
I'm not taking anything away from the PS5's APU. I like it, I'd just prefer a more thorough comparison. It's like he missed half the tech involved with the platform, which was a HUGE talking point in the first PC release in their own videos. Just weird that it's briefly mentioned and then not compared for both resolution and framerate. It's a great looking game regardless of platform you play it on.
 

Rea

Member
This guy still don't understand how CPU and GPU clock works in PS5. Jesus. Cerny never said that less stressed on CPU means max clock on GPU. Both can Run at Max frequency as long as the work load doesn't exceed the power budget. The power shifts only happens when the CPU running at max clocks with some extra juice to spare, so that the GPU can sqeeze afew more pixel.
 

DeepEnigma

Gold Member
This guy still don't understand how CPU and GPU clock works in PS5. Jesus. Cerny never said that less stressed on CPU means max clock on GPU. Both can Run at Max frequency as long as the work load doesn't exceed the power budget. The power shifts only happens when the CPU running at max clocks with some extra juice to spare, so that the GPU can sqeeze afew more pixel.
It's called, being either in denial, or intentionally obtuse.
 
laziness. I believe Richard has the 6600xt and 6700xt cards they received for reviews and he didnt want to do the 30 second benchmarks Alex did for this test.

But as the PC expert on the panel, he cannot make the excuse he doesnt have the cards. He should have all the cards. 5700 and 2060 super are so far behind the PS5 here, they should not even be included.

Yeah that would have been cool if he pulled those cards too, I'd like to see him pit it against some RTX 3060 series as well, but, maybe those would just run into the VSYNC limit during the scenes he was testing in.
 

SlimySnake

Flashless at the Golden Globes
Yeah that would have been cool if he pulled those cards too, I'd like to see him pit it against some RTX 3060 series as well, but, maybe those would just run into the VSYNC limit during the scenes he was testing in.
the 3060 Ti is roughly on par with the 2080 and I believe Alex did a comparison with the PS5 and found similar results. It might have been AC Valhalla.

3060Ti vs 2080.
 

Arioco

Member
This guy still don't understand how CPU and GPU clock works in PS5. Jesus. Cerny never said that less stressed on CPU means max clock on GPU. Both can Run at Max frequency as long as the work load doesn't exceed the power budget. The power shifts only happens when the CPU running at max clocks with some extra juice to spare, so that the GPU can sqeeze afew more pixel.


It surprised me a bit too, because Cerny himself told DF (not any other website, DF) that it's not the case that you have to choose between running the CPU at full clockspeed or running the GPU at full clockspeed, since the system is provided with enough power for both CPU and GPU to potentially run at 3.5 and 2.23 Ghz, but that actually depends on how power hungry the instructions used may be. As long as you don't exceed the ~200 watts PS5's cooling solutions can handle you're good. If for some reason your code has a higher power consumption than that the console will underclock to a level in which power consumption is those ~200 watts. I don't think its so hard to understand and I can't believe Alex always fails to get it. I mean, it's not like he doesn't have the knowledge or that Cerny haven't explained it to him.
 
Last edited:

Buggy Loop

Member
Sure. But the PS5 is almost two years old now. And the PS5 runs an APU that has the graphics potential to beat a 2080 dedicated GPU. Thats pretty awesome imo.

But you’re also comparing a 3 years old GPU that has 1/3 of its silicon area dedicated to use that new tech.

It’s like taking a McLaren P1 made for Nurburgring to a drag race. I think we all know what happens with RT + DLSS comparisons, which it was tech made for that shift.

Cool for PS5 I guess, you’re rendering legacy techniques faster (if we assume that the PC port is at the best it can be?)
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
There's more 3000 series gpu's on the steam hardware chart than 2000 series, they sold over 45 million gpus in 2021 and the 2000 series ended production in mid 2020.

But it seems like only 13% total have a 3000 series GPU on Steam.
 

ethomaz

Banned
First party optimized engine plays ahead the 3rd-party generic engines. That is why you guys see PS5 beating way stronger GPUs than what it has inside.

Exclusives takes more of the hardware they run.

People will say but Death String was in both PS and PC but the Decima engine was exclusively developed to PS4 and after ported to PC.
 
Last edited:
Top Bottom