• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Deathloop PC vs PS5, Optimised Settings, Performance Testing + More

ACESHIGH

Banned
What is this game bringing to the table that it absolutely cannot run on a ps4 or xbox one? Its not a CPU hog, it does not need the ultra fast SSD for a core gameplay mechanic... not sure why this was just released on next gen consoles only. I am sure I can play this game at 1080p 30 FPS max settings on an FX 6300 + RX 580 8GB + 8GB ram.

I think the old consoles can still play some of this cross gen games at 720p 30 FPS. They were never truly pushed to the level seventh gen consoles were, except with Cyberpunk 2077.

FS 2020 - Cyberpunk 2077 PC and Ratchet & Clank are the only truly next gen games so far.
 

adamosmaki

Member
Why when ps5 outperform 5700XT is because bad pc optimization but when it's the same performance it's the only truthful scenario? Based on what? GPU frequency it's quite higher on ps5 but as always just count the specs which fit the favourite narrative.
Based on everything we have seen so far and based on the fact PS5 Gpu and 5700xt both are about 10tf and are based on a similar architecture with some RT capabilities
Some of you ps5 fanboys really do think you are getting an rtx3080 or 6800xt level of GPu?
 

assurdum

Banned
I would say it's mostly irrelevant and ~10TF RDNA 1/2 card should be comparable in raster as long as it's not memory BW limited or something.



This is also what alex says:

INsSPXK.png


But I don't say that PS5 is worse than 2060S in raster. You can see where PS5 lands based on pure specs alone and that in most games I think it shows results similar to DL.



RDNA1 and RDNA2 cards can be totally compared to each other and PS5, it's the same fucking arch.
Are you kidding? It's irrelevant the difference in frequency between 1,62 Vs 2,20Mhz for the GPU performance? Really? Especially when it's tied per WATTS?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I would say it's mostly irrelevant and ~10TF RDNA 1/2 card should be comparable in raster as long as it's not memory BW limited or something.



This is also what alex says:

INsSPXK.png


But I don't say that PS5 is worse than 2060S in raster. You can see where PS5 lands based on pure specs alone and that in most games I think it shows results similar to DL.



RDNA1 and RDNA2 cards can be totally compared to each other and PS5, it's the same fucking arch.
Alex also said this:
Based on tests with a 2080 Ti, it looks like a 2080 Super or RTX 3060 Ti would be required to match or exceed PlayStation 5's output.

And thats my point. you cant take one game and say THAT IS IT. Only if PC version is shit. MATH WINS. Fatality.
 

Armorian

Banned
Alex also said this:


And thats my point. you cant take one game and say THAT IS IT. Only if PC version is shit. MATH WINS. Fatality.

What game was this?

There are examples like COD: BO where you can't match PS5 graphics (it's lower than low in some aspects) and properly measure GPU performance compared to console version.

Why you persist? Frequency is very important in the AMD performance. You literally avoid the argument.

10TF card is 10TF card (of the same arch and as proven by HU, they are in raster), and this can be achieved with different CU amount and clock combinations. There maybe some small differences in different scenarios with more CUs vs higher clock and vice versa but nothing major.
 
Last edited:

Zathalus

Member
Are you kidding? It's irrelevant the difference in frequency between 1,62 Vs 2,20Mhz for the GPU performance? Really? Especially when it's tied per WATTS?
Most 5700 XT GPUs (not the shitty reference cooler) average around 2000-2050Mz. Which is around 10.2-10.5 TFLOP, almost exactly a PS5 GPU. PS5 is likely a little bit faster due to the higher clock speed and lower amount of CUs that need to be kept fed, but the 5700 XT doesn't have to share memory bandwidth so the difference will be minor.

Saying a PS5 is roughly equal to or slightly better than a 5700 XT in non-RT makes sense.
 

SlimySnake

Flashless at the Golden Globes
This game is broken on Nvidia.

With more games in the future we will have more samples but you can check GPU specs yourself and see where PS5 GPU lands in PC space.
Who is to say other games are not broken on AMD gpus? Gears 5 and Forza are big AMD titles. Are they broken too?

There is always going to be games that perform better or worse on Nvidia or AMD gpus. Thats why you cant make definitive statements like the one you made earlier.
 

01011001

Banned
Never noticed the 59fps issue on PS5... And I have 1080p TV, 1440p is enough for me....

Jesus Christ... why PC gamer is so retard for the little detail like "OMG is 59fps, is so trash...."..;

Just play and enjoy games... fk**** PCtard...

I'm saying that but I have also a "better config than PS5" (and maybe 85% of "pc player") with my 5800x/2060S.. But i'm playing Deathloop on PS5... guess why ? Dualsense, just a FPS Solo, not full competition fps... and just want playing peacefully on my sofa.

the PS5 version is awful tho, the controls are ridiculously bad, the FOV is super low and no button mapping options. I literally stopped playing it on PS5 and started on PC even tho my PC is way worse and I have to use FSR to even hit a stable 60fps... still worth it, it was a pain to play it on PS5, and the graphics are not the issue here. sadly many developers still are inkompetent when it comes to making good controller aiming, and they still think FOV settings are not needed, even tho it is the easiest accessibility feature you could ever implement, especially since it's already implemented but simply locked out of this version of the game due to a bad choice by the developers.
 
Last edited:

Sosokrates

Report me if I continue to console war
Why when ps5 outperform 5700XT is because bad pc optimization but when it's the same performance it's the only truthful scenario? Based on what? GPU frequency it's quite higher on ps5 but as always just count the specs which fit the favourite narrative.

So far the PS5s higher frequency has yet to show a significant performance advantage over a lower clocked but equal tflops gpu.
 

squarealex

Member
the PS5 version is awful tho, the controls are ridiculously bad, the FOV is super low and no button mapping options. I literally stopped playing it on PS5 and started on PC even tho my PC is way worse and I have to use FSR to even hit a stable 60fps... still worth it, it was a pain to play it on PS5, and the graphics are not the issue here. sadly many developers still are inkompetent when it comes to making good controller aiming, and they still think FOV settings are not needed, even tho it is the easiest accessibility feature you could ever implement, especially since it's already implemented but simply locked out of this version of the game due to a bad choice by the developers.
My only complaining on the game is the UI very tiny on TV and maybe the loading a bit longer for the PS5..

I never playing a console game on monitor.... and Deathloop is not a big fast fps for me... so FOV is not a issue here..
 

Armorian

Banned
Who is to say other games are not broken on AMD gpus? Gears 5 and Forza are big AMD titles. Are they broken too?

There is always going to be games that perform better or worse on Nvidia or AMD gpus. Thats why you cant make definitive statements like the one you made earlier.

From time to time there are broken ports on PC, AMD/Nvidia partnered or not. And there are not a good representation of how PC GPUs compare to consoles.
 

01011001

Banned
My only complaining on the game is the UI very tiny on TV and maybe the loading a bit longer for the PS5..

I never playing a console game on monitor.... and Deathloop is not a big fast fps for me... so FOV is not a issue here..

I got dizzy in some scenes, especially using double jumps and trying to get up to roofs.

also the aiming is just so bad, it just doesn't feel good to play on controller, and I imagine it gets worse once all the abilities come in.
no joke, I literally ordered a new PC because this game reminded me that I can't live with only consoles knowing that many devs are still completely oblivious on how to properly make console shooters feel good, and I need a decent alternative to the new consoles to fall back to
 
Last edited:

SlimySnake

Flashless at the Golden Globes
So far the PS5s higher frequency has yet to show a significant performance advantage over a lower clocked but equal tflops gpu.
The 6600xt with its 2.6 ghz clockspeeds and the 5700xt with its 1.98 ghz ingame clockspeeds are a good comparison. Thats a roughly 10.2 tflops gpu vs a 10.6 tflops GPU. The 6600xt is bottlenecked by a 128bit memory interface but that most only plays at higher resolutions.

lgczcsyx5lg71.jpg


Timespy benchmarks give it a lead of 11%.

3DMark Time Spy provides a nice overview of how the cards stack up. The RX 6600 XT and RX 5700 XT were close with about an 11% difference in favor of the RX 6600 XT.

Typically, tflops are tflops, but I always found it interesting that Nvidia GPUs enjoyed a significant lead over AMD GPUs until RDNA when AMD was finally able to increase the clockspeeds to 1.8-1.98 ghz and all of a sudden, they started to match the standard rasterization performance of equivalent Nvidia graphics cards. Nvidia GPUs starting from Pascal had always been very high at around 1.7-2.0 ghz. My rtx 2080 has hit 2050 mhz even though the boost clocks were supposed to max out at 1.7 ghz according to specs.

Now the 6600xt hits 2.6 ghz and is able to offer 11% more performance despite 8 fewer CUs and just a minor 3% increase in tflops.
 
Last edited:
Never noticed the 59fps issue on PS5... And I have 1080p TV, 1440p is enough for me....

Jesus Christ... why PC gamer is so retard for the little detail like "OMG is 59fps, is so trash...."..;

Just play and enjoy games... fk**** PCtard...

I'm saying that but I have also a "better config than PS5" (and maybe 85% of "pc player") with my 5800x/2060S.. But i'm playing Deathloop on PS5... guess why ? Dualsense, just a FPS Solo, not full competition fps... and just want playing peacefully on my sofa.

The game is great... but vid like this can trigger me... Like, who the fk... check the dynamic resolution and framerate while playing with stable 60fps ???

Even Switch owners hate 59fps. It just looks wierd at that framerate from what i understand.
 

Sosokrates

Report me if I continue to console war
The 6600xt with its 2.6 ghz clockspeeds and the 5700xt with its 1.98 ghz ingame clockspeeds are a good comparison. Thats a roughly 10.2 tflops gpu vs a 10.6 tflops GPU. The 6600xt is bottlenecked by a 128bit memory interface but that most only plays at higher resolutions.

lgczcsyx5lg71.jpg


Timespy benchmarks give it a lead of 11%.



Typically, tflops are tflops, but I always found it interesting that Nvidia GPUs enjoyed a significant lead over AMD GPUs until RDNA when AMD was finally able to increase the clockspeeds to 1.8-1.98 ghz and all of a sudden, they started to match the standard rasterization performance of equivalent Nvidia graphics cards. Nvidia GPUs starting from Pascal had always been very high at around 1.7-2.0 ghz. My rtx 2080 has hit 2050 mhz even though the boost clocks were supposed to max out at 1.7 ghz according to specs.

Now the 6600xt hits 2.6 ghz and is able to offer 11% more performance despite 8 fewer CUs and just a minor 3% increase in tflops.

Nice example. Cerny mentioning higher clock speed giving significant gains is pretty much a marketing lie.
 

Mr Moose

Member
Nice example. Cerny mentioning higher clock speed giving significant gains is pretty much a marketing lie.
PS5 16GB GDDR6 256 bit 448 GB/s
6600XT 8GB GDDR6 128 bit 256 GB/s
5700XT 8GB GDDR6 256 bit 448 GB/s
Almost like the clock speed isn't the bottleneck in the 6600XTs case.
 
Last edited:

Sosokrates

Report me if I continue to console war
PS5 16GB GDDR6 256 bit 448 GB/s
6600XT 8GB GDDR6 128 bit 256 GB/s
5700XT 8GB GDDR6 256 bit 448 GB/s
Almost like the clock speed isn't the bottleneck in the 6600XTs case.

The 6600XTs 32mb ”infinity cache" which has 832gb/s will make up for its GDDR6 bandwidth deficit. I imagine it works a bit like the xbox ones esram.

I bet both sony and Microsoft were thinking about adding infinity cache, but it probably proved to expensive.
 
Last edited:

Gamer79

Predicts the worst decade for Sony starting 2022
Compare the Price vs Performance and the PS5 wipes it's ass with all of those cards. Those cards range from $500 on the Low end to well over $1500 on the high end. Good luck finding most of those cards n stock as well.(in most cases a 3080 at retail price is like finding a leprecaun.)
 
Last edited:

Sosokrates

Report me if I continue to console war
Xbox One's eSRAM (and ew, DDR3) was a mistake which was corrected by the One X's GDDR5.

Well. It is not as good as a the PS4s GDDR5 solution, however AMDs infinity cache system seems to be producing good results.
Point is, the 6600Xt is not bandwidth bottlenecked.
 

Mr Moose

Member
Point is, the 6600Xt is not bandwidth bottlenecked.
It is.

The 6600 XT’s rasterization performance is strong and it can clearly handle 1080p ray tracing today, but AMD’s rather conservative positioning for the card, combined with the evidence we found for memory bandwidth bottlenecks, leaves us uncertain about its long-term strengths. If you just bought a high-end 1080p monitor and you know you won’t be upgrading, buy in confidence. Everyone else may want to think it over.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Compare the Price vs Performance and the PS5 wipes it's ass with all of those cards. Those cards range from $500 on the Low end to well over $1500 on the high end. Good luck finding most of those cards n stock as well.(in most cases a 3080 at retail price is like finding a leprecaun.)
Thank Lord Gabe pc gaming is far from being only about performance so price diference vs console doesn't matter...
 
Top Bottom