• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Deathloop PC vs PS5, Optimised Settings, Performance Testing + More

Sosokrates

Report me if I continue to console war


  • at launch, stutter is from botched mouse movement
  • because of this, stutter is apparent with VRR
  • lock your framerate
  • at first launch, there's a consistent (actual) stutter, any new launch removes the stutter
  • beta patch "juliashotme" fixes the mouse stutter, does not work on non-60/120fps refreshes yet
  • to maintain a target framerate, you need a lot of headroom on your gpu, recommended to use dynamic resolution
  • performance dynamic res aggressively scales the image to maintain framerate
  • optimized settings is essentially ps5 settings
  • PS5 uses balanced ambient occlusion
  • ps5 model detail is at high
  • ps5 water detail uses very high setting
  • ps5 using medium motion blur
  • shadows should be high or ultra
  • very high terrain
  • ultra decals (affects bullet holes)
  • optimized increased 31% over ultra settings
  • visual quality mode ps5 has similar performance between a 2060S and 2070S / 10% higher than a 5700
  • PS5 RT is lower than the lowest PC setting
  • 2060S can produce locked 30fps and better, not blurry IQ than PS5's RT mode
  • 3080 and 6800XT has same performance, rasterized
  • RT adds 7.2ms of render time for 3080, 11.2ms for 6800
  • performance RTAO is the better option
  • PS5 uses High Textures
  • going over budget does not cause stuttering, just cause low res textures closer to the camera
  • performance scales with bandwidth
  • 4.5% performance loss v.high vs low on a 2070S
  • v.high textures are worth it
  • Ryzen 3600/2060S can do dynamic 4K/60 (raster only)
  • 3080 can do native 4K/~70 (raster only)
  • with RT, dynamic 1440/60
  • with RT, 3080 dynamic 4K/60
 
Last edited:
Whatever the results he is having as usual his comparison is flawed and pointless. He is saying to fairly compare PC GPU against PS5 GPU but he uses a much more powerfull CPU on PC vs the mid-range CPU on PS5. The worse in this is that he often repeats he uses such a powerfull high-end CPU in order to take the CPU bottleneck out of the equation. But that can only work by comparing PC GPUs between them and not against PS5!

The only fair comparisons are done by NXGamer because he uses a PC CPU a bit more powerfull but still similar to PS5 CPU so he can then fairly compare the GPUs.

EDIT: apparently I was wrong as his using a modest 3600 for his comparisons.
 
Last edited:
Whatever the results he is having as usual his comparison is flawed and pointless. He is saying to fairly compare PC GPU against PS5 GPU but he uses a much more powerfull CPU on PC vs the mid-range CPU on PS5. The worse in this is that he often repeats he uses such a powerfull high-end CPU in order to take the CPU bottleneck out of the equation. But that can only work by comparing PC GPUs between them and not against PS5!

The only fair comparisons are done by NXGamer because he uses a PC CPU a bit more powerfull but still similar to PS5 CPU so he can then fairly compare the GPUs.
"As usual his comparisons make my plastic box look bad so he's just a big ol' bully!"

He's using a Ryzen 3600 in the video and the game uses like 10-20% of it. At 60 fps the CPU is a complete non-factor in this game.
 
Last edited:

01011001

Banned
it is very surprising to A: not have an RT performance mode on PS5 anf B: that RT is so low quality as well on PS5.

I just turned on the sun shadows for the lulz on my 1070 and still got 30fps at ~1080p, so they don't seem that demanding tbh.
 

Zathalus

Member
Whatever the results he is having as usual his comparison is flawed and pointless. He is saying to fairly compare PC GPU against PS5 GPU but he uses a much more powerfull CPU on PC vs the mid-range CPU on PS5. The worse in this is that he often repeats he uses such a powerfull high-end CPU in order to take the CPU bottleneck out of the equation. But that can only work by comparing PC GPUs between them and not against PS5!

The only fair comparisons are done by NXGamer because he uses a PC CPU a bit more powerfull but still similar to PS5 CPU so he can then fairly compare the GPUs.
A Ryzen 3600 is not much more more powerful then the PS5 CPU. Not that is matters when the resolution scaler for this game is GPU dependant.
 

01011001

Banned
Whatever the results he is having as usual his comparison is flawed and pointless. He is saying to fairly compare PC GPU against PS5 GPU but he uses a much more powerfull CPU on PC vs the mid-range CPU on PS5. The worse in this is that he often repeats he uses such a powerfull high-end CPU in order to take the CPU bottleneck out of the equation. But that can only work by comparing PC GPUs between them and not against PS5!

The only fair comparisons are done by NXGamer because he uses a PC CPU a bit more powerfull but still similar to PS5 CPU so he can then fairly compare the GPUs.

people like you, who have evidently no idea what they're talking about, should just stop posting in threads like this...

everything you said is complete nonsense. the game barely uses the CPU, and even a Zen 1 CPU would run this no problem, so even bringing up the CPU he used is fucking redundant, especially since the CPU was basically never in any case limiting these tests at any point and didn't even come close to 50% utilisation

nothing in this video was in any way a wrong, misleading or unfair comparison
 
Last edited:

Md Ray

Member
It's interesting to see that you require an RTX 3090 to get a little over double the perf of PS5 at the same settings.

EyFmKOr.jpg


And on the AMD side an RX 6800 XT, even with twice the CU count, and a higher core clock isn't enough to get it to hit 2x the perf of PS5, but it's pretty close.

7dyclvb.jpg


So the PS5's raster perf in this game based upon the above result is around RTX 2070 non-Super level, which interestingly has the exact same specs in terms of total BW (although shared on PS5), ROPs, TMUs, SM, and shader/CUDA core count as the PS5's GPU right down to the amount of L2 cache, lol. Of course, the PS5 has higher throughput in every way possible in terms of FP32, pixel, texture rate, etc.

RTX 2070 (non-S)
pNYukqc.png

I think PS5 has 128 KB L1 cache per CU

PS5
FWF1MVK.png
 
Last edited:
people like you, who have evidently no idea what they're talking about, should just stop posting in threads like this...

everything you said is complete nonsense. the game barely uses the CPU, and even a Zen 1 CPU would run this no problem, so even bringing up the CPU he used is fucking redundant, especially since the CPU was basically never in any case limiting these tests at any point and didn't even come close to 50% utilisation

nothing in this video was in any way a wrong, misleading or unfair comparison
Indeed I missed the Ryzen 3600 part. That's an improvement indeed comparisons to some of his others comparisons.
 

SlimySnake

Flashless at the Golden Globes
It's interesting to see that you require an RTX 3090 to get a little over double the perf of PS5 at the same settings.

EyFmKOr.jpg
36 tflops lol. 3.6x more tflops, only 2x more performance.

Hopefully Nvidia goes back to the drawing board with their next gen cards because just increasing shader processors isnt enough. DLSS 2.0 saved their asses, but even at 30 tflops, the 3080 was only 1.8x more powerful despite having almost 3x more tflops.

AMD's 6900xt when overclocked can already beat out 3090 in some non-ray traced games. Thats a $1,000 GPU beating out a $1,500 GPU.
 

Andodalf

Banned
Agreed. I've only played 2 console games where the ray-tracing blew me away: Ratchet and Clank Rift Apart(PS5) and Metro Exodus(PS5).

I'd rather developers drop the rt and just use all the horsepower they can to get native 4k/60fps. Heck, I'd settle for 1440p/60+fps(my gaming PC) on current gen consoles.

Uh, that’s what most every game is trying to do? Including this one?
 

01011001

Banned
Indeed I missed the Ryzen 3600 part. That's an improvement indeed comparisons to some of his others comparisons.

it wouldn't matter either way, there's barely a single game on the market that can't run at a locked 60fps with the CPU of the PS5.
it's usually only if you try high framerates like 120fps and above when the CPU really will limit you on PC. even using an old Zen 1 CPU you are usually absolutely fine running almost anything at 60fps, and Zen 1 is notoriously bad for gaming compared to Intel CPUs of the same time
 

rofif

Banned
That's my point ! As usual there is a problem with a pc version.
Stutter every 600 frames, weird mouse movement... and some other.
I am sure it will be fixed asap but cmon... you meet recommended specs, you spent 2k usd on pc. you have updated drivers and everything.
If I have to spend 5 minutes troubleshooting, I am already annoyed.... and I will spend way longer.
I am glad I did not got this on pc yet. That mouse stuff would drive me crazy and I would spend way too much time analyzing it.
Sure, 3080 can do 4k60 no problem still.

For real though - sorry for repeating myself on this in every pc topic. It's just years and years of the hobby and I would expect more at this point and price.

edit: And wtf do we need 5 settings for CACO ambient occlusion 3 fps apart from low to ultra ? Or model low vs ultra.... 3fps.
Like cmon. too granular settings seem to be there just for the sake "we need a lot of settings on a pc".
Make settings reasonable. Allow to REALLY turn down graphics to REALLY help with performance. As it is, maxed out, the game does not look too different from ps5 version. Actually it looks identical aside from resolution and some slight loss of sharpness
 
Last edited:

01011001

Banned
Agreed. I've only played 2 console games where the ray-tracing blew me away: Ratchet and Clank Rift Apart(PS5) and Metro Exodus(PS5).

I'd rather developers drop the rt and just use all the horsepower they can to get native 4k/60fps. Heck, I'd settle for 1440p/60+fps(my gaming PC) on current gen consoles.

uhm... that's what almost every dev does. there's almost always a dedicated RT mode and one or more other modes that either prioritize resolution or performance
 
36 tflops lol. 3.6x more tflops, only 2x more performance.

Hopefully Nvidia goes back to the drawing board with their next gen cards because just increasing shader processors isnt enough. DLSS 2.0 saved their asses, but even at 30 tflops, the 3080 was only 1.8x more powerful despite having almost 3x more tflops.

AMD's 6900xt when overclocked can already beat out 3090 in some non-ray traced games. Thats a $1,000 GPU beating out a $1,500 GPU.

TF count on Ampere cards is pointless and shouldn't be used to compare them with anything.

 
Last edited:

01011001

Banned
Hahaha. It only depend on how shitty PC version is.

PS5 GPU is ~5700XT in pure raster, or something between 2070 and 2070 Super.

This is it and math backs it up.

yeah, but it helps a lot to not have Windows running and using a set hardware spec + specific APIs developed for said hardware.

so in real world performance it's usually better than that when properly targeted.

akrane is not the best when it comes to console ports, as we know from Dishonored 2 and Prey, both of which were awful on most consoles, where Prey was only barely playable on a One X, and all the other Systems gave a terrible experience with stutters and crazy input latency.
meanwhile the PC version was among the best Cryengine titles in terms of optimisation that I ever played
 
Last edited:

Armorian

Banned
Why when ps5 outperform 5700XT is because bad pc optimization but when it's the same performance it's the only trustful scenario? Based on what?

5700XT is ~10TF

Wer know that RDNA1 and RDNA2 GPUs has pretty much the same IPC despite AMD claiming otherwise:



So ~10TF RDNA1 GPU should perform almost exactly like PS5 GPU. And this game and some others shows this. Some other games don't, and that includes turd like AC Valhalla.

When PC port is competent PC version should perform the same as PS5 version on the same settings on 5700XT. There is no way around this, no matter if PS5 GPU is RDNA1, 2 or maybe 3 even :messenger_tears_of_joy:
 

assurdum

Banned
5700XT is ~10TF

Wer know that RDNA1 and RDNA2 GPUs has pretty much the same IPC despite AMD claiming otherwise:



So ~10TF RDNA1 GPU should perform almost exactly like PS5 GPU. And this game and some others shows this. Some other games don't, and that includes turd like AC Valhalla.

When PC port is competent PC version should perform the same as PS5 version on the same settings on 5700XT. There is no way around this, no matter if PS5 GPU is RDNA1, 2 or maybe 3 even :messenger_tears_of_joy:

And what about the gpu frequency? Pixel fill rate and other specs tied to the GPU speed?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Hahaha. It only depend on how shitty PC version is.

PS5 GPU is ~5700XT in pure raster, or something between 2070 and 2070 Super.

This is it and math backs it up.
huh? I am not the one saying this. Alex is. His own tests back it up.

You cant take one test that backs up the results you like and then ignore the ones you dont.

Bizarre post.
 

assurdum

Banned
Comparing just the TFLOPs (theoretical peak VALU usage) count between GPUs with entirely different architectures & customizations & deciding which one’s the most powerful based on that is the one of the most dumbest things I’ve heard of… and yet here we are.
IMO ps5 GPU it's a very particular GPU, count just TF to measure it's specs, with it's so higher frequency, take a lot of stuff out of the equation.
 
Last edited:
IMO ps5 GPU it's a very particular GPU, count just TF to measure it's specs, with it's so higher frequency, take a lot of stuff out of the equation.
This is true. Various DF and other outlet testing did show that the PS5 can hit 2080 levels of performance in certain cases (non RT). On the other end, when RT is in the picture, a 2060S can match it. Just comparing TF counts doesn't exactly show the big picture.
 

Armorian

Banned
And what about the gpu frequency? Pixel fill rate and other specs tied to the GPU speed?

I would say it's mostly irrelevant and ~10TF RDNA 1/2 card should be comparable in raster as long as it's not memory BW limited or something.

huh? I am not the one saying this. Alex is. His own tests back it up.

You cant take one test that backs up the results you like and then ignore the ones you dont.

Bizarre post.

This is also what alex says:

INsSPXK.png


But I don't say that PS5 is worse than 2060S in raster. You can see where PS5 lands based on pure specs alone and that in most games I think it shows results similar to DL.

Comparing just the TFLOPs (theoretical peak VALU usage) count between GPUs with entirely different architectures & customizations & deciding which one’s the most powerful based on that is the one of the most dumbest things I’ve heard of… and yet here we are.

RDNA1 and RDNA2 cards can be totally compared to each other and PS5, it's the same fucking arch.
 
Last edited:
Top Bottom