• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Diablo 4 - Digital Foundry Tech Review - PS5 vs Xbox Series X/S vs PS4

Gaiff

SBI’s Resident Gaslighter
It's POE but same game with D4, but diablo 4 have much less monsters. Consoles doesn't have X3D CPU
What has this got to do with Diablo 4? Oliver tested playing alone and didn't see dropped frames in large combat scenarios but did see dropped frames in the central city which is where Tom's Hardware tested. The implication is that the city is more demanding than even big fights, making it a good stress test.
 
Last edited:

GymWolf

Member
"Character models in D4 are far more complex with much higher material qualities than D3." D3 is ten years old at that point, dont know why this is praise worthy, imo D4 looks rather bad for a 2023 title.

People have strangely low standards when it comes to isometric games and beat em up.

I guess they can't push too much the graphic with the amount of mobs and effects on screen at any given moment but it's kinda funny that these consoles can't even do 4k native 60 with a game like this.
 

SolidQ

Member
Oliver tested playing alone and didn't see dropped frames in large combat scenarios but did see dropped frames in the central city
He testes Solo, with low density monsters. Need max party + with effects + and alot monsters for PC and Consoles.

where Tom's Hardware tested
You see similar CPU in consoles, but in video solo + low density. So that why blizzard locked to 60 fps.

What has this got to do with Diablo 4
CPU test + similar games, and Blizzard can add in future more monsters for endgame
 

Mr.Phoenix

Member
Either they were really careful with the res and left a lot of performance on the table, or there is a bottleneck somewhere else.

A4PfY3XtVKdAWjkQD9uhLZ-970-80.png.webp


A 3050 offers similar performance at a higher resolution. Granted, the consoles probably don't drop to 51fps but 1440p is also a bit higher than 1260p. A 2070 completely outclasses them which shouldn't happen unless, again, there is a bottleneck somewhere else.
And that is exactly my point. And what you are doing is actually being a lot nicer than me, because toe I am flat-out calling BS.

There is just no way.... NO WAY, that an RX5600XT(7TF RDNA1 GPU) or an RTX2060 (also a 7TF GPU) performs better than a PS5/XSX whose GPU is equivalent to a RX6700 (11TF GPU). I mean the consoles are reconstructing up from native ~1200p for crying out loud. And only barely hitting 60fps? I say barely cause there are still drops.

Unless there's a major bottleneck elsewhere, which I honestly can't even wrap my head around being that I have seen what first-party devs are able to do... this kinda shit is based on poor optimization. And on the surface, it looks great because we are getting 60fps. But seriously, if you really look at it, then you see how bad this shit is.
 

Thirty7ven

Banned
People have strangely low standards when it comes to isometric games and beat em up.

I guess they can't push too much the graphic with the amount of mobs and effects on screen at any given moment but it's kinda funny that these consoles can't even do 4k native 60 with a game like this.

Doesn’t seem that well optimized for consoles, looking at the pc benchmarks they should be performing better.

Blizzard isn’t known for their tech anyway.
 

SolidQ

Member
Unless there's a major bottleneck elsewhere, which I honestly can't even wrap my head around being that I have seen what first-party devs are able to do
Remember ARPG games mostly CPU limited games. D4 have max 400 fps cap. That was test with 7800X3D, rest CPU's are below 400fps
 

Kataploom

Gold Member
My concern with FSR-type tech and what I had feared since these consoles were announced, is that I believe it would make devs lazy. It gives them an out.

At the end of the day, we get a good-looking and performing game, but I can't help but feel devs will put in less work trying to get the most out of these consoles when they know they can always use stuff like FSR to bail themselves out.
If games run and look good, why does it matter the effort put on them? In the end the result is what matters.
 

adamsapple

Or is it just one of Phil's balls in my throat?
People have strangely low standards when it comes to isometric games and beat em up.

I guess they can't push too much the graphic with the amount of mobs and effects on screen at any given moment but it's kinda funny that these consoles can't even do 4k native 60 with a game like this.

Not sure why you think it's peoples low standards. The DF review is pretty glowing in terms of the effects the game is using and most importantly its a 5 (8 if you count the variations) consoles and PC launch that has come out without 90% of the issues almost all modern games are plagued with for weeks, if not months, after launch.

It's commendable.
 

Mr.Phoenix

Member
If games run and look good, why does it matter the effort put on them? In the end the result is what matters.
In a vacuum, this would be ok.

But when you can see how much better it could be running at, to say well at least it looks good at 60fps would be to be settling for the absolute worst game you can get.
Remember ARPG games mostly CPU limited games. D4 have max 400 fps cap. That was test with 7800X3D, rest CPU's are below 400fps
Understandable, but this game is not CPU limited on a PS5/XSX. 6/7 Jaguar single-threaded cores clocked at 1.8Ghz is managing to run this at 30fps. On a 1.8TF GPU. With max bandwidth of 182GB/s.

6 Jaguar cores at 1.8 GHz.
 

DavidGzz

Member
No agenda here, ill buy the game and enjoy it like I did with D1-3.

Just played Horizon, why an cross gen, two year old, open world game where you can fly sky high looks better than in isometric game is a mystery to me.
D4 had all the possibilities to be a gfx showcase. Slow moving, Isometric view, no draw distance to speak of etc. Particle/blood/ gore effects should be crazy in this game, they don't appear to be from all the content I've seen from it so far (and witnessed in the Beta). Demon Souls PS5 seems to have more geometrical Detail than this, why? :(

Diablo 4 is on PS4 and a WIDE selection of old to new PC set ups. It can downscale enough to run on PCs weaker than a PS4 or stress a high end PC. And they did an amazing job. Demon's is a showcase for the PS5, it was optimized for exactly one hardware specification. This is Blizzards prettiest game yet and it blows every other game in the genre away but we still bitching.

Another thing, how is D4 going to look as good as Demon's with dozens of monsters and 12 players shooting off spells? Make that happen in Demon's Souls and watch the PS5 explode or the game would simply freeze up.
 
Last edited:

Bry0

Member
So what? He took a few minutes to show how things have changed since D3.

Yeah.....quite a controversy there. /s
Seriously. D2 didn’t look that amazing in 2000, and d3 honestly looked like a 2008 game not a 2012 game. D4 looks good for what it is, who cares. As long as it’s smooth and plays well, which it does.
 
Last edited:

GymWolf

Member
Not sure why you think it's peoples low standards. The DF review is pretty glowing in terms of the effects the game is using and most importantly its a 5 (8 if you count the variations) consoles and PC launch that has come out without 90% of the issues almost all modern games are plagued with for weeks, if not months, after launch.

It's commendable.
Oh for sure, the game looking good everywhere and not being rigged by bugs is great, i was not talking about that.

Although last night the lag and rubber banding was kinda unbereable.
 

DenchDeckard

Moderated wildly
Maybe the game is more cpu bound? That's where these consoles might suffer a touch.

It looks great on xbox, obviously not as good as my 4090 pc but it's a great performing version and is good to jump on when I'm in bed.
 

I Master l

Banned
And that is exactly my point. And what you are doing is actually being a lot nicer than me, because toe I am flat-out calling BS.

There is just no way.... NO WAY, that an RX5600XT(7TF RDNA1 GPU) or an RTX2060 (also a 7TF GPU) performs better than a PS5/XSX whose GPU is equivalent to a RX6700 (11TF GPU). I mean the consoles are reconstructing up from native ~1200p for crying out loud. And only barely hitting 60fps? I say barely cause there are still drops.

Unless there's a major bottleneck elsewhere, which I honestly can't even wrap my head around being that I have seen what first-party devs are able to do... this kinda shit is based on poor optimization. And on the surface, it looks great because we are getting 60fps. But seriously, if you really look at it, then you see how bad this shit is.

FSR2 is not free so you need to add that to the equation
 

Kataploom

Gold Member
In a vacuum, this would be ok.

But when you can see how much better it could be running at, to say well at least it looks good at 60fps would be to be settling for the absolute worst game you can get.

Understandable, but this game is not CPU limited on a PS5/XSX. 6/7 Jaguar single-threaded cores clocked at 1.8Ghz is managing to run this at 30fps. On a 1.8TF GPU. With max bandwidth of 182GB/s.

6 Jaguar cores at 1.8 GHz.
Well, they rather have less need for working hard, they do work hard enough which makes them not take care of performance and polish. Current bottleneck is the disparity between budget, human resources and project complexity though.

BTW never played Diablo, hope it comes to game pass to try it lol
 

GymWolf

Member
Doesn’t seem that well optimized for consoles, looking at the pc benchmarks they should be performing better.

Blizzard isn’t known for their tech anyway.
I only had network related problems, the game run pretty fine on console otherwise.
 

sankt-Antonio

:^)--?-<
Diablo 4 is on PS4 and a WIDE selection of old to new PC set ups. It can downscale enough to run on PCs weaker than a PS4 or stress a high end PC. And they did an amazing job. Demon's is a showcase for the PS5, it was optimized for exactly one hardware specification. This is Blizzards prettiest game yet and it blows every other game in the genre away but we still bitching.
Horizon is also on PS4 (while looking better than PS4 D4). I don't know why I shouldn't put a billion dollar companies effort to the same standard I hold for PS Studios games (just one example).

Every game can downscale, as we know since the Series S is just doing that. Its completely beside the point. This games looks like it belongs in 2018, not 2023 - purely from a technical standpoint - the art is great. It's all I'm saying.

Oh, and the only Diablo competition atm is a free2play game that's also like a decade old.

Edit: Spelling.
 
Last edited:

DavidGzz

Member
Horizon is also on PS4 (while looking better than PS4 D4). I don't know why I shouldn't put a billion dollar companies effort to the same standard I hold for PS Studios games (just one example).

Every game can downscale, as we know since the Series S is just doing that. Its completely beside the point. This games looks like it belongs in 2018, not 2023 - purely from a technical standpoint - the art is great. It's all I'm saying.

Oh, and the only Diablo competition atm is a free2play game that's also like a decade old.

Edit: Spelling.

Still, link me a vid where the amount of shit that can happen in an isometric ARPG can happen in any of your graphical showcases. In Horizon you fight one large beast, maybe 2 and it's single player. In D4 it's a huge group with 11 other players shooting spells off. The same reason something like Monster Hunter won't look as good as Resident Evil 4 is why a Diablo like will not look as good as a character action game. Use logic. Maybe buy and play the game and then judge it. It looks and plays amazing for all the shit that can happen on screen. This argument has no legs.
 
Last edited:

sankt-Antonio

:^)--?-<
Because you should be comparing apples to apples not apples to cumquats.
AAA to AAA is apples to apples. The genre is not defining/restricting technical capabilities. D4 has the best case scenario of being a gfx showcase with its isometric view --> no pop in because of ultra small draw distance, no sudden camera turns and play style --> slow paced, and super controlled.

Why a 3D open world game is expected to look better than this is beyond me. makes absolutly no sense.
 

DavidGzz

Member
AAA to AAA is apples to apples. The genre is not defining/restricting technical capabilities. D4 has the best case scenario of being a gfx showcase with its isometric view --> no pop in because of ultra small draw distance, no sudden camera turns and play style --> slow paced, and super controlled.

Why a 3D open world game is expected to look better than this is beyond me. makes absolutly no sense.

Also, another thing, have you seen how the graphics work in Horizon? Everything out of the camera is gone. As you rotate it, the graphics rendering behind you don't exist. And yes, it is NOT apples to apples. They are going for different things and use different techniques. There is a reason graphics look so different even from the same dev depending on the genre. Jesus.
 

Mr.Phoenix

Member
Just to be clear...I am not saying this isnt good enough. It is. Just saying, that they grossly under utilized these current-gen consoles. And pointing out that it's a growing trend when it comes to third-party devs.
FSR2 is not free so you need to add that to the equation
And I did... it doesn't account for the disparity either. If you want I can also show you PC benchmarks where FSR was used....it paints an even worse picture.
Well, they rather have less need for working hard, they do work hard enough which makes them not take care of performance and polish. Current bottleneck is the disparity between budget, human resources and project complexity though.

BTW never played Diablo, hope it comes to game pass to try it lol
Maybe, and this kinda brings me to my point. They are choosing to just aim for the lowest profile possible to hit 60fps on the current-gen so they dont spend to much time working on it to get the best they can get out of it.
I only had network related problems, the game run pretty fine on console otherwise.
Runs fine... should be running a lot better than what its running at considering the hardware we have in these consoles. Nd the proof is right out there for all to see.
Look video above with R3600 especially on 3.55 timestamp, and add more monsters/party etc that would even more drop
Again... a Jaguar CPU with 6 cores/threads from 2011 clocked at 1,8Ghz is running this game at 30fps.

If you know or understand anything about CPU loads and game logic.... do you ready think that going from that to a zen2 7 core/14 thread CPU clocked at 3.5Ghz translates to a CPU limitation?
 

Kataploom

Gold Member
Maybe, and this kinda brings me to my point. They are choosing to just aim for the lowest profile possible to hit 60fps on the current-gen so they dont spend to much time working on it to get the best they can get out of it.
Well, yeah, they're developing for older consoles, but imo getting the most out of the performance is actually a great output too. Maybe you're just referring to graphic complexity, which I'd understand but still prefer the focus on performance.
 

sankt-Antonio

:^)--?-<
Also, another thing, have you seen how the graphics work in Horizon? Everything out of the camera is gone. As you rotate it, the graphics rendering behind you don't exist. And yes, it is NOT apples to apples. They are going for different things and use different techniques. There is a reason graphics look so different even from the same dev depending on the genre. Jesus.
Of course they do that, Diablo also does that. The thing is, the slower and more predictable your camera moves the easier it is, because you have more time to stream in assets (of theoretically higher fidelity or more unique assets in general).

You don't have to render a thousand trees in varying LOD stages, cloud simulations and distant mountains, in an isometric game all you render in a piece of rather flat terrain that's moving in a predictable way.

It just isn't looking next gen. Don't know why this is such a hot take. Even DF sayed so in their video.
 

DavidGzz

Member
Of course they do that, Diablo also does that. The thing is, the slower and more predictable your camera moves the easier it is, because you have more time to stream in assets (of theoretically higher fidelity or more unique assets in general).

You don't have to render a thousand trees in varying LOD stages, cloud simulations and distant mountains, in an isometric game all you render in a piece of rather flat terrain that's moving in a predictable way.

It just isn't looking next gen. Don't know why this is such a hot take. Even DF sayed so in their video.

Still not apples to apples or all game genres would look the same. An RTS versus a racing game for example they look wildly different even if they came from the same Dev because they have different restrictions. Plain and simple. Until you become a Dev that successfully launches games in two different genres you really don't have anything to stand on.
 

SolidQ

Member
that to a zen2 7 core/14 thread CPU clocked at 3.5Ghz translates to a CPU limitation?
They can higher, but limited to 60 fps for stability. Need unlocked fps mod for truth
There also from POE videos, previous page. You see more monsters/effects = FPS drop. Maybe blizzard right now testing new endgame content with higher monster density

Jaguar CPU with 6 cores/threads from 2011 clocked at 1,8Ghz is running this game at 30fps.
There no test max party + effects, need wait for this. POE on PS4 with high density monsters Fps was very low.
 

Bojji

Member
Either they were really careful with the res and left a lot of performance on the table, or there is a bottleneck somewhere else.

A4PfY3XtVKdAWjkQD9uhLZ-970-80.png.webp


A 3050 offers similar performance at a higher resolution. Granted, the consoles probably don't drop to 51fps but 1440p is also a bit higher than 1260p. A 2070 completely outclasses them which shouldn't happen unless, again, there is a bottleneck somewhere else.

And that is exactly my point. And what you are doing is actually being a lot nicer than me, because toe I am flat-out calling BS.

There is just no way.... NO WAY, that an RX5600XT(7TF RDNA1 GPU) or an RTX2060 (also a 7TF GPU) performs better than a PS5/XSX whose GPU is equivalent to a RX6700 (11TF GPU). I mean the consoles are reconstructing up from native ~1200p for crying out loud. And only barely hitting 60fps? I say barely cause there are still drops.

Unless there's a major bottleneck elsewhere, which I honestly can't even wrap my head around being that I have seen what first-party devs are able to do... this kinda shit is based on poor optimization. And on the surface, it looks great because we are getting 60fps. But seriously, if you really look at it, then you see how bad this shit is.

You are missing something here, we don't know average framerate of console version because they are locked to 60FPS. We only know that they drop to 59FPS and that's probably not even GPU drop but something to do with CPU or IO. But even if it is GPU drop this puts console versions around 3060.

FSR2 reconstruction needs resources so 4K reconstructed from resolution used on consoles is probably more taxing than "naked" 1440p.

Without knowing average framerate we can't compare consoles to PC parts in this game.
 

Mr Moose

Member
And that is exactly my point. And what you are doing is actually being a lot nicer than me, because toe I am flat-out calling BS.

There is just no way.... NO WAY, that an RX5600XT(7TF RDNA1 GPU) or an RTX2060 (also a 7TF GPU) performs better than a PS5/XSX whose GPU is equivalent to a RX6700 (11TF GPU). I mean the consoles are reconstructing up from native ~1200p for crying out loud. And only barely hitting 60fps? I say barely cause there are still drops.

Unless there's a major bottleneck elsewhere, which I honestly can't even wrap my head around being that I have seen what first-party devs are able to do... this kinda shit is based on poor optimization. And on the surface, it looks great because we are getting 60fps. But seriously, if you really look at it, then you see how bad this shit is.
CPU maybe. It's a bit shit that they are locked to 60fps too, they can clearly go above it.
 

intbal

Member
I did not realize there was an old gen version. Was there a cliffs notes version on difference in load times between current and last gen?

This channel always does loading time tests. Although in this case, he only did one. Initial game load. 50 seconds current gen. 3 minutes last gen.

 

Gaiff

SBI’s Resident Gaslighter
You are missing something here, we don't know average framerate of console version because they are locked to 60FPS. We only know that they drop to 59FPS and that's probably not even GPU drop but something to do with CPU or IO. But even if it is GPU drop this puts console versions around 3060.

FSR2 reconstruction needs resources so 4K reconstructed from resolution used on consoles is probably more taxing than "naked" 1440p.

Without knowing average framerate we can't compare consoles to PC parts in this game.
Sure, but then you look at the RX 6600 XT at 88fps average and 75fps 1% lows. Even the 2070's 1% lows are 64fps. That's also assuming Ultra settings. Though admittedly, the comparison is muddy because it's not the same areas and same settings being tested so it's hard to do a 1 to 1 comparison but this game seems fairly light on hardware demands and 1260p seems a bit low for what is on display. I honestly would have expected the consoles to be able to do something like 1620p/60fps.
 

SlimySnake

Flashless at the Golden Globes
You are missing something here, we don't know average framerate of console version because they are locked to 60FPS. We only know that they drop to 59FPS and that's probably not even GPU drop but something to do with CPU or IO. But even if it is GPU drop this puts console versions around 3060.

FSR2 reconstruction needs resources so 4K reconstructed from resolution used on consoles is probably more taxing than "naked" 1440p.

Without knowing average framerate we can't compare consoles to PC parts in this game.
Sure, but then you look at the RX 6600 XT at 88fps average and 75fps 1% lows. Even the 2070's 1% lows are 64fps. That's also assuming Ultra settings. Though admittedly, the comparison is muddy because it's not the same areas and same settings being tested so it's hard to do a 1 to 1 comparison but this game seems fairly light on hardware demands and 1260p seems a bit low for what is on display. I honestly would have expected the consoles to be able to do something like 1620p/60fps.
It's also possible that the devs felt that 1440p 60 fps looked worse than 1296p FSR balanced 4k. DLSS can definitely clean up the image sometimes so you never know, FSR might be producing a better looking image.

They should probably release a VRR mode and uncap the framerate. They probably looked at the worst case scenario of 4 players in a crazy dungeon and said ok 1296p it is.
 
My concern with FSR-type tech and what I had feared since these consoles were announced, is that I believe it would make devs lazy. It gives them an out.

At the end of the day, we get a good-looking and performing game, but I can't help but feel devs will put in less work trying to get the most out of these consoles when they know they can always use stuff like FSR to bail themselves out.

I think you're right. In fact D4 DOES have a soft look to it. Compare to the image quality of D3 on Xb1X which is pin sharp and pristine native 4k.
 

sankt-Antonio

:^)--?-<
Still not apples to apples or all game genres would look the same. An RTS versus a racing game for example they look wildly different even if they came from the same Dev because they have different restrictions. Plain and simple. Until you become a Dev that successfully launches games in two different genres you really don't have anything to stand on.
So DF can't comment on game's gfx? Because as far as I know non of them shipped a game ...
 

DavidGzz

Member
So DF can't comment on game's gfx? Because as far as I know non of them shipped a game ...

They aren't comparing apples and oranges lol. They compared it to the last entry. Thanks for showing how dumb it is to expect different genres doing different things to be comparable.
 

DavidGzz

Member
Anyway, I just read about DLAA and turned that on, this game is beautiful. I hate jaggies and DLAA wipes then away completely. The last thing that had them was my rogues bow string no matter how high the graphics were. Gone.
 

Godfavor

Member
Either they were really careful with the res and left a lot of performance on the table, or there is a bottleneck somewhere else.

A4PfY3XtVKdAWjkQD9uhLZ-970-80.png.webp


A 3050 offers similar performance at a higher resolution. Granted, the consoles probably don't drop to 51fps but 1440p is also a bit higher than 1260p. A 2070 completely outclasses them which shouldn't happen unless, again, there is a bottleneck somewhere else.
There is nothing wrong with the consoles here.

First the consoles use FSR 2.0 which is taxing and comparable with a 1440p resolution.

Second the CPU that is used in those PC comparisons would probably be way more powerful than the console equivalent CPU so this could add 10-20% better performance.

Third, the one FPS drops (it's not even 1% low as this chart shows) in the console video from digital foundry is unrelated with GPU or CPU bottleneck, as mentioned in the video it's a traversal thing, so we don't know the average FPS in the console version as well as the minimum FPS (big fights didn't manage to drop the frame rate).
 
Last edited:

Pimpbaa

Member
HDR? Good, bad, nonexistent? It’s an important part to graphics to a lot of people (those playing on a good modern TV).
 

hlm666

Member
Understandable, but this game is not CPU limited on a PS5/XSX.
I think it might be, after looking around some other benchmarks and things a 3600x with a 2070 is doing about 75fps at 1440p native but the gpu is only sitting between 60-70% utilization. If that is the case though I still don't understand why they didn't bump the base resolution, unless endgame it becomes gpu bound aswell although from what i've seen it still seems to be cpu bound in endgame on high end rigs. Being cpu bound is also why dlss3 fg is getting such a big uplift in this I guess.

I don't think the console cpus are quite as close to the 3700x as many of us thought, an old 6700k seems to out perform them in this scenario at least. A 6700k with a 1080ti was running the beta at between 60-90fps at 1080 with 150% res scaling (1620p) and that kept gpu util between 95-99%, i'm not sure if there was a performance improvement for the final game either so the 6700k might be a bit better now aswell. Anyway I pretty much answered my own question to satisfy my curiousity. It might also go a long way to answer SlimySnake SlimySnake question about why some other coop games recently were 30fps, something has to give when your power budget is ~200 watts for the whole system vs a system using over that for the cpu alone.
 

CGNoire

Member
DF does this all the time. They dont know how to judge graphics.
Came here to post this.

Naysayers will claim they made the comparison because its the same series but those in the know know full well why DF chose to compare it againgst that 10+ year cartoony ass game. DF is Fucking shameless.

They did the same thing with DS and pretended the whole time as if it wasnt 11+ years older and 2 gens behind.
 
Last edited:

CGNoire

Member
I mean, they're just comparing the latest numbered game with the last one that came out lol.

Not sure why this is a controversial issue. When Uncharted 4 came out, DF compared its details with the previous games, as an example, regardless of if they were on the last gen.
Its the way they phrase there compliments with phrases like "impressive" as if any significant visual improvement is impressive for a game 10+ years later.

The word "impressive" basicly has zero meaning when they say it.
 
Top Bottom