• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Doom Eternal PS5 vs Xbox Series S/X

longdi

Banned
AFAIK we don't know. Have Mesh Shaders and Sampler Feedback along with dedicated hardware support for them been confirmed on PS5? I genuinely don't know.

What we do know is that it is confirmed to not have hardware support for VRS, which means it is not "full" RDNA2. If that's an actual issue is debatable, but it is true, and you should probably stop arguing.

Machine learning too.

VRS, Mesh Shaders, 2x (IIRC) more efficent ROPS, SFS and ML. All nice complementary efficiency boosters of rDNA2 that Phil held out for. The raw numbers and the soft new features :messenger_smiling_with_eyes:

MS is going big on Direct ML
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Like I mentioned earlier, please don’t do this. Infinity cache is for pc gpus that have shared system memory and gpu memory, why would a console with super fast gddr memory for its entire memory pool have infinity cache?
That's a good point. Do we know how the xsx compares against the 6700xt? I believe it's around 13 tflops with 64mb of infinity cache.

Should tell us if the infinity cache makes a big difference.
 

rnlval

Member
Infinity cache is part of RDNA2 architecture and Series X|S doesn't have it...
Infinity Cache is a workaround for PC RDNA 2's not exceeding 256-bit bus GDDR6 bus. Infinity Cache doesn't change hardware feature support for DirectX12U.

NVIDIA's Ampere RTX GPUs do not have Infinity Cache and it still supports DirectX12U like on PC RDNA 2.
 

DenchDeckard

Moderated wildly
That's a good point. Do we know how the xsx compares against the 6700xt? I believe it's around 13 tflops with 64mb of infinity cache.

Should tell us if the infinity cache makes a big difference.

I don’t have numbers to back it up but I would imagine a 6700xt at 1850mhz would be roughly In the same ball park as the series x. Probably a bit better performance for the dedicated gpu due to extra power draw available and the fact that 6700 xt are massive with large heat sinks haha.
 

SlimySnake

Flashless at the Golden Globes
I don’t have numbers to back it up but I would imagine a 6700xt at 1850mhz would be roughly In the same ball park as the series x. Probably a bit better performance for the dedicated gpu due to extra power draw available and the fact that 6700 xt are massive with large heat sinks haha.
Found some benchmarks. Ray tracing only.
MgonoHS.png
hKz8p6A.png


Ultra nightmare settings though so not a 1:1 comparison.
 
According to VG tech it is dynamic 4K on both for balanced mode. Dynamic 1800P in RT mode. They both drop resolution, 23% and 14% more pixels in favor of SX in these mode but these are dynamic and depend on the scene. 120fps the worst case scenario for PS5 is 1992*1120 and 2266*1275 for SX which is 29% more pixels.

if you look at the performance of RX 5700 which is 36CU 1.7ghz boost mode, 8tf console compared to these console it is disappointing but even more for SX because it specs are 52CU 1.8ghz, 12tf with VRS.

I think there is one important parameter that need to be taken in account, that's the memory bandwidth. On PS5 you have 448GB/s of memory bandwidth SHARED compared to the same bandwidth just for the GPU with the 5700. Same problem for the SX. That could explain some limitation and resolution drops encountered.
 
Like I mentioned earlier, please don’t do this. Infinity cache is for pc gpus that have shared system memory and gpu memory, why would a console with super fast gddr memory for its entire memory pool have infinity cache?

It could help A LOT the memory bandwidth aspect, that's not used, because for die size point of view, it could have been a nightmare.
 
Last edited:

DenchDeckard

Moderated wildly
Found some benchmarks. Ray tracing only.
MgonoHS.png
hKz8p6A.png


Ultra nightmare settings though so not a 1:1 comparison.

Yep interesting, so with the DRS and pretty much locked 120 FPS with settings scaled back on console I would imagine they are pretty close?
What do you think?
 

yamaci17

Member
textures really are weird on series x

i wonder if they just selected a texture option that works OK on series s and forgot to readjust the same setting for series x?
 

DenchDeckard

Moderated wildly
textures really are weird on series x

i wonder if they just selected a texture option that works OK on series s and forgot to readjust the same setting for series x?

I do think we have seen stuff like this before. Will be interesting to see if there is a patch to fix AF on consoles and if any other tweaks happen too.
 

HoofHearted

Member
textures really are weird on series x

i wonder if they just selected a texture option that works OK on series s and forgot to readjust the same setting for series x?
This has happened before with DiRT 5... and they ultimately fixed it.

It's either a bug or a configuration issue - looks to be a scenario where lower IQ textures for a different target system (i.e. XSS or X1X) are included with the XSX target build.
 
Last edited:

Armorian

Banned
V1SCmLB.jpg


atW9MCB.jpg


571QzJV.jpg


EU7P34I.jpg


Textures :
PS5 = PC
XSX = XB1 X


d7cfIlX.png

If this is VRS then this "you won't notice unless 400% zoom" folks are... cleary wrong :messenger_grinning_smiling: But...

textures really are weird on series x

i wonder if they just selected a texture option that works OK on series s and forgot to readjust the same setting for series x?

I do think we have seen stuff like this before. Will be interesting to see if there is a patch to fix AF on consoles and if any other tweaks happen too.

...it looks more like texture streaming issue to me
 

Neo_game

Member
Do you think the DRS is TOO effective at maintaining 120 fps? and that its coming at the cost of a lower resolution? After all, they are targeting 1585p and 1800p as the base resolution for these modes so there was clearly more headroom available here.

2dviFWZzU2cQWwVvnTTzTK-970-80.png.webp


The PS5 should be on par with the 5700xt but I see that there are some drops to 117 fps which means the DRS system has to adjust the resolution down to maintain the 120 fps.

The XSX is on par with the 2080 so it should be easily able to do 1800p at 120 fps but again due to those 1% drops the DRS kicks in and lowers the resolution.

i think they shouldve just gone with 1440p and an unlocked framerate with a 120 fps cap. this way you get a consistent image with some drops in those 1% instances that you wont even notice because the average framerate is what matters.

I have been saying this for a while now, but DF's insistence on finding the worst case scenario and using that to judge the entire game performance is simply not accurate. They should take average framerates across various different levels and report that. Finding a drop here and there does nothing to report the performance of the game other than to feed console wars. And now we are seeing these really aggressive DRS scalers that are downgrading the image quality to 1080p and sub 1080p in Metro exodus because devs are too afraid to drop a single fucking frame.

I agree about the aggressive DRS scaling because 1% percentile is emaculate and even in 120fps mode it drops to 116fps only.

The big downside as you mentioned is the pixel count is halved for all consoles to maintain that performance, that is a pretty significant downgrade on resolution . It is difficult to compare with the PC which are at fixed resolution. Nvidia cards seems to do well in this game. I would say consoles are doing around RX 5700 to 5700XT. NXgamer said in IGN video that only 4-5 gfx programmer we working on Metro upgrade, it is a small team and probably not many did for DE as well because this is a free upgrade. So I guess we can give them benefit of doubt.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Yep interesting, so with the DRS and pretty much locked 120 FPS with settings scaled back on console I would imagine they are pretty close?
What do you think?
If the 5700xt and 6700xt benchmark comparisons are any indication, I'd say yes.

The difference in tflops between the two cards is 36% and the difference in performance is 25% at 1080p, 32% at 1440p and 30% at 4k. More or less on par with the tflops difference.

The difference in tflops between the xsx and 6700xt is exactly 10% so I suspect we are looking roughly 8-10% more performance. If the 6700xt can do ray tracing at 1080p 120 fps on average at ultra nightmare settings with RT enabled then the xsx should be able to do 120 fps with RT on as well. It's clear that they are leaving A LOT of performance on the table. Looking at the chart, when the game does drop, it drops big. And I suspect that is the reason why we have both consoles running DRS and leaving A LOT of performance on the table.

ydTZGqP5nzNymbKrtEGuNN-970-80.png.webp


Anyway, going back to our original discussion on the infinity cache, it seems infinity cache is used to scale performance 1:1 with tflops. Something they couldnt do with vega which scaled very poorly past 10 tflops. If we can get Alex or NXGamer to do some 1:1 comparisons between the xsx and 6700xt downclocked to 12 tflops, we could see just what kind of difference the infinity cache is making with RDNA 2.0 cards.

P.S Another thing a test like that would prove or disprove is Cerny's contention that not all tflops are the same. the 6700xt is 13 tflops with a 40 CU 2.3 ghz configuration where as the XSX is obviously using the wide and slow approach. If the performance between the two are the same, we can assume all RDNA tflops are the same.
 
Last edited:

DenchDeckard

Moderated wildly
If the 5700xt and 6700xt benchmark comparisons are any indication, I'd say yes.

The difference in tflops between the two cards is 36% and the difference in performance is 25% at 1080p, 32% at 1440p and 30% at 4k. More or less on par with the tflops difference.

The difference in tflops between the xsx and 6700xt is exactly 10% so I suspect we are looking roughly 8-10% more performance. If the 6700xt can do ray tracing at 1080p 120 fps on average at ultra nightmare settings with RT enabled then the xsx should be able to do 120 fps with RT on as well. It's clear that they are leaving A LOT of performance on the table. Looking at the chart, when the game does drop, it drops big. And I suspect that is the reason why we have both consoles running DRS and leaving A LOT of performance on the table.

ydTZGqP5nzNymbKrtEGuNN-970-80.png.webp


Anyway, going back to our original discussion on the infinity cache, it seems infinity cache is used to scale performance 1:1 with tflops. Something they couldnt do with vega which scaled very poorly past 10 tflops. If we can get Alex or NXGamer to do some 1:1 comparisons between the xsx and 6700xt downclocked to 12 tflops, we could see just what kind of difference the infinity cache is making with RDNA 2.0 cards.

P.S Another thing a test like that would prove or disprove is Cerny's contention that not all tflops are the same. the 6700xt is 13 tflops with a 40 CU 2.3 ghz configuration where as the XSX is obviously using the wide linear approach. If the performance between the two are the same, we can assume all RDNA tflops are the same.

Great post man, would be great to be able to get hard data on your suggestions of 6700 XT downclocked.
 
Worse IQ in small segments of the periphary image that the user will never notice during gameplay. Unless I missed something?

Yes.

VRS means that, when employed with dynamic resolution scaling, it will improve the overall IQ. With using VRS, the saved 10 %- 15% of GPU power can be used to increase the resolution which leads to finer details shown better.

Coalition uses this in Gears 5 / Hivebusters and this video explains what VRS is and how it is used in Gears 5:

 

JackMcGunns

Member
Yes.

VRS means that, when employed with dynamic resolution scaling, it will improve the overall IQ. With using VRS, the saved 10 %- 15% of GPU power can be used to increase the resolution which leads to finer details shown better.

Coalition uses this in Gears 5 / Hivebusters and this video explains what VRS is and how it is used in Gears 5:




Haters gonna hate.

A next gen feature that's not in the PS5, naturally it's going to be downplayed and labeled "BAD" when it's actually really good when implemented correctly like in Hivebusters which looks A fucking Mazing! But no one will talk about that because it won't line up with the narrative.
 

Armorian

Banned
Haters gonna hate.

A next gen feature that's not in the PS5, naturally it's going to be downplayed and labeled "BAD" when it's actually really good when implemented correctly like in Hivebusters which looks A fucking Mazing! But no one will talk about that because it won't line up with the narrative.

This shit is on PC and any logical person just turns if off. You have no choice on Xbox...
 

Hoddi

Member
This shit is on PC and any logical person just turns if off. You have no choice on Xbox...
VRS was pretty bad the first time I saw it in Wolfenstein TNC. I didn't fully play through the game until much later but the VRS blocking was gone by that point. It's still not the first thing I enable but it's very hard to spot differences in later games like Wolfenstein Youngblood.

That aside, VRS artifacts don't look anything like those screenshots of Doom Eternal. That's something else entirely and is likely just a simple bug.
 

Armorian

Banned

This is the how this looks

Beztytuu.png


VRS was pretty bad the first time I saw it in Wolfenstein TNC. I didn't fully play through the game until much later but the VRS blocking was gone by that point. It's still not the first thing I enable but it's very hard to spot differences in later games like Wolfenstein Youngblood.

That aside, VRS artifacts don't look anything like those screenshots of Doom Eternal. That's something else entirely and is likely just a simple bug.

This looks like some texture streaming bug. But VRS = Cancer
 

Armorian

Banned
It can be pretty bad but I think it's largely down to the implementation. I was actually messing around with it the other day and made a few captures. I'm curious if anyone can spot the difference.

A
B

Way more aliasing on pic B - but I don't know it that's VRS thing or different AA setting

e66Beztytuu.png


What was performance difference?
 

Hoddi

Member
Way more aliasing on pic B - but I don't know it that's VRS thing or different AA setting

e66Beztytuu.png


What was performance difference?

I think those are just TAA artifacts. Truth be told, I don't even remember which is which because I can't tell the difference.

VRS artifacts otherwise don't look like that. Here's a typical example of VRS artifacting where you'll notice blockiness on the floor inside the bus. VRS shouldn't cause the type of aliasing that you mentioned because it's only supposed to work on uniform surfaces.

Edit:

The performance difference was something like 85fps vs 90fps. Not exactly massive but in the range promised. Also, this screenshot is in 'performance' mode and those blocks disappear in 'quality' mode.
 
Last edited:
Like I mentioned earlier, please don’t do this. Infinity cache is for pc gpus that have shared system memory and gpu memory, why would a console with super fast gddr memory for its entire memory pool have infinity cache?
Infinity cache is CACHE, it's on die, as you say it's much faster (and its latency is much faster) . GDDR memory is not even close in terms of performance, cache is one of the most accessible way to boost the performance of many computer components, it's also expensive because it's on die, so the consoles most certainly have a cut down quantity of it, but if they had more they would be that much faster, same for the CPU.

PC GPUs have their own pool of very fast memory, it's not like any serious GPU accessed main RAM when it needs a texture or something.

Anyway, your post is probably one of the worst tech take I have ever seen.
 

DenchDeckard

Moderated wildly
Infinity cache is CACHE, it's on die, as you say it's much faster (and its latency is much faster) . GDDR memory is not even close in terms of performance, cache is one of the most accessible way to boost the performance of many computer components, it's also expensive because it's on die, so the consoles most certainly have a cut down quantity of it, but if they had more they would be that much faster, same for the CPU.

PC GPUs have their own pool of very fast memory, it's not like any serious GPU accessed main RAM when it needs a texture or something.

Anyway, your post is probably one of the worst tech take I have ever seen.

yes it’s basically Edram from the 360 and Xbox one On a pc gpu right? I’m not surprised that Microsoft or Sony didn’t touch that again. Ms doesn’t need it as it isn’t a 256 bit bus like the pc RDNA 2 gpus

neither of these consoles should waste valuable die space on a cache with how consoles are set up. It doesn’t make any valuable sense.
 
Last edited:

yamaci17

Member
yes it’s basically Edram from the 360 and Xbox one On a pc gpu right? I’m not surprised that Microsoft or Sony didn’t touch that again. Ms doesn’t need it as it isn’t a 256 bit bus like the pc RDNA 2 gpus

neither of these consoles should waste valuable die space on a cache with how consoles are set up. It doesn’t make any valuable sense.
no not a similar situation.

xbox one used ddr3 ram as a baseline... a horrific choice. esram was used to bolster ddr3 ram's bandwidth, and rightfully so, it seemed to work. xbox one managed to stay "competitive" in terms of performance. in the end, most of the difference came from 12 CU vs 18 CU (xbox one targeted 900p mostly whereas ps4 targeted 1080p mostly throughout the generation. i believe this difference comes from the compute unit difference)

this proves that that fast esram managed to cover up for slow ddr3 ram, so differences between xxbox one and ps4 was mostly relied on computational power (in other words, both memory bandwidth configs managed to "feed" the gpu)

this also shows us that infinity cache probably allows rdna2 desktop gpus to be faster than their console counterparts , i don't know by how much, but definetely faster. we can see 6700xt taking the ball away and run in some cases against 5700xt, probably in bandwidth starved situations. in some cases, difference between the two can be %20, in some cases it can be up to %60-70. there are huge irregulatiries between that can be partly explained by the infinity cache

in the case of desktop rdna 2, the vram is still gdrr6, but bolstered by the cache

i do agree that there had to cut corners, i have no quarrel with that though, base gdrr6 bandwidth should be enough for the console. amd needed infinity cache on desktop to compete with nvidia on desktop space
 
Last edited:

M1chl

Currently Gif and Meme Champion
Seems to me that VRS is applied to the whole picture (so not taking into account the depth of the scene and how far is it from the player), which is not the approach this technique should be implemented. As far as I know, nobody saw any softness with Gears 5, which is using VRS Tier 2, this looks like Tier 1 without the Z-buffer info.
 
yes it’s basically Edram from the 360 and Xbox one On a pc gpu right? I’m not surprised that Microsoft or Sony didn’t touch that again. Ms doesn’t need it as it isn’t a 256 bit bus like the pc RDNA 2 gpus

neither of these consoles should waste valuable die space on a cache with how consoles are set up. It doesn’t make any valuable sense.

I'm not sure if it's similar but the PS5 has SRAM in the I/O complex which is taking up space on the die.

20200329143025.jpg


Not sure how big it is on the die compared to the edram though.
 
Machine learning too.

VRS, Mesh Shaders, 2x (IIRC) more efficent ROPS, SFS and ML. All nice complementary efficiency boosters of rDNA2 that Phil held out for. The raw numbers and the soft new features :messenger_smiling_with_eyes:

MS is going big on Direct ML
You don't really believe all that, do you?
 

Riky

$MSFT
AFAIK we don't know. Have Mesh Shaders and Sampler Feedback along with dedicated hardware support for them been confirmed on PS5? I genuinely don't know.

What we do know is that it is confirmed to not have hardware support for VRS, which means it is not "full" RDNA2. If that's an actual issue is debatable, but it is true, and you should probably stop arguing.

It was all in the statement issued at the RDNA2 reveal, people tried to damage control by saying they were only talking about DX12U features despite them specifically saying hardware support, now we see the first of the features confirmed, the other two will follow shortly.
 

Shmunter

Member
It was all in the statement issued at the RDNA2 reveal, people tried to damage control by saying they were only talking about DX12U features despite them specifically saying hardware support, now we see the first of the features confirmed, the other two will follow shortly.
Let’s hope these other 2 features work in its favour and not sabotage it like the 1st one. 😂
 
Seems to me that VRS is applied to the whole picture (so not taking into account the depth of the scene and how far is it from the player), which is not the approach this technique should be implemented. As far as I know, nobody saw any softness with Gears 5, which is using VRS Tier 2, this looks like Tier 1 without the Z-buffer info.
Because nobody could compare with the same exact game without VRS. We only know VRS true effect on the whole game because of the VRS less PS5 version.

And we never got proper comparison by MS about their tech. Only one side of the story with a few selected comparison pics. Never a whole game.
 

M1chl

Currently Gif and Meme Champion
Because nobody could compare with the same exact game without VRS. We only know VRS true effect on the whole game because of the VRS less PS5 version.

And we never got proper comparison by MS about their tech. Only one side of the story with a few selected comparison pics. Never a whole game.
I believe that's not the case, can't you turn off VRS in PC version? Should be possible... I might look into it.
 

Hoddi

Member
Because nobody could compare with the same exact game without VRS. We only know VRS true effect on the whole game because of the VRS less PS5 version.

And we never got proper comparison by MS about their tech. Only one side of the story with a few selected comparison pics. Never a whole game.


The PC version of Doom doesn't support VRS. It's easy to compare in the Wolfenstein games though and I posted some screenshots above.

Here's a couple more for anyone curious.

A
B

And here's also a third that shows the calculated differences between VRS on vs off. Just note that TAA also has an effect on some of the edges.

C
 
Top Bottom