• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry about XSX teraflops advantage : It's kinda all blowing up in the face of Xbox Series X

Dick Jones

Gold Member
That very much is how it works.

Besides, I'm not the one calling folks liars, so I have nothing to work out. If this is the verdict, I guess he's acquitted due to a lack of evidence. :)

In the real world, Microsoft refuses to give solid, up-to-date numbers. So, all anyone really has to go on is "leaked documents" and random updates from everyone but Microsoft. That's not on him, you, nor me.
I've always preferred that we base the numbers on the last publicly disclosed numbers. Otherwise don't give those who give grey answers any benefit for their actions. I'd love to see that implemented
 

Killjoy-NL

Member
What? The leaked e-mails showed that Phil was confident that Xbox was the better piece of hardware compared to PS5:

Unredacted reaction to PS5's reveal, "We have a better product than Sony does" - Phil Spencer 30% GPU advantage

The odd thing about how excited they were about the power advantage of XSX and the narrative before about not being beat on power like the previous gen shows they thought XSX would be a big selling console against PS5, with XSS adding more entry level users. The Series X share is woeful compared to PS5, its main tech and price rival and should be Microsoft's lead console. It's very much on the back burner now after all that, like they've unwittingly boxed off their own console.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
DF flaming the console war


Pretty nuts to bring up the 18% die size advantage while completely ignoring the 22% clock advantage PS5 GPU has.

XSX is probably clocked way too low. A 40 CU GPU like the XSX with 2.2 tflops wouldve outperformed the PS5 every single time. 1.8 Ghz is way too low for RDNA2 GPUs which can hit up to 2.6 Ghz.

Not a good look for DF. MS needs to ask their engineers why they designed an albatross. not punt the blame on to devs.

EDIT: Posting this again to show why the PS5 outperforms the xsx in some scenarios.

s0n39Hi.png
 
Last edited:

DenchDeckard

Moderated wildly
Pretty nuts to bring up the 18% die size advantage while completely ignoring the 22% clock advantage PS5 GPU has.

XSX is probably clocked way too low. A 40 CU GPU like the XSX with 2.2 tflops wouldve outperformed the PS5 every single time. 1.8 Ghz is way too low for RDNA2 GPUs which can hit up to 2.6 Ghz.

Not a good look for DF. MS needs to ask their engineers why they designed an albatross. not punt the blame on to devs.

Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
 

HeisenbergFX4

Gold Member
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
I think the refresh idea has been scrapped as even they knew they can't relaunch the same console at the same great price when Sony looks to be bringing out a Pro
 

SlimySnake

Flashless at the Golden Globes
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
more CUs means if they clock these things higher, it would raise the temps AND the power usage. Both the PS5 and XSX consume around 220 watts in their most intensive games. If MS were to increase the clocks by 22%, they would be increasing the tdp by 22% which is roughly 40-50 watts. And these RDNA clocks use more power as they go up above 2.0 ghz. its not 1:1 linear.

Their revision is on 6nm so they are likely saving 10% in tdp there but since the whole box is smaller, they likely chose to save on cost of vapor chamber cooling instead of upping clocks like they did for the xbox one s which they took from 1.31 tflops to 1.4 tflops.
 

Daneel Elijah

Gold Member
Pretty nuts to bring up the 18% die size advantage while completely ignoring the 22% clock advantage PS5 GPU has.

XSX is probably clocked way too low. A 40 CU GPU like the XSX with 2.2 tflops wouldve outperformed the PS5 every single time. 1.8 Ghz is way too low for RDNA2 GPUs which can hit up to 2.6 Ghz.

Not a good look for DF. MS needs to ask their engineers why they designed an albatross. not punt the blame on to devs.

EDIT: Posting this again to show why the PS5 outperforms the xsx in some scenarios.

s0n39Hi.png
For me it is a mix of marketing over practicality and a consequence of their choices. They wanted a 12 tf machine no matter what. As the One X was a 6tf they needed the double to not be seen inadequate. More importantly, they choose to use Series x consoles as servers for the cloud. So that too did not help. But it leaves the Series S clock that is even smaller as a enigma.
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
They proved us that they can change a console insides a lot for a redesign. The One S is proof that you can change the RAM and put a stronger GPU while still being retrocompatible with the old Xbox One. Xbox software approach and hypervisor allow them a lot of latitude that Sony and their more hardware based approach can't match. I think that between the Red ring of death and their fear of a repeat, plus their intense desire of never being seen as the weaker console again, plus theirs Cloud aspirations the Xbox executives had given the engineers a really hard task. On top of that we have rumors that they did the console in a smaller timeframe than Sony did, as Xbox was not sure to have a next gen console at some point...
 
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
Adorably All Digital Stupid
 

Bogroll

Likes moldy games
For me it is a mix of marketing over practicality and a consequence of their choices. They wanted a 12 tf machine no matter what. As the One X was a 6tf they needed the double to not be seen inadequate. More importantly, they choose to use Series x consoles as servers for the cloud. So that too did not help. But it leaves the Series S clock that is even smaller as a enigma.

They proved us that they can change a console insides a lot for a redesign. The One S is proof that you can change the RAM and put a stronger GPU while still being retrocompatible with the old Xbox One. Xbox software approach and hypervisor allow them a lot of latitude that Sony and their more hardware based approach can't match. I think that between the Red ring of death and their fear of a repeat, plus their intense desire of never being seen as the weaker console again, plus theirs Cloud aspirations the Xbox executives had given the engineers a really hard task. On top of that we have rumors that they did the console in a smaller timeframe than Sony did, as Xbox was not sure to have a next gen console at some point...
I think you mean the Xbox One X ? I think they just upped the clocks on One S.
 
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
Overclocking it won't change the innefficiencies of the architecture: the problem of low cache / tflops ratio (notably the L1 cache) and the splitted memory architecture with the possible memory problems devs can encounter.

They need to redesign a better APU, basically they are likely going to copy PS5 and/or PS5 Pro architecture like when they released X1X which was a super PS4.
 
Last edited:
I don't think it's right to put all the blame on the Xbox engineers, if from the offices and management of the brand they wanted 12 Tflops yes or yes, well I don't know what other options there were, Sony had to take its SoC to 2.23 Ghz You have to use liquid metal to cool, but it is true that you see the performance of one with 52 cu and another with 36 cu and it is very clear which one did the best job.
 

Godfavor

Member
Average power consumption is a little lower about 15-20% in favor of series x on multiplatform titles against the the 7nm PS5, it is one of the positive things of the slow and wide approach gpus, but this is a question for me as it shows that the gpu is not utilized to its fullest. It might be interesting to know how much power consumption the new Forza will have.

Another advantage of the series x is the 560gb/sec memory bandwidth which weirdly enough doesn't translate in a notable performance difference, assuming that the fast 10gb of ram is only utilized for graphic purposes.

Not utilizing the GPU to its fullest means one or more of these three things: a flawed hardware design with some kind of bottleneck, game engines are still not utilizing new stuff as most of them are 3+ years old, PS5 is much easier to develop so game engine bottlenecks are easily resolved by devs.
 
Pretty nuts to bring up the 18% die size advantage while completely ignoring the 22% clock advantage PS5 GPU has.

XSX is probably clocked way too low. A 40 CU GPU like the XSX with 2.2 tflops wouldve outperformed the PS5 every single time. 1.8 Ghz is way too low for RDNA2 GPUs which can hit up to 2.6 Ghz.

Not a good look for DF. MS needs to ask their engineers why they designed an albatross. not punt the blame on to devs.

EDIT: Posting this again to show why the PS5 outperforms the xsx in some scenarios.

s0n39Hi.png
I think you have a good point, the clocks are very conservative, considering too that the consoles seem really well designed cooling wise too I'm sure they could push a firmware update and add 100mhz and not break a sweat. My series s and X are nearly silent when running even the most demanding games.

Also not sure what Microsoft is thinking letting games like ghostwire and redfall release running how they did. Microsoft should be making sure their first party are using all the tricks to get the maximum out of the consoles using them to show 3rd parties how it's done. It's clear that Bethesda were given too much freedom. I mean the redfall release was a disgrace and very embarrassing for xbox. Can't believe no one checked the state of the game. Also the theme of the game would have been perfect for a Halloween release there are even loads of pumpkins out side peoples houses. Game could have been alright with more time.
 
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
I did wonder if they could do a pro series x quite easily by just clocking the gpu and cpu a but higher and making all the ram chips 2gb so unlocking the full memory bandwidth and simplifying things for developers.
 

SlimySnake

Flashless at the Golden Globes
I think you have a good point, the clocks are very conservative, considering too that the consoles seem really well designed cooling wise too I'm sure they could push a firmware update and add 100mhz and not break a sweat. My series s and X are nearly silent when running even the most demanding games.

Also not sure what Microsoft is thinking letting games like ghostwire and redfall release running how they did. Microsoft should be making sure their first party are using all the tricks to get the maximum out of the consoles using them to show 3rd parties how it's done. It's clear that Bethesda were given too much freedom. I mean the redfall release was a disgrace and very embarrassing for xbox. Can't believe no one checked the state of the game. Also the theme of the game would have been perfect for a Halloween release there are even loads of pumpkins out side peoples houses. Game could have been alright with more time.
MS knew redfall was trash in September. But at that point they decided to fix up the bugs and release the game. I think Phil Spencer said this in one of the interviews. Thats a game that shouldve been cancelled but Phil didnt have the heart to do it.

And I mentioned this above, but cooling the chip isnt the only issue. The power consumption will also go up and it might not actually be a 1:1 increase. Remember MS increased the clocks on the base xbox on pushing it from 12 to 13.1 tflops because they had the headroom back then. They dont have it right now because they are already pushing 220 watts.
 
Last edited:

sendit

Member
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
Anthony Anderson Deathwish GIF by Death Wish Coffee
 
MS knew redfall was trash in September. But at that point they decided to fix up the bugs and release the game. I think Phil Spencer said this in one of the interviews. Thats a game that shouldve been cancelled but Phil didnt have the heart to do it.

And I mentioned this above, but cooling the chip isnt the only issue. The power consumption will also go up and it might not actually be a 1:1 increase. Remember MS increased the clocks on the base xbox on pushing it from 12 to 13.1 tflops because they had the headroom back then. They dont have it right now because they are already pushing 220 watts.
Still they didn't cancel and now have to fix it up and release a dlc. Could have been a much nicer story releasing in a better state halloween week, as a game for people who are not interested in forza.
 

Dural

Member
I think you have a good point, the clocks are very conservative, considering too that the consoles seem really well designed cooling wise too I'm sure they could push a firmware update and add 100mhz and not break a sweat. My series s and X are nearly silent when running even the most demanding games.

Also not sure what Microsoft is thinking letting games like ghostwire and redfall release running how they did. Microsoft should be making sure their first party are using all the tricks to get the maximum out of the consoles using them to show 3rd parties how it's done. It's clear that Bethesda were given too much freedom. I mean the redfall release was a disgrace and very embarrassing for xbox. Can't believe no one checked the state of the game. Also the theme of the game would have been perfect for a Halloween release there are even loads of pumpkins out side peoples houses. Game could have been alright with more time.

Instead of just adding 100mhz they could add boost clocks like PC GPUs already have.

Average power consumption is a little lower about 15-20% in favor of series x on multiplatform titles against the the 7nm PS5, it is one of the positive things of the slow and wide approach gpus, but this is a question for me as it shows that the gpu is not utilized to its fullest. It might be interesting to know how much power consumption the new Forza will have.

Another advantage of the series x is the 560gb/sec memory bandwidth which weirdly enough doesn't translate in a notable performance difference, assuming that the fast 10gb of ram is only utilized for graphic purposes.

Not utilizing the GPU to its fullest means one or more of these three things: a flawed hardware design with some kind of bottleneck, game engines are still not utilizing new stuff as most of them are 3+ years old, PS5 is much easier to develop so game engine bottlenecks are easily resolved by devs.

I did wonder if they could do a pro series x quite easily by just clocking the gpu and cpu a but higher and making all the ram chips 2gb so unlocking the full memory bandwidth and simplifying things for developers.

Yep, with the lower power consumption and the vapor chamber cooler I wouldn't doubt they could easily make a pro console on a smaller node fairly easily by just bumping up the clocks and fixing the ram issue.
 

twilo99

Member
Redfall was in development way before Microsoft bought Bethesda so stop with the false narrative. I have yet to see a ps5 game that has even come close Flight Simulator on Series X .

If you don't have a developer who can specifically do a flight simulation game it's kinda of hard to make that happen.. its not that the PS5 is limited in anyway.
 
more CUs means if they clock these things higher, it would raise the temps AND the power usage. Both the PS5 and XSX consume around 220 watts in their most intensive games. If MS were to increase the clocks by 22%, they would be increasing the tdp by 22% which is roughly 40-50 watts. And these RDNA clocks use more power as they go up above 2.0 ghz. its not 1:1 linear.

Their revision is on 6nm so they are likely saving 10% in tdp there but since the whole box is smaller, they likely chose to save on cost of vapor chamber cooling instead of upping clocks like they did for the xbox one s which they took from 1.31 tflops to 1.4 tflops.

I don't think it's right to put all the blame on the Xbox engineers, if from the offices and management of the brand they wanted 12 Tflops yes or yes, well I don't know what other options there were, Sony had to take its SoC to 2.23 Ghz You have to use liquid metal to cool, but it is true that you see the performance of one with 52 cu and another with 36 cu and it is very clear which one did the best job.

I think a lots of here need to consider that this chip was not design only to be used in one device (Serie X), it was designed to be also included in cloud server. And when you take that point in account, the design of this APU is clearly not bad, the consumption is a key point for this environment (consumption can be 20% lower than PS5 under the same game and 7nm technology). Doing 2 separate dies (+series S one) would have been too expensive and have increased overall complexity (2 SKU is already annoying).
So yes, this APU is less efficient from cost point of view vs PS5 one when we consider only console hardware. But you cant strictly say that one has done a better job than the other without taking in account the overall picture.
 
That week when we found out each console's specs was embarrassing. Looking back now, not even Digital Foundry knew what they were talking about.

Seriously i hope Mark Cerny doesn't go anywhere for years to come.

Man I have vivid memories of the next-gen speculation thread, the amount of meltdowns and hot-takes that we were seeing from both sides once the PS5 specs were revealed. The amount of flack Cerny was getting too. It aged so dam poorly.

Pretty nuts to bring up the 18% die size advantage while completely ignoring the 22% clock advantage PS5 GPU has.

XSX is probably clocked way too low. A 40 CU GPU like the XSX with 2.2 tflops wouldve outperformed the PS5 every single time. 1.8 Ghz is way too low for RDNA2 GPUs which can hit up to 2.6 Ghz.

Not a good look for DF. MS needs to ask their engineers why they designed an albatross. not punt the blame on to devs.

EDIT: Posting this again to show why the PS5 outperforms the xsx in some scenarios.

s0n39Hi.png

Out of curiosity, how exactly do you calculate the triangle rasterisation of the consoles ? I want to know how they get these numbers, I always knew the PS5 had an advantage in geometry throughput but it's interesting to see exact numbers.
 

Vergil1992

Member
Honestly, I think the full story isn't being told in the thread. It's not as "common" for PS5 to be a superior version as people say. If we look at VG Tech or DF, there is an absolute majority of versions without a clear winner (stuttering, higher or lower framerate, higher or lower resolutions, different graphic settings...).

If we look at VG Tech for example, there are many versions that are superior in XSX.

Call of Duty Modern Warfare 2 (best performance)
The Quarry (higher resolution)
Dying Light 2 RT (highest resolution)
Fortnite (Lumen) (highest resolution)
Star Wars: Jedi Survivor (highest minimum resolution)
Dead Space (higher minimum resolution and better performance)
Resident Evil 2, 3 and 8 (generally work better)
Alan Wake Remastered (best performance)
Guardians of The Galaxy: (best performance)
Doom Eternal (higher resolution)
The Witcher 3 Next Gen: (higher resolution and better performance with RT)
Outriders (highest resolution)



What there are are many mixed results, such as Cyberpunk (which has a higher resolution in XSX but worse performance), or Need For Speed (the same), Immortals of Avenum, Metro Exodus and many more.


Of course, there are many that work better on PS5 as well. But I'm not so sure it's as common as people say around here. I agree with DF that the advantage is not as consistent as it should be. But I think the problem falls on the side of the API or the developers; Not because they are "inexperienced", we simply have to recognize that as Alex Battaglia says, PS5 is a much better-selling and successful console, it is logical that they focus more efforts on it. We have seen how games have been fixed through patches more quickly on PS5, or games that have come out with "strange" errors in XSX (Atomic Heart, Callisto Protocol that did not work on RT...). I think it's more of a platform priority issue than hardware differences. If we take the "elanalistadebits" as a reference, there are many games with higher XSX resolutions that only he has tested (Exoprimal, Mortal Kombat 1...). We also have the example of Control with Ray Tracing and unlocked framerate, where XSX performed 15-20% better on average. There are definitely plenty of games that show higher resolutions and/or better performance, but there are also plenty that show the PS5 taking the lead, especially when it comes to CPU performance.




I also don't deny that PS5 could have a better design in some areas, especially I/O. But I'm not sure that having a higher clock frequency makes it "22% better" and XSX "18% better". The higher frequency may help and even make it outperform the XSX in some areas, but the XSX still has more "horsepower" ready to go and the frequency also influences teraflops.



I think the XSX is a slightly more powerful piece of hardware than the PS5. DF thinks so too. But with more handicaps:

- Platform with fewer sales (less priority)
- Xbox Series S is extra work.
- Developers usually agree that working on PS5 is easier.
- PS5 has some advantages in terms of architecture.



But I think there is a lot of exaggeration about it. XSX and PS5 are essentially performing (almost) the same. No one would notice differences if they told us. Although it should be noted that XSX "should" be consistently better, but it isn't. I think the thing about XSX is that it was expected to have a consistent advantage and mostly what we are seeing are identical or mixed results (with advantages and disadvantages, we could generalize that it is more common for with better framerate), and that has been disappointing.
 
Last edited:

BigLee74

Member
Consoles have similar cost, consoles have similar performance.

When all is said and done however, the PlayStation 5 is still one ugly motherfucker. There is no glossing over that one. And ultimately, that was the deciding factor for me.

(Quietly hoping the PS5 pro is living room friendly again)
 

Vergil1992

Member
checking my message, I forgot to mention Deathloop (better framerate in XSX)) and The Outer Worlds. However, GhostWire Tokyo works better on PS5.

By the way, don't be too harsh with my messages, I'm learning English and currently I'm just "defending myself." I hope I don't sound too strange and that everything is understood correctly. I'm not saying this for anything in particular; It's just that checking my message I realize that I make mistakes frequently, I don't have much time to review the text. I appreciate the understanding. :)
 

Lysandros

Member
Not utilizing the GPU to its fullest means one or more of these three things: a flawed hardware design with some kind of bottleneck, game engines are still not utilizing new stuff as most of them are 3+ years old, PS5 is much easier to develop so game engine bottlenecks are easily resolved by devs.
PS5's GPU isn't utilized "to its fullest" either. XSX GPU's slower throughputs in some base metrics and less robust cache sub-system are inherent to it, those aren't separate entities from the GPU. It's a natural consequence of having slightly different desing goals compared to PS5, meaning different concessions. Each system's GPU is slightly faster depending on the area, searching for "a" mysterious bottleneck or unforseen design flaw is faulty logic. Yes, PS5 is more efficient overall but this doesn't mean that XSX GPU isn't properly utilized. While at it i am very curious about your source on PS5 being "much easier" develop for statement. XSX isn't something like PS3 with exotic architecture, both systems are based on the same AMD one. Furthermore, XSX is using directx which is the API with the highest familiarity among developers. There is more of a learning curve with PS5's GNM.
 
Last edited:

Bogroll

Likes moldy games
more CUs means if they clock these things higher, it would raise the temps AND the power usage. Both the PS5 and XSX consume around 220 watts in their most intensive games. If MS were to increase the clocks by 22%, they would be increasing the tdp by 22% which is roughly 40-50 watts. And these RDNA clocks use more power as they go up above 2.0 ghz. its not 1:1 linear.

Their revision is on 6nm so they are likely saving 10% in tdp there but since the whole box is smaller, they likely chose to save on cost of vapor chamber cooling instead of upping clocks like they did for the xbox one s which they took from 1.31 tflops to 1.4 tflops.
Not sure SX uses as much power as PS5, Certainly not launch model PS5. Check out the videos on this channel. Not sure how accurate they are but they're recent ish games with power usage.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Not sure SX uses as much power as PS5, Certainly not launch model PS5. Check out the videos on this channel. Not sure how accurate they are but they're recent ish games with power usage.

Wow thats a massive difference.

I do remember either DF or someone else doing a test at the start of the gen and they were both around 220 Watts. Cant remember the game.

But in Gears 5, i remember it was going up to 210 watts.

I guess developers or engines for games like Lies of Pie are not fully utilizing all the CUs? 160 watts is way too low.
 

Bogroll

Likes moldy games
Wow thats a massive difference.

I do remember either DF or someone else doing a test at the start of the gen and they were both around 220 Watts. Cant remember the game.

But in Gears 5, i remember it was going up to 210 watts.

I guess developers or engines for games like Lies of Pie are not fully utilizing all the CUs? 160 watts is way too low.
Quite a few games on there around that sort of watts or around 20-30 watts difference with PS5.
 

Vergil1992

Member
PS5's GPU isn't utilized "to its fullest" either. XSX GPU's slower throughputs in some base metrics and less robust cache sub-system are inherent to it, those aren't separate entities from the GPU. It's a natural consequence of having slightly different desing goals compared to PS5, meaning different concessions. Each system's GPU is slightly faster depending on the area, searching for "a" mysterious bottleneck or unforseen design flaw is faulty logic. Yes, PS5 is more efficient overall but this doesn't mean that XSX GPU isn't properly utilized. While at it i am very curious about your source on PS5 being "much easier" develop for statement. XSX isn't something like PS3 with exotic architecture, both systems are based on the same AMD architecture. Furthermore XSX is using directx which is the API with the highest familiarity among developers. There is more of a learning curve with PS5's GNM.
I agree in part I don't think there is any bottleneck on the Series X GPU, or anything that makes it "special/exotic". But definitely, games with dynamic resolution have a tendency to favor XSX versions (not always, but in most cases), the strange thing is that DRS usually works when the target framerate drops depending on the GPU load, but We see cases where the dynamic resolution is higher in XSX but nevertheless the performance is lower. Even in games where the resolution is the same, we see cases like RE4R that the target IQ on XSX was better than on PS5. If you have "worse" hardware it makes no sense to aim for better IQ.

I think the graphics engines and their DRS do "detect" that the XSX's GPU is slightly more powerful, but something is holding it back. It could be a CPU and API performance issue.

The best example for me is The Witcher 3. When the next gen version was released with DX12, XSX had framerate drops in Novigrad (up to 10fps less than PS5), and in turn, the resolution was significantly higher, in a DF pixel count said it was a 1400p vs 1800p. Coinciding with this, on PC the CPU performance in Novigrad with DX12 was a disaster; the Ryzen 3600 couldn't sustain 60fps and had copious drops in the mid-50fps (non RT). Even some more powerful CPUs like the Ryzen 5700x/5800X had minor issues. Over time and the arrival of updates, the game achieved a solid 60fps with a Ryzen 3600 and in more powerful CPUs like the 5800X, performance and stability increased. at the same time, the XSX and PS5 versions were in a technical tie in terms of framerate.



Although there are exceptions (for example, I think some Cyberpunk crashes are due to XSX's GPU and resolutions, at least some examples we've seen), I think in most cases it's an API and underused CPU issue. It's my theory at least.
 

damidu

Member
Maybe its microsofts plan lmaao, They underclocked the thing, getting battered now for wasting silicone that can clock much higher. So Hyperthetically, they drop a revision next year and improve the cooling and clock the thing at like 2.2GHz or more and make it 14 odd teraflops off the same design.

I am onbviously a smooth brain idiot and not sure if this is even feasible, but hey ho. You never know. MS has performance just sat their they arent utilising.

Now, obviously they cant just crank up the frequency and break the current series X, but maybe they can do something with a redesign.
just in time to go against ps5pro, lol
they totally fucked up with the power narrative and its about to get much worse
 

Lysandros

Member
I agree in part I don't think there is any bottleneck on the Series X GPU, or anything that makes it "special/exotic". But definitely, games with dynamic resolution have a tendency to favor XSX versions (not always, but in most cases), the strange thing is that DRS usually works when the target framerate drops depending on the GPU load, but We see cases where the dynamic resolution is higher in XSX but nevertheless the performance is lower. Even in games where the resolution is the same, we see cases like RE4R that the target IQ on XSX was better than on PS5. If you have "worse" hardware it makes no sense to aim for better IQ.

I think the graphics engines and their DRS do "detect" that the XSX's GPU is slightly more powerful, but something is holding it back. It could be a CPU and API performance issue.

The best example for me is The Witcher 3. When the next gen version was released with DX12, XSX had framerate drops in Novigrad (up to 10fps less than PS5), and in turn, the resolution was significantly higher, in a DF pixel count said it was a 1400p vs 1800p. Coinciding with this, on PC the CPU performance in Novigrad with DX12 was a disaster; the Ryzen 3600 couldn't sustain 60fps and had copious drops in the mid-50fps (non RT). Even some more powerful CPUs like the Ryzen 5700x/5800X had minor issues. Over time and the arrival of updates, the game achieved a solid 60fps with a Ryzen 3600 and in more powerful CPUs like the 5800X, performance and stability increased. at the same time, the XSX and PS5 versions were in a technical tie in terms of framerate.



Although there are exceptions (for example, I think some Cyberpunk crashes are due to XSX's GPU and resolutions, at least some examples we've seen), I think in most cases it's an API and underused CPU issue. It's my theory at least.
Yep, that is definitely your theory based on your assumption that the XSX GPU is slightly/meaningfully more powerful. Now, that is the very point that i disagree on and i have countless of posts laying out the reasons, like the one above. I wont repeat myself in this occasion. Just to remind though, in RE4 remake PS5 had actually higher resolution in performance mode. And generally, you seem to gloss over the cases where PS5 has the resolution advantage over XSX (which contrary to vice versa don't come with a performance penalty generally), a point which seems to undermine your theory based solely on resolutions quite a bit.
 

ChiefDada

Gold Member
What there are are many mixed results, such as Cyberpunk (which has a higher resolution in XSX but worse performance),

This is complicated/debatable territory due to the limitations of VRS.

Consoles have similar cost, consoles have similar performance.

When all is said and done however, the PlayStation 5 is still one ugly motherfucker. There is no glossing over that one. And ultimately, that was the deciding factor for me.

(Quietly hoping the PS5 pro is living room friendly again)

Are the games so meh looking on Xbox that you often find yourself staring at the console? We don't have that problem on our side, my guy. The likes of Forbidden West, Rift Apart, and FFXVI keep my eyes on the screen so much, I barely remember what my PS5 looks like:messenger_sunglasses:
 
Top Bottom