• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry about XSX teraflops advantage : It's kinda all blowing up in the face of Xbox Series X

This is complicated/debatable territory due to the limitations of VRS.



Are the games so meh looking on Xbox that you often find yourself staring at the console? We don't have that problem on our side, my guy. The likes of Forbidden West, Rift Apart, and FFXVI keep my eyes on the screen so much, I barely remember what my PS5 looks like:messenger_sunglasses:

I’m genuinely baffled by people who base their console purchasing choice on the way the console looks. It’s one of the least relevant factors in purchasing it.
 

BigLee74

Member
This is complicated/debatable territory due to the limitations of VRS.



Are the games so meh looking on Xbox that you often find yourself staring at the console? We don't have that problem on our side, my guy. The likes of Forbidden West, Rift Apart, and FFXVI keep my eyes on the screen so much, I barely remember what my PS5 looks like:messenger_sunglasses:

Probably for the best you barely remember what it looks like! 😀
 

Vergil1992

Member
Yep, that is definitely your theory based on your assumption that the XSX GPU is slightly/meaningfully more powerful. Now, that is the very point that i disagree on and i have countless of posts laying out the reasons, like the one above. I wont repeat myself in this occasion. Just to remind though, in RE4 remake PS5 had actually higher resolution in performance mode. And generally, you seem to gloss over the cases where PS5 has the resolution advantage over XSX (which contrary to vice versa don't come with a performance penalty generally), a point which seems to undermine your theory based solely on resolutions quite a bit.
Well... the story of RE4R wasn't exactly like that. In XSX IQ was significantly better, probably due to temporal reconstruction, TAA or even higher quality alpha effects, or probably all at once. Obviously this was not free, and the biggest proof was that when its quality was reduced to match PS5 the framerate increased 5-10fps. The resolution was slightly higher in performance mode at launch, but when IQ got worse in XSX they should have had room to increase it, I don't know if they did it in an update. It's also fair to point out that all REs with PS5/XSX upgrades run better on XSX, especially RE3R (although I don't think there are any differences in resolution).


Still, I'd bet that the percentage of games with higher resolutions on XSX is higher than on PS5. In fact, I think DF exposed it. Apart from the fact that it is more common for the framerate to be more stable on PS5 as a general rule. Obviously I haven't counted them, but I have both consoles and I look at all the comparisons, and I've always had the feeling that the resolutions on XSX have the advantage. With DF they often don't point them out, but VG Tech usually does.
 

SKYF@ll

Member
Honestly, I think the full story isn't being told in the thread. It's not as "common" for PS5 to be a superior version as people say. If we look at VG Tech or DF, there is an absolute majority of versions without a clear winner (stuttering, higher or lower framerate, higher or lower resolutions, different graphic settings...).

If we look at VG Tech for example, there are many versions that are superior in XSX.

Call of Duty Modern Warfare 2 (best performance)
The Quarry (higher resolution)
Dying Light 2 RT (highest resolution)
Fortnite (Lumen) (highest resolution)
Star Wars: Jedi Survivor (highest minimum resolution)
Dead Space (higher minimum resolution and better performance)
Resident Evil 2, 3 and 8 (generally work better)
Alan Wake Remastered (best performance)
Guardians of The Galaxy: (best performance)
Doom Eternal (higher resolution)
The Witcher 3 Next Gen: (higher resolution and better performance with RT)
Outriders (highest resolution)



What there are are many mixed results, such as Cyberpunk (which has a higher resolution in XSX but worse performance), or Need For Speed (the same), Immortals of Avenum, Metro Exodus and many more.


Of course, there are many that work better on PS5 as well. But I'm not so sure it's as common as people say around here. I agree with DF that the advantage is not as consistent as it should be. But I think the problem falls on the side of the API or the developers; Not because they are "inexperienced", we simply have to recognize that as Alex Battaglia says, PS5 is a much better-selling and successful console, it is logical that they focus more efforts on it. We have seen how games have been fixed through patches more quickly on PS5, or games that have come out with "strange" errors in XSX (Atomic Heart, Callisto Protocol that did not work on RT...). I think it's more of a platform priority issue than hardware differences. If we take the "elanalistadebits" as a reference, there are many games with higher XSX resolutions that only he has tested (Exoprimal, Mortal Kombat 1...). We also have the example of Control with Ray Tracing and unlocked framerate, where XSX performed 15-20% better on average. There are definitely plenty of games that show higher resolutions and/or better performance, but there are also plenty that show the PS5 taking the lead, especially when it comes to CPU performance.




I also don't deny that PS5 could have a better design in some areas, especially I/O. But I'm not sure that having a higher clock frequency makes it "22% better" and XSX "18% better". The higher frequency may help and even make it outperform the XSX in some areas, but the XSX still has more "horsepower" ready to go and the frequency also influences teraflops.



I think the XSX is a slightly more powerful piece of hardware than the PS5. DF thinks so too. But with more handicaps:

- Platform with fewer sales (less priority)
- Xbox Series S is extra work.
- Developers usually agree that working on PS5 is easier.
- PS5 has some advantages in terms of architecture.



But I think there is a lot of exaggeration about it. XSX and PS5 are essentially performing (almost) the same. No one would notice differences if they told us. Although it should be noted that XSX "should" be consistently better, but it isn't. I think the thing about XSX is that it was expected to have a consistent advantage and mostly what we are seeing are identical or mixed results (with advantages and disadvantages, we could generalize that it is more common for with better framerate), and that has been disappointing.
In addition to resolution and frame rate, PS5 and Xbox Series X have slightly different graphics settings.
Call of Duty Modern Warfare 2 & Alan Wake Remastered: PS5 has full V-sync.
Dying Light 2 RT: PS5 fully shading on trees.
Doom Eternal: PS5 has higher Textures.
Fortnite: PS5 has high quality Nanite.
There are many other small differences such as shadow and draw distance.
PS5 and Xbox Series X have advantages and disadvantages depending on the game, but it is certain that they have very similar benchmark results.
 
Last edited:

BlackTron

Member
The odd thing about how excited they were about the power advantage of XSX and the narrative before about not being beat on power like the previous gen shows they thought XSX would be a big selling console against PS5, with XSS adding more entry level users. The Series X share is woeful compared to PS5, its main tech and price rival and should be Microsoft's lead console. It's very much on the back burner now after all that, like they've unwittingly boxed off their own console.
It was a bad strategy that lacked the full picture. MS simply does not have the industry experience/wisdom of Sony or Nintendo
 

hepfom

Member
isn't it more that windows is less efficient than BSD or Linux so it needs more hardware to accomplish the same thing?
 

Vergil1992

Member
In addition to resolution and frame rate, PS5 and Xbox Series X have slightly different graphics settings.
Call of Duty Modern Warfare 2 & Alan Wake Remastered: PS5 has full V-sync.
Dying Light 2 RT: PS5 fully shading on trees.
Doom Eternal: PS5 has higher Textures.
Fortnite: PS5 has high quality Nanite.
There are many other small differences such as shadow and draw distance.
PS5 and Xbox Series X have advantages and disadvantages depending on the game, but it is certain that they have very similar benchmark results.
No offense at all, most of the things you've said are not taken from real comparisons. In the case of Doom Eternal, for example, I have only seen one frame where the texture quality is better, and really, it seemed more like an oversight than something related to performance. For example, in Fortnite with Lumen, DF says it was "visually identical."

PS5 was running about 55% 4K and XSX about 59% 4K. DF also doesn't point out that Dying Light 2 looks any better on PS5. What they say is that it runs at 3200x1800 on PS5 and 3456x1944 on XSX. Exactly the difference in the power of both GPUs. I think that they are very even consoles and that no one would notice real differences, but I also don't think that PS5 usually has the advantage as often as I am reading. Most cross-platforms have mixed results, and it is relatively common to find higher resolutions on XSX and more stable framerates on PS5. Apart from this, there is nothing to differentiate.

I would also like to point out that according to elanalistadebits, the last 6-7 games released on XSX/PS5 have a higher average resolution on Xbox (Cyberpunk, Mortal Kombat 1, The Crew Motorfest -better performance-, ExoPrimal, Jedi Survivor...) I don't know if he is wrong, but his results in two of them coincide with DF or VG Tech.
 
Last edited:

Godfavor

Member
PS5's GPU isn't utilized "to its fullest" either. XSX GPU's slower throughputs in some base metrics and less robust cache sub-system are inherent to it, those aren't separate entities from the GPU. It's a natural consequence of having slightly different desing goals compared to PS5, meaning different concessions. Each system's GPU is slightly faster depending on the area, searching for "a" mysterious bottleneck or unforseen design flaw is faulty logic. Yes, PS5 is more efficient overall but this doesn't mean that XSX GPU isn't properly utilized. While at it i am very curious about your source on PS5 being "much easier" develop for statement. XSX isn't something like PS3 with exotic architecture, both systems are based on the same AMD one. Furthermore, XSX is using directx which is the API with the highest familiarity among developers. There is more of a learning curve with PS5's GNM.
PS5 API is an evolution the PS4 one and many devs already know how to use it. It doesn't seem like that the learning curve is a factor of the results here. As it is one api for one system to focus on.

I agree that DX has the highest familiarity among the devs but most of features of dx12 ultimate aren't utilized either (mesh shading, new MIP map loading etc.) as they have to unify all their work by using a single Api across Xbox, ss, sx and PC due to time constraints. Apart from not using such features as the game engines do not support them yet, there is also much lost in translation by using a single API for so many devices, as a result many games experience serious stuttering on Xbox-pc side of things, and this is due to lack of optimization, there are occasions that future game updates fix the frame rate or image quality problems on the Xbox/PC versions. Devs don't care to optimize so many platforms in the same level as the PS5 (where sales are usually better as well).

P.S. Digital foundry said that the devs find PS5 is easier to work with.
 

DeepEnigma

Gold Member
The way xss sips on power so efficiently is beautiful to watch..
doubt it come on GIF by VH1
 

Crayon

Member
Your PC can pull 80 watts with the xss settings?

I don't know. Maybe. I suppose I could try. The card, definitely. That's a desktop though. If you are trying to get all serious about it, take a look at something else with components integrated onto the board like a laptop.
 

SKYF@ll

Member
No offense at all, most of the things you've said are not taken from real comparisons. In the case of Doom Eternal, for example, I have only seen one frame where the texture quality is better, and really, it seemed more like an oversight than something related to performance. For example, in Fortnite with Lumen, DF says it was "visually identical."

PS5 was running about 55% 4K and XSX about 59% 4K. DF also doesn't point out that Dying Light 2 looks any better on PS5. What they say is that it runs at 3200x1800 on PS5 and 3456x1944 on XSX. Exactly the difference in the power of both GPUs. I think that they are very even consoles and that no one would notice real differences, but I also don't think that PS5 usually has the advantage as often as I am reading. Most cross-platforms have mixed results, and it is relatively common to find higher resolutions on XSX and more stable framerates on PS5. Apart from this, there is nothing to differentiate.

I would also like to point out that according to elanalistadebits, the last 6-7 games released on XSX/PS5 have a higher average resolution on Xbox (Cyberpunk, Mortal Kombat 1, The Crew Motorfest -better performance-, ExoPrimal, Jedi Survivor...) I don't know if he is wrong, but his results in two of them coincide with DF or VG Tech.
Dying Light 2: DF mentioned the difference in shading in the video. (Shown in image)
Xbox Series X ver. of Alan and COD, they use adaptive V-Sync. (Tearing numbers are shown in the image)
Please check the images to see the difference between Nanite (Lod?) in Fortnite.
Texture differences in Doom Eternal are not limited to specific scenes, but can be seen in any scene.
This doesn't mean one is better than the other; each has its own strengths and weaknesses, and the results are similar.
Eq1hAMI.jpg
NEVPS88.jpg

5EDI2fl.jpg
CtWltMu.jpg
mlsBo0F.jpg
 

midnightAI

Member
Oh I remember Alex/DF trying to figure out how the PS5 must have been oversold after the spec reveal. I think they are more revered as hosting console war faceoffs than technical merit.
They actively feed off the console war, without it no one would watch them. They are like arms dealers selling to both sides without the war they cannot profit so they actively stoke it and warriors lap it up.

A few FPS difference makes zero difference, most will play games on their primary console regardless.
 
Last edited:

TrebleShot

Member
They actively feed off the console war, without it no one would watch them. They are like arms dealers selling to both sides without the war they cannot profit so they actively stoke it and warriors lap it up.

A few FPS difference makes zero difference, most will play games on their primary console regardless.
No , I don’t think so man.
They are just enthusiasts that go down rabbit holes and get stuck looking at 400% zoomed in images.

I look forward to their comparisons but I don’t take them overly seriously, I don’t believe there is a bias.
 

Crayon

Member
They actively feed off the console war, without it no one would watch them. They are like arms dealers selling to both sides without the war they cannot profit so they actively stoke it and warriors lap it up.

A few FPS difference makes zero difference, most will play games on their primary console regardless.

Straight up. The retro ones are fun tho.
 

DenchDeckard

Moderated wildly
Wow thats a massive difference.

I do remember either DF or someone else doing a test at the start of the gen and they were both around 220 Watts. Cant remember the game.

But in Gears 5, i remember it was going up to 210 watts.

I guess developers or engines for games like Lies of Pie are not fully utilizing all the CUs? 160 watts is way too low.

Not feeding the apu enough power. Wowza
 
What is sure is that overall, that's the first gen where the competitors are "offering" so similar/on par performances. What finally will give the advantage to one hardware over the other is more than ever the software (plus, for some gamer, the environment, such as Game Pass in Xbox one etc...), and at this level, Playstation is above. I love my Series X (I'm more green than blue), but I have played more on my PS5 the last two years.
 
Wow thats a massive difference.

I do remember either DF or someone else doing a test at the start of the gen and they were both around 220 Watts. Cant remember the game.

But in Gears 5, i remember it was going up to 210 watts.

I guess developers or engines for games like Lies of Pie are not fully utilizing all the CUs? 160 watts is way too low.
PS5 higher clocks fully explain the difference. Power consumption is not linear to clocks, it's a logarithmic ratio.
 
Last edited:

Crayon

Member
If the frame rate is varies up and down a lot, you have to cap the framerate closer to the low fps than the high side. So it's spending more time against the cap, which would have been the high side. Leaving a lot in the table, so to speak. You might still get torn frames striking that balance, too. So it can look like a gpu is being pushed but really it's not using full power.
 

Vergil1992

Member
Dying Light 2: DF mentioned the difference in shading in the video. (Shown in image)
Xbox Series X ver. of Alan and COD, they use adaptive V-Sync. (Tearing numbers are shown in the image)
Please check the images to see the difference between Nanite (Lod?) in Fortnite.
Texture differences in Doom Eternal are not limited to specific scenes, but can be seen in any scene.
This doesn't mean one is better than the other; each has its own strengths and weaknesses, and the results are similar.
Eq1hAMI.jpg
NEVPS88.jpg

5EDI2fl.jpg
CtWltMu.jpg
mlsBo0F.jpg
- Different shading does not equal "better shading on PS5". There are many that show differences even when on Xbox it is at the equivalent of Ultra.

- V-sync adaptative vs vsync "standard" Adaptative vsync it doesn't make a game run more stable. We are no longer in the days of double-buffered v-sync. The difference between adaptive and standard vsync is that even though they drop frames here and there, one will have tearing and the other won't (you'll have more microstuttering when frames drop and slightly more input lag). Generally adaptive vsync is used if the game has reasonably stable performance and lower input lag and less microstuttering are maintained in frame drops (in exchange, you get more tearing).

In this particular case it is not just the comparisons, I myself have played Alan Wake Remastered at launch on Xbox Series , and I recently played it on PS5 to "prepare" myself for AW2. And I am completely sure that PS5 has framerate drop more frequently than in XSX, I would even say that the analyzes do not reflect reality at all, there were many sections in the forests where on PS5 you turned the camera and the game had framerate drop that I could not observe XSX.

- Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images. Neither DF nor Epic talked about different settings and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets... Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images and to say that the Xbox One was actually "more powerful." Neither DF nor Epic talked about different settings, In the interview/analysis, Epic the only difference that stood out is the resolution and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets...
 

Vergil1992

Member
- Different shading does not equal "better shading on PS5". There are many that show differences even when on Xbox it is at the equivalent of Ultra.

- V-sync adaptative vs vsync "standard" Adaptative vsync it doesn't make a game run more stable. We are no longer in the days of double-buffered v-sync. The difference between adaptive and standard vsync is that even though they drop frames here and there, one will have tearing and the other won't (you'll have more microstuttering when frames drop and slightly more input lag). Generally adaptive vsync is used if the game has reasonably stable performance and lower input lag and less microstuttering are maintained in frame drops (in exchange, you get more tearing).

In this particular case it is not just the comparisons, I myself have played Alan Wake Remastered at launch on Xbox Series , and I recently played it on PS5 to "prepare" myself for AW2. And I am completely sure that PS5 has framerate drop more frequently than in XSX, I would even say that the analyzes do not reflect reality at all, there were many sections in the forests where on PS5 you turned the camera and the game had framerate drop that I could not observe XSX.

- Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images. Neither DF nor Epic talked about different settings and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets... Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images and to say that the Xbox One was actually "more powerful." Neither DF nor Epic talked about different settings, In the interview/analysis, Epic the only difference that stood out is the resolution and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets...
Sorry to repeat, I'm at work and I used the translator on this occasion.


Thanks for understanding.
 

Lysandros

Member
Well... the story of RE4R wasn't exactly like that. In XSX IQ was significantly better, probably due to temporal reconstruction, TAA or even higher quality alpha effects, or probably all at once. Obviously this was not free, and the biggest proof was that when its quality was reduced to match PS5 the framerate increased 5-10fps. The resolution was slightly higher in performance mode at launch, but when IQ got worse in XSX they should have had room to increase it, I don't know if they did it in an update. It's also fair to point out that all REs with PS5/XSX upgrades run better on XSX, especially RE3R (although I don't think there are any differences in resolution).


Still, I'd bet that the percentage of games with higher resolutions on XSX is higher than on PS5. In fact, I think DF exposed it. Apart from the fact that it is more common for the framerate to be more stable on PS5 as a general rule. Obviously I haven't counted them, but I have both consoles and I look at all the comparisons, and I've always had the feeling that the resolutions on XSX have the advantage. With DF they often don't point them out, but VG Tech usually does.
Again, offering a slightly higher resolution at the expense of performance (and at times shader resolution/IQ) isn't a valid argument to establish an 'additional power' narrative at all. 'If' XSX offered higher performance at the same resolution/IQ or had higher resolutions at the same performance/fidelity across the games as a general rule then this would be a valid benchmark. But this simply isn't the case. It's somewhat akin to increase the resolution in a game on any PC GPU at the expense of performance and declare that it's more powerful now because of it. Sounds quite silly, doesn't it?

By the way, DF tend to "miss" the cases where PS5 has the resolution advantage (spotted by NXgamer/IGN and VGtech) quite often and yell from rooftops when XSX has it. So they aren't maybe the best on 'exposing' anything, especially considering their higher affinity with a particular brand.
 
Last edited:

Mr Moose

Member


This above is a reply to the tweet. Did DF state the Xbox Series X is equivalent to 25 terraflops? I can't believe that.

"Without hardware acceleration, this work could have been done in the shaders, but would have consumed over 13 TFLOPs alone," says Andrew Goossen. "For the Series X, this work is offloaded onto dedicated hardware and the shader can continue to run in parallel with full performance. In other words, Series X can effectively tap the equivalent of well over 25 TFLOPs of performance while ray tracing."
 

Vergil1992

Member
Again, offering a slightly higher resolution at the expense of performance (and at times shader resolution/IQ) isn't a valid argument to establish an 'additional power' narrative at all. 'If' XSX offered higher performance at the same resolution/IQ or had higher resolutions at the same performance/fidelity across the games as a general rule then this would be a valid benchmark. But this simply isn't the case. It's somewhat akin to increase the resolution in a game on any PC GPU at the expense of performance and declare that it's more powerful now because of it. Sounds quite silly, doesn't it?

By the way, DF tend to "miss" the cases where PS5 has the resolution advantage (spotted by NXgamer/IGN and VGtech) quite often and yell from rooftops when XSX has it. So they aren't maybe the best on 'exposing' anything, especially considering their higher affinity with a particular brand.
To clarify: I'm not saying that. What I'm trying to say is that if the resolution targets on XSX tend to be higher, it has to be for a reason. On PC you decide if you want higher resolution or performance, but developers usually opt for slightly higher resolutions on XSX. Not always, but it is more common for the XSX to handle higher resolutions than the PS5 than the other way around. And I doubt there isn't a reason behind it when it has a slightly more powerful GPU.


The problem is that PS5 usually has the same or better performance, whether there is a difference in resolution or not. It's as if adjusting the resolution of XSX with that of PS5 (lowering it) does not make the performance the same. In any case, the differences are not always consistent, there are times when the PS5 version has higher resolution, but it is honestly less common. What is extremely common is that XSX has an advantage in resolution and PS5 in framerate.
 

Whitecrow

Banned
The problem is no forced parity as much as XBOX having only Forza as first party (until now)

Third party game dont push ps5 either, but at least Sony first party does it.
 
Last edited:

SKYF@ll

Member
- Different shading does not equal "better shading on PS5". There are many that show differences even when on Xbox it is at the equivalent of Ultra.

- V-sync adaptative vs vsync "standard" Adaptative vsync it doesn't make a game run more stable. We are no longer in the days of double-buffered v-sync. The difference between adaptive and standard vsync is that even though they drop frames here and there, one will have tearing and the other won't (you'll have more microstuttering when frames drop and slightly more input lag). Generally adaptive vsync is used if the game has reasonably stable performance and lower input lag and less microstuttering are maintained in frame drops (in exchange, you get more tearing).

In this particular case it is not just the comparisons, I myself have played Alan Wake Remastered at launch on Xbox Series , and I recently played it on PS5 to "prepare" myself for AW2. And I am completely sure that PS5 has framerate drop more frequently than in XSX, I would even say that the analyzes do not reflect reality at all, there were many sections in the forests where on PS5 you turned the camera and the game had framerate drop that I could not observe XSX.

- Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images. Neither DF nor Epic talked about different settings and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets... Regarding the Fortnite images, it reminds me of the time of PS4 and Xbox One where when PS4 won (which was most of the time) there were users who tried to "deny" this with selected images and to say that the Xbox One was actually "more powerful." Neither DF nor Epic talked about different settings, In the interview/analysis, Epic the only difference that stood out is the resolution and it would not make sense for them to be better on PS5, since on XSX it renders at approximately 15% higher resolution. Surely they are differences that have no impact on performance, perhaps due to using different tools, the time of day, the game having a system that makes variations in the assets...
Both high resolution and high graphics settings have advantages.
Comparing 1600p/Ultra or 1800p/Low is meaningless as a benchmark.
The opinion that higher resolution is better than anything else is a matter of personal preference.
In Fortnite, I myself have compared Nanite (Lod) in various places, and there is no doubt that the PS5 version is more accurate.
I'll send you another image sample, so don't just take the DF at face value, but check it out with your own eyes.
Pay attention to the curve of the octopus's legs and the roundness of its suckers.
sJyBjEk.jpg
eCphbka.jpg
WtUNfDD.jpg
 

Godfavor

Member
Both high resolution and high graphics settings have advantages.
Comparing 1600p/Ultra or 1800p/Low is meaningless as a benchmark.
The opinion that higher resolution is better than anything else is a matter of personal preference.
In Fortnite, I myself have compared Nanite (Lod) in various places, and there is no doubt that the PS5 version is more accurate.
I'll send you another image sample, so don't just take the DF at face value, but check it out with your own eyes.
Pay attention to the curve of the octopus's legs and the roundness of its suckers.
sJyBjEk.jpg
eCphbka.jpg
WtUNfDD.jpg
There is definitely something that's not working correctly regarding the xbox version here, it seems that the PS5 has 2-3X the polygon count.
 

Vergil1992

Member
Both high resolution and high graphics settings have advantages.
Comparing 1600p/Ultra or 1800p/Low is meaningless as a benchmark.
The opinion that higher resolution is better than anything else is a matter of personal preference.
In Fortnite, I myself have compared Nanite (Lod) in various places, and there is no doubt that the PS5 version is more accurate.
I'll send you another image sample, so don't just take the DF at face value, but check it out with your own eyes.
Pay attention to the curve of the octopus's legs and the roundness of its suckers.
sJyBjEk.jpg
eCphbka.jpg
WtUNfDD.jpg
Assuming that these images are real, can anyone really believe that a problem like that is due to performance? That is, we are talking about the low polygonal load that is visible to the naked eye, the tentacles have edges on all sides. It is clear that it is a lod problem or a bug. I don't like Fortnite, but maybe I'll check it out on PS5/XSX out of curiosity.

But I highly doubt that this is an intentional downgrade. It reminds me a lot of the PS4/One stage. Xbox fans took advantage of any bugs, anisotropic filtering, textures, or any anomalies to justify a performance difference.
 
Top Bottom