• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Control PS5 Vs Xbox Series X Raytracing Benchmark

Care to explain to us all how the XSX can display the same static scene at a higher frame rate than the PS5?

No need. On paper TF difference is 18%, average shows around 16% But i appreciate gameplay benchmark when the whole system is used, not GPU only for a very specific situation
So, care to explain why PS5 runs Control better than XSX in gameplay?
 
Last edited:

SatansReverence

Hipster Princess
No need. On paper TF difference is 18%, average shows around 16% But i appreciate gameplay benchmark when the whole system is used, not GPU only for a very specific situation
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.

So you don't want an actual performance benchmark comparison at all.

Amazing that another comparison where the PS5 losses handily ends in another 1000+ reply thread :messenger_unamused:
 
Last edited:
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.

So you don't want an actual performance benchmark comparison at all.

Amazing that another comparison where the PS5 losses handily ends in another 1000+ reply thread :messenger_unamused:

PS5 already lost on paper in GPU comparison long time ago. LOL.
But you can't play a game in photomode.

Anyway, i've said in this thread couple a pages back

From now on, for the sake of Xbox fans and their happiness and for the sake of XSXs power proveness, i hope that every game will only have photomode and ray tracing. No joy of gameplay, no destructions, no npcs, no shooting, no jumping, no changeable time of day...just pure ray tracing, reflections and photomode and gaming community will be happy. I just want future of gaming to be like that. Imagine games with over 100GB of ray tracing and photomode. And of course, future benchmarks will be provided by Alex. Long live Dictator
 

SatansReverence

Hipster Princess
PS5 already lost on paper in GPU comparison long time ago. LOL.
But you can't play a game in photomode.
And you can rarely determine performance from capped framerates.

Anyway, i've said in this thread couple a pages back

From now on, for the sake of Xbox fans and their happiness and for the sake of XSXs power proveness, i hope that every game will only have photomode and ray tracing. No joy of gameplay, no destructions, no npcs, no shooting, no jumping, no changeable time of day...just pure ray tracing, reflections and photomode and gaming community will be happy. I just want future of gaming to be like that. Imagine games with over 100GB of ray tracing and photomode. And of course, future benchmarks will be provided by Alex. Long live Dictator
Jesus titty fucking fanboying christ...

Imagine being this upset that the console that was known to be more powerful showed said power.
 

Panajev2001a

GAF's Pleasant Genius
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.

So you don't want an actual performance benchmark comparison at all.

Amazing that another comparison where the PS5 losses handily ends in another 1000+ reply thread :messenger_unamused:
Ok, so if XSX does not win as much as expected or loses it is poor optimisation or poor use of its potential by the devs, but if the situation is reversed it is just how it is expected to be and PS5 was used to its full potential but it is just inferior.

Taking both systems theoretical max in account we will have situations where XSX should pull ahead especially in synthetic benchmarks which this photomode “glitch” is kind of part of. In other cases you may have scenarios that take advantage of the features in PS5 (higher fillrate, higher geometry processing and rasterisation rate, albeit yet mesh shaders bring more of the CU delta into that phase) and the delta between the two consoles diminishes or goes negative in very rare cases.

Both consoles seem to have used part of their R&D budget differently: one went for a higher frequency GPU with some customisations (cache scrubbers, Geometry Engine customisations, etc...) to offset the relatively low number of CU’s and lower peak memory bandwidth, but spent more on the I/O solution.

The net result is that first party games will make the tech choices of each console shine, but third parties will target the common baseline without really exploiting the unique features of each for a while... which is what we are seeing.
 

Fredrik

Member
That's the thing under discussion though. A 'superior' hardware should overtrump a inferior hardware on a constant basis. What we see instead is that a 'inferior' hardware is outsmarting the 'superior' one on a constant basis.

This may change in the future... or it may not.
In short it seems like it’s better in some scenarios because of a higher GPU clock. I would say that there is nothing surprising about that at all, and I laugh everytime Richard says he’s ”fascinated” that PS5 push ahead. I’ve overclocked my PC enough to know it matters, they should’ve done that too. Some games will simply benefit from a faster clock. This battle will continue throughout the whole generation and the results will vary just as much. Don’t trust that tools will change anything, Sony will improve their tools as well.

But what I don’t understand is why MS have clocked their GPU 0.4GHz lower? The cooling solution seems good, they use the same GPU architecture, new RDNA2 GPUs on PC are all going fast, around 2GHz base clock. Why so low? And could MS clock it up to 1.9 or 2GHz?
 

Three

Member
seeing that the gpu isn't the problem (because in photomode is performing lot better than ps5) you think that is the same cpu clocked higher on xsx to bring problems?
I keep thinking about a problem of immature tools and above all (stuttering) a dev bug
These are assumptions about the CPU. In photo mode you literally have a static world. Static meshes/geometry. Even as a GPU raytracing benchmark you have static buffers in photo mode so it's not benchmarking what the GPU would actually do in game. minimal IO on the bottom level (mesh/geometry) . You literally have no compute shaders doing work to write out deformed geometry for raytracing. You are just looking at rendering a static scene, not a game. So while it's easy to assume that it's a CPU bottleneck, immature tools and bugs on the XSX because photomode is performing better it's a lot more complicated than that. There are other things that happen when not in photomode, more read/writes to buffers, more compute even on the GPU, more data streaming. Causes of stutters and fps in game are a lot more than just "well if it's not the GPU it must be the CPU"
 

SatansReverence

Hipster Princess
Even at capped framerates whole system can struggle in some scenarios.

Yes yes, the XSX is such a gimped system it struggles to display text on a screen.
Well, i'm not upset. Just poking fun of XSX GPU supposedly superiority

6e6a22ac612abaa77328b7bdce3c89de.png


Awful lot of REEEEEEING going on from someone who isn't upset :messenger_unamused:
Fighting over fkn console bs 🤣. Join PC, it's very peaceful over here. 🥰😘😘
with AMD vs Intel vs Nvidia? :messenger_tears_of_joy:
 
Last edited:
I like Kingthrash Kingthrash energy but he really shouldn't have used the lower settings argument in his video. It's usual to have differences like this due to bugs or dynamic lighting (and even if that was the case, those slight differences couldn't explain the performance difference). And in the photomode (or in the cutscenes in Hitman 3) it's normal if XSX is rendering 15% better as it has a 15% more powerfull GPU.

We know the XSX has worse performance during gameplay (even in Control and Hitman 3) when the CPU enters into the equation. this is where he should have focused.
 

Kingthrash

Member
I like Kingthrash Kingthrash energy but he really shouldn't have used the lower settings argument in his video. It's usual to have differences like this due to bugs or dynamic lighting (and even if that was the case, those slight differences couldn't explain the performance difference). And in the photomode (or in the cutscenes in Hitman 3) it's normal if XSX is rendering 15% better as it has a 15% more powerfull GPU.

We know the XSX has worse performance during gameplay (even in Control and Hitman 3) when the CPU enters into the equation. this is where he should have focused.
It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
 
Last edited:

anothertech

Member
Holy shit! That's some fucking major-

*Photo mode

:🤦:

Edit: can you imagine if all of ps5 wins were in photo mode in every game? What a laughable benchmark.
 
Last edited:

MonarchJT

Banned
Yeah, in photomode where everything is static. But gameplay is another thing. So, much about superiority
you still don't get it eh?
What DF proved is that there is a scenario where XSX’s GPU pulls ahead which is a statement nobody refused ever.
thats is a benchmark of the gpus ....isn't a specific scenario man....stop acting voluntarily obtuse
 

Panajev2001a

GAF's Pleasant Genius
you still don't get it eh?
thats is a benchmark of the gpus ....isn't a specific scenario man....stop acting voluntarily obtuse

Look who is talking about being obtuse. This is particular scenes (static), a specific engine, with particular optimisations per platform, etc...

I might be wrong, but it seem you want wins badly (and project it on PS fans). Actually, more than wins you seem to want obliterations and while PS fans accepted the consoles being very close and GPU wise XSX pulling ahead in several scenarios some people are still stuck in a “monster console obliterates the competition” mode.
 

phil_t98

#SonyToo
Look who is talking about being obtuse. This is particular scenes (static), a specific engine, with particular optimisations per platform, etc...

I might be wrong, but it seem you want wins badly (and project it on PS fans). Actually, more than wins you seem to want obliterations and while PS fans accepted the consoles being very close and GPU wise XSX pulling ahead in several scenarios some people are still stuck in a “monster console obliterates the competition” mode.

I dont think its about winning as such but as proved there is much more overhead with the series x
 

Panajev2001a

GAF's Pleasant Genius
I dont think its about winning as such but as proved there is much more overhead with the series x
If you were becoming afraid there must be something seriously wrong yes, I can feel how liberating seeing this, but that is weird coming from XSX fans as they had no reason to fear their console was crap or had no headroom.

Perhaps, Xbox fans were sold a monster and 6 months and more of gloating and calling the other system a rushed last minute over-clocked solution without seeing the massive power advantage reflecting on games did raise tensions, but you have got MS’s PR to blame for that.

System design and overall usable performance is a complex can of worms... you could possibly produce synthetic demos that showed headroom in PS3’s RSX in isolation and in the CELL BE too (more easily) vs the competition, but what would it prove?
 
Last edited:
It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
Thanks for your answer. But it is in my understanding they said it because the devs told them this. They probably didn't check for those differences as they were looking for the worse performing scenes on PS5, as usual proving the world the Playstation is worse than their Xbox.

You are probably mostly right in your video showing those differences, but still, those differences are mostly very small and should not explain all the performance gap.

In my opinion their most obvious journalistic bias is when they are using specific scenes (like photomode) as if those scenes were proving the XSX is the superior hardware and when they are ignoring all the data we have during normal scenes showing PS5 having actually the edge against XSX.

Why are they not doing like for like comparisons during gameplay in Control (and others big games like Destiny 2 or COD) for instance? It should be easy for them as they have access to the games with their fps tools. Well it's because the XSX is losing (or not winning) in those comparisons and they are actively looking only the worst cases on PS5 and are hiding all the worst cases on XSX.
 

Riky

$MSFT
It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them

You didn't mention the part where the actual developers told them they are identical, you know the people who made the game.
 
Last edited:
Leviathan is reliable. Anyway, IIRC last year NXGamer talked about XSXs CPU and possible bottlenecks. i think it was during Valhalla analysis
He hypothesised, something is causing some issues with the Series X, I believe developers are finding it harder early on to work with it than the PS5, nothing more. To insinuate some sort of hardware design flaw is insulting in my opinion.
 

Fredrik

Member
It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
It was the developer who said it was identical settings but I mean you could’ve played the game yourself to notice how light,shadows,smoke,particles,reflective materials,randomizers etc work dynamically in this game, it would answer most of your questions. Just try starting the game yourself and try replicate one single of the DF screens with identical details.

When the developer says it’s settings parity I think we need to research a bit better before we start claiming they’re lying.
 

Panajev2001a

GAF's Pleasant Genius
I emphasized hz diff from cpus to make you understand how in the test carried out by df the only thing missing is the AI and the game logic .... so as why a exactly the same but higher clocked CPU should have any bottlenecks? .. they are exactly same cpu. And in this test the gpu proven not to be the problem. Let's say that it is almost certainly about optimization ...tools immatury or something like that
I do not know, but rumours are improved caching system (unified L3 cache) to increase efficiency and put less pressure on the memory system (it would go along with the work they have done to put less pressure on the RAM from the GPU with the cache scrubbers... it would seem that they wanted to go with a fully unified but not too expensive RAM solution and this would help achieve it.

While it helps greatly with BC and allowing general OS updates independently from game OS ones, the virtualised approach does have a non zero impact... when using multi threading / SMT the clocks speed difference is even lower and actually negative in the case of XSS which you need to take into account as your minimum target for the game logic.
As titles stress disk I/O more and more the impact on the CPU grows: it could be that in this case the I/O Processor complex Sony built around the SSD keeps the CPU overhead small.

It could also be that, being a console that the low level graphics libraries still have a smaller overhead on PS5 (i.e.: lower CPU cost) maybe not as big as with the DX11 Xbox One vs PS4 GNM, but while optimised and allowing more direct HW access it still tries to keep in line to DX12U used on desktop:
09:35PM EDT - Q: Are you happy as DX12 as a low hardware API? A: DX12 is very versatile - we have some Xbox specific enhancements that power developers can use. But we try to have consistency between Xbox and PC. Divergence isn't that good. But we work with developers when designing these chips so that their needs are met. Not heard many complains so far (as a silicon person!). We have a SMASH driver model. The games on the binaries implement the hardware layed out data that the GPU eats directly - it's not a HAL layer abstraction. MS also re-writes the driver and smashes it together, we replace that and the firmware in the GPU. It's significantly more efficient than the PC.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Sort of makes sense. But with the previous generation things didn't really change that much as the generation went on. The PS4 was consistently ahead of the X1 by more or less the same amount. I doubt that either current gen system is suddenly going to create a huge delta between the two. Things will most likely remain similar between the two towards the end.
According to a piece by John from DF, the gap actually started to grow relative to the base Xbox One and then after a while to the One S too as developers started to push PS4 more and PS4 Pro and Xbox One X entered the picture.
 
Last edited:

Mr Moose

Member
I like Kingthrash Kingthrash energy but he really shouldn't have used the lower settings argument in his video. It's usual to have differences like this due to bugs or dynamic lighting (and even if that was the case, those slight differences couldn't explain the performance difference). And in the photomode (or in the cutscenes in Hitman 3) it's normal if XSX is rendering 15% better as it has a 15% more powerfull GPU.

We know the XSX has worse performance during gameplay (even in Control and Hitman 3) when the CPU enters into the equation. this is where he should have focused.
18.17% more powerful (TF).
You didn't mention the part where the actual developers told them they are identical, you know the people who made the game.
He's right, just because the settings are identical it doesn't mean they will produce identical results visually. due to bugs or whatever. We've seen it in Watch_Dogs: Legion with RT on PS5 (puddles) and AF on Series X.
Digital Foundry either didn't notice it or didn't feel like mentioning it.
 
Last edited:

Riky

$MSFT
I emphasized hz diff from cpus to make you understand how in the test carried out by df the only thing missing is the AI and the game logic .... so as why a exactly the same but higher clocked CPU should have any bottlenecks? .. they are exactly same cpu. And in this test the gpu proven not to be the problem. Let's say that it is almost certainly about optimization ...tools immatury or something like that

It is about optimization, that includes timelines for launch period and working from home, look at Dirt 5 where the Devs didn't even know about the 120hz settings bug on Xbox until DF told them.
It's also about last gen engines and machines, all these games so far were started for last gen consoles, they have then been upgraded from Pro and X1X versions to run on the new hardware. When you look at the structure in amount of compute units etc and the change of developer environment from those last gen refreshes to next gen it becomes obvious that one will be a lot easier than the other.
 
Last edited:

Riky

$MSFT
18.17% more powerful (TF).

He's right, just because the settings are identical it doesn't mean they will produce identical results visually. due to bugs or whatever. We've seen it in Watch_Dogs: Legion with RT on PS5 (puddles) and AF on Series X.
Digital Foundry either didn't notice it or didn't feel like mentioning it.

Because the 16% advantage isn't about one specific scene where the small differences would matter, it's over 20 samples where that would be taken into account as we see the difference goes from nothing to over 30%. The average is 16%.
 

phil_t98

#SonyToo
If you were becoming afraid there must be something seriously wrong yes, I can feel how liberating seeing this, but that is weird coming from XSX fans as they had no reason to fear their console was crap or had no headroom.

Perhaps, Xbox fans were sold a monster and 6 months and more of gloating and calling the other system a rushed last minute over-clocked solution without seeing the massive power advantage reflecting on games did raise tensions, but you have got MS’s PR to blame for that.

System design and overall usable performance is a complex can of worms... you could possibly produce synthetic demos that showed headroom in PS3’s RSX in isolation and in the CELL BE too (more easily) vs the competition, but what would it prove?
So where do you get fear from? Where in my post did I put i was fearful?

again you turn it in to console warrioring to try and turn your point around. It was a simple question.

as I pointing out it isn’t about winning but what he photomode showed was that there was more overhead on the Xbox than ps5? Is that right?
 
Last edited:

Clear

CliffyB's Cock Holster
as I pointing out it isn’t about winning but what he photomode showed was that there was more overhead on the Xbox than ps5? Is that right?

Yes, you are correct, but that overhead only appears to manifest itself in certain scenarios. Specifically based on this result, and in other games where the SX creeps ahead during real-time cinematics, it appears that its only when the CPU is at its most idle does the gap show itself.

This could be a fault with the PS5's smart-shift implementation, or it could indicate that the fillrate advantage of the SX GPU gets stymied by bus bandwidth, or some other system bottleneck when the whole APU is under load. To be honest, it could just be down to graphics API differences as regardless of the code being run there's a significant layer between that and the actual hardware.
 

MonarchJT

Banned
Look who is talking about being obtuse. This is particular scenes (static), a specific engine, with particular optimisations per platform, etc...

I might be wrong, but it seem you want wins badly (and project it on PS fans). Actually, more than wins you seem to want obliterations and while PS fans accepted the consoles being very close and GPU wise XSX pulling ahead in several scenarios some people are still stuck in a “monster console obliterates the competition” mode.
this post will age very very very bad..saved for future crow eating
 

phil_t98

#SonyToo
Yes, you are correct, but that overhead only appears to manifest itself in certain scenarios. Specifically based on this result, and in other games where the SX creeps ahead during real-time cinematics, it appears that its only when the CPU is at its most idle does the gap show itself.

This could be a fault with the PS5's smart-shift implementation, or it could indicate that the fillrate advantage of the SX GPU gets stymied by bus bandwidth, or some other system bottleneck when the whole APU is under load. To be honest, it could just be down to graphics API differences as regardless of the code being run there's a significant layer between that and the actual hardware.
Yeah I didn’t say it was down to hardware, I just think it’s a poorly optimised game tbh
 

J_Gamer.exe

Member
Jesus Christ they are making analysis videos of the analysis videos to try get the win 🙈🙈🙈
Oh its "They" again. Yeah all PlayStation fans across the globe clubbed together to make this video.

Let's ignore the fact kingthrash has been calling them out for years.

How about address the differences spotted and come up with your explanation.

As I pointed out previously the section in the video were DF mentioned bottlenecks the series x managed to get a few FPS better performance but yet Sony fans presumed that it was pointed towards the x and not both consoles were it was.
Damn, you'd have to be a new level of daft to not see why it would refer to xbox given it lead in prior scenes, what sort of logic do you have to say ps5 was behind, is now level so maybe ps5 is bottlenecked in this level scene when its the SX thats now lower than before in % lead LOL. If it was aimed at both you'd expect ps5 to drop by a similar % as in the other scenes if CPU's were equal.

tenor.gif
 

German Hops

GAF's Nicest Lunch Thief
I do not know, but rumours are improved caching system (unified L3 cache) to increase efficiency and put less pressure on the memory system (it would go along with the work they have done to put less pressure on the RAM from the GPU with the cache scrubbers... it would seem that they wanted to go with a fully unified but not too expensive RAM solution and this would help achieve it.

While it helps greatly with BC and allowing general OS updates independently from game OS ones, the virtualised approach does have a non zero impact... when using multi threading / SMT the clocks speed difference is even lower and actually negative in the case of XSS which you need to take into account as your minimum target for the game logic.
As titles stress disk I/O more and more the impact on the CPU grows: it could be that in this case the I/O Processor complex Sony built around the SSD keeps the CPU overhead small.

It could also be that, being a console that the low level graphics libraries still have a smaller overhead on PS5 (i.e.: lower CPU cost) maybe not as big as with the DX11 Xbox One vs PS4 GNM, but while optimised and allowing more direct HW access it still tries to keep in line to DX12U used on desktop:
7b2ec1860df515a0f07bc13b6bd10641.gif
 

Gediminas

Banned
so many excuses from xbox side so little substance.

no, but really, people measuring performance in photo mode? how much you have to be brain dead to do that VS real gameplay where you actually playing?

xbox side every day hitting the bottom of the barrel more and more. i am pity you xbox people. so sad.
 

Topher

Gold Member
Because the 16% advantage isn't about one specific scene where the small differences would matter, it's over 20 samples where that would be taken into account as we see the difference goes from nothing to over 30%. The average is 16%.

What are the values you've assigned to those 20 samples to come up with an average?
 
According to a piece by John from DF, the gap actually started to grow relative to the base Xbox One and then after a while to the One S too as developers started to push PS4 more and PS4 Pro and Xbox One X entered the picture.

I never noticed that from the comparisons to be honest. The gap between the versions seemed to remain the same. I do know that the X1 had some weird hardware quirks that put it quite a bit behind the PS4 when it came out. So far I'm not seeing anything similar with the PS5 and the XSX.
 

FranXico

Member
This is actually kinda interesting.

16% lead on average in a purely GPU related task. If you woulda told us this 6 months ago, there would be nothing but revolt.

I’m not surprised that Richard goes on a tangent about silicon investment return and “early days” and “parity”. He clearly wanted more ahahahaha so pathetic. And people say he’s neutral!

Complete disregard for draw calls doesn’t surprise me either!
Lots of bullshit numbers. With Hitman 3, it was the "44% more pixels" that "scales with compute units". Now it's "16% more frames" that "scales with TF".

Can't wait for 26% something that scales with the RAM bandwidth (and let's pretend it's the same for the entire RAM).

One day they care about one thing and ignore another, then do the opposite the next. Like looking for numbers that match spec ratios. It gives that impression really.
 
Last edited:
as I pointing out it isn’t about winning but what he photomode showed was that there was more overhead on the Xbox than ps5? Is that right?

But certainly not enough for it to be a big difference. I understand why the developers went with the same settings on both. Pretty sure many other games will do the same except for the ones that favor the XSX architecture more. That's what I've learned from this.
 

phil_t98

#SonyToo
16% a fairly big difference tbh and when you look at where the SX hits frame rate wise as they go round it’s big. If you work out ps5 hitting 45FPS and add 16% on top of that it’s big enough
 

phil_t98

#SonyToo
Oh its "They" again. Yeah all PlayStation fans across the globe clubbed together to make this video.

Let's ignore the fact kingthrash has been calling them out for years.

How about address the differences spotted and come up with your explanation.


Damn, you'd have to be a new level of daft to not see why it would refer to xbox given it lead in prior scenes, what sort of logic do you have to say ps5 was behind, is now level so maybe ps5 is bottlenecked in this level scene when its the SX thats now lower than before in % lead LOL. If it was aimed at both you'd expect ps5 to drop by a similar % as in the other scenes if CPU's were equal.

tenor.gif
Daft is looking at the video and seeing what you only want to see, which you have done
 

Kingthrash

Member
It was the developer who said it was identical settings but I mean you could’ve played the game yourself to notice how light,shadows,smoke,particles,reflective materials,randomizers etc work dynamically in this game, it would answer most of your questions. Just try starting the game yourself and try replicate one single of the DF screens with identical details.

When the developer says it’s settings parity I think we need to research a bit better before we start claiming they’re lying.
Sooooo......why isn't it on DF tho....they are the only ones echoing the devs. Lol....I mean even so it still doesn't explain the missing textures and lower reflective effects throughout. The dev said identical just like DF said.....both of them lied.
 
Top Bottom