• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Ghostwire: Tokyo - Xbox Tech Review - No Improvement on PS5, Worse on Xbox Series X/S

Mr.Phoenix

Member
I don't really get this either. The times I've seen him in these videos talk about consoles he has seemed very reasonable. Maybe I'm missed some stuff in the past?
Here is my problem with him, though it's not bad enough where I have ever openly spoken against him.

I am just a gamer and hobbyist tech head, and even I could tell when these consoles were announced, that looking at just a TF number doesn't really mean much and that in any kinda hardware setup, even more so when they are as specialized as consoles, you have to take everything together and not just look at one part of one part of a whole.

He is a professional tech-head. This is job, he is not doing us or anyone a favor.It's his livelihood and how he makes money. He should KNOW better.

But what is worse,is that even now, after going on 3 years of these consoles being out, no one in that DF team has thought it necessary to do an expose addressing these consoles in truth and what their true strengths are. Even now, whenever PS5 comes out on top it is dismissed as the series X having a `badport` as if its somehow by default just to be better. Even with three years of evidence suggesting otherwise. Only a tech idiot will arrive at conclusions like that.

Kinda hard to take them seriously.
 
The controls are a stylistic choice. It's not meant to control like a FPS. You're supposed to repeatedly snap aim when needed with L trigger. It's snappy as hell on the parts they wanted to be snappy, like rapid element switching, dodges or blocks. You're not supposed to have flawless aim, and you can tell because if you ever land a direct hit on the orbs it does way more damage than normal, akin to a critical hit. A few of the charge hits have large amounts of homing on it as well, or are designed to hit wide areas in close range and are extremely easy to aim. Half the time now I don't even have to shoot and just counter block them and knock to the ground and instantly do a finisher. It doesn't control poorly, it's just stylistically different than almost anything there (aside from like ... Breakdown, 20 years ago).

No way is the input lag "by design". There's nothing snappy about this game. I know lag when I see it.
 
While this is all facts, it's still a first party game and should be given the resources to get it up to scratch.
Seems to me that resources were spent working on the content update that also had to ship on PlayStation. In addition resources were spent on HiFi Rush. Since resources are not infinite especially for a year old game I don't think they made a mistake with where they spent their time and money.
 
I think this is getting so much attention because its so rare. I don't even remember a digital foundry thread this long but when I see its more than a page, know the xbox lost the comparison.. hahaha.
 

SHA

Member
So the hype train for the pro and the next Xbox has started, I despise listening to these stuffs especially in this period of time, long story short , some people just can't have fun with what they have.
 

Punished Miku

Gold Member
No way is the input lag "by design". There's nothing snappy about this game. I know lag when I see it.
I dont want to keep repeating myself so agree to disagree. You can have the last word if you want it. I'm back home and playing it right now. I dont see what you're talking about. The first wind shot does have a slight delay but seems intentional as a start up. Every chain shot 2nd, 3rd hits exactly when I press the button. I just am not seeing it.
 

KXVXII9X

Member
I have no idea what is going on with the Xbox version since it is running at 60fps, near max settings, full raytracing w/DLSS performance on my midrange Asus G14 laptop. Wtf? The Xbox Series X is supposed to be a bit more powerful too along with the extra juice you can squeeze in console hardware.
 

Hurahn7

Banned
This was a funny video. Why are there so many graphical settings? Just make a 30, 60, and if it runs correctly 120 FPS modes. These companies KNOW their games run like ass and still put them out. 🤡😀 It is so ridiculous.
 

calistan

Member
I finally started playing this on Series X yesterday, expecting maybe a little bit of jankiness, but the performance is genuinely terrible. Got as far as the hospital scene, in 60 fps mode, and there's a part where every time you look down a particular corridor it stutters like crazy. This part:

nXALfyj.jpg

(Screenshot is from the PC though - as soon as I got to that bit and the framerate went haywire, I installed it on the PC instead and it plays fine with everything maxed out.)
 

OmegaSupreme

advanced basic bitch
Seems to me that resources were spent working on the content update that also had to ship on PlayStation. In addition resources were spent on HiFi Rush. Since resources are not infinite especially for a year old game I don't think they made a mistake with where they spent their time and money.
This is a 2 trillion-dollar company. Resources are in fact infinite. They operate on cheat codes. This is a poor excuse.
 

DenchDeckard

Moderated wildly
Tbh, I do not think he really cares about the X-Box that much.

Listen to him when a PC port sucks or has Shader Compilation Stutter: then you hear him truly heart broken!

I think it's just such a shame and could be deflating for your job to benchmark these games and developers can not release them in a competent state on any platform.

Let's not pretend the PS5 version was perfect and the Xbox version is bad, it's just the xbox version is worse.
 

aries_71

Junior Member
Thing is I’ve been playing the game on an PC i7 + rtx 2070 + DLSS quality. Thing is capable of moving the game at 60fps at 1440 with no problems but… but there’s always some stuttering here and there, due to traversal or shader compiling. It’s really a sad state of affairs.
 

01011001

Banned
Thing is I’ve been playing the game on an PC i7 + rtx 2070 + DLSS quality. Thing is capable of moving the game at 60fps at 1440 with no problems but… but there’s always some stuttering here and there, due to traversal or shader compiling. It’s really a sad state of affairs.

yeah, but that's the only version that's actually playable imo.

I would also recommend using flawless widescreen to increase the ridiculously low FOV.
I set it all the way to +30
 

Kataploom

Gold Member
No way is the input lag "by design". There's nothing snappy about this game. I know lag when I see it.
Was wondering what you all meant by "lag" and after playing around an hour yesterday, yes... it's shit, I got used to it tho, but I had to lower the resolution so I can play it at average 80fps or more so that issues goes away for the most part (not 100% tho). If it's not obvious, I'm playing on PC so it clearly isn't a console-only issue... I'm talking about the camera panning weird acceleration tho, I don't know if that's what feels laggy to you all but it definitely fells so to me
 
This is a 2 trillion-dollar company. Resources are in fact infinite. They operate on cheat codes. This is a poor excuse.
They'd rather spend their resources on acquiring new studios over making a year old game slightly better so forum people can cheer. The new studios will have more of an effect long term. Seeing this studio produce HiFi Rush shows me this is hardly something to be concerned about. It's time to move forward.
 
Last edited:

M1chl

Currently Gif and Meme Champion
Genuinely disappointed with this game. Why would you do this trash instead of Evil Within? Like if it would be just junk, fair enough, but it just seems so bland and uninteresting. SAD
 

Topher

Gold Member
I think this is getting so much attention because its so rare. I don't even remember a digital foundry thread this long but when I see its more than a page, know the xbox lost the comparison.. hahaha.

Not true. PS5 version of Lego Star Wars was significantly worse than Xbox version and that thread was 8 pages long. Most games only have minor differences and those are the ones that are only a page or two.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Genuinely disappointed with this game. Why would you do this trash instead of Evil Within? Like if it would be just junk, fair enough, but it just seems so bland and uninteresting. SAD

Keep in mind Tango *CHOSE* to skip over TEW3, as Ghostwire started life *AS* Evil Within 3 in concept. It must not have done well enough for their projections.

Hopefully they get a chance to revisit it again now that they're in a big first party safety net.
 

SlimySnake

Flashless at the Golden Globes
Here is my problem with him, though it's not bad enough where I have ever openly spoken against him.

I am just a gamer and hobbyist tech head, and even I could tell when these consoles were announced, that looking at just a TF number doesn't really mean much and that in any kinda hardware setup, even more so when they are as specialized as consoles, you have to take everything together and not just look at one part of one part of a whole.

He is a professional tech-head. This is job, he is not doing us or anyone a favor.It's his livelihood and how he makes money. He should KNOW better.

But what is worse,is that even now, after going on 3 years of these consoles being out, no one in that DF team has thought it necessary to do an expose addressing these consoles in truth and what their true strengths are. Even now, whenever PS5 comes out on top it is dismissed as the series X having a `badport` as if its somehow by default just to be better. Even with three years of evidence suggesting otherwise. Only a tech idiot will arrive at conclusions like that.

Kinda hard to take them seriously.
There was a DF direct recently where they discussed this in great detail.

IIRC, they had reached out to devs who themselves were perplexed by the PS5 outperforming the XSX. Some think its the DirectX overhead, others think PS5 gets more optimization time since its the lead platform, DF had no clue.

I do agree that as tech journos they should be able to figure this shit out. You have fairly identical GPUs in the PC market that can be used as a comparison between an XSX equivalent GPU and a PS5 equivalent GPU. If DirectX overhead is a thing, we should be able to see it in RDNA2.0 GPUs like the 6600xt and 6700xt. If higher clocks on the PS5 are giving it an advantage like Cerny said they would then again, fairly simple to limit clockspeeds on those GPUs to test that out. We see that the xsx is only 3x more powerful than the xss but offers at least 4x more performance, sometimes more. Why? Maybe because its clocked 400 mhz higher thus proving Cerny's hypothesis.

A lot of the PC ports recently have had very poor optimization in RT modes. Ive seen the same stuttering issues present in Xbox games so definitely possible that devs are struggling with DX12. But if anything, this would help DF narrow down the exact reasons behind the performance issues on the series x. Hogwarts is a mess on PC due to poor memory management, but it runs flawlessly on the PS5 with RT on and off so maybe the Cerny IO is helping there. Maybe the game was indeed optimized around the PS5 architecture. That would support their hypothesis if they could prove it.

I just find it odd that they refuse to do any kind of deep dive on this. On PC, the 8GB vram issues prompted a lot of the PC youtubers to run benchmarks comparing similar GPUs from nvidia and AMD with different vram configs proving that vram bottleneck IS indeed a thing. It took them less than a week. DF on the other hand has spent almost 3 years speculating why a more powerful GPU is being outperformed by the PS5.
 
There was a DF direct recently where they discussed this in great detail.

IIRC, they had reached out to devs who themselves were perplexed by the PS5 outperforming the XSX. Some think its the DirectX overhead, others think PS5 gets more optimization time since its the lead platform, DF had no clue.

I do agree that as tech journos they should be able to figure this shit out. You have fairly identical GPUs in the PC market that can be used as a comparison between an XSX equivalent GPU and a PS5 equivalent GPU. If DirectX overhead is a thing, we should be able to see it in RDNA2.0 GPUs like the 6600xt and 6700xt. If higher clocks on the PS5 are giving it an advantage like Cerny said they would then again, fairly simple to limit clockspeeds on those GPUs to test that out. We see that the xsx is only 3x more powerful than the xss but offers at least 4x more performance, sometimes more. Why? Maybe because its clocked 400 mhz higher thus proving Cerny's hypothesis.

A lot of the PC ports recently have had very poor optimization in RT modes. Ive seen the same stuttering issues present in Xbox games so definitely possible that devs are struggling with DX12. But if anything, this would help DF narrow down the exact reasons behind the performance issues on the series x. Hogwarts is a mess on PC due to poor memory management, but it runs flawlessly on the PS5 with RT on and off so maybe the Cerny IO is helping there. Maybe the game was indeed optimized around the PS5 architecture. That would support their hypothesis if they could prove it.

I just find it odd that they refuse to do any kind of deep dive on this. On PC, the 8GB vram issues prompted a lot of the PC youtubers to run benchmarks comparing similar GPUs from nvidia and AMD with different vram configs proving that vram bottleneck IS indeed a thing. It took them less than a week. DF on the other hand has spent almost 3 years speculating why a more powerful GPU is being outperformed by the PS5.
No there isn't actually because no AMD RDNA1/2 GPUs (having more than 35 CUs) have the same structure than XSX GPU. Most AMD RDNA2 GPUs (having more than 35 CUs) have, like PS5, 10 CUs by shader arrays. But XSX has 14 CUs by shader array.

This is a big difference of architecture, a difference all "analysists" are oddly overlooking. For AMD it's obvious that 10 CUs by shader array is the sweetspot performance for RDNA 1 and 2 class of GPUs. There must be good reasons for it.

Usually what happens when you go outside the sweetspot? you lose efficiency, here performance by CU compared to PS5. At least for gaming applications. Maybe XSX 14CUs by shader array is good for other kind of application (that don't need rasterization / geometry), like cloud computing and we know that XSX has actually being designed for gaming and cloud computing. Years ago Phil Spencer bragged about that fact, but he obviously stopped doing it.

On the other hand PS5 has being designed purely for 60 and 120fps (VR) gaming and it shows.
 
Last edited:

01011001

Banned
Genuinely disappointed with this game. Why would you do this trash instead of Evil Within? Like if it would be just junk, fair enough, but it just seems so bland and uninteresting. SAD

after Evil Within 2 I personally didn't wanna see another one... they killed the IP with that crap
 

SlimySnake

Flashless at the Golden Globes
No there isn't actually because no AMD RDNA1/2 GPUs (having more than 35 CUs) have the same structure than XSX GPU. Most AMD RDNA2 GPUs (having more than 35 CUs) have, like PS5, 10 CUs by shader arrays. But XSX has 14 CUs by shader array.

This is a big difference of architecture, a difference all "analysists" are oddly overlooking. For AMD it's obvious that 10 CUs by shader array is the sweetspot performance for RDNA 1 and 2 class of GPUs. There must be good reasons for it.

Usually what happens when you go outside the sweetspot? you lose efficiency, here performance by CU compared to PS5. At least for gaming applications. Maybe XSX 14CUs by shader array is good for other kind of application (that don't need rasterization / geometry), like cloud computing and we know that XSX has actually being designed for gaming and cloud computing. Years ago Phil Spencer bragged about that fact, but he obviously stopped doing it.

On the other hand PS5 has being designed purely for 60 and 120fps (VR) gaming and it shows.
But thats another thing we can point to and say ok, a 12 tflops RDNA 2.0 GPU on PC is outperforming the 12 tflops XSX GPU because of its 14 CU per shader array architecture. It will help explain the difference in performance. It will help prove the hypothesis that not all tflops are the same.

Again, all of this is very easily proven/disproven. IIRC, the 6800 non xt version has 56 CUs. Im not sure how many CUs per shader arrays it has, but we should be able to determine if that is indeed the bottleneck by simply reducing the clockspeeds to xsx levels or below to match the tflops count and running some benchmarks.
 

Mr Moose

Member
after Evil Within 2 I personally didn't wanna see another one... they killed the IP with that crap
Is it shite? I got it free with Prime, only played a bit of it, if its crap I might not bother finishing it.
I enjoyed my time with this game, can't find my disc to play the DLC though :messenger_weary:
 

Lysandros

Member
No there isn't actually because no AMD RDNA1/2 GPUs (having more than 35 CUs) have the same structure than XSX GPU. Most AMD RDNA2 GPUs (having more than 35 CUs) have, like PS5, 10 CUs by shader arrays. But XSX has 14 CUs by shader array.

This is a big difference of architecture, a difference all "analysists" are oddly overlooking. For AMD it's obvious that 10 CUs by shader array is the sweetspot performance for RDNA 1 and 2 class of GPUs. There must be good reasons for it.

Usually what happens when you go outside the sweetspot? you lose efficiency, here performance by CU compared to PS5. At least for gaming applications. Maybe XSX 14CUs by shader array is good for other kind of application (that don't need rasterization / geometry), like cloud computing and we know that XSX has actually being designed for gaming and cloud computing. Years ago Phil Spencer bragged about that fact, but he obviously stopped doing it.

On the other hand PS5 has being designed purely for 60 and 120fps (VR) gaming and it shows.
What to converse with granite feels like?
 

01011001

Banned
Is it shite? I got it free with Prime, only played a bit of it, if its crap I might not bother finishing it.
I enjoyed my time with this game, can't find my disc to play the DLC though :messenger_weary:

after the first game it was a huge disappointment.
they removed all the things that made the first game good, and replaced them with generic horror stealth gameplay and (at least on console) absolutely horrifically bad controls.

there was a lot of potential in the first game, which itself was basically Resident Evil 4-2,
but instead of fixing the issues the first game had they made it more mainstream friendly, aka generic.
 
Last edited:

aries_71

Junior Member
DF on the other hand has spent almost 3 years speculating why a more powerful GPU is being outperformed by the PS5.
But are they really tech experts? I mean, sure, they know the lingo, they are aware of the 3D techs available at a high level and they count pixels and framerate averages, but they strike me as advanced amateurs rather than real experts. The only hope is that, given their established name and audiences, they would interview real industry engineers and experts. I think that would add value to the channel, that right now is just light commentaries.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
A lot of the PC ports recently have had very poor optimization in RT modes. Ive seen the same stuttering issues present in Xbox games so definitely possible that devs are struggling with DX12. But if anything, this would help DF narrow down the exact reasons behind the performance issues on the series x. Hogwarts is a mess on PC due to poor memory management, but it runs flawlessly on the PS5 with RT on and off so maybe the Cerny IO is helping there. Maybe the game was indeed optimized around the PS5 architecture. That would support their hypothesis if they could prove it.

On the flip side, Witcher 3's RT mode runs better on SX in both the launch and 4.02 patch tests done by DF. There really is no one 'catch-all' solution or outcome, it generally varies per game.
 

Gaiff

Member
But are they really tech experts? I mean, sure, they know the lingo, they are aware of the 3D techs available at a high level and they count pixels and framerate averages, but they strike me as advanced amateurs rather than real experts. The only hope is that, given their established name and audiences, they would interview real industry engineers and experts. I think that would add value to the channel, that right now is just light commentaries.
They most definitely aren't "experts" in the sense of an engineer of an actual programmer.
 
But are they really tech experts? I mean, sure, they know the lingo, they are aware of the 3D techs available at a high level and they count pixels and framerate averages, but they strike me as advanced amateurs rather than real experts. The only hope is that, given their established name and audiences, they would interview real industry engineers and experts. I think that would add value to the channel, that right now is just light commentaries.
Yeah, we never get to hear/read analysis about LOD settings between games, engines, how they render stuff, why some games' lightning looks much better than others, how they make materials look better even if both are using PBR...

DF is a let down, it's so superficial. Like, a game will release and they won't notice odd bugs or glitches relating to the graphics. The PS5 version of Witcher 3 has an unique LOD glitch where patches of grass will render then disappear then come back again at random times, and you can't even repeat it at the same area, but it happens every 50meters or so.

Even CDPR has acknowledged this bug, yet DF has released 2 videos on the game, and didn't notice anything
 

Gaiff

Member
Yeah, we never get to hear/read analysis about LOD settings between games, engines, how they render stuff, why some games' lightning looks much better than others, how they make materials look better even if both are using PBR...

DF is a let down, it's so superficial. Like, a game will release and they won't notice odd bugs or glitches relating to the graphics. The PS5 version of Witcher 3 has an unique LOD glitch where patches of grass will render then disappear then come back again at random times, and you can't even repeat it at the same area, but it happens every 50meters or so.

Even CDPR has acknowledged this bug, yet DF has released 2 videos on the game, and didn't notice anything
In fairness, with how huge today's games are and how complex rendering techniques are, you can't really fault them for missing things. They have to go through hours upon hours of footage for just a single game. Often on multiple platforms and using different settings.
 

Darsxx82

Member
On the flip side, Witcher 3's RT mode runs better on SX in both the launch and 4.02 patch tests done by DF. There really is no one 'catch-all' solution or outcome, it generally varies per game.
And the last patch in Howart legacy fixes the fps drops in transition zones, the town and in the academy in XSX also in RT mode vs game at launch.

As you say, there are different results and situations depending on each game and the differences usually have more to do with the specific degree of optimization of each platform than with the supposed hardware advantages or disadvantages of each of those platforms.

For me Callisto protocol as an example. It went in just 2 weeks from running at 25fps and no RT on XSX to a locked 30fps, no framepacing and RT reflection (although blurrier). It without resolution cut(which according to NXG is slightly higher average than on PS5) and no cut in other details.
This situation only has one explanation 😉

PS: Do we remember Lego buldiers? Well, another case that generated a lot of speculation from some here where RT existed on PS5 but was absent on XSX and XS . It was updated. Now it even seems that the performance in XSX is superior while maintaining native 4K.
Another case of specific and different causes that explain a certain situation and that the answer is not hardware defects or weaknesses.

 
Last edited:
And the last patch in Howart legacy fixes the fps drops in transition zones, the town and in the academy in XSX also in RT mode vs game at launch.

As you say, there are different results and situations depending on each game and the differences usually have more to do with the specific degree of optimization of each platform than with the supposed hardware advantages or disadvantages of each of those platforms.

For me Callisto protocol as an example. It went in just 2 weeks from running at 25fps and no RT on XSX to a locked 30fps, no framepacing and RT reflection (although blurrier). It without resolution cut(which according to NXG is slightly higher average than on PS5) and no cut in other details.
This situation only has one explanation 😉

PS: Do we remember Lego buldiers? Well, another case that generated a lot of speculation from some here where RT existed on PS5 but was absent on XSX and XS . It was updated. Now it even seems that the performance in XSX is superior while maintaining native 4K.
Another case of specific and different causes that explain a certain situation and that the answer is not hardware defects or weaknesses.


TW3 still runs a little better on PS5 in the perf mode though. Do you have sources for Hogwart legacy? Not many recent comparisons here.
 

Darsxx82

Member
TW3 still runs a little better on PS5 in the perf mode though. Do you have sources for
TW3 perf mode performs very slightly better (1-2fps) but the average resolution on XSX is higher in both modes (according to VGtech). This after the Game in launch worked significantly worse.
What is the reason for the improvement?
Certainly no hardware drawbacks as some speculate here.

It is clear that most of the time the differences between the versions of PS5 and XSX (consoles that have been shown to be very very similar in power) tend to have reasons other than the advantages or disadvantages of the hardware.

Hogwart legacy? Not many recent comparisons here.
Sadly no comparasions post launch. I Hope DF revisite It and we can see how better Is performance now.

In the last patch description (march 8). XSX specifics notes. In Twitter people confirm Is much better now.

  • XSX
    • Performance and Stability
      • Improved frame rate performance issue when dismissing the contextual menu.
      • Improve fidelity mode performance for smooth 30 FPS
      • experience.
        • Raytracing
        • Improve stability and performance after long play throughs.
        • Improve VFX performance during while raytracing.
        • Improve performance by batching and caching raytracing buffers.
        • Removed fog volumes for better BVH performance.
The interesting thing is that until patch 3 they did not practically start the optimization improvement in XSeries and the previous ones focused on PS5 and PC🤔
 

01011001

Banned
There was a DF direct recently where they discussed this in great detail.

IIRC, they had reached out to devs who themselves were perplexed by the PS5 outperforming the XSX. Some think its the DirectX overhead, others think PS5 gets more optimization time since its the lead platform, DF had no clue.

I do agree that as tech journos they should be able to figure this shit out. You have fairly identical GPUs in the PC market that can be used as a comparison between an XSX equivalent GPU and a PS5 equivalent GPU. If DirectX overhead is a thing, we should be able to see it in RDNA2.0 GPUs like the 6600xt and 6700xt. If higher clocks on the PS5 are giving it an advantage like Cerny said they would then again, fairly simple to limit clockspeeds on those GPUs to test that out. We see that the xsx is only 3x more powerful than the xss but offers at least 4x more performance, sometimes more. Why? Maybe because its clocked 400 mhz higher thus proving Cerny's hypothesis.

A lot of the PC ports recently have had very poor optimization in RT modes. Ive seen the same stuttering issues present in Xbox games so definitely possible that devs are struggling with DX12. But if anything, this would help DF narrow down the exact reasons behind the performance issues on the series x. Hogwarts is a mess on PC due to poor memory management, but it runs flawlessly on the PS5 with RT on and off so maybe the Cerny IO is helping there. Maybe the game was indeed optimized around the PS5 architecture. That would support their hypothesis if they could prove it.

I just find it odd that they refuse to do any kind of deep dive on this. On PC, the 8GB vram issues prompted a lot of the PC youtubers to run benchmarks comparing similar GPUs from nvidia and AMD with different vram configs proving that vram bottleneck IS indeed a thing. It took them less than a week. DF on the other hand has spent almost 3 years speculating why a more powerful GPU is being outperformed by the PS5.

games are always prioritising optimisation of the platform that for the devs sells the most copies.

it's really that easy.
we saw that last gen too, where the One X could brute force itself through some of the shit developers pulled, but you could tell it wasn't really given the same care as PS4 Pro in many games.

you had games where the One X ran the same settings as the base Xbox One, simply with an increased resolution, while PS4 Pro used higher settings like better draw distance or foliage density.

and the only explanation here is, the devs didn't give a shit 🤷 they just didn't care...
"increase the resolution, does it run ok? ok ship it!"
that's all that often happened.

or developers pushing the resolution way too high, resulting in a worse framerate on One X, because the devs only cared if it runs smoothly on PS4 Pro, One X was an afterthought.


and in this specific game here, the VRR modes don't even support VRR. that's not a technical limitation, that's developers not giving even half a shit about this port. I bet noone at the studio even tested this, they just implemented the no-vsync modes and never tested it on a VRR screen to catch this issue.
in this case most likely because most of the Studio worked on Hi-Fi Rush and just Rushed (no pun intended) this port, of a game with already lackluster sales, out the door.
 
TW3 perf mode performs very slightly better (1-2fps) but the average resolution on XSX is higher in both modes (according to VGtech). This after the Game in launch worked significantly worse.
What is the reason for the improvement?
Certainly no hardware drawbacks as some speculate here.

It is clear that most of the time the differences between the versions of PS5 and XSX (consoles that have been shown to be very very similar in power) tend to have reasons other than the advantages or disadvantages of the hardware.


Sadly no comparasions post launch. I Hope DF revisite It and we can see how better Is performance now.

In the last patch description (march 8). XSX specifics notes. In Twitter people confirm Is much better now.

  • XSX
    • Performance and Stability
      • Improved frame rate performance issue when dismissing the contextual menu.
      • Improve fidelity mode performance for smooth 30 FPS
      • experience.
        • Raytracing
        • Improve stability and performance after long play throughs.
        • Improve VFX performance during while raytracing.
        • Improve performance by batching and caching raytracing buffers.
        • Removed fog volumes for better BVH performance.
The interesting thing is that until patch 3 they did not practically start the optimization improvement in XSeries and the previous ones focused on PS5 and PC🤔
Could be good but patch notes mean not much without framerate analysis.
 
Top Bottom