• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Atomic Heart - DF Tech Review - PS5 vs Xbox Series X/S vs PC

Lysandros

Member
XSX is more powerful. PS5 is more efficient and easier to work with.

Ony devs can say (or show) which of those two options they prefer.
XSX has a slightly larger GPU with more processing cores. PS5 has a GPU with faster fixed function units and caches. They end up being similarly powerful. I do not see efficiency and speed as being unrelated to power. The CPUs are about the same. The truly meaningful exception is the I/O side not counting the APIs.
 
Marketing rights? It's just on Gamepass day one, that has nothing to do with technical support.

Doesn't Microsoft usually do something to try making sure 3P games are running adequately on their systems (relative to competition) if they have marketing rights to them? And wouldn't they want to do that for a game in their service Day 1, that is multiplat and could help attract subscribers to their service?

If this isn't something Microsoft does, then they might want to change their stance. Because having the worst performing version of a multiplat that's in Game Pass Day 1 is just reinforcing negative perspectives about the service. That it's only good for lower-quality games, or that it negatively affects the quality of games across platforms, and if you want the better-performing version of the game, you better be ready to pay upfront for it.

MS kind of defeat the appeal of Game Pass, in that sense.

Same way Sony had marketing rights for Deathloop and yet the Series X version ran better.

Lol Deathloop came out on Xbox a whole year after it hit PS5. Also it's a 1P game and the team had over a year's worth of time to optimize the Xbox versions.

Microsoft had actual incentives to make sure the Xbox version ran better. Doesn't seem like they felt those incentives were here for Atomic Heart.

You're literally creating a scenario, there is no evidence whatsoever that MS had any input in the making of the game, please provide it. To then pretend this is some sort of fault with the hardware is as sad as you bringing up what people said post launch earlier in the thread, that's an old take that has been disproved several times. People were saying exactly that at the launch of Dirt 5 remember, then it simply got patched.

I didn't outright say it was a problem with the hardware, why the game runs worst on Series X. Even said that out of the two, I'm more inclined to believe it was Microsoft not providing any technical assistance to the developer.

What I want to know is why they didn't provide any technical assistance, knowing they secured the game for Day 1 in Game Pass. They nullified the impact of that by a lot due to the performance issues, you can't deny this. Do Microsoft only provide that type of support for games which are otherwise exclusive or timed exclusive? Or for 1P games coming later to Xbox so that they don't create bad optics of their own 1P games running worst on Xbox compared to PlayStation?

All I'm saying is, maybe for 3P games they get Day 1 in Game Pass, they should actually provide technical support for the devs to make sure the Xbox version is at least on par with the PlayStation version at launch, especially if it's a smaller dev who may be strapped for resources and could use the extra ones on the Xbox version (if they're busy on the PlayStation and/or PS & PC versions beforehand).

You can find as many Series X multiplats if not more than run better, falling behind is a delusion, it's just not a big deal because it's expected of Series X.

How many are current-gen only? Even Plague Tale: Requiem got a port to the Switch, so that wouldn't count. If you have some examples, you could share them.

We've seen with Forza Horizon 5 with a full 4k 60fps locked and open world that they can outpace anyone, I'm sure Forza Motorsport will take that even further as it's not just a linear last gen game with some graphical bells and whistles added.

FH5 has terrible LOD pop-in issues probably due to how its engine handles asset streaming, and those problems exist on all versions. Also it doesn't have in-game RT. And Forza Motorsport's physics engine is a lot more involved than Forza Horizon 5's.

This s not a direct storage issue. It's not like you are seeing the game perform identically on PC as it is on the XS consoles.

Stutters on PC have to do with the unreal engine and how it handles shaders on the PC.

People seem to keep forgetting the convoluted memory setup MS went for with the series consoles. That shit probably complicates the shit out of developing anything on those machines.

The issues on the series consoles are clearly about memory management, not IO bandwidth. Don't forget this game is also on past-gen consoles.

Idkbout identical... I like to think of it more like this.

XSX is more powerful. PS5 is more efficient and easier to work with.

Ony devs can say (or show) which of those two options they prefer.

XSX is more powerful on paper; in practice it really comes down to how much a given game is going to benefit from the extra GPU bandwidth and compute-driven tasks. Otherwise I'd say PS5 beats it in most areas that are more critical to gaming (pixel fillrate, GPU cache bandwidth, geometry culling & rasterization, I/O, (assumed) lower OS overhead, more refined APIs, audio). Also the GPU cache scrubbers help reduce the amount of trips it needs to make out to system RAM, and it has dedicated hardware for enforcing cache coherency.

The cache scrubbers nullify some of the Series X's raw bandwidth advantage, and the cache coherency being offloaded from the CPU in PS5 also helps there, reducing Series X's (very slight) CPU advantage, and bandwidth advantage even more. For all intents and purposes the two systems are about equal in "power".

Only the GPU can see the memory difference and the slower memory is again reserved for the OS and the balance on Series X used for CPU and Audio. I'm sure DF said that the Series X actually has more available memory for developers as it reserves less.

Wasn't that just Richard speculating, and not confirmed? Anyway the issue with Series X's memory is that if a dev needs more than 10 GB for graphics data, they have to either swap to using some of the other 3.5 GB to house the rest (and would need two reserves: one for the extra data and one to act as a scratchpad if they don't want to dump data in the 10 GB out altogether and re-fetch it from storage), or access the data from storage.

In the case of a texture miss, the Series X (and Series S) GPU has a mip-blending piece that's supposed to help fill in for the missing texture until it's fetched and placed in memory, but that's obviously a fallback feature, not something devs are meant to rely on. And in the case the game would need to access data for the GPU beyond what's in the 10 GB, even if it's in the other 3.5 GB, that ends up dropping effective system memory bandwidth for the GPU.

We don't know by how much; it would probably vary from game to game. But you wouldn't see the GPU utilizing its full theoretical bandwidth of 560 GB/s if it has to fetch data from the other 3.5 GB and/or storage that isn't in the 10 GB. I'm not even 100% sure the Series X CAN access the other 3.5 GB, but given MS calls the 10 GB "GPU optimized", I'm assuming that the GPU can. It's just up to the developer and how they want to manage access (keeping in mind the drawbacks).

The Series S has the advantage of GDDR6 over the One X and normally outputs at much lower resolutions so doesn't need that the same throughput. When resolution matched in games with One X like Ori, Halo Infinite 60fps etc Series S always outperforms One X.

Are you sure that's due to the bandwidth? Or the type of GDDR Series S uses? Don't forget the role of the GPU in all of this.

The GDDR type doesn't matter as much if in case of Series S, it has a factually lower bandwidth than the One X. Only things it could possibly help with is latency, but newer GDDR chips tend to have slightly more latency than their previous versions. And as the chip makers keep trying to boost the bandwidths by multiple amounts each generation, they trade on some latency to do so.

In other words Series S probably has GDDR6 with slightly more latency than the GDDR5 the One X used. Same with Series X and PS5, but the reason it hurts Series S is because it has both less bandwidth in total (224 GB/s vs 336 GB/s), AND less memory capacity (8 GB for Series S GPU (maybe closer to 6-7 GB for the GPU after taking out what the CPU & audio use) vs 9 GB for One X GPU (closer to 7-8 GB after taking out CPU & audio use, though MS added back some system memory a little later).

GDDR6 has nothing to do with it. In fact, the One X has a lot more memory bandwidth than the Series S. 326.4 GB/s vs 224.0 GB/s
The diference is that RDNA2 is a tile based rendered architecture, but the One X is a tradicional renderer. This saves a lot in accesses to memory.
The Series S also has more L2 cache, which also saves on memory accesses.

I mean in practice this SHOULD be the cast and manifest in the games, and I'm sure these same advantages exist with Series X.

But none of that seems to be helping Atomic Heart perform up to expectations on Series X, and I'd like to know why. It's either because MS didn't provide technical assistance (for whatever reason) or as aries_71 aries_71 said, the devs just didn't give a shit to QA the Xbox versions (all the more reason why Microsoft should have stepped in).

I doubt it's down to architecture issues but if people don't want to entertain MS not providing support, or even entertain the devs just pissing away QA for the Series X & S versions...that doesn't leave a lot of other possibilities.
 
Last edited:

Thirty7ven

Banned
DF backed themselves into a stupid ass corner with their takes and now they have to go around downplaying, justifying or quasi ignoring PS5 version advantages in these H2H.

Hell Alex Buggaga was on B3D just the other day acting like a prime fanboy crying that devs weren’t spending enough time on Series X versions.
 

DenchDeckard

Moderated wildly
I'm here for the crazy theories on how it's the hardware that is at fault for this poor performing port....don't let me down now.

Anticipation Popcorn GIF
 
Last edited:

winjer

Gold Member
I mean in practice this SHOULD be the cast and manifest in the games, and I'm sure these same advantages exist with Series X.

But none of that seems to be helping Atomic Heart perform up to expectations on Series X, and I'd like to know why. It's either because MS didn't provide technical assistance (for whatever reason) or as aries_71 aries_71 said, the devs just didn't give a shit to QA the Xbox versions (all the more reason why Microsoft should have stepped in).

I doubt it's down to architecture issues but if people don't want to entertain MS not providing support, or even entertain the devs just pissing away QA for the Series X & S versions...that doesn't leave a lot of other possibilities.

I was only comparing the Series S to the One X, with that post.
But on the case of the PS5, Series S and Series X, they are all tile based renderers. And have similar amounts of cache, compare to the amount of CUs.

The problem with the Series S/X is probably software.
For example, Sometime ago I saw a comparison between games running on the Windows Store and Steam.
Games on Steam ran a few fps better because they don't have the extra overhead of the UWP system.
Another issue could be the MS SDK. Maybe it's just easier to get things running well on the Sony SDK, than on the MS SDK.
 

Lysandros

Member
DF backed themselves into a stupid ass corner with their takes and now they have to go around downplaying, justifying or quasi ignoring PS5 version advantages in these H2H.

Hell Alex Buggaga was on B3D just the other day acting like a prime fanboy crying that devs weren’t spending enough time on Series X versions.
One thing is sure; PS5 version's screen time is getting pretty limited lately, barely on third of Series consoles in Hogwarts and this. No side by side performance showing either. As they wish i guess, this is not so important. There are other (arguably more accurate) outlets in the field for further in depth analysis.
 

Mr Moose

Member
One thing is sure; PS5 version's screen time is getting pretty limited lately, barely on third of Series consoles in Hogwarts and this. No side by side performance showing either. As they wish i guess, this is not so important. There are other (arguably more accurate) outlets in the field for further in depth analysis.
There's not much to say about the PS5 version apart from it has a few dips but is mostly solid, so they give more time showing the Xbox versions that aren't running as well. Side by sides are always welcome though, pity they didn't do any.
 

Riky

$MSFT
Doesn't Microsoft usually do something to try making sure 3P games are running adequately on their systems (relative to competition) if they have marketing rights to them? And wouldn't they want to do that for a game in their service Day 1, that is multiplat and could help attract subscribers to their service?

If this isn't something Microsoft does, then they might want to change their stance. Because having the worst performing version of a multiplat that's in Game Pass Day 1 is just reinforcing negative perspectives about the service. That it's only good for lower-quality games, or that it negatively affects the quality of games across platforms, and if you want the better-performing version of the game, you better be ready to pay upfront for it.

MS kind of defeat the appeal of Game Pass, in that sense.



Lol Deathloop came out on Xbox a whole year after it hit PS5. Also it's a 1P game and the team had over a year's worth of time to optimize the Xbox versions.

Microsoft had actual incentives to make sure the Xbox version ran better. Doesn't seem like they felt those incentives were here for Atomic Heart.



I didn't outright say it was a problem with the hardware, why the game runs worst on Series X. Even said that out of the two, I'm more inclined to believe it was Microsoft not providing any technical assistance to the developer.

What I want to know is why they didn't provide any technical assistance, knowing they secured the game for Day 1 in Game Pass. They nullified the impact of that by a lot due to the performance issues, you can't deny this. Do Microsoft only provide that type of support for games which are otherwise exclusive or timed exclusive? Or for 1P games coming later to Xbox so that they don't create bad optics of their own 1P games running worst on Xbox compared to PlayStation?

All I'm saying is, maybe for 3P games they get Day 1 in Game Pass, they should actually provide technical support for the devs to make sure the Xbox version is at least on par with the PlayStation version at launch, especially if it's a smaller dev who may be strapped for resources and could use the extra ones on the Xbox version (if they're busy on the PlayStation and/or PS & PC versions beforehand).



How many are current-gen only? Even Plague Tale: Requiem got a port to the Switch, so that wouldn't count. If you have some examples, you could share them.



FH5 has terrible LOD pop-in issues probably due to how its engine handles asset streaming, and those problems exist on all versions. Also it doesn't have in-game RT. And Forza Motorsport's physics engine is a lot more involved than Forza Horizon 5's.



XSX is more powerful on paper; in practice it really comes down to how much a given game is going to benefit from the extra GPU bandwidth and compute-driven tasks. Otherwise I'd say PS5 beats it in most areas that are more critical to gaming (pixel fillrate, GPU cache bandwidth, geometry culling & rasterization, I/O, (assumed) lower OS overhead, more refined APIs, audio). Also the GPU cache scrubbers help reduce the amount of trips it needs to make out to system RAM, and it has dedicated hardware for enforcing cache coherency.

The cache scrubbers nullify some of the Series X's raw bandwidth advantage, and the cache coherency being offloaded from the CPU in PS5 also helps there, reducing Series X's (very slight) CPU advantage, and bandwidth advantage even more. For all intents and purposes the two systems are about equal in "power".



Wasn't that just Richard speculating, and not confirmed? Anyway the issue with Series X's memory is that if a dev needs more than 10 GB for graphics data, they have to either swap to using some of the other 3.5 GB to house the rest (and would need two reserves: one for the extra data and one to act as a scratchpad if they don't want to dump data in the 10 GB out altogether and re-fetch it from storage), or access the data from storage.

In the case of a texture miss, the Series X (and Series S) GPU has a mip-blending piece that's supposed to help fill in for the missing texture until it's fetched and placed in memory, but that's obviously a fallback feature, not something devs are meant to rely on. And in the case the game would need to access data for the GPU beyond what's in the 10 GB, even if it's in the other 3.5 GB, that ends up dropping effective system memory bandwidth for the GPU.

We don't know by how much; it would probably vary from game to game. But you wouldn't see the GPU utilizing its full theoretical bandwidth of 560 GB/s if it has to fetch data from the other 3.5 GB and/or storage that isn't in the 10 GB. I'm not even 100% sure the Series X CAN access the other 3.5 GB, but given MS calls the 10 GB "GPU optimized", I'm assuming that the GPU can. It's just up to the developer and how they want to manage access (keeping in mind the drawbacks).



Are you sure that's due to the bandwidth? Or the type of GDDR Series S uses? Don't forget the role of the GPU in all of this.

The GDDR type doesn't matter as much if in case of Series S, it has a factually lower bandwidth than the One X. Only things it could possibly help with is latency, but newer GDDR chips tend to have slightly more latency than their previous versions. And as the chip makers keep trying to boost the bandwidths by multiple amounts each generation, they trade on some latency to do so.

In other words Series S probably has GDDR6 with slightly more latency than the GDDR5 the One X used. Same with Series X and PS5, but the reason it hurts Series S is because it has both less bandwidth in total (224 GB/s vs 336 GB/s), AND less memory capacity (8 GB for Series S GPU (maybe closer to 6-7 GB for the GPU after taking out what the CPU & audio use) vs 9 GB for One X GPU (closer to 7-8 GB after taking out CPU & audio use, though MS added back some system memory a little later).



I mean in practice this SHOULD be the cast and manifest in the games, and I'm sure these same advantages exist with Series X.

But none of that seems to be helping Atomic Heart perform up to expectations on Series X, and I'd like to know why. It's either because MS didn't provide technical assistance (for whatever reason) or as aries_71 aries_71 said, the devs just didn't give a shit to QA the Xbox versions (all the more reason why Microsoft should have stepped in).

I doubt it's down to architecture issues but if people don't want to entertain MS not providing support, or even entertain the devs just pissing away QA for the Series X & S versions...that doesn't leave a lot of other possibilities.

Why would MS do anything unless a developer went to them for assistance, do you know how many games are being made at any one time and what resources that would take? This game had two patches of over 85gb in two days so the developer is obviously aware of the issues anyway. The latest patch seems to run at 60fps during gameplay, I don't see the major issue people are talking about on my play tonight on the 1.02 patch. But like I've said we've been through all this before from Dirt 5 to Valhalla at launch and all these wild theories and the games were simply patched, they didn't patch the hardware did they.

The "appeal" of Gamepass has nothing to do with a few framerate stutters, it's a value proposition and it applies to three other consoles and PC, it's not Series X specific.

The marketing deal prevented the release of Deathloop on Xbox, we have no idea if they continued to work on it for that year, the same performance modes were patched into the PS5 version and run worse, you would think by your logic that Sony would help Bethesda get parity.

Again what evidence do you have that the developer even approached Microsoft for assistance, developers are not always even aware themselves of the issues at launch as again shown with Dirt 5 where they shipped the Series X version with Series S settings.

This game isn't current gen only, I don't see your point.

You can find a drawback in every game and engine but obviously if the memory setup really had the flaws that people like to pretend it has then my point is how does this flaw not appear in FH5 at 4k and a locked 60fps, it doesn't drop frames at all and is open world and far more demanding than this game.

I don't know why Series S Vs One X was introduced into this thread as it's totally irrelevant, for this game Series S runs at twice the framerate with higher settings apart from resolution, the One X is miles behind even with it's memory bandwidth advantage, as for Series X it's 10gb of optimised memory has the most bandwidth of any console created, so I doubt it's an issue.
 

PaintTinJr

Member



Summary:

- Powered by Unreal Engine 4
- DF are fond of the games visual make up but say you can spot flaws as well.

- Current gen consoles get IQ and frame rate boost over last gen
- PS5|SX use dynamic 2160p with Unreal TAA. Mostly stays near or at full 4K.
- PS5 and SX are like for like in terms of visual features.

- Series S comes at 1080p with some minor drops but mostly 1080p.
- Series S also cuts foliage density and geometric detail, also no motion blur.

- PS5 seems to run the game at a steady 60 FPS with minor drops
- SX can drop frames and stutters and can drop to high 30s in the intro
- Series S doesn't stutter as much as SX but still prone to drop.
- The day 1 patch on Xbox version causes massive frame time spikes and have other glitches like missing videos on in-game projection screens as well

- PC version doesn't have RT despite it being promised earlier
- Shader compilation offered at launch of the game
- Minor cut-scene drops aside the stutter is far less than most modern games
- General PC specs and comparison with consoles shows higher quality features

- PC supports DLSS2 and FSR 1.0
- DLSS3 frame generation also supported

Thanks adamsapple adamsapple for the summary!

Never in a month of Sundays is the 15min to 60min gameplay at the start of the game powered by Unreal 4. Unreal 1 or Unreal 2, maybe but the gameplay - at that point - is a carbon copy of the old Valve Source Engine that originally powered Half Life 2.

The opening 15min tech demo walk-and-talk could be Unreal 4, although I don't remember watching the loading screen at my friend's when playing on XsX to see if the Unreal Engine screen showed, but the tunnel cutting drill beast is so, so faceted, it isn't even at Orange box level of fidelity and looks so Source Engine.
 
Last edited:

01011001

Banned
Dx12 has been on Xbox since atleast 2019. No chance this is still a tools issue.

I'm going to go with 10 gigs of faster ram vs 16 slower at this point

less time spent optimising the slower selling system. that has always been an issue as long as consoles existed.

the Xbox version of San Andreas says hi. 3 and VC looked almost next gen compared to PS2, but then they released almost simultaneously and the Xbox version has a variety of issues compared to PS2
 

SlimySnake

Flashless at the Golden Globes
Dx12 has been on Xbox since atleast 2019. No chance this is still a tools issue.

I'm going to go with 10 gigs of faster ram vs 16 slower at this point
It's more like DX12 stuff, which is harder to get good.
Shouldnt that also affect PC? I thought the PC version ran fine here unlike Gotham Knights and Hogwarts which run like shit on all Xbox consoles and PC.
 

Lysandros

Member
Dx12 has been on Xbox since atleast 2019. No chance this is still a tools issue.

I'm going to go with 10 gigs of faster ram vs 16 slower at this point
Why it has to be one single thing, particularly the VRAM architecture? It may well play a role but those are systems with components working together. There are also other facets to consider like the differences between the CPUs, GPUs or I/Os on the hardware front. May be an amalgamation of things with varying degrees of influence depending of the processes within differing engines.
 
Last edited:

M1chl

Currently Gif and Meme Champion
Shouldnt that also affect PC? I thought the PC version ran fine here unlike Gotham Knights and Hogwarts which run like shit on all Xbox consoles and PC.
Does it run fine? If so it could be a ton of things.

I ain't watching due to this conflict, I feel like not contributing to watching videos and stuff promoting the game. It's my own personal choice, not saying anyone should follow me.
 
Watch the DF video again. The Series X isn't just stuttering. It has prolonged periods of running at a much lower framerate. A stutter can be due to an access miss or something. Running at a lower FPS for prolonged durations of time speaks more to RAM or its availability. It means the entire engine is getting bottlenecked somehow. We know it's not a raster bottleneck, it also isnt an IO bottleneck being that the game is on the previous-gen consoles too. That leaves memory being the only possible culprit. And what is the only difference between the memory on the PS5 and that of the XS consoles?
Reminds me XB1 that had such a problem (1 second freeze) in a Fallout game (a MS game!) but that was actually caused by real-time super slow I/O access on HDD (because it could be pretty much be resolved using SSD).

But without profiling it's really hard to know what's causing those pauses here. Could be I/O too. Imagine if they somehow use real-time super fast Kraken on PS5 and regular PC SSD on Xbox? Could cause this, but it's really only a vague theory (probably wrong).
 

PaintTinJr

Member
Shouldnt that also affect PC? I thought the PC version ran fine here unlike Gotham Knights and Hogwarts which run like shit on all Xbox consoles and PC.
Even though it seemed to run fine on my friend's Series X (VRR capable OLED TV) when I tried it, if there is a big performance difference I would chalk it up to Valve's Source Engine gameloop still being largely single threaded from its idTech origins. The PC CPUs with high clocks and massive cache's - and being full desktop chips - should have no issues with a single threaded game bottleneck and the PlayStation 5 being more CPU cache latency efficient than the series gets it a little boost in comparison too - would be my guess.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Only issue I’ve experienced with the Series X version was the weird frame drops while riding in the elevator in the opening segment.

Yep. I've seen plenty of bugs and even posted about the TV screens not showing anything in the GAF OT, but I wouldn't have observed big FPS drops like DF. Maybe VRR helps, maybe my version is a patch ahead than when the video was recorded. Can' say for sure.

Ah well.
 
Last edited:
I’m shelving this for the moment, pretty disgusted. Playing 1.0.2.0 latest XSX patch and videos are not even playing on the wall monitors or TV on the safe rooms. That’s nothing to do with DirectX, hardware or other generally stupid warrior Gaf takes. Dev must know these videos are not playing and still decided to release the game for Xbox in such state. Their QA would not miss such a blatant issue. They just didn't dedicate enough time to iron out the issues on Xbox. If this is what MS gets with their deals, I’m afraid they are laughing on their faces.

edit: also, neuromodules are not dropping on the Xbox and windows version. Devs are working on a patch. Pretty sad, to be honest.
Oh the Dev's saw this, the QA guys definitely let them know... it's just not a P0 issue, its likely a late P1 or even a P2 cause it's not "game breaking" so okay to ship. I just wonder why the devs weren't able to address it before launch. They must have been working on P0 issues for launch.
 

lucbr

Member
That's to be expected when every single PS5 GPU component is running 22% faster compared to XSX counterparts. Additional speed isn't free.

That's my point. This additional speed, characteristic of the architecture used by Sony, provides similar performance to the Xbox. However, the Xbox has a lower power consumption, apart from the fact that it is smaller and quieter. If I consider that both have a similar cost to produce (which I don't know), due to its efficiency, seems to me that the XSX is a very well-architected device and it's creators did a great job and should be applauded. Even more so considering that Sony is a master, with more than 70 years building excellent consumer hardware products.
 
I bet you'll be saying this in 2035...lol. I wasn't saying it. There was also the people saying that the ps5 would beat pcs for years to come thanks to its untouchable god like io. There's crazy on both sides, just laugh at it.

PC is destroying consoles now just like it has for every gen.
Sure if you have 1500$ plus videocard...
 

Pimpbaa

Member
Pfft, who cares which version performs better. All versions fail due to lack of HDR. I know that is partly UE4‘s fault, but devs have added HDR to their UE4 powered games. Hell, Netherrealm Studios added HDR to their modified version of UE3 for their most recent games. I expect this shit from small indie devs making cheap games, not a dev making a full priced game. Glad I’m game passing this instead of buying.
 
Because those are generally just nice-to-haves and aren't things most gamers and certainly developers, actually care about.

Also AFAIK both systems consume similar amounts of power running the same games, PS5 maybe a tad more so at points but that was also before the move to 6nm. I think you're conflating the "sometimes much less" part with the power saving settings features, which was somewhat incorrect even at the time the article got written.

Xbox systems do seem to default to a mode at rest where less power is consumed than PS5. But that's down to the platform holders and their preferences.
Why go through the trouble of typing all that out when you could've just said "Real gamers and developers don't care about anything Xbox does better... only about what PS5 does."

"What you meant to say is that they're really just totally equal in power consumption and any discrepancy is because Sony CHOSE for it to be that way."

Is this schtick your full time job?
 
Pfft, who cares which version performs better. All versions fail due to lack of HDR. I know that is partly UE4‘s fault, but devs have added HDR to their UE4 powered games. Hell, Netherrealm Studios added HDR to their modified version of UE3 for their most recent games. I expect this shit from small indie devs making cheap games, not a dev making a full priced game. Glad I’m game passing this instead of buying.
That's my takeaway. Trying to give any console a win here is moot considering the devs couldn't even nail something as basic as HDR kinda comes off as desperately trying to find a win.
 

Lysandros

Member
Trying to give any console a win here is moot considering the devs couldn't even nail something as basic as HDR kinda comes off as desperately trying to find a win.
Yeah, of course. This point tends to appear quite regularly when PS5 happens to outperform XSX in a game. Luckily there is always a 'something' that renders the result meaningless.
 
Last edited:

Little Chicken

Gold Member
Perhaps people who haven't actually played the game should stop calling the supposed power of any console into question; it's apparent that the game isn't pushing any hardware very hard, it's just unoptimised and buggy.

More importantly, it's still entirely playable and fun on all platforms.
 

Beer Baelly

Al Pachinko, Konami President
Never in a month of Sundays is the 15min to 60min gameplay at the start of the game powered by Unreal 4. Unreal 1 or Unreal 2, maybe but the gameplay - at that point - is a carbon copy of the old Valve Source Engine that originally powered Half Life 2.

The opening 15min tech demo walk-and-talk could be Unreal 4, although I don't remember watching the loading screen at my friend's when playing on XsX to see if the Unreal Engine screen showed, but the tunnel cutting drill beast is so, so faceted, it isn't even at Orange box level of fidelity and looks so Source Engine.

Jim Carrey What GIF


This game doesnt look or feel like a Source game.
 

aries_71

Junior Member
Perhaps people who haven't actually played the game should stop calling the supposed power of any console into question; it's apparent that the game isn't pushing any hardware very hard, it's just unoptimised and buggy.

More importantly, it's still entirely playable and fun on all platforms.
Why would you want to inject common sense and reasoning into Gaf? Tell me why. WHY? 👍
 

DenchDeckard

Moderated wildly
Dx12 has been on Xbox since atleast 2019. No chance this is still a tools issue.

I'm going to go with 10 gigs of faster ram vs 16 slower at this point

But then what's the reason when they patch the xbox version? Still a hardware issue?

It just doesn't make sense to me. How is this game running so well on pcs with gpus that have 8 gb of slower ram? Think about that....
 
Last edited:
Top Bottom