• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry about XSX teraflops advantage : It's kinda all blowing up in the face of Xbox Series X

How? The purpose of megascans is not to remodel them, but to slam them together, typically reusing a small number, to create unique looking results from the natural scan complexity with mindboggling polygon counts in the result- or use them as individual models as-is in isolation.

The Coalition effectively conceded that using megascans dynamically in the former way is very limited on XsX, and multi-platform UE5 games do look limited like their demo. And as I tried to discuss the launch argument about geometry culling via the custom geometry engine of pre-culling would both backup the claim and shed light on why the technical tweets of the time were opaque.
The Coalition conceded nothing. Megascans are simply a library that may have not been required to fit the intended vision of their offering. The underlying nanite technology was certainly in use as it is core to UE5.

It’s possible that toolsets on the Series consoles weren‘t in the same place as PS5 initially which makes sense considering Epics partnership with Sony. The point is no one knows as we haven’t seen a game released at the same fidelity level seen in the UE5 tech demos working on current hardware.
 

shamoomoo

Member
I couldn't find it because the city was updated.Please check the image below instead.In this scene, the PS5 version was more rounded, and I could see more leaves.
The location is SLAPPY SHORES.
vDhyStM.jpg
4kSuFvM.jpg
d1rK911.jpg
Interesting, does anyone know why Fortnite is so dynamic? Also,the Series X appears to be missing the light shaft like the PS5 on the lamppost.
 

Three

Member
I thought the xbox 360's 250 gflops GPU was very well balanced by its xenon processor and 512 MB of unified vram. Especially for a $299 console. It was the PS3 that was served a dude of a GPU with bottlenecks everywhere and kutaragi's ridiculous decision to split the vram.
It wasn't really Kutaragi's ridiculous decision to split vram. Split vram was the norm. It was MS' ingenious decision to unify it. The dud GPU was nvidias doing, MS made the right call to go ATI/AMD for the GPU.
 

Lysandros

Member
I couldn't find it because the city was updated.Please check the image below instead.In this scene, the PS5 version was more rounded, and I could see more leaves.
The location is SLAPPY SHORES.
vDhyStM.jpg
4kSuFvM.jpg
d1rK911.jpg
It's almost like XSX is lacking tesselation. Was this brought up in DF analysis about the game?
 
Last edited:

kikkis

Member
The systems simply are what they are at this point. Anyone expecting a major shift 3 years in is just setting themselves up for disappointment.
I cant think of single 3rd party current gen only game made by top their studio like infinity ward or id.
 

DaGwaphics

Member
I cant think of single 3rd party current gen only game made by top their studio like infinity ward or id.

I was actually agreeing with you already. I agree that there will be games with a resolution boost on XSX throughout the generation.

My comment was more for those that think Sony is going to jump ahead because of this or that, though there are some on the XSX side that are similar. I'm confident the systems will continue to perform extremely similar to each other throughout.
 
Last edited:

DenchDeckard

Moderated wildly
I couldn't find it because the city was updated.Please check the image below instead.In this scene, the PS5 version was more rounded, and I could see more leaves.
The location is SLAPPY SHORES.
vDhyStM.jpg
4kSuFvM.jpg
d1rK911.jpg
Hey thanks, that is interesting. Im going to fire it up and check it out.

Both running the same settings? 60fps mode on both. Seems odd....
 
Last edited:

DaGwaphics

Member
Hey thanks, that is interesting. Im going to fire it up and check it out.

Both running the same settings? 60fps mode on both. Seems odd....

Time of day is so different in the shots (light is coming from much lower and the opposite side in the first shot. Not that great of a comparison shot, tbh. In one the building is in shadow, in the other it isn't, etc.
 
Last edited:
Yeah those Fortnite screenshots are eye opening what is going on 🤣 was there an update that changed things up? Series X has hard edges and weird lighting the street light isn't even casting light it could be a slight TOD difference I'm not sure
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
IMO the design and split pool all makes sense when you visualise racks of XsX in Xcloud, where only the GPU is being used and the CPU cloud sessions are done on a high core count server with lots RAM. The memory split just means they are only wasting the slower portion when two XsS session are running, or able to multiplex multiple XBLA sessions along with a XsS session, although I image the Xcloud XsX machines have 24GB's of unified RAM.
They should have had a different SKU for their cloud models. As it stands, the split pool is utterly useless to toe console and in fact detrimental.
 

DaGwaphics

Member
That shouldn't affect the number of leaves we can see, or missing textures/rounded shapes from objects.

It might not here in this case, but it absolutely can. Many times some elements in games are dynamic depending on what is going on and one time of day might be heavier than the other on the GPU.
 
Last edited:

DaGwaphics

Member

It's the truth. A lot of times the sunset hours in games with a day/night cycle run significantly worse than high noon. A lot more shadow work and things like that.

In this case though everything is just lit from an entirely different angle with makes it hard to compare. The building in the PS5 shot is illuminated so much more, so of course it looks sharper, on the XSX the building is in shadow. Same with the trees, looking at the shadow side vs the side being illuminated. A shot with similar lighting would be easier to compare.
 

DeepEnigma

Gold Member
It's the truth. A lot of times the sunset hours in games with a day/night cycle run significantly worse than high noon. A lot more shadow work and things like that.

In this case though everything is just lit from an entirely different angle with makes it hard to compare. The building in the PS5 shot is illuminated so much more, so of course it looks sharper, on the XSX the building is in shadow. Same with the trees, looking at the shadow side vs the side being illuminated. A shot with similar lighting would be easier to compare.
Bro. This game is the same no matter the TOD. And this TOD is just minutes apart since it's dynamic in real time.
 

DaGwaphics

Member
Bro. This game is the same no matter the TOD. And this TOD is just minutes apart since it's dynamic in real time.

The time of day in these shots would actually be about 12hrs in the real world. LOL One is in the morning the other is in the evening, the source of light has changed completely from the right to the left.

Might not change what you are talking about in particular but is just so jarring of a change that it makes the shots difficult to compare.

Untitled-1.jpg
 
Last edited:

PaintTinJr

Member
The Coalition conceded nothing. Megascans are simply a library that may have not been required to fit the intended vision of their offering. The underlying nanite technology was certainly in use as it is core to UE5.

It’s possible that toolsets on the Series consoles weren‘t in the same place as PS5 initially which makes sense considering Epics partnership with Sony. The point is no one knows as we haven’t seen a game released at the same fidelity level seen in the UE5 tech demos working on current hardware.
You are either arguing without reviewing the subject matter of AlphaPoint Demo - which is intended to show off similar results as the UE5 showcase on PS5 - and the walls and flooring nature are exactly the type of subject you'd model by kit bashing in real-time using megascans of a few rocks containing millions of polygons - if your hardware could handle it.

Even to achieve what they did with 100x less geometry, they fell back to fake 2D decal techniques - for the undulating terrain floor/walls - that were techniques used heavily in the PC space in the time of OG Xbox, and features that would only get used in PlayStation first part games if they were aiming for 60fps on PS3, or if the technique was specifically to stylise.

We see games like the new Jedi Survivor using unreal 5 and nanite, but no liberal use of megascans, and then we get PS4/X1 versions, I can't help see a connection between the lack of kit bashing and megascans in the AlphaPoint Demo, the replacement techniques used, and the state of nanite in released games, with cross-gen verions released.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
We see games like the new Jedi Survivor using unreal 5 and nanite, but no liberal use of megascans, and then we get PS4/X1 versions, I can't help see a connection between the lack of kit bashing and megascans in the AlphaPoint Demo, the replacement techniques used, and the state of nanite in released games, with cross-gen verions released.
Jedi Survivor doesnt use UE5 or nanite. They made the decision early on to not use UE5 because the engine was still in development when they started dev.
 

PaintTinJr

Member
Jedi Survivor doesnt use UE5 or nanite. They made the decision early on to not use UE5 because the engine was still in development when they started dev.
I was sure it was UE5 nanite enabled, but clearly I got that wrong, but like you I look around and am struggling to find games using UE5 with nanite that look better than any other engine without nanite. IMO without the ability to kit bash with megascans nanite is just kraken for geometry and an auto LoD system for development to save time. Games really don't look impressive compared to last-gen, and certainly don't have the geometry on display that the first reveal of UE5 sold us all on.
 

SlimySnake

Flashless at the Golden Globes
The systems simply are what they are at this point. Anyone expecting a major shift 3 years in is just setting themselves up for disappointment.
I dont think what kikis said was too outlandish. We have seen several games that have performed better on the xsx in the 10-18% range. The problem is that it isnt consistent and for some bizarre reason XSX ends up losing some other benchmarks by 10-18%. I do agree that we will continue to see similar results as the gen progresses. XSX wins some. PS5 wins other. Especially when PS5 seems to be the lead platform.

Whats interesting to me are the UE5 comparisons. I was sure that Matrix, Fortnite and Remnant 2 would present a clear 18% advantage but we have not really seen it play out. Probably because of DRS masking some of these advantages but it seems the engine is CPU bound and because they both have the same CPU, it doesnt seem to be helping the XSX.

Immortals and Lords of the Fallen 2 are next, and if they dont show that 15-25% advantage we saw in Doom and Metro then I guess its time to retire these arguments since almost the entire industry seems to be moving to UE5.
 

DaGwaphics

Member
No, it's not. It moves super fast and both are night shots with the moonlight being from left to above.

That's why I said real-world, in the real world this kind of light change could never happen over the course of a few minutes. The light differences are quite drastic. As I pointed out in the image I posted. Comparing the shadow side of a surface to the lit side is a drastic change in my opinion.
 
Last edited:

PaintTinJr

Member
I dont think what kikis said was too outlandish. We have seen several games that have performed better on the xsx in the 10-18% range. The problem is that it isnt consistent and for some bizarre reason XSX ends up losing some other benchmarks by 10-18%. I do agree that we will continue to see similar results as the gen progresses. XSX wins some. PS5 wins other. Especially when PS5 seems to be the lead platform.

Whats interesting to me are the UE5 comparisons. I was sure that Matrix, Fortnite and Remnant 2 would present a clear 18% advantage but we have not really seen it play out. Probably because of DRS masking some of these advantages but it seems the engine is CPU bound and because they both have the same CPU, it doesnt seem to be helping the XSX.

Immortals and Lords of the Fallen 2 are next, and if they dont show that 15-25% advantage we saw in Doom and Metro then I guess its time to retire these arguments since almost the entire industry seems to be moving to UE5.
For a point of reference, the OG Xbox typically rendered games at far higher resolution with inferior fx and performance - in non-memory bound games like MGS2 - and at this point the small difference in XsX resolution is just to eat up otherwise lost CU utilisation that can't be better used rather than being a measure of superiority in rendering IQ or superior performance. As noted with up and coming starfield, scaling back isn't getting a 60fps option, so with dynamic res and frame-rate there's possibly zero cost to the XsX for hitting a higher dynamic res in lighter scenes by using the full width of all the CUs.
 
Last edited:
You are either arguing without reviewing the subject matter of AlphaPoint Demo - which is intended to show off similar results as the UE5 showcase on PS5 - and the walls and flooring nature are exactly the type of subject you'd model by kit bashing in real-time using megascans of a few rocks containing millions of polygons - if your hardware could handle it. Even to achieve what they did, they fell back to fake 2D decal techniques - for the undulating terrain - that were used heavily in the PC space in the time of OG Xbox, and features that would only get used in PlayStation first part games if they were aiming for 60fps on PS3, or if the technique was specifically to stylise.

We see games like the new Jedi Survivor using unreal 5 and nanite, but no liberal use of megascans, and then we get PS4/X1 versions, I can't help see a connection between the lack of kit bashing and megascans in the AlphaPoint Demo, the replacement techniques used, and the state of nanite in released games, with cross-gen verions released.

The megascans library is just that... a library of very high quality assets. It’s nanite tech that is responsible for the high geometric detail visible in UE5. It’s up to the developer to decide if they use existing assets or create new ones (or both).

As for the ‘real-time’ claims made. We have seen UE5 demos on pc running with megascan assets on much slower drives than what resides within the Series consoles. We have also seen those same PC demos running on older GPU tech than what resides within the current consoles. Heck, we had the Coalition co-developed Matrix Awakens UE5 demo running on both but that clearly isn’t enough for you.

Also, wasn’t it finally confirmed that Jedi Survivor runs UE4?

It seems you really want to believe that PS5 has something that the Series consoles don’t. That there is this great visual experience on the horizon that only PlayStation can provide. It’s funny as Ninja Theory’s Hellblade 2, Project Mara along with the Initiatives Perfect Dark and the Coalitions new game are all set to raise the visual bar over the next couple of years. Hell, Xbox has more confirmed first party UE5 titles in development than Sony. I’m just not seeing your overall point.
 
Last edited:

DenchDeckard

Moderated wildly
Those fornite shots look like the cutbacks you'd make to have the 120hz mode vs the 60hz mode. Clear favour of the ps5 there. I'm gonna fire it up and check.
 

Bogroll

Likes moldy games
Those fornite shots look like the cutbacks you'd make to have the 120hz mode vs the 60hz mode. Clear favour of the ps5 there. I'm gonna fire it up and check.
I've just been there and its different. No arch to the right of the purple squid.


Just been there again (new game) and the arch is there
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I was sure it was UE5 nanite enabled, but clearly I got that wrong, but like you I look around and am struggling to find games using UE5 with nanite that look better than any other engine without nanite. IMO without the ability to kit bash with megascans nanite is just kraken for geometry and an auto LoD system for development to save time. Games really don't look impressive compared to last-gen, and certainly don't have the geometry on display that the first reveal of UE5 sold us all on.
Yeah, the first demo seems to be in a league of its own but Matrix and Remanant 2 have impressed me. There is a clear upgrade in asset quality over what we've seen in Demon Souls and Ratchet. Two games with the best asset quality on PS5.
 

PaintTinJr

Member
The megascans library is just that... a library of very high quality assets. It’s nanite tech that is responsible for the high geometric detail visible in UE5. It’s up to the developer to decide if they use existing assets or create new ones (or both).

As for the ‘real-time’ claims made. We have seen UE5 demos on pc running with megascan assets on much slower drives than what resides within the Series consoles. We have also seen those same PC demos running on older GPU tech than what resides within the current consoles. Heck, we had the Coalition co-developed Matrix Awakens UE5 demo running on both but that clearly isn’t enough for you.

Also, wasn’t it finally confirmed that Jedi Survivor runs UE4?

It seems you really want to believe that PS5 has something that the Series consoles don’t. That there is this great visual experience on the horizon that only PlayStation can provide. It’s funny as Ninja Theory’s Hellblade 2, Project Mara along with the Initiatives Perfect Dark and the Coalitions new game are all set to raise the visual bar over the next couple of years. Hell, Xbox has more confirmed first party UE5 titles in development than Sony. I’m just not seeing your overall point.
Rectangular buildings that in nanite view don't produce pixel or subpixel clusters-megascans?

The matrix demo is a demonstration of the ai moving characters and the lumen lighting system. It isn't kit-bashing high quality megascans. The buildings with basic lighting are PS4 level spiderman geometry+. I've ran it on both consoles - xsx at a friend's - and on PC, and the Valley of the Ancient, and the latter is far more impressive visual because of the megascans IMO

Given how clunky the editor was to handle modest adjustments on my 12 Core/24thread, 32GB workstation to the Valley demo, I still don't think by default UE5 kit -bashes all that geometry in real-time, and does it in compilation. which might be why the demos ran on some much weaker harder than I expected.

Overall the UE5 demos on PC have been impressive by the the tech, but the first demo was the most visually arresting, even if the lighting in other demos looked superior on inferior assets.
 

Stooky

Member
PS- NX gamer is the same exact way as DF. Just read a tweet from him how he thinks Insomniac will "match or exceed" Spiderman 2's reveal trailer except .. we've seen gameplay now and its not looking anywhere near as good and camt possibly improve that much in 2 months. He works for IGN now too so that says it all. They're like extra mouthpieces for these companies and have been for a while.
bad example the last spider-man 2 trailer shows several upgrades more like image refinements. the devs confirmed it. I don’t know what you are expecting.
 
Last edited:

DaGwaphics

Member
I have pictures from 7 months ago. Around the same time in the daytime.
Image before any updates.
Yv4aHoL.jpg
rx35GZo.jpg
usuoPxM.jpg

Notice how much more similar the general shading, trees, etc. look in shots with closer lighting.

Assuming that the XSX shot is running in the same mode as the PS5 shot and that both console shots are actually from the consoles represented, there is a reduction in triangles there between the two on some objects but not all, but certainly nothing that is really getting noticed without a heavy zoom or a side by side comparison.
 

rnlval

Member
IMO the 360 is as close as Xbox ever got to a balanced console, but the context of RRoD - caused by the GPU specs - undermines that.

It took 2years of revisions to fix RRoD, and the launch specs weren't targeting 720p, but 1024x768 based on the lack of hdmi for lossless video, audio or stereoscopic 3D, and the EDRAM amount fitting a double buffered 1024x768, which meant superior 2nd half gen titles were sub-HD on 360 showing unbalance, and the absence of a HDD and HD-DVD as a base model all led to other imbalances. IMO, but the GPU probably looked pretty balanced because it's TF and fill-rate were based around a popular PC GPU ATI 9700 Pro/ATI 9800

By comparison, the 2nd half of the gen the PS3 could demonstrate that the GPU/SPUs TF/fillrate and CPU/storage and RAM were all in balance, but it was PlayStation's least balanced console IMO....but it wasn't their intended design which was supposed to be two Cell BEs + a PS2 style GS and unified XDR memory. But problems beyond their control forced a redesign with Nvidia. The 360 was a new gen at the PRO console timescale combined with the most inopportune time for the gulf of change in graphics in a 6year period to be the biggest we seen.

As for the FPS games the PS2 didn't get, the 2year older console was only short of memory and a HDD, and Carmack even discussed the Doom3 iconic Carmack's reversal shadow technique aligning well with the PS2 graphics capabilities in the Nvidia paper. A feature that he couldn't do on the OG Xbox version because it lacked fill-rate, and lacked a z-buffer, - although the technique was a reinvention of an older CreativeLabs patent. The cost of the HDD was also not viable at the time for a base console box price because of added shipping weight of a cheap 3.5inch HDD - and had been subsidized heavily by Xbox for the OG Xbox and was removed from the 360 arcade in the next gen, so the absence of such games from PS2 is pretty logical, and not indicative of unbalanced design at the time-frame the PS2 launched IMO.

I also think the Xbox One shared imbalanced issues. The lack of ACEs and Rapid pack maths meant that by the end of the generation the PS4 using FP16 was twice the capability with async compute, the Cyberpunk probably being the biggest example of that.

The esram size was another poor choice that meant 900p was the only way to extract the full fill-rate and compared with the PS4's unified GDDR5 to dereference RAM buffers between CPU and GPU instantly meant decompressing high quality textures for games like Arkham Knight, MGSV, TLOU2, Death Stranding and GoT, to name just a few, and still be at Full HD locked to 30 or 60, showcased the PS4 balance and the X1 imbalance by its deficiencies IMO.
FYI, PS4 Pro has FP16 rapid pack math. Baseline PS4 doesn't have FP16 rapid pack math. PC R7-265 delivers PS4-level performance despite two ACE units.
 

rnlval

Member
Maybe the GPU can be rated even as RDNA2 is better than GCN Polaris

The problem is the RAM setup (size+bandwidth)
FYI, RDNA 2 is still "GCN" with lower latency wave32 instruction set support. RDNA 1 and RDNA 2 support GCN's wave64 and RDNA's wave32 instruction sets.

Half of NAVI 31's CU's stream processors don't have GCN Wave64 instruction set support. It's only a matter of time before the GCN wave64 instruction set is dropped. RDNA's wave32 aligns with NVIDIA's CUDA warp32 MIMD size.
 

//DEVIL//

Member
I couldn't find it because the city was updated.Please check the image below instead.In this scene, the PS5 version was more rounded, and I could see more leaves.
The location is SLAPPY SHORES.
vDhyStM.jpg
4kSuFvM.jpg
d1rK911.jpg
If someone made me play the PS5 version then the Xbox Series X version, I wouldn't tell the difference ( keyword, made me play because fuck this game)

but looking side by side, PS5 is a much superior version.
 
Top Bottom