• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

XeSS (Intel ML Upscaling) supported by Series Consoles

Still, the headline might be bit misleading ATM.

There haven't been any announcements from MS or Intel to include this to Xbox GDK. However, if the Intel SDK and XeSS software stack is open source or the license otherwise allows that, the developers can add this support to their games easily as all the required stuff already exists on Xbox Series S/X, DX12 with SM6.4 or SM6.6 and hardware level support for DP4a. AFAIK XSS and XSX both support SM6.6.
 
What is the proof it does not support INT8/INT4 quad and octa rate math (available in RDNA1 already, optional on that arch, default on RDNA2) as it does double rate FP16? The old “if it did they would have said” “Sony Principal Eng tweet” proofs, right ;)?

I see the thread already took a “omg, this could mean a gulf of performance” “trump card” material kind of turn :).

Intel's own Tom Peterson did say in an interview that Intel's cards use their own internal programming language which they say is super optimised for XeSS.

So how well Microsoft's API's for ML can leverage XeSS is open to debate.

I wouldn't expect anything "staggering" but regardless XeSS is still very impressive and I'm curios too see how AMD can respond with their next iteration of FSR.
 

Panajev2001a

GAF's Pleasant Genius
Intel's own Tom Peterson did say in an interview that Intel's cards use their own internal programming language which they say is super optimised for XeSS.

So how well Microsoft's API's for ML can leverage XeSS is open to debate.

I wouldn't expect anything "staggering" but regardless XeSS is still very impressive and I'm curios too see how AMD can respond with their next iteration of FSR.

I am sure Intel did put their best mind on it, but this is a bit orthogonal to the point I was making.
 

Buggy Loop

Member
Still makes me chuckle that Intel dropped a solution before AMD, Microsoft and Sony when they don’t yet have a racing horse in all of this.

It’s like some dude who partied all night and seems to not give a care in the world, dropping a perfect score homework on the desk while the rest of the group worked all night with no solutions.
 

FlyyGOD

Member
Microsoft also didn't wait for full RDNA2 anymore than Sony considering all consoles released the same week, but it'll take years before this kind of misinformation spread by windows central, xboxera and their discord armies gets dispelled.
Both Sony and MS waited to release systems because of game development and production of gpu chips to produce enough systems. MS has more advanced features in their system because they waited for the technology.
 

Panajev2001a

GAF's Pleasant Genius
They mark it as "no" but do have a ? on that line. 🤷‍♂️

Good to see that Xbox series has support. Whether it ever gets used on console or not is anyone's guess.
The table says also that the data inside it might not be reliable and to take it with a pinch of salt too :D.

ehKJibd.jpg
 
Last edited:

elliot5

Member
Still makes me chuckle that Intel dropped a solution before AMD, Microsoft and Sony when they don’t yet have a racing horse in all of this.

It’s like some dude who partied all night and seems to not give a care in the world, dropping a perfect score homework on the desk while the rest of the group worked all night with no solutions.
Intel's R&D department is gigantic and they are the hardware makers (CPU and iGPU and now dedicated GPU), Sony and MS just use AMD hardware. It's more a shame AMD hasn't been ahead of things, but again Intel has a wayyyy bigger budget than AMD
 

DaGwaphics

Member
XBSX doesn't look anything like RDNA 2 to be called full RDNA 2.

WpypT2O.jpg

The way a chip is arranged has little to nothing to do with the instruction sets/capabilities present. Xbox Series supports all of the capabilities of RDNA2 (as stated by AMD themselves) even without Infinity Cache (which is probably not introducing any specific instructions, acting instead as a performance enhancer).
 

Sosokrates

Report me if I continue to console war
Especially this could enable more elaborate Ray Tracing effects while still maintaining 60 fps and 4K level quality.

Yes, according to intel the render time required to reconstruct 1080p to 4k is small.

I wonder how the render time compares to UE5's super resolution because that seems quite resource heavy, there target for current gen consoles with nanite and lumen enabled is 1080p native super resed to 4k 30fps.
 

Closer

Member
LoL @ “Market Leader” sir this tech is an industry wide paradigm shift that’ll be open source.

Also Microsoft is surely going to implement this as they’ve been a force in ML development.

Tell me you know nothing about the industry without telling me.

lol btw Microsoft is the most valuable company in the world. I think that’s about as market leader as you can get

Microsoft is gonna use it alright, but I don't see third parties using it as it will remove parity between versions. Time will tell.
 

Closer

Member
Why wouldn't they? Plenty of games used CB rendering vs native rendering on PS4 Pro vs One X which is a difference in rendering/reconstruction that removes "parity".

You are right but that goes back to my original post. Unless Microsoft becomes market leader and this solution is "easy" to implement, I don't see third parties even bothering.
 

elliot5

Member
You are right but that goes back to my original post. Unless Microsoft becomes market leader and this solution is "easy" to implement, I don't see third parties even bothering.
Sure, but I think the biggest thing is easy to implement / open source, not necessarily MS being market leader. When AMD's FSR came out as part of AMD's FidelityFX package it was pretty quickly adapted into a handful of titles from indies to AAA 3rd party games. PS, PC, and Xbox. Even with DLSS being closed source and a limited selection of PC players, lots of titles use DLSS because it's relatively easy to adopt.
 
Last edited:

Sosokrates

Report me if I continue to console war
Here is an interesting bit of info


@21.42

Rich: So is there a limitation on which GPUs from other vendors will run it (XeSS), I mean im assuming there has to be some kind of machine learning acceleration involved right?

Intel guy: Thats really ba question for other vendors right, you've seen machine learning style applications run on GPUs with none right.. theres no reason it has to have a particular hardware, its just a performance quality kind of complexity trade off.


So it seems hardware without ML hardware (int4) will run XeSS but there will be some performance quality/complexity trade off.
 
Last edited:

Closer

Member
Sure, but I think the biggest thing is easy to implement / open source, not necessarily MS being market leader. When AMD's FSR came out as part of AMD's FidelityFX package it was pretty quickly adapted into a handful of titles from indies to AAA 3rd party games. PS, PC, and Xbox. Even with DLSS being closed source and a limited selection of PC players, lots of titles use DLSS because it's relatively easy to adopt.

I'm skeptical. It'd be better if they adopted available tech readly, but that's not how it goes, sadly.
 
Last edited:

oldergamer

Member
What is the proof it does not support INT8/INT4 quad and octa rate math (available in RDNA1 already, optional on that arch, default on RDNA2) as it does double rate FP16? The old “if it did they would have said” “Sony Principal Eng tweet” proofs, right ;)?

I see the thread already took a “omg, this could mean a gulf of performance” “trump card” material kind of turn :).
Proof? They dont list it on supported hardware no?

You also had this argument on other features and were not correct if memory serves?
 
Last edited:

DaGwaphics

Member
Here is an interesting bit of info


@21.42

Rich: So is there a limitation on which GPUs from other vendors will run it (XeSS), I mean im assuming there has to be some kind of machine learning acceleration involved right?

Intel guy: Thats really ba question for other vendors right, you've seen machine learning style applications run on GPUs with none right.. theres no reason it has to have a particular hardware, its just a performance quality kind of complexity trade off.


So it seems hardware without ML hardware (int4) will run XeSS but there will be some performance quality/complexity trade off.


One of those things where, while it is possible, it is likely untenable to run it on GPUs without hardware support. Probably cost you more frames than what you gain by rendering internally at the lower resolution to begin with.
 

elliot5

Member
Here is an interesting bit of info


@21.42

Rich: So is there a limitation on which GPUs from other vendors will run it (XeSS), I mean im assuming there has to be some kind of machine learning acceleration involved right?

Intel guy: Thats really ba question for other vendors right, you've seen machine learning style applications run on GPUs with none right.. theres no reason it has to have a particular hardware, its just a performance quality kind of complexity trade off.


So it seems hardware without ML hardware (int4) will run XeSS but there will be some performance quality/complexity trade off.


I think this would be like how Nvidia 10 series GPUs *can* raytrace, but run like shit doing so without the hardware acceleration. That's the performance v quality tradeoff in action. So while it may be feasible, it may not be worth implementing over traditional solutions like CB or TAA stuff.

At first, I thought it was all marketing, but this thing with VRR missing from the PS5 for over a year now is starting to make me think that there is something more to this statement. Doesn't RDNA2 support VRR out of the box?
I don't think VRR is related to RDNA2 at all. VRR is an HDMI protocol thing?
 

SoraNoKuni

Member
Quite doubt that PS4 doesn't support int4/int8 calculations, it's not something new that AMD had to implement but a heritage function from RDNA1.

Also Sony was first to introduce(CB) and heavily rely on reconstruction/upscaling features to be so naive about that.
 
Last edited:

hlm666

Member
Rich: So is there a limitation on which GPUs from other vendors will run it (XeSS), I mean im assuming there has to be some kind of machine learning acceleration involved right?

Intel guy: Thats really ba question for other vendors right, you've seen machine learning style applications run on GPUs with none right.. theres no reason it has to have a particular hardware, its just a performance quality kind of complexity trade off.


So it seems hardware without ML hardware (int4) will run XeSS but there will be some performance quality/complexity trade off.
Before that part he also pretty much jumps over the will there be source code question and goes beating around the bush to try not say it's like nvidias dlss approach. It sounds straight up like no source code, you get our binaries (external engines or whatever bullshit he used to not say dll's) and an sdk to integrate it into your software, like nvidia unless your using ue/unity. Doesn't sound like AMD/MS/NVIDIA will be able to modify and compile their own binaries to optimise it for their hardware, sounds like how nvidia would do dlss on other hardware (like physx).
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
I wonder how performant it will be. People hate on nVidia for locking their stuff, but the other chipsets don't have tensor cores that accelerate the process.

I'm a big believer in this stuff though. I really don't think base rendering needs to be higher than 1080p, and AI upscaling is plenty good to get us the rest.
 

Fredrik

Member
'm not the one reading two separate sentences in separate paragraphs - "waiting to get full RDNA2" and "we're the only ones with DX12 Ultimate features (duh)" - and thinking they waited longer than Sony but then released 2 days later.
Boy, those must have been some seriously productive 2 days over at AMD to make such progress.
Why would you think the release date has anything to do with when they finalized the hardware? They could’ve been years apart on the hardware side and still release the same day or have the exact same hardware and still be years apart on release.
 

Loxus

Member
The way a chip is arranged has little to nothing to do with the instruction sets/capabilities present. Xbox Series supports all of the capabilities of RDNA2 (as stated by AMD themselves) even without Infinity Cache (which is probably not introducing any specific instructions, acting instead as a performance enhancer).
Doesn’t matter what you say , it’s been confirmed officially By Microsoft and AMD directly not much speculation to be had. It’s literally written in black and white.
But would say the PS5 is RDNA 1 because it looks like Navi 10 and because of that, says it doesn't support INT4/8 even though the PS5 is confirmed to be based on RDNA 2.

Such hypocrites.
Stephen A Smith Eye Roll GIF by ESPN
 

DaGwaphics

Member
But would say the PS5 is RDNA 1 because it looks like Navi 10 and because of that, says it doesn't support INT4/8 even though the PS5 is confirmed to be based on RDNA 2.

Such hypocrites.
Stephen A Smith Eye Roll GIF by ESPN

I don't follow you. I have no idea if PS5 specifically supports acceleration of INT4/8. You can do ML inference without it, albeit at greater cost.

I'm certain that Xbox does because MS put in on a slide and showed it at a conference and AMD concurred. If Sony makes an official statement saying this is a supported feature, I'd obviously believe it (why would they bother to lie?).
 

elliot5

Member
Bigger question. Why is Intel further ahead of AMD in this regards. Seems ass backwards to me.
 

DaGwaphics

Member
Why would you think the release date has anything to do with when they finalized the hardware? They could’ve been years apart on the hardware side and still release the same day or have the exact same hardware and still be years apart on release.

Very true. Wasn't the Xbox360 CPU derivative of Sony's work with IBM, yet it still hit the market first?
 

Loxus

Member
I don't follow you. I have no idea if PS5 specifically supports acceleration of INT4/8. You can do ML inference without it, albeit at greater cost.

I'm certain that Xbox does because MS put in on a slide and showed it at a conference and AMD concurred. If Sony makes an official statement saying this is a supported feature, I'd obviously believe it (why would they bother to lie?).
Sony saying the GPU is RDNA 2 based, common sense would tell you it also has RDNA 2 features.

MTttAZJ.jpg
 

ethomaz

Banned
At first, I thought it was all marketing, but this thing with VRR missing from the PS5 for over a year now is starting to make me think that there is something more to this statement. Doesn't RDNA2 support VRR out of the box?
VRR nas nothing to do with hardware lol

Maybe you are mistaking it with VRS.
 
Last edited:

Calverz

Member
But would say the PS5 is RDNA 1 because it looks like Navi 10 and because of that, says it doesn't support INT4/8 even though the PS5 is confirmed to be based on RDNA 2.

Such hypocrites.
Stephen A Smith Eye Roll GIF by ESPN
Pretty sure it’s been confirmed it’s a custom rdna 2. Not the full suite of features like the Xbox series s
 

ethomaz

Banned
But would say the PS5 is RDNA 1 because it looks like Navi 10 and because of that, says it doesn't support INT4/8 even though the PS5 is confirmed to be based on RDNA 2.

Such hypocrites.
Stephen A Smith Eye Roll GIF by ESPN
Just to add something here about Navi 10.

It was designed with INT4/INT8 native support but due a bug AMD shipped with these features disabled.

The bug was fixed in post RDNA revisions (Navi 12, Navi 14, etc).
 
Last edited:

Loxus

Member
Pretty sure it’s been confirmed it’s a custom rdna 2. Not the full suite of features like the Xbox series s
What RDNA 2 features Xbox has that PS5 doesn't?

Also custom in regards to the PS5 means they did their own stuff like adding Cache Scrubbers to the GPU.
 
Last edited:

DaGwaphics

Member
Sony saying the GPU is RDNA 2 based, common sense would tell you it also has RDNA 2 features.

MTttAZJ.jpg

I don't doubt it has RDNA 2 features. Sony hasn't ever stated (to my knowledge) that it has EVERY RDNA2 feature though. So, I still have no idea about INT 4/8 unless they say something official about it.
 

oldergamer

Member
Yup i called it switching to the software argument. Some here have a predictable response to anything that seems one console supports that the other likely doesnt
 

Loxus

Member
I don't doubt it has RDNA 2 features. Sony hasn't ever stated (to my knowledge) that it has EVERY RDNA2 feature though. So, I still have no idea about INT 4/8 unless they say something official about it.
Then it wouldn't be called RDNA 2 if it didn't have all the features.
 
Top Bottom