• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

XeSS (Intel ML Upscaling) supported by Series Consoles

south park GIF
 
Then it wouldn't be called RDNA 2 if it didn't have all the features.

Honestly at this this point it seems RDNA 2 has different definitions and no one can agree on one.

It's clear that PS5's GPU is based on RDNA 2 but customised for their own needs and purposes as stated by Mark Cerny himself, similar to their Zen 2 CPU.

Posters are arguing that it's not "full RDNA 2" because it lacked hardware VRS support as well as SFS.

It's pretty obvious such features were likely removed by Sony themselves for several reasons but that's another discussion. The ML hardware for INT4/8 operations has been found in AMD GPU's up to and including RDNA 2, so similar to VRS and SFS if the PS5 doesn't have hardware ML support then it means it was purposefully removed by Sony.
 

Zathalus

Member
What RDNA 2 features Xbox has that PS5 doesn't?

Also custom in regards to the PS5 means they did their own stuff like adding Cache Scrubbers to the GPU.
The PS5 seems to lack Tier 2 VRS and Mesh shaders. No PS5 games have leveraged Tier 2 VRS, and Cerny clearly mentioned Primitive shaders, not Mesh shaders.

Sampler Feedback is unknown as well, as they have made no mention of it.

No I'm not claiming that the PS5 is not RDNA2, just that the feature set is obviously malleable.
 

ethomaz

Banned
The PS5 seems to lack Tier 2 VRS and Mesh shaders. No PS5 games have leveraged Tier 2 VRS, and Cerny clearly mentioned Primitive shaders, not Mesh shaders.

Sampler Feedback is unknown as well, as they have made no mention of it.

No I'm not claiming that the PS5 is not RDNA2, just that the feature set is obviously malleable.
These are feature sets named by Microsoft… it is really the names it takes in DirectX 12 API.

Sony consoles doesn’t use DirectX APIs.
So it will never had these names features but they could have similar features with their own names.

Different but still the same situation? After all these are several ways to reach the same results, no?
 
Last edited:
The PS5 seems to lack Tier 2 VRS and Mesh shaders. No PS5 games have leveraged Tier 2 VRS, and Cerny clearly mentioned Primitive shaders, not Mesh shaders.

Sampler Feedback is unknown as well, as they have made no mention of it.

No I'm not claiming that the PS5 is not RDNA2, just that the feature set is obviously malleable.

I would add that PS5 can support Mesh Shaders at a hardware level but Sony chose to go with Primitive Shaders.

The performance gains are the same but they function differently on a software/API level.
 

Zathalus

Member
These are feature sets named by Microsoft… it is really the names it takes in DirectX 12 API.

Sony consoles doesn’t use DirectX APIs.
So it will never had these names features but they could have similar features with their own names.

Different but still the same situation? After all these are several ways to reach the same results, no?
No, Mesh shaders and VRS are terms that are not exclusive to DirectX.



If the PS5 had Mesh Shader support he likely would have mentioned that instead of Primitive Shaders. Not that I expect it to be a massive difference, Mesh Shaders are likely just a bit more programatic but overall they should provide similar functionality.

As for Tier2 VRS, Doom Eternal had it on Series X but not PS5. Also zero mention of it anywhere as well. Might not be a great loss as software VRS can be done quite well.

Sampler Feedback is a total unknown. Sampler Feedback Streaming is a Series exclusive feature, but PS5 might have the standard SF.
 
The PS5 seems to lack Tier 2 VRS and Mesh shaders. No PS5 games have leveraged Tier 2 VRS, and Cerny clearly mentioned Primitive shaders, not Mesh shaders.

Sampler Feedback is unknown as well, as they have made no mention of it.

No I'm not claiming that the PS5 is not RDNA2, just that the feature set is obviously malleable.
We know for sure that the PS5 doesn't support Tier 2 VRS, because Doom Eternal devs said they would've implemented it if possible.
 

Corndog

Banned
Microsoft also didn't wait for full RDNA2 anymore than Sony considering all consoles released the same week, but it'll take years before this kind of misinformation spread by windows central, xboxera and their discord armies gets dispelled.
Xbox apu development was later then ps5. That’s a fact. What difference that made I don’t know.
 

ethomaz

Banned
Its software? I had no idea that any GPU can support it..
Any GPU can support it since there is enough bandwidth in HDMI port.
It is after all a HDMI 2.1 feature but HDMI 2.0, 1.4, etc devices can have it since the device owner implement it.

BTW any AMD GPU that supports FreeSync can support VRR… that include all GPUs after GCN 2 (I’m not sure if there are any hardware specific thing that doesn’t allow it in GCN 1 but GCN 2 is where AMD support starts).
 
Last edited:

sncvsrtoip

Member
this can potentialy be big advantage for xsx, can't understand how sony thought better support for ml (int8/4) is not that important
 

01011001

Banned
I will wait for actual game comparisons until I am getting hyped for another upscaling method. FSR 1.0 was a joke for example, barely even worth mentioning if a game supports it or not. but Intel is using machine learning so this could be good, we will see.

and since I see the old PS5 vs Xbox crap is starting up again... guys, we know the PS5 misses RDNA2 features that the XBox consoles have... why is this even up for discussion again, it's public knowlege...? also the supported hardware mentions 2 custom AMD chips, and one of them is not supported, most likely that is the PS5. let's wait and see if this Intel upscaler is even worth fanboy warring about... could be just as shit as FSR, although I bet it won't be quite as useless.
 
Last edited:

ethomaz

Banned
No, Mesh shaders and VRS are terms that are not exclusive to DirectX.



If the PS5 had Mesh Shader support he likely would have mentioned that instead of Primitive Shaders. Not that I expect it to be a massive difference, Mesh Shaders are likely just a bit more programatic but overall they should provide similar functionality.

As for Tier2 VRS, Doom Eternal had it on Series X but not PS5. Also zero mention of it anywhere as well. Might not be a great loss as software VRS can be done quite well.

Sampler Feedback is a total unknown. Sampler Feedback Streaming is a Series exclusive feature, but PS5 might have the standard SF.
Your link is talking about nVidia VRS that is not the same as MS VRS (that is based on AMD VRS).
 

Loxus

Member
nvidia sells Turing GPUs both with and without RT and tensor cores. All of them are still Turing GPUs.
RDNA 2 does RT and ML via CUs.
5EYIXKQ.jpg


The same CUs that are in the PS5 and confirm many times to be RDNA 2 CUs.

E4lOfvi.jpg

"We built a GPU with 36CUs. Mind you RDNA2 CUs are large, each has 62% more transistors than the CUs we were using on PlayStation 4.

So if we compare transistor counts, 36 RDNA2 CUs equates to roughly 58 PlayStation 4 CUs."


Right there is how you know the PS5 supports INT4/8 instructions.
You don't have to be a Brain Surgeon to understand that.
 
Last edited:

Riky

$MSFT
That is not what is being discussed.

Indeed VRS Tier 2 or anything DirectX related is not supported by PS5.

That doesn’t mean it doesn’t have it own similar features in their own API.

It's not a DirectX feature it's a hardware feature, ask AMD.

Also,

"It's also interesting to note that Xbox Series consoles use the hardware-based tier two VRS feature of the RDNA2 hardware, which is not present on PlayStation 5. VRS stands for variable rate shading, adjusting the precision of pixel shading based on factors such as contrast and motion. Pre-launch there was plenty of discussion about whether PS5 had the feature or not and the truth is, it doesn't have any hardware-based VRS support at all"

Just accept it.
 

Neo_game

Member
Interesting, I wonder if they are going to use it. Because if I remember correctly even rtx 2060 is twice as fast as SX in INT4, INT8 calculation. So I am not sure it means much. Surprised GTX 10 series is supported. I am not sure it is going to be relevant on these cards either.
 

Andodalf

Banned
Interesting, I wonder if they are going to use it. Because if I remember correctly even rtx 2060 is twice as fast as SX in INT4, INT8 calculation. So I am not sure it means much. Surprised GTX 10 series is supported. I am not sure it is going to be relevant on these cards either.
SX supposed rapid packed math for int4 and int8 though
 

Mister Wolf

Gold Member
I can't wait for this stuff to bear fruit. I'm starting to get impatient since they've been talking about it for so long. This, Sampler Feedback, Mesh Shaders, Direct Storage, etc. Can we get atleast a couple games in 2022 using it.
 

DaGwaphics

Member
I can't wait for this stuff to bear fruit. I'm starting to get impatient since they've been talking about it for so long. This, Sampler Feedback, Mesh Shaders, Direct Storage, etc. Can we get atleast a couple games in 2022 using it.

I'm hoping that they just drop all support for X1. Now that X1 supports streaming it wouldn't really destroy GP for X1 players and would allow them to build directly against the next-gen features.
 

elliot5

Member
I'm hoping that they just drop all support for X1. Now that X1 supports streaming it wouldn't really destroy GP for X1 players and would allow them to build directly against the next-gen features.
Halo Infinite is the last X1 game from XGS as far as what's been shown. I don't see any sign or reason to continue releasing on Xbox One. Not including Grounded which hasn't exited early access ofc
 
Last edited:

elliot5

Member
So the consoles have Infinity Cache? Or Infinity Cache isn't a feature of RDNA 2?
I think Infinity Cache isn't a "feature" so much as a rebrand of L3 cache? It was part of their Ryzen CPUs IIRC and those aren't "RDNA2". Idk if I'd call it a feature, but maybe some consider it as such
 

Schnozberry

Member
I think Infinity Cache isn't a "feature" so much as a rebrand of L3 cache? It was part of their Ryzen CPUs IIRC and those aren't "RDNA2". Idk if I'd call it a feature, but maybe some consider it as such
The size of the cache makes the difference. 128MB of on die GPU cache increases the cache hit rate by quite a bit over 32 or 64MB. There is less dependence on the DDR memory as a result.

Neither console has Infinity Cache. They are essentially custom APUs, and the size of the die area if they included 128MB of cache would make cost and yields a serious problem.
 

FireFly

Member
I think Infinity Cache isn't a "feature" so much as a rebrand of L3 cache? It was part of their Ryzen CPUs IIRC and those aren't "RDNA2". Idk if I'd call it a feature, but maybe some consider it as such
Right, but those are CPUs, not GPUs and it's the first time that such a large amount of cache has shipped in a PC GPU. I mean AMD describe it as a feature of the RDNA 2 architecture on their page.


So either their own marketing is wrong, or what counts as being "RDNA 2" is somewhat flexible.
 

DaGwaphics

Member
Right, but those are CPUs, not GPUs and it's the first time that such a large amount of cache has shipped in a PC GPU. I mean AMD describe it as a feature of the RDNA 2 architecture on their page.


So either their own marketing is wrong, or what counts as being "RDNA 2" is somewhat flexible.

It's a bit of semantics I think. It's a feature, sure. But I'm not sure it's a feature that developers are directly programming against. MS wording regarding RDNA 2 comes across more as them saying there is no code that can be run on a desktop RDNA 2 part that can't run on a Xbox Series console. The IC is giving the desktop parts a performance advantage on certain things (as AMD has demonstrated), but Xbox Series can run everything.
 

Fredrik

Member
Very true. Wasn't the Xbox360 CPU derivative of Sony's work with IBM, yet it still hit the market first?
No idea. I do have some insight on how things turn from paper spec to a final electronics product but I don’t know much about consoles. I can guess though and I’m guessing that AMD simply kept working on new iterations of RDNA2 after Sony said ”Enough! We need an actual working chip now so we can hit our launch target!”. MS got a later version but both are still RDNA2.
 

FireFly

Member
It's a bit of semantics I think. It's a feature, sure. But I'm not sure it's a feature that developers are directly programming against. MS wording regarding RDNA 2 comes across more as them saying there is no code that can be run on a desktop RDNA 2 part that can't run on a Xbox Series console. The IC is giving the desktop parts a performance advantage on certain things (as AMD has demonstrated), but Xbox Series can run everything.
I agree. My point is that what counts as being RDNA 2 is ultimately a question of semantics / marketing.
 

M1chl

Currently Gif and Meme Champion
That is not what is being discussed.

Indeed VRS Tier 2 or anything DirectX related is not supported by PS5.

That doesn’t mean it doesn’t have it own similar features in their own API.
Most of the software features set is mainly based on what you as a dev/manufacturer can do with HW through SW. Things which are HW based like DLSS, RT acceleration and so on. needs special hw, most of other features do not and can be done in some assembly injected to the driver.
 

Dream-Knife

Banned
At first, I thought it was all marketing, but this thing with VRR missing from the PS5 for over a year now is starting to make me think that there is something more to this statement. Doesn't RDNA2 support VRR out of the box?
Yes, but FreeSync should be able to be used on PS5.

Maybe something else is going on. Or Sony doesn't care. Do any Sony displays even support freesync?
 
Last edited:
Yeah it makes no sense, but then again, big corporations make moves like that very often so no surprise.

Yea it’s like taking bets on what’s relevant to save on production costs. Sometimes the bets work sometimes they dont.

But what makes this even more odd is that the Xbox one X | S both have VRR and those came out in 2017 (I think)
 

Hugare

Member
Is it good, tho?

Because FSR have been pretty shitty so far

DLSS stands king far above anything else (being used) on the market today
 

Allandor

Member
Microsoft specs says 97 TOPS for INT4 and 49 for INT8 for SX which is very low. I think RTX 2060 is twice these numbers.
That's because nvidia GPUs count the tensor cores to that. Funny thing is that they are almost not used so far (other than DLSS) for the tasks they are on board. Still waiting for nvidia to use them for RT. They still only use them for DLSS. I guess they are not capable enough of working on DLSS + RT (the parts than can be processed on the tensor cores) at the same time.
 
Top Bottom