• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel shows more info about XeSS - 153% performance in Ultra-Performance

winjer

Gold Member

Intel XeSS offers five quality modes, up to 153% performance in Ultra-Performance


Intel software engineers behind XeSS technology mention that the upscaling and anti-aliasing should be treated as a single problem. But the use of temporal super sampling may not always be the first choice for developers. Older games and older GPUs may see better results and quicker implementation with spatial upscalers. Games supporting ray-tracing and other graphics-intensive technologies with support from the latest GPU hardware will undoubtedly see better performance and visuals with temporal upscalers.

Intel intends to support all GPU vendors with XeSS. The company reaffirms that all GPUs with support for Shader Model 6.4+ and DP4a instructions will support the technology. Still, XeSS will work best with their own Arc and other graphics architectures with Matrix Extensions (XMX) acceleration support. For those, a special version of XeSS will be available. Game developers should not worry, though, as XeSS will have a single API and the use of either XMX or DP4a models will be managed internally.



The XeSS might not be available yet, but the company is already working on overcoming the most common issues with temporal upscalers, such as ghosting or blurring. For those issues, Intel introduces their own algorithms that will either eliminate or reduce the effect to the minimum.



Unlike AMD FSR 2.0 and NVIDIA DLSS 2.3, Intel XeSS will feature five image quality modes, including Ultra Quality mode with a 1.3X scaling factor. This scaling factor is not available with DLSS or FSR, although there were rumors that they might at some point in the future.



Intel claims that their technology can achieve higher scaling rations than other temporal and spatial upscaling techniques. The Ultra-Quality mode will improve performance by 21% to 27% for 1440p and 4K resolutions respectively, while Ultra-Performance will offer 97% to 153% better framerate. Those numbers are based on the Rens demo powered by Arc Alchemist GPU running at fixed (undisclosed) frequency.



Intel does not yet have a launch date for XeSS or a full list of games that will support it. We might learn more soon, though, at official introduction of Arc GPUs on March 30th.

 

Tripolygon

Banned
I love GDC, lots of interesting technology being talked about.

Those cheeky bastards. They used the same demo nvidia used to introduce DLSS. Lol
XeSSDeepDive_GDC_2022_v3_Final_Robert_Hisham-26.jpg
 
Last edited:

hlm666

Member
Looking good, interested to see FSR 2 and XeSS in motion but I suspect they are all gonna be pretty close you wont be able to pick them apart in general use.

Though one thing still annoys me, It's not really open source if the binaries the sdk/api is calling the functions from are not open source. As it is if AMD put some kinda of AI hardware on it's next gen cards thay can't modify it to use their new hardware. It's a little more open than nvidia/dlss but it's not really open source, how it performs on every piece of hardware is dependent on intel and what they see fit to support.
 

manfestival

Member
Stills can be made to look nice with any of this type of solution. We need to see it in action and have it released. Unleash it into the wild!
 
Last edited:

Hugare

Member
Cool stuff

But what I really want to see is a driver level, global sollution for all games. That would be the endgame.

I dont know if that would be possible, but man, it would be a dream come true
 

Swift_Star

Banned
They'll still have visual features, performance and load times to compare :p
Those are aguarbly better comparisons. I honestly can’t see the difference in the resolution between the quality and perf modes on HFW. One doesn’t look softer than the other, they both look sharp, the perf mode introduces some shimmering and aliasing but that’s it to me. Pixel counts are useless for most people I guess.
 

//DEVIL//

Member
I love GDC, lots of interesting technology being talked about.

Those cheeky bastards. They used the same demo nvidia used to introduce DLSS. Lol
XeSSDeepDive_GDC_2022_v3_Final_Robert_Hisham-26.jpg
I hope someone post a detailed video comparing this technology to dlss . This is the best we will be able to get hands on till the tech is out
 

winjer

Gold Member
Soon enough, the majority of games will have one or more of these upscalers.
And considering that both FSR 2.0 and XeSS work on other company's GPUs, there is a good chance we'll have always one to choose.
 
FSR2.0 Quality is 1.5 scale.
On Reddit there was people complaining that they wanted an "Ultra Quality" option with a bit less scalling. Intel is offering just that, and "Ultra Quality" with 1.3 scale.
 

ethomaz

Banned
I hope the image example is very compressed.
Because it is too blurry imo.

XeSS indeed removed the aliasing but lose clarity in everything else.
 
Last edited:

Dampf

Member
I love GDC, lots of interesting technology being talked about.

Those cheeky bastards. They used the same demo nvidia used to introduce DLSS. Lol
XeSSDeepDive_GDC_2022_v3_Final_Robert_Hisham-26.jpg
Kudos to them showing 720p to 1440p. That's a much, much bigger challenge for upscaling compared to 4K modes like what AMD have been showing.
 

winjer

Gold Member
Kudos to them showing 720p to 1440p. That's a much, much bigger challenge for upscaling compared to 4K modes like what AMD have been showing.

AMD showed FSR with a 50% render scale. 1080p to 4K
Besides, 2K is 1080p. So Intel upscaling 720p to 1080p is just a render scale of 68%.
 

winjer

Gold Member
720p is much less data to work with than 1080p, so differences in upscaling will much more obvious at 1440p and especially 1080p.

Reconstructing an image from 1080p to 4K is a bigger jump, than 720p to 1080p.

On the first case, we are creating 4 times the amount of pixels.
On the second case, we are creating 3 times the amount of pixels.

This means that in the first case, each new pixel has less information.
 

DaGwaphics

Member
AMD showed FSR with a 50% render scale. 1080p to 4K
Besides, 2K is 1080p. So Intel upscaling 720p to 1080p is just a render scale of 68%.

The example presented above is 720p to QHD, hence why the 720p original is 1/4 the size of the final image. In the real-world a 2K monitor is 2560 x 1440, regardless of what the specification was supposed to be.
 

winjer

Gold Member
The example presented above is 720p to QHD, hence why the 720p original is 1/4 the size of the final image. In the real-world a 2K monitor is 2560 x 1440, regardless of what the specification was supposed to be.

2K is not 1440p. That is not the standard.
 

MonarchJT

Banned
Those are aguarbly better comparisons. I honestly can’t see the difference in the resolution between the quality and perf modes on HFW. One doesn’t look softer than the other, they both look sharp, the perf mode introduces some shimmering and aliasing but that’s it to me. Pixel counts are useless for most people I guess.
instead 4s vs 7s second in loading are game changer for most people? ahahah
let's be real here ....average gamers don't give a fuck about most of the things that users on forum like this one talk about
 
Last edited:

DaGwaphics

Member
2K is not 1440p. That is not the standard.

They've used 2K in the way the market uses the term. 2K is not used to label FHD displays. It is used to reference QHD and WQHD displays.

Just look at the monitors listed as 2K by Newegg, Bestbuy, or the display manufacturers themselves. Intel literally explains that this was demoing their most aggressive mode with a HD to QHD reconstruction.

 
Last edited:

winjer

Gold Member
They've used 2K in the way the market uses the term. 2K is not used to label FHD displays. It is used to reference QHD and WQHD displays.

Just look at the monitors listed as 2K by Newegg, Bestbuy, or the display manufacturers themselves. Intel literally explains that this was demoing their most aggressive mode with a HD to QHD reconstruction.

I'm not going on another thread about the 2K standard. There have been to many here on neogaf.

Regardless, even if Intel is using 720p to 1440p, then it's the same information per pixel, as AMD when showing 1080p to 2160p.
 

DaGwaphics

Member
I'm not going on another thread about the 2K standard. There have been to many here on neogaf.

Regardless, even if Intel is using 720p to 1440p, then it's the same information per pixel, as AMD when showing 1080p to 2160p.

As always, what the standard is on paper has absolutely no bearing or relevance to how the term is defined in the real-world.

A 2K display is 1440p because the market decided to go with that term. If you have a problem with that, take it up with all the GPU makers, the display makers, and the retail chains. :messenger_winking_tongue:

It is a 4x increase in total pixels in both cases sure, who said that it wasn't? The lower the starting pixels the worse the results, so kudos to them for showing the more difficult 720p to 1440p reconstruction.
 
Last edited:

winjer

Gold Member
It is a 4x increase in total pixels in both cases sure, who said that it wasn't? The lower the starting pixels the worse the results, so kudos to them for showing the more difficult 720p to 1440p reconstruction.

These temporal solutions sample neighboring pixels, with a jittering pattern, across a number of frames, then accumulate the result in a weighed manner.
So it doesn't matter the resolution, because we are not sampling the whole image. Just the pixels that are near.
What matter is how many pixels are sampled. This means the resolution scale and sampled pixels are the important factor.
 
Last edited:

TheDreadLord

Gold Member
Pretty soon native pixel counts will be a thing of the past.
I don't think so. Hardware will still evolve with time and become more capable of pushing even more pixels. But for sure, this tech is awesome from an efficiency perspective - you get "more" for less money.
 

DaGwaphics

Member
These temporal solutions sample neighboring pixels, with a jittering pattern, across a number of frames, then accumulate the result in a weighed manner.
So it doesn't matter the resolution, because we are not sampling the whole image. Just the pixels that are near.
What matter is how many pixels are sampled. This means the resolution scale and sampled pixels are the important factor.

So far, the 720p to 1440p results have always looked less impressive than the 1080p to 4k results, whether it is DLSS, FSR, or TSR. I was just speaking as to how the final image looks, not the technical specifics behind it.
 

lukilladog

Member
All these companies competing to provide upscaling solutions is incredible for games.

But notice the stagnation and influx of turds when it comes to new videocards, 6600xt, 6500xt, 6600, rtx 350, rtx 360, and probably others. They are pathetic if you compare price ranges to previous generations, they are making upscaling part of the deal if you want aceptable performance and locking us to 1080p rendering for profit.

On the PC side, gone are the Days of buying an affordable card and pushing some nice 2x supersampling anti aliasing over console resolution, now we applaud them for making the inverse, bearable... at a new ridiculous premium price tag. Bravo!.
 
Last edited:

DaGwaphics

Member
But notice the stagnation and influx of turds when it comes to new videocards, 6600xt, 6500xt, 6600, rtx 350, rtx 360, and probably others. They are pathetic if you compare price ranges to previous generations, they are making upscaling part of the deal if you want aceptable performance and locking us to 1080p rendering for profit.

That's more about what happens when the companies making the products realize they don't need to compete for you. In a market where everything sells because it is there to be sold, don't expect much advancement. Hopefully Eth 2.0 will allow supply to even out, bringing back competition at the various price points. The market is normalizing a bit just from the work bomb associated with the transition.
 

winjer

Gold Member
So far, the 720p to 1440p results have always looked less impressive than the 1080p to 4k results, whether it is DLSS, FSR, or TSR. I was just speaking as to how the final image looks, not the technical specifics behind it.

yet, it's the same process, the same render scale, the same pixel sampling, etc.
Just the base resolution that changes.......
 
Reconstructing an image from 1080p to 4K is a bigger jump, than 720p to 1080p.

On the first case, we are creating 4 times the amount of pixels.
On the second case, we are creating 3 times the amount of pixels.

This means that in the first case, each new pixel has less information.

Absolutely, but the sheer fact that we've gotten to this point from a mathematical and, more importantly, computationally implementable form is nothing short of astounding. The creativity of these engineers who are using algorithms from other disciplines is fantastic when you consider 20 years ago we were talking about things like quantization errors in a 16-bit framebuffer. How far we've come, can't wait for the future...
 

winjer

Gold Member
Absolutely, but the sheer fact that we've gotten to this point from a mathematical and, more importantly, computationally implementable form is nothing short of astounding. The creativity of these engineers who are using algorithms from other disciplines is fantastic when you consider 20 years ago we were talking about things like quantization errors in a 16-bit framebuffer. How far we've come, can't wait for the future...

I have to agree completely with you.
Because of the chip shortage and high GPU prices, I'm stuck with an RTX 2070S. And with games getting heavier, the 2070S it's starting to show it's age.
DLSS and TAAU have been life savers to recoup a nice chunk of performance back, with a small hit to image quality.
And with XeSS and FSR 2.0, the list of games is sure to increase a lot.
 

lukilladog

Member
That's more about what happens when the companies making the products realize they don't need to compete for you. In a market where everything sells because it is there to be sold, don't expect much advancement. Hopefully Eth 2.0 will allow supply to even out, bringing back competition at the various price points. The market is normalizing a bit just from the work bomb associated with the transition.

They are competing to produce the worst turd from what I see. I mentioned it years ago, it was suspicious that even when AMD was not able to produce a high end card and their market share was minimal, they were not willing to produce segment disruptive stuff for some reason, they had to leave nvidia market segmentation at peace no matter what. Some price fixing investigation for Jen Hsun and his cousing over at AMD is overdue, specially now after TSMC CEO mentioned they both were hoarding chips (which could be the very reason behind on why prices rocketed like this). Hopefully Intel will bring back some form of competition from these two.
 

winjer

Gold Member
XeSS in Remnant 2 is miles better then FSR2, FSR2 looks like horse shit.

Better yet, you can replace the "libxess.dll" with the new 1.2.13 version.

 
Top Bottom