• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

FSR 2.0 to be announced soon

winjer

Gold Member

AMD FSR 2.0 might be announced soon, “impressive performance and image quality”


W4jiENj.png


The announcement of FSR 2.0 was not expected till at least RDNA3 architecture release, now planned for Q3/Q4 based on the leaks. However, with no requirement for AI acceleration, the technology might be supported by shader cores, possibly using similar instruction like Intel XeSS (DP4a).

In two weeks at Game Developer Conference 2022, AMD will be hosting a session called “Next-Generation Image Upscaling for Games”. Assuming that the tweet and this GDC presentation are related, then we might see the first demo of the technology.

However, one should note that GDC requires registration. According to AMD though, all GDC sessions will be available a day later, on March 24th 4PM GMT on GPUOpen website.
 

Bojji

Member
I'm not saying it requires dedicated hardware for it, I'm just saying A.I. reconstruction might be needed to reach native or pass native.

I doubt it will be better than DLSS 2.x or XeSS but it will be much better than 1.0 version and probably many devs will be using it. Temporal upsampling is the next best thing after Ai reconstruction and console games needs this shit....
 
Considering the ultra quality & quality modes were generally good, it'd be interesting to see if these improvements can help out with the lower settings as they were naff and best avoided
 

SeraphJan

Member
I recently played Spider Man Remaster and I'm shocked how good it looks AND have RT and 60FPS, same for Ratchet. Insomniac knows how to reconstruct image quality.
Exactly, native is a waste of resources, if we have tech such as FSR and Checkerboarding, 60fps with correct upscale/reconstruction is way better than native with 30fps

Not everyone is going to use a 400% magnifying glass to nitpick the difference between Native and FSR/Checkerboarding.

But for 30fps vs 60fps, even my grandmother could tell the difference
 
Last edited:

Xdrive05

Member
Curious how they will do this temporally without AI to help, and it sounds like they will be using GPU compute to basically try to do the same thing that DLSS does without the AI acceleration. Competition rocks!

Tech like FSR, NIS and DLSS has been a godsend in these trying times. With prices still high, most gaming PCs will continue to be in the GTX 1650 / 1060 / 980 power class which, while still great for 1080p medium - high settings, benefit the most from some 20-30% performance boost of FSR and NIS.
 

winjer

Gold Member
Curious how they will do this temporally without AI to help, and it sounds like they will be using GPU compute to basically try to do the same thing that DLSS does without the AI acceleration. Competition rocks!

Tech like FSR, NIS and DLSS has been a godsend in these trying times. With prices still high, most gaming PCs will continue to be in the GTX 1650 / 1060 / 980 power class which, while still great for 1080p medium - high settings, benefit the most from some 20-30% performance boost of FSR and NIS.

It could be something similar with Epic's Temporal AA Upscaling.
This has been used in several game, including consoles, with very good results.
Maybe AMD has some extra sauce on top of a temporal upscaler.
 

01011001

Banned
this sounds like they just made a form of TAA and I can't see that looking any better than established TAA solutions.

FSR1.0 was already a gigantic disappointment that looked usually worse than other upsampling solutions that most modern engines already basically come with these days.

so if FSR 2.0 isn't some absolutely revolutionary take on TAAU, I can't see it making any sense for most developers
 
Hyped despite no AI just for the reason that this might be coming to consoles too.

I ask again.
Didn't Microsoft during the Hotchips talk about the SeX architecture, didn't they mention they had a custom ML hardware? What happened with that?
But even so, the consoles are more than capable of offering the necessary low precision math for this upscale. Having more only means being able to use a higher quality tier upscale.
 
I ask again.
Didn't Microsoft during the Hotchips talk about the SeX architecture, didn't they mention they had a custom ML hardware? What happened with that?
But even so, the consoles are more than capable of offering the necessary low precision math for this upscale. Having more only means being able to use a higher quality tier upscale.
Just like DirectStorage, DirectML isn't being used yet. It's something they want to do in the future (afaik, they were actively experimenting with it last year).
 

01011001

Banned
I ask again.
Didn't Microsoft during the Hotchips talk about the SeX architecture, didn't they mention they had a custom ML hardware? What happened with that?
But even so, the consoles are more than capable of offering the necessary low precision math for this upscale. Having more only means being able to use a higher quality tier upscale.

no custom hardware no, only int4/int8 support.

Intel XeSS is basically confirmed to work on Series X|S already, and honestly, I am expecting way more from that than from FSR 2.0
 

TonyK

Member
Looking forward to see a solution for consoles equivalent to DLSS, but until now I prefer native. Upscaling in Horizon 2, for example, is pure shit and makes performance mode to look worse than native 1080p.
 

01011001

Banned
Looking forward to see a solution for consoles equivalent to DLSS, but until now I prefer native. Upscaling in Horizon 2, for example, is pure shit and makes performance mode to look worse than native 1080p.

that is an extreme example tho. I have never seen checkerboarding look that shit tbh. it's astonishingly bad... to the point where you have to wonder what went wrong there...

but other games like Dark Souls Remastered or even older PS4pro titles like Deus Ex and Watch Dogs 2 looked very convincingly like 1800p back in the day
 
Last edited:

hlm666

Member
I was under the impression the scenarios that made dlss look better than native was down to the AI adding the missing pixels in fine line detail that could possibly still crawl or be missing at native. Not saying it can't be as good or better than xess/dlss but if it's not using AI i'm interested to see how they manage to make the reconstruction better than native.
 

01011001

Banned
I was under the impression the scenarios that made dlss look better than native was down to the AI adding the missing pixels in fine line detail that could possibly still crawl or be missing at native. Not saying it can't be as good or better than xess/dlss but if it's not using AI i'm interested to see how they manage to make the reconstruction better than native.

DLSS adding detail is a mix of the AI being trained to complete/interpret detail and accumulation over multiple sample frames.

you could do that with only accumulation, but it won't be as accurate or sharp I bet.
 

rofif

Can’t Git Gud
A rumor of a tease of a release... of anything nowadays. Even most boring shit like new anti aliasing.
what a time to be alive
 

manfestival

Member
AMD now claiming theirs can also be as good or better than native. barf
DLSS 1.0 and FSR 1.0 were not really worth using but DLSS 2.0 was definitely better... not sure if worth using over native but it has come a long way. Let's hope FSR 2.0 is similar in that sense.
 

OZ9000

Banned
I mean I'm still on Team DLSS, but it's nice that FSR is available on everything and works really nice in VR games.
Also Valve even added it to Steam Deck
DLSS looks absolutely awful sometimes

Granted it's useful if your monitor is 4K but on my 1080p set, image quality takes a nosedive and is full of artefacts.
 

GreatnessRD

Member
I am interested in seeing the strides they've made (if any) after FSR 1.0. The only thing I don't like is now we'll have 3 options between DLSS (if you have an Nvidia GPU), FSR and XeSS. Then we'll have to hope all three in implemented in games by the Devs. Which I know isn't going to be a fun time.
 

FireFly

Member
DLSS adding detail is a mix of the AI being trained to complete/interpret detail and accumulation over multiple sample frames.

you could do that with only accumulation, but it won't be as accurate or sharp I bet.
Based on the Nvidia presentation, DLSS doesn't attempt to "guess" what detail should be in a given frame. The reason it doesn't do this this, is that this could end up "hallucinating" details and compromising the artistic vision. Rather it more intelligently rejects data from previous frames, to allow for more information to be retained without "ghosting". So essentially it is a better version of TAA upscaling. There's no reason in principle why an algorithm couldn't do the same and indeed TSR doesn't look to be far off.

So I can see this being an open source version of Epic's TSR.
 
FSR 2.0 vs XeSS let's go!

I don't see DLSS being able to hang on if developers can simply use the other ones as an all-in-one solution. Unless DLSS goes open source and supports tensor-less hardware.
 

01011001

Banned
Based on the Nvidia presentation, DLSS doesn't attempt to "guess" what detail should be in a given frame. The reason it doesn't do this this, is that this could end up "hallucinating" details and compromising the artistic vision. Rather it more intelligently rejects data from previous frames, to allow for more information to be retained without "ghosting". So essentially it is a better version of TAA upscaling. There's no reason in principle why an algorithm couldn't do the same and indeed TSR doesn't look to be far off.

So I can see this being an open source version of Epic's TSR.

Epic's TSR looks awful in comparison tho. the Matrix demo never looks convincingly high res. meanwhile even running DLSS in performance mode on my 4K TV looks almost native at times

so far no TAA upsampling method I know looks even close to DLSS. even running at 4K with DLSS Ultra Performance in Death Stranding looks higher res than the Matrix demo using epic's solution.
Ultra Performance in 4K is basically 720p native btw.
 
Last edited:
I'll look into this if I ever get into the 4K realm, but I'm keeping my PC gaming at 1440p for a long while. And at that resolution, my 6900 XT should be able to keep up with games with no issues for the next couple of years.
 

ethomaz

Banned
I find that better that native buzz weird… because it is not true at all.

nVidia said the same about DLSS but they compared DLSS that auto applied AA and others post processing filters with raw native resolution.

Native resolution will still looks better after you apply the same render and post render effects/processing.
 
Last edited:

01011001

Banned
I find that better that native buzz weird… because it is not true at all.

nVidia said the same about DLSS but they compared DLSS that auto applied AA and others post processing filters with raw native resolution.

Native resolution will still looks better after you apply the same render and post render effects/processing.

not true. if you ever played Death Stranding on PC with native res and the game's TAA solution you will instantly switch to DLSS to avoid the aliased and unstable look of the native res.

there are definitely games that look better with DLSS
 

ethomaz

Banned
not true. if you ever played Death Stranding on PC with native res and the game's TAA solution you will instantly switch to DLSS to avoid the aliased and unstable look of the native res.

there are definitely games that look better with DLSS
Because the AA implementation sucks.

Turn it off ingame and enable AA via driver… you can do the same with the bad sharpening filter… turn it off and do sharpening via driver.

Ohhhh and if you already in the Drive menus why not force AF to 16x.

Edit - I don’t even know why devs like TAA (that blurry thing) and this game game a implementation that doesn’t even fix aliasing just added blur…
 
Last edited:

Boy bawang

Member
I wonder if this will be enough to run recent games with the future dell XPS 13 my work is supposed to buy me soon-ish, which are said to feature a decent integrated GPU in their 2022 iteration. I'd love to try PC gaming, but I don't want to invest in one.
 

01011001

Banned
Because the AA implementation sucks.

Turn it off ingame and enable AA via driver… you can do the same with the bad sharpening filter… turn it off and do sharpening via driver.

Ohhhh and if you already in the Drive menus why not force AF to 16x.

Edit - I don’t even know why devs like TAA (that blurry thing) and this game game a implementation that doesn’t even fix aliasing just added blur…

the only thing I can do using nvidia drivers is either using FXAA, whichnlooks like shit... or use super sampling which kills my performance.

so in the end, DLSS is by far the best option due to it looking better than native + TAA, better than FXAA and has decent performance compared to SSAA

DLSS for Death Stranding is the only viable option unless you have a 3080ti or something so that you can easily do SSAA.

and now that I think about it, I think I had DSR settings turned on in the Nvidia control panel and Death Stranding didn't give me any of the res options I had enabled and maxed out at native res, so SSAA might not even be easily possible.

DLDSR would maybe be an option performance wise even on lower end cards, but again, that necessitates that the game supports it, and I ran into a few already that refused to recognise the DSR resolutions I enabled
 
Top Bottom