• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia introduces DLDSR. AI powered DSR powered by tensor cores.

BennyBlanco

aka IMurRIVAL69
nvidia-dldsr-ai-deep-learning-dynamic-super-resolution-performance-image-quality-comparison-1030x579.jpg


NVIDIA dropped some unexpected news on a few graphics-enhancing features that will go live with the next Game Ready driver, starting with DLDSR. It's exactly what it sounds like, an AI-powered version of the Dynamic Super Resolution or DSR that's been available via the Control Panel for several years.

According to NVIDIA, DLDSR could be up to twice as efficient while maintaining similar quality. In the example image, we can see Prey running at nearly the same frame rate of the native 1080p resolution, all the while actually rendering at 1620p resolution for crisper definition.

By combining DLDSR and depth-based filters such as SSRTGI, you could obtain a 'remastered' game as seen in the below image from Prey.



giphy.gif
 

GymWolf

Member
What are the scenarios where this thing is useful? It is like an inferior dlss but supported on any game?!

Explain to me like you explain to a toddler please.

Does 4k\1800p with this thing ON looks sharper and better than native 1440p on a 4k screen with the same performance hit?!
 
Last edited:

elliot5

Member
What are the scenarios where this thing is useful? It is like an inferior dlss but supported on any game?!

Explain to me like you explain to a toddler please.
It’s basically driver level Super resolution. Render at a greater pixel count to Super sample down to your native resolution. Difference is now the AI can do the super part, so you’re still rendering mostly at native resolution.

Hence why native 1080p in Prey is like the same frame rate as 1620p but looks as good if not better than 4K DSR
 

GymWolf

Member
It’s basically driver level Super resolution. Render at a greater pixel count to Super sample down to your native resolution. Difference is now the AI can do the super part, so you’re still rendering mostly at native resolution.

Hence why native 1080p in Prey is like the same frame rate as 1620p but looks as good if not better than 4K DSR
Your toddlers must be very smart dude, be proud of them.

So let's talk real life scenarios here, i have a 2070super and i'm forced to play at 1440p with most games to maintain 60 and good details, what i can do with this thing?! Like playing at fake 4k while maintaining the same settings\framerate?!
 
Last edited:

winjer

Gold Member
Remember when everyone thought Tensor cores were useless for gaming GPUs? Good times.

(To be fair DLSS 1.0 was... bad.)

The tensor cores on my 2070 were sitting idle for almost 2 years.
If it wasn't for the GPU prices, I would have already upgraded to Ampere, and would have never used the tensor cores in the 2070.
I did try DLSS 1.0 on Metro Exodus and BFV, but it was appalling.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
Your toddlers must be very smart dude, be proud of them.

So let's talk real life scenarios here, i have a 2070super and i'm forced to play at 1440p with most games to maintain 60 and good details, what i can do with this thing?! Like playing at fake 4k while maintaining the same settings\framerate?!
I'm a dummy and I think this will let me play games at an internal resolution of 4K on my 1440p monitor, which if that's the case then it's fucking great.
 

ToTTenTranz

Banned
What are the scenarios where this thing is useful?
It's for the times when you have a lower resolution monitor, like 1080p or 1440p but your GPU can afford to render at a higher resolution than that.

It's called Super Sampling and they already had this with DSR (or AMD's VSR, or any console since the PS4 Pro when connected to a 1080p TV).
The difference here is that they're using the tensor cores to downsample from the larger resolution image to the lower resolution ones. Or they're using the tensor cores to decide how high the resolution can go.

Regardless, this has nothing to do with FSR / RSR that AMD announced earlier this month.


There are no motion vectors at play here, so don't expect anything like temporal AA or DLSS. It might compete with some in-engine / in-game MSAA implementations.


I wouldn't hold my breadth, though. We know how their deep learning algorithms perform without motion vectors. It was called DLSS 1.0 and it wasn't pretty.
 
Last edited:

GymWolf

Member
It's for the times when you have a lower resolution monitor, like 1080p or 1440p but your GPU can afford to render at a higher resolution than that.

It's called Super Sampling and they already had this with DSR (or AMD's VSR, or any console since the PS4 Pro when connected to a 1080p TV).
The difference here is that they're using the tensor cores to downsample from the larger resolution image to the lower resolution ones. Or they're using the tensor cores to decide how high the resolution can go.

Regardless, this has nothing to do with FSR / RSR that AMD announced earlier this month.


There are no motion vectors at play here, so don't expect anything like temporal AA o DLSS. It might compete with some in-engine / in-game MSAA implementations.
Ok clear, so the opposite of what i need, lmao.

Well good thing for people with old monitors and a 3080 under their ass i guess.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
What are the scenarios where this thing is useful? It is like an inferior dlss but supported on any game?!

Explain to me like you explain to a toddler please.

Does 4k\1800p with this thing ON looks sharper and better than native 1440p on a 4k screen with the same performance hit?!

ITs DSR but using the tensor cores to cut the load.
Think of it like Super Sampling for free....or rather free hyper quality Antialiasing.

So as per the example:

Your screen is 1080p.
With regular DSR you just activate the 2160p render resolution and the GPU renders the game at 2160p then downsamples it to 1080p....giving you a much crisper image....but this was all done on the GPU cores...to the game and the GPU the cost is the same as if you had a native 2160p screen and were rendering the game at 2160p aka it was expensive.

With this solution.
You have a 1080p screen.
You decide to super sample for the same cost as rendering the game at 1080p you get a super sampled image from 1620p down to 1080p.
Giving you an image as crisp yet cheaper than super sampling from 2160p.

Its a DSR solution so its for rendering things beyond your displays resolution.

Explain like your 5?
DLSS - Making things look native resolution from below native.
DLDSR - Making things look better than native from beyond native resolution without the cost of actually going beyond native.

On Topic
Holy shit NVIDIA just ended the game.
We could already kinda hack our way into this with DLSS at native resolution which lead to a SuperSampled image.
But now im all in on not upgrading to a 4K screen till ultrawide 2160p screens are a thing and GPUs can actually handle said resolution without breaking a sweat.



But the real news is theyve added Pascals SSRTGI, AO and DOF to the GeForce Experience.
Fuck your lazy remasters developers...GFE has got this covered.
ooooohhhhfffffff!
Cidwdx3.jpg
 

GymWolf

Member
ITs DSR but using the tensor cores to cut the load.
Think of it like Super Sampling for free....or rather free hyper quality Antialiasing.

So as per the example:

Your screen is 1080p.
With regular DSR you just activate the 2160p render resolution and the GPU renders the game at 2160p then downsamples it to 1080p....giving you a much crisper image....but this was all done on the GPU cores...to the game and the GPU the cost is the same as if you had a native 2160p screen and were rendering the game at 2160p aka it was expensive.

With this solution.
You have a 1080p screen.
You decide to super sample for the same cost as rendering the game at 1080p you get a super sampled image from 1620p down to 1080p.
Giving you an image as crisp yet cheaper than super sampling from 2160p.

Its a DSR solution so its for rendering things beyond your displays resolution.

Explain like your 5?
DLSS - Making things look native resolution from below native.
DLDSR - Making things look better than native from beyond native resolution without the cost of actually going beyond native.

On Topic
Holy shit NVIDIA just ended the game.
We could already kinda hack our way into this with DLSS at native resolution which lead to a SuperSampled image.
But now im all in on not upgrading to a 4K screen till ultrawide 2160p screens are a thing and GPUs can actually handle said resolution without breaking a sweat.



But the real news is theyve added Pascals SSRTGI, AO and DOF to the GeForce Experience.
Fuck your lazy remasters developers...GFE has got this covered.
ooooohhhhfffffff!
Cidwdx3.jpg
So it's big deal for people without 4k displays right?!

That reshade mode integrated into nvcp looks nice, i never used reshade because it always looked a bit complicated, but if i can just turn on a setting on nvcp this could be nice.

I want to try condemned 1 with that shit, imagine the atmosphere...
 
Last edited:

Larxia

Member
What are the scenarios where this thing is useful? It is like an inferior dlss but supported on any game?!

Explain to me like you explain to a toddler please.

Does 4k\1800p with this thing ON looks sharper and better than native 1440p on a 4k screen with the same performance hit?!
If I understand it correctly, it's a less demanding DSR.
It's for if you're playing a game in let's say 1080 or 1440p, but the game has a shity anti aliasing, so to improve the image quality you decide to render it in higher resolution, like 4K, while still displaying it on your lower resolution monitor.

This is very common, also sometimes available directly in games with SSAA or render resolution scaling.
So this DLDSR would basically use DLSS technology, but not in the goal at displaying on a higher output resolution, but instead to increase the internal resolution and improving the image quality.
 

GymWolf

Member
Wait a moment, if i set my tv resolution to 1440p and then i upscale to 4k and the return to 1440p with the dldsr, do i have a 1440p image that looks better than native 1440p??

I guess that people with 4k display and a powerfull gpu can do use this thing do upscale and downscale from 8k to have better 4k iq??
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So it's big deal for people without 4k displays right?!

That reshade mode integrated into nvcp looks nice, i never used reshade because it always looked a bit complicated, but if i can just turn on a setting on nvcp this could be nice.

I want to try condemned 1 with that shit, imagine the atmosphere...

Its a big deal for anyone who can run their games at native res.
If your 3080Ti can run your games at 4K on your 4K panel....why not run those games at 6K instead for the same cost as running them at 4K.
For games like say ResEvi 3 Re where seemingly no amount of AA fixes things, you can now just render the game from a much higher resolution at basically no cost to try get all the aliasing outta there.


And yeah not needing to mess with reshade is gonna make alot more people understand why good AO and GI really is gaming changing when making a game look good.
I was already a huge proponent of freestyle but add in Global Illumination and Ambient Occlusion.....i think i might be replay a bunch of older titles again.

Condemned and F.E.A.R are some of the first games im gonna give a go.
 

GymWolf

Member
Its a big deal for anyone who can run their games at native res.
If your 3080Ti can run your games at 4K on your 4K panel....why not run those games at 6K instead for the same cost as running them at 4K.
For games like say ResEvi 3 Re where seemingly no amount of AA fixes things, you can now just render the game from a much higher resolution at basically no cost to try get all the aliasing outta there.


And yeah not needing to mess with reshade is gonna make alot more people understand why good AO and GI really is gaming changing when making a game look good.
I was already a huge proponent of freestyle but add in Global Illumination and Ambient Occlusion.....i think i might be replay a bunch of older titles again.

Condemned and F.E.A.R are some of the first games im gonna give a go.
Yeah i'm understanding how this works for everyone because you can always downscale from 6k or 8k if you have a 4k display.

Can you answer my question in the other post please because i'm not sure if this thing can be of any use in my specific case with a 4k tv but a non-4k gpu.

I have both fear and condemned on steam but controllers don't work with these games so i can't really do anything with them...dark messiah is another game that i want to try with this thing (no controller support aswell)
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wait a moment, if i set my tv resolution to 1440p and then i upscale to 4k and the return to 1440p with the dldsr, do i have a 1440p image that looks better than native 1440p??

I guess that people with 4k display and a powerfull gpu can do use this thing do upscale and downscale from 8k to have better 4k iq??

Thats an awfully convoluted way to make an image look good.
Youd be better off just using DLSS from 1440p to 4K.
Going down to go up to go back down doesnt make much sense and likely wont be worth the hassle.

And yes, as long as you can render at resolution X, you can make your image look better than resolution X with DLDSR.
So if you PC can do 4K on a 4K panel....then render at 6K
If you have a 1440p panel...then render at 4K.
 

Kenpachii

Member
So it's big deal for people without 4k displays right?!

That reshade mode integrated into nvcp looks nice, i never used reshade because it always looked a bit complicated, but if i can just turn on a setting on nvcp this could be nice.

I want to try condemned 1 with that shit, imagine the atmosphere...

DSR ( supersampling ) higher resolution rendering on a lower resolution screen is used by everybody even 4k screens, however u will see less people use it at higher resolutions because gpu performance straight up dies out. This is why people originally a decade ago wanted sli /crossfire setups to DSR with.

I got a 3440x1440 and i can tell you 6880x2880 looks a lot better then 3440x1440 even on such a screen. However the main problem here is.

3440x1440 = 4,953,600 pixels
6880x2880 = 19,814.400 pixels

4k = 8.371.200 pixels.
8k = 33.177.600 pixels.

U need to render a gigantic amount of pixels more, which is absolutely not doable in most games even older games. U need also a gigantic amount of v-ram to boot with it. Even older games a 3080 aint enough in games like horizon as textures will simple not load in.

So in order to solve this problem, Nvidia created DLSS, which was originally created to be a solution towards AA and Supersampling. Aka supersampling without performance loss + still having the crispy image and AA solution.

Because they also had RT and the 2000 series where not particulair good at it, they needed more performance and pushed DLSS into another direction of what it is now. So them moving DLSS into DSR again makes total sense.

However i don't see much use for it at this point. DLSS is already so good and performance on highr resolutions DLSS is heavily used that i don't think much people care for it anymore. It's nice to have tho.

What i am more interested in, if this also can be used to lower the native resolution and dldsr it back towards native, so basically driver level DLSS. and how it upholds. This could be huge as it would basically make every game DLSS capable.
 

GymWolf

Member
Thats an awfully convoluted way to make an image look good.
Youd be better off just using DLSS from 1440p to 4K.
Going down to go up to go back down doesnt make much sense and likely wont be worth the hassle.

And yes, as long as you can render at resolution X, you can make your image look better than resolution X with DLDSR.
So if you PC can do 4K on a 4K panel....then render at 6K
If you have a 1440p panel...then render at 4K.
Lol i know but i have a peculiar situation with a 4k panel but a 1440p gpu so i was trying to think a method to use this thing to improve the image a bit.

Of course if dlss is available i'm gonna use that instead of this because i can actually have gains from dlss.

So in the end, this thing is not very useful to me until i'm gonna upgrade my gpu to have performance to spare at 4k (if this is ever gonna happen) to try downscale from 6k.
 
Last edited:
DSR ( supersampling ) higher resolution rendering on a lower resolution screen is used by everybody even 4k screens, however u will see less people use it at higher resolutions because gpu performance straight up dies out. This is why people originally a decade ago wanted sli /crossfire setups to DSR with.

I got a 3440x1440 and i can tell you 6880x2880 looks a lot better then 3440x1440 even on such a screen. However the main problem here is.

3440x1440 = 4,953,600 pixels
6880x2880 = 19,814.400 pixels

4k = 8.371.200 pixels.
8k = 33.177.600 pixels.

U need to render a gigantic amount of pixels more, which is absolutely not doable in most games even older games. U need also a gigantic amount of v-ram to boot with it. Even older games a 3080 aint enough in games like horizon as textures will simple not load in.

So in order to solve this problem, Nvidia created DLSS, which was originally created to be a solution towards AA and Supersampling. Aka supersampling without performance loss + still having the crispy image and AA solution.

Because they also had RT and the 2000 series where not particulair good at it, they needed more performance and pushed DLSS into another direction of what it is now. So them moving DLSS into DSR again makes total sense.

However i don't see much use for it at this point. DLSS is already so good and performance on highr resolutions DLSS is heavily used that i don't think much people care for it anymore. It's nice to have tho.

What i am more interested in, if this also can be used to lower the native resolution and dldsr it back towards native, so basically driver level DLSS. and how it upholds. This could be huge as it would basically make every game DLSS capable.
This would be epic.
 

GymWolf

Member
DSR ( supersampling ) higher resolution rendering on a lower resolution screen is used by everybody even 4k screens, however u will see less people use it at higher resolutions because gpu performance straight up dies out. This is why people originally a decade ago wanted sli /crossfire setups to DSR with.

I got a 3440x1440 and i can tell you 6880x2880 looks a lot better then 3440x1440 even on such a screen. However the main problem here is.

3440x1440 = 4,953,600 pixels
6880x2880 = 19,814.400 pixels

4k = 8.371.200 pixels.
8k = 33.177.600 pixels.

U need to render a gigantic amount of pixels more, which is absolutely not doable in most games even older games. U need also a gigantic amount of v-ram to boot with it. Even older games a 3080 aint enough in games like horizon as textures will simple not load in.

So in order to solve this problem, Nvidia created DLSS, which was originally created to be a solution towards AA and Supersampling. Aka supersampling without performance loss + still having the crispy image and AA solution.

Because they also had RT and the 2000 series where not particulair good at it, they needed more performance and pushed DLSS into another direction of what it is now. So them moving DLSS into DSR again makes total sense.

However i don't see much use for it at this point. DLSS is already so good and performance on highr resolutions DLSS is heavily used that i don't think much people care for it anymore. It's nice to have tho.

What i am more interested in, if this also can be used to lower the native resolution and dldsr it back towards native, so basically driver level DLSS. and how it upholds. This could be huge as it would basically make every game DLSS capable.
Sorry if i get straight to the point, but how is dldsr useful for people like me with a 4k display but a 1440p gpu like a 2070super??

I can't sustain 4k so downscaling from even higher resolution would be pointless because i just don't have enough horse power right?!

Sorry for dumbing down the discussion too much.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
What i am more interested in, if this also can be used to lower the native resolution and dldsr it back towards native, so basically driver level DLSS. and how it upholds. This could be huge as it would basically make every game DLSS capable.

The reason it might not work is because DLSS is for going up and DSR is for going down. (Im betting it wont)
Your weak spot in the go down then back up....is the going to be back up part.
You;d go down just fine, and basically for free....but once you tried to come back to native resolution youd probably lose all the image quality you were hunting.

If you arent using DLSS whatever driver level solution you are using to go back up to native resolution is likely worse than if you just stuck with native res.
You need some sort of temporal solution to gain back that lost data.
At which point you might as well have just used that temporal solution to go up from a lower resolution to native.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats the SSRGTI and AO.
And yes thats how light works....if there is no light bouncing or directly hitting an object why would it be lit?

I know videogames have trained you to believe objects just get lit by magic.....but thats no how the world works.
The occluded areas are more realistic and actually ground the objects in the world.
Without that occlusion the objects look like they are floating in the space.

Look at the TV or bookshelfs in the background.....whats lighting them up?
Even the gold circle on the ground....look how much more detail you are getting from it simply because it has more occlusion on it.
The side NOT facing the lights should be that bright.
 

sertopico

Member
I wonder if I will be able to use this technique along with DLSS. I have a 1440p monitor, I use this one to get to 4k and then I activate DLSS ingame. The final image should be crisper than 1440p plus DLSS. Does it make sense? :p
 
Last edited:

ZywyPL

Banned
Wasn't this kind of supersampling already available by combining DSR with DLSS? Guess it's just a way to make it more simple, a on-click toggle option.

But that GI filter? Simply amazing (dunno that the main character from Back to the Future has to do with it tho).
 

YCoCg

Member
(dunno that the main character from Back to the Future has to do with it tho).
Thats the modders nickname, they've released various graphical related things in the past but this is one of the things that's been picked to be incorporated at driver level.
 

Haggard

Banned
Well, it sounds great, but I´m gonna wait for actual tests with the tech until I start praising it.
The first iteration of DLSS was crap, too.
 
I can't sustain 4k so downscaling from even higher resolution would be pointless because i just don't have enough horse power right?!
That's the point of the "DL" part. If your GPU has Tensor cores, they take the load of upscaling. Basically, you maintain whatever FPS you can get running at native, but improved visual fidelity. It probably won't matter much if you can't sustain native resolution though.
 
Top Bottom