• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia introduces DLDSR. AI powered DSR powered by tensor cores.

Just tested it on Cyberpunk. No framerate improvements for DSR vs DLDSR. In fact, I honestly think regular DSR was giving me 1 or 2 more frames.

A bit strange. Not sure what Nvidia thought they had here.
 
Just tested it on Cyberpunk. No framerate improvements for DSR vs DLDSR. In fact, I honestly think regular DSR was giving me 1 or 2 more frames.

A bit strange. Not sure what Nvidia thought they had here.

I think what Nvidia meant was it runs better than 4x scaling, but looks the same.
After trying it I'd say it does run better than 4x scaling (obviously since it's a lower resolution) but it doesn't look nearly as good.
 

Jigsaah

Gold Member
Question. I have a 1440p monitor. Never wanted to go 4k because I just felt it wasn't worth the frame-rate hit.

I have a 3080 and a 5800x. With this new feature is it smart to get a 4k monitor expecting I can get similar performance to what I get at 1440p right now?

I can run Apex Legends 165 fps pretty easily at 1440p. Soooo this would mean I could run the game at 144-165 fps at 4K?
 

Dream-Knife

Banned
Question. I have a 1440p monitor. Never wanted to go 4k because I just felt it wasn't worth the frame-rate hit.

I have a 3080 and a 5800x. With this new feature is it smart to get a 4k monitor expecting I can get similar performance to what I get at 1440p right now?

I can run Apex Legends 165 fps pretty easily at 1440p. Soooo this would mean I could run the game at 144-165 fps at 4K?
This feature is basically super sampling. Instead of using dlss and a lower resolution image upscaled to a higher resolution, this upscales a lower resolution to a higher resolution then downsamples it back to your native resolution.

What you're looking for is dlss.
 

Kenpachii

Member
Ah found it out.

Press alt + F3, then select 1 at the top, and select in the drop down box the following:

jIn5ZN9.jpg


U need geforce experience by the way.

  • SSRTGI (Screen Space Ray Traced Global Illumination), commonly known as the “Ray Tracing ReShade Filter” enhances lighting and shadows of your favourite titles to create a greater sense of depth and realism.
  • SSAO (Screen Space Ambient Occlusion) emphasizes the appearance of shadows near the intersections of 3D objects, especially within dimly lit/indoor environments.
  • Dynamic DOF (Depth of Field) applies bokeh-style blur based on the proximity of objects within the scene giving your game a more cinematic suspenseful feel.
 
Last edited:

SatansReverence

Hipster Princess
So it's literally nothing.

using the same DSR scaling provides the same performance and looks the same.

Even tried it in Prey, a game that doesn't allow fullscreen mode. So now it's blown up and I can't even get to the settings to reduce resolution back down again :messenger_grinning_sweat:
 

azertydu91

Hard to Kill
So does anybody have a before/after screen of the ssrtgi in action,I haven't downloaded back 3d games since my computer blew up, but am curious to see how it may look.
 

azertydu91

Hard to Kill
HOW DO I ACTIVATE SSAO, SSRTGI filters ON ACC, I cant see them in the filters..................... Im so pissed
For games that supports it see this post :
Ah found it out.

Press alt + F3, then select 1 at the top, and select in the drop down box the following:

jIn5ZN9.jpg


U need geforce experience by the way.

  • SSRTGI (Screen Space Ray Traced Global Illumination), commonly known as the “Ray Tracing ReShade Filter” enhances lighting and shadows of your favourite titles to create a greater sense of depth and realism.
  • SSAO (Screen Space Ambient Occlusion) emphasizes the appearance of shadows near the intersections of 3D objects, especially within dimly lit/indoor environments.
  • Dynamic DOF (Depth of Field) applies bokeh-style blur based on the proximity of objects within the scene giving your game a more cinematic suspenseful feel.

You need to have geforce experience and the latest driver.
 
Last edited:

Verchod

Member
This new update feels a little premature. I haven't noticed much greater performance with the dlsr over the standard version.
Also, the new filters only appear in certain games. I looked for them in Deus Ex human Revolution but they didn't appear. Loaded up Dishonored remastered and three new filters were there. The GI one absolutely tanked my performance even running it in 1440P. That's mad crazy. As soon as I added the GI it dropped 30Fps even in the Low quality setting. Also had some strange artifacts appear when activated. I'd get sections of the screen appear darker, in large blocks. Odd.
 

CrustyBritches

Gold Member
Has anybody been able to get DLDSR to work? I've tried TW3, Prey, and Wolfenstein Youngblood and there's no perf difference. It should be around the same as 1080p(native). Only on TW3 have I been able to get the SSRTGI to work, but without the DLDSR I can't get playable frame rate(~35fps on 3060 laptop).

I've been going into NVCP->Manage 3D Settings->DSR Factors-> DL 2.25x(1620p). Then NVCP->Change Resolution->1080p, 2880x1620, HD 2.25x. Using Fullscreen in TW3 and Prey.
 

Kenpachii

Member
This new update feels a little premature. I haven't noticed much greater performance with the dlsr over the standard version.
Also, the new filters only appear in certain games. I looked for them in Deus Ex human Revolution but they didn't appear. Loaded up Dishonored remastered and three new filters were there. The GI one absolutely tanked my performance even running it in 1440P. That's mad crazy. As soon as I added the GI it dropped 30Fps even in the Low quality setting. Also had some strange artifacts appear when activated. I'd get sections of the screen appear darker, in large blocks. Odd.

Yea its shit. DLDSR doesn't add performance and the raytracing filter absolute tanks performance. from 110 fps > 27 fps while also having the typical screenspace problems.

In short, skip for now.
 

DarkestHour

Banned
I also have absolutely zero frame rate improvement over normal DSR. Seems like a crock of shit to be honest. If a game support DLSS though you can do it and then enable DLSS to get some frames back.
 
Last edited:

azertydu91

Hard to Kill
So I can't find it while playing Horizon zero down gog version, do I need to activate the experimental functionalities or is it just that the game doesn't support it?
 

Kenpachii

Member
So I can't find it while playing Horizon zero down gog version, do I need to activate the experimental functionalities or is it just that the game doesn't support it?

Doesn't seem to have the shader, also don't have it in the game.
 
Last edited:

Kenpachii

Member
This is what nvidia staff has to say about it as people are complaining

406b0c663a292faf5aa88ed1d6625f48.png


I am honestly confused now.

This was there PR image.

csm_nvidia_dldsr_ai_deep_learning_dynamic_super_resolution_8_1b995f6b95.jpg
 

hlm666

Member
This is what nvidia staff has to say about it as people are complaining



I am honestly confused now.

This was there PR image.
That image shows what the nvidia rep is saying. Old dsr is rendering at 4k and downscaling to 1080. The new dsr is rendering at 1620p and downscaling to 1080p and using AI to improve the image. The result is supposed to be they look similar and hence the performance uplift is because your upscale render resolution is only 1620p instead of 2160p. I havn't tested it but judging on what someone said above the difference visually may not be as close as that image implies.
 

Kenpachii

Member
That image shows what the nvidia rep is saying. Old dsr is rendering at 4k and downscaling to 1080. The new dsr is rendering at 1620p and downscaling to 1080p and using AI to improve the image. The result is supposed to be they look similar and hence the performance uplift is because your upscale render resolution is only 1620p instead of 2160p. I havn't tested it but judging on what someone said above the difference visually may not be as close as that image implies.

Doesn't make sense to me.

So if its 4x the quality with only 2,25x being rendered. so be it. However why does the picture state practically the same framerate? it upholds, which isn't the case at all.
 

hlm666

Member
Doesn't make sense to me.

So if its 4x the quality with only 2,25x being rendered. so be it. However why does the picture state practically the same framerate? it upholds, which isn't the case at all.
yeh the difference in that image between 1080p and 1620p performance is horse shit. I wonder if it was a straight up lie intentionally or if the wanker who did it thought dlss was being used to upscale from 1080p to 1620p, either way it's wrong. Their reflex example for god of war was dodgey as fuck aswell so it's hard to think it was unintentional.

I own nvidia hardware aswell before anyone tries call me an AMD shill or some shit. The marketing from all these asshats lately has been so far from reality it's going from kinda funny to being pissed on and told it's raining.
 
Doesn't make sense to me.

So if its 4x the quality with only 2,25x being rendered. so be it. However why does the picture state practically the same framerate? it upholds, which isn't the case at all.
Its possible that Prey is locked at 145 internally? But even if it was they should know damn well that pc crowd loves to speculate and could come with the wrong conclusion from that image.

The sharpening added is so dumb too, way overboard. Super weird coming from leading graphics technicians. The downsampled image already looks naturally cleaner then native wtf add sharpen on top? A tiny bit ok , but as it stands now it's just way too much.
 

hlm666

Member
Its possible that Prey is locked at 145 internally? But even if it was they should know damn well that pc crowd loves to speculate and could come with the wrong conclusion from that image.

The sharpening added is so dumb too, way overboard. Super weird coming from leading graphics technicians. The downsampled image already looks naturally cleaner then native wtf add sharpen on top? A tiny bit ok , but as it stands now it's just way too much.
Well caught this didn't even cross my mind, after you said this went and did a quick check and it is indeed locked at 144 unless you edit the cfg file. That explains why such an odd choice was used for this marketing slide. It's basically the perfect choice to make this look impressive and why they didn't include lower old dsr levels probably.
 
This is what nvidia staff has to say about it as people are complaining

406b0c663a292faf5aa88ed1d6625f48.png


I am honestly confused now.

This was there PR image.

csm_nvidia_dldsr_ai_deep_learning_dynamic_super_resolution_8_1b995f6b95.jpg

So it seems there isn't really a performance improvement - 2.25 DLDSR is just supposed to give you the image equivalent of 4x DSR for the same performance as 2.25 DSR.

However, I doubt that's the case too, because on Cyberpunk I struggled to see any difference in image quality between DSR 2.25 and DLDSR 2.25. Although it was a pretty quick check and it was night time so maybe I didn't notice the improvements.
 
Last edited:
isn't it just meant to give you better quality for same/similar performance?

normal DSR you actually need to render at 4K so you're getting a performance hit. but with this you're getting "4k" with 1080p performance?

anyway, i can't get the fucking thing to work.
 

azertydu91

Hard to Kill
Yep dldsr is not great right now and those ssrtgi tank performances really hard but I do have hope they'll get way better just like dlss did.Because the first iteration of dlss was not that great if I remember correctly but with a few update it became amazing.
 
too much oversharpening with this dldsr implementation :( i'm not liking it so far
Yup - just did some more testing, and while DLDSR definitely does look sharper than DSR at the same setting, the sharpness feels a bit artificial, as if it's coming as the result of some kind of post-process sharpening.
 

Dream-Knife

Banned
I completely misread the description. This still has the performance hit of rendering at a higher resolution apparently, but it doesn't have as high of one as DSR.

Interesting to note playing older less RTX optimized games the framerate stays basically the same (Insurgency Sandstorm). It also for whatever reason doesn't work in MH Rise (only shows up to native res).
``` Remaster Classic Games With AI-Powered DSR and Advanced Freestyle Filters ```
 
Last edited:

rofif

Can’t Git Gud
so dldsr for 4k monitors/tvs is like a DLSS but rendering at higher res?
so super quality DLSS ?
 

yamaci17

Member
I completely misread the description. This still has the performance hit of rendering at a higher resolution apparently, but it doesn't have as high of one as DSR.

Interesting to note playing older less RTX optimized games the framerate stays basically the same (Insurgency Sandstorm). It also for whatever reason doesn't work in MH Rise (only shows up to native res).
``` Remaster Classic Games With AI-Powered DSR and Advanced Freestyle Filters ```

you must set the desktop to dsr resolution for such games
 

ZywyPL

Banned
so dldsr for 4k monitors/tvs is like a DLSS but rendering at higher res?
so super quality DLSS ?

Technically, if you have the horsepower to render at native 4K, DLDSR will mean an image downscaled from something like 5-6K.
 

yamaci17

Member
Somebody please try both implementations at this point in Prey and see if they get similar FPS or Nvidia were lying their ass off
prey 1080p picture is cpu bound. its highly misleading (i don't know whether its intentional or not)

2.25x DLDSR will still cost you what 1620p rendering would cost you

so they're probably not lying, but its also misleading. im sure they tested it with a 11900k or something, and they probably hit CPU limitations at 145 fps. therefore, its an invalid comparison.

if there were, theoritically, a mildly faster CPU than a 11900k, you would, say, get 200+ fps at 1080p on prey.
 

sertopico

Member
Just tested it on Cyberpunk. No framerate improvements for DSR vs DLDSR. In fact, I honestly think regular DSR was giving me 1 or 2 more frames.

A bit strange. Not sure what Nvidia thought they had here.
Same. CP performed and looked horrible. I chose 4k starting from 1440p then put dlss balanced on top of that. On my CX it runs decently this way without huge slowdowns, on my 1440p monitor instead with DLDSR it was terrible. 1440p native + DLSS quality gives me the best results in the end. TW3 instead looked good. I think also the 10GB of my 3080 don't help. I have the suspect the game performs worse cause once you hit the VRAM limit the FPS plummet.
 
Last edited:

yamaci17

Member
native 1080p vs dsr 4k (89 fps to 49 fps). first alarming sign: its impossible to drop from 145 fps to 103 fps when you're going from 1080p to 4k. this is the first proof that prey 1080p picture is highly CPU bound (at least by a healthy %40-50 percent)


---

dldsr 1620p vs 1080p (89 fps to 65 fps). practically it costs same as going to 1620p from 1080p. cost as expected


---

dldsr 1620p vs dsr 4k (49 fps to 65 fps). this is the part where it needs to be 2x efficient (its not). its %35 more efficient (just like in the prey comparison) supposedly dldsr 1620p matches the dsr 4k. this is at %0 smothness so you can see it is massively applying an overly aggresive sharpening filter. some may say dldsr 1620p looks better than dsr 4k. it depends on the smoothness level... and how tolerant you are to the sharpening


---

dldsr1620p (%100 smoothness) vs dsr 4k


---

dldsr1620p (%50 smoothness) vs dsr 4k


---

bonus round. native 1080p vs dlsdsr 1620p+dlss quality (%50 smoothness)

 
Last edited:
yamaci17 yamaci17 oh that's right you can negate sharpening with the smoothness slider. Looking at those comparisons at 1080p the 4K DSR and 2.25 DLDSR they're about about the same IQ. But would be better to see uncompressed BMP images side by side.
 

TheTurboFD

Member
Does anyone know if theres a way to edit what resolution the DLDSR scales from? I have a samsung G9 which is at 5120x1440. Obviously I don't want to use 2x25x DL for that because it comes out to like 7680x2160. I always play my games at 1440p in windowed mode on the left side of my screen so I wanted to know if there's a way to set what resolution the DSR is using to scale from? I would like to do 2.25x of 1440p for my games.
 

Kenpachii

Member


In short

Quality:
4x DSR > 2,25x DLDSR > 2,25x DSR

Performance
2,25x DSR > 2,25x DLDSR > 4x DSR

Best way to apply DLDSR 2,25x = using DLSS quality with it to remove some of the framerate hit.

I tested it a while ago out in witcher 3 and frankly its not worth it, no matter how people spin it.

What nvidia should have done is just create a higher DLSS preset that replaces DSR entirely while at it.

For example

Ultra Quality = Quality DLSS on Native resolution
2x Ultra Quality = 2,25x DSR with quality DLSS
4x Ultra quality = 4x DSR with quality DLSS.


Does anyone know if theres a way to edit what resolution the DLDSR scales from? I have a samsung G9 which is at 5120x1440. Obviously I don't want to use 2x25x DL for that because it comes out to like 7680x2160. I always play my games at 1440p in windowed mode on the left side of my screen so I wanted to know if there's a way to set what resolution the DSR is using to scale from? I would like to do 2.25x of 1440p for my games.

No clue if its possible, i would not advice it however tho. a lower solution would probably fuck up the image even worse then native.
 
Last edited:

yamaci17

Member


In short

Quality:
4x DSR > 2,25x DLDSR > 2,25x DSR

Performance
2,25x DSR > 2,25x DLDSR > 4x DSR

Best way to apply DLDSR 2,25x = using DLSS quality with it to remove some of the framerate hit.

I tested it a while ago out in witcher 3 and frankly its not worth it, no matter how people spin it.

What nvidia should have done is just create a higher DLSS preset that replaces DSR entirely while at it.

For example

Ultra Quality = Quality DLSS on Native resolution
2x Ultra Quality = 2,25x DSR with quality DLSS
4x Ultra quality = 4x DSR with quality DLSS.




No clue if its possible, i would not advice it however tho. a lower solution would probably fuck up the image even worse then native.

per usual, DF or Alex in particular is only interested with methodical, %400 zoom approach. he never mentions anything about how the games look in general.

if you have a 1080p screen, just do what I do. set your game to pristine native 4k %0 smoothness with no bull***. set the DLSS to performance mode = benefit.

games have special LODs/assets tailored for 4k when you set them to 4k. then going back to 1080p with DLSS does not matter much.

here are some comparisons, DSR 4k+dlss performance versus native 1080p



I've tried, and even made my own comparisons compared to DLDSR 1620p+Dlss quality versus DSR 4K+dlss performance. DSR4K+DLSS PERFORMANCE looks WAAAY better. Like, out of worwlds. Simply, 1620p input, no matter you downscaling with AI or whatnot, has 1620p lods/assets.

getting that swet 4K LODs are way better. its just a nuisance as it is for DLSS games.

for non-DLSS games, IT is a good, viable alternative. but if a game has DLSS, you better go with the dsr 4k+dlss perf combo. its a literal game changer
 

Kenpachii

Member
per usual, DF or Alex in particular is only interested with methodical, %400 zoom approach. he never mentions anything about how the games look in general.

if you have a 1080p screen, just do what I do. set your game to pristine native 4k %0 smoothness with no bull***. set the DLSS to performance mode = benefit.

games have special LODs/assets tailored for 4k when you set them to 4k. then going back to 1080p with DLSS does not matter much.

here are some comparisons, DSR 4k+dlss performance versus native 1080p



I've tried, and even made my own comparisons compared to DLDSR 1620p+Dlss quality versus DSR 4K+dlss performance. DSR4K+DLSS PERFORMANCE looks WAAAY better. Like, out of worwlds. Simply, 1620p input, no matter you downscaling with AI or whatnot, has 1620p lods/assets.

getting that swet 4K LODs are way better. its just a nuisance as it is for DLSS games.

for non-DLSS games, IT is a good, viable alternative. but if a game has DLSS, you better go with the dsr 4k+dlss perf combo. its a literal game changer


Actually i never thought about that. U could obviously downsample and just use DLSS on top of it in DLSS type of games.

Also never really knew about LOD's being better at 4k then 1080p, wonder why that is.

I actually should try this out today.
 

yamaci17

Member
Actually i never thought about that. U could obviously downsample and just use DLSS on top of it in DLSS type of games.

Also never really knew about LOD's being better at 4k then 1080p, wonder why that is.

I actually should try this out today.
think of distance LODs as proportional. most game engines have a fixed proportion to reduce rendering load of certain stuff. let's say %50

at 1080p, you get 540p-like assets,
at 1440p, you get 720p-like assets
and at 4k, you get actual 1080p like assets

biggest trick TSR/DLSS uses is to not REDUCE model/texture LODs for objects. you see, i've discovered and understood this when I noticted that performance do not scale to the resolution it renders at. At first, I thought it was DLSS overhead (whch is, in fact, a factor). But no, there was something amiss.

In some games, native 1440p rendering versus 4k+dlss quality rendering had a difference of whopping %30-35 FPS. this was huge. this had nothing to do with DLSS overhead. DLSS overhead, at max, causes a %10-15 frame time loss.

Then I understood. DLSS only scales back certain "aspects" to 1440p. not all of them. crucial things such as LODs/assets/post process effects are still rendered as if you're rendering native 4K, hence you claw performance back with DLSS, but retain critical assets at higher resolutions, so that DLSS can be miraculuously good.

here is another RDR 2 comparison. people scoff at DSR stuff and see it just as an alternative AA method. I disagree. the current gen games do not look like at their native resolution anymore. the improvement in this picture cannot be explained by anti aliasing effect alone. there are clearly something different going on here.


v3LXIlm.jpg
VDibIXJ.jpg


see the native 1080p on the left? STUFF is not rendering at native 1080p. TRUST me! look at that undersampling artifacts. its TAA trying to "mold" together low resolution assets. only at 4k+dlss performance combo that game actually looks like it should.

Extra note: this is why some games, RDR 2 in particular, do not benefit greatly from DLSS. because developer chose to not scale back everything. it is good, it gives you some performance and you still get a good image. some games scale everything down, and you see enormous gains. this is why there is not a fixed performance increase for DLSS modes. it all depends on the implementation.
 
Last edited:
Top Bottom