• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cyberpunk 2077 with NVIDIA DLSS 3 & Ray Tracing: Overdrive

Reizo Ryuu

Member
blow-mind-mind-blown.gif

Yeah mindblown alright, mindblown by how you think this supports your retarded statement while it actually proves the exact opposite.
 

yamaci17

Member
You can get away with that on a 3080 Ti... No need to dish out those eddies there!
i dont think so
new path tracing mode is the most complex type of RT we've ever seen in video games, most likely. Ada is specifically equipped with SER, and other new ray tracing calculation methods that specifically gives it an %70 advantage in those path tracing scenarios. shader execution reordering (brings like %30 improvement in very complex ray tracing scenarios), displaced micro meshes and opacity micro maps. these will factor in when you play cyberpunk with its rt overdrive mode in Ada. the gpus that do not have these stuff will be proportionally slower
so while a rtx 4090 at native 4k does 20-25 fps with rt overdrive mode, most likely a 3080ti will actualy do something like 10 fps. and at that point, even dlss performance wouldn't save it.

i can see 3080ti getting 30-35 frames at 1440p/dlss balanced/performance maybe, but then game mostly looks blurry so yeah

you can bet they will push ray tracing in a manner that makes ampere obsolete. even the top dogs. Path tracing is no joke, and Ampere does not have the tools to handle it with realtime on a game that has the scale of Cyberpunk

in most RT scenarios, a rtx 4080 will match a 3080ti, most likely. but in the case of path tracing in cyberpunk, you can bet it will take the ball and run away. this is how they market it , at least. their "racing" rtx demo will literally won't work on ampere.
 
Last edited:

yamaci17

Member
is it really going to make the game look better? that's the question i have
who knows, the game still have lastgen assets and textures. I mean the trailer did not look that impressive for a supposed "path tracing" mode.

rtx gi in cyberpunk was already a controversial topic. now all of a sudden you will have elitists who would defend RTX GI over native raster lighting shit on RTX GI if the see DF doing %400 closeups and showing differences here and there.

it happens

it will most likely keep happening lol

the reason nvidia is so focused on cyberpunk is because the game has lastgen assets and textures, which means they can push ray tracing while not breaching the shitty 12 gb they put on their fake 4080s and 10 gb they will put on their fake 4070.

imagine a ps5/sx multiplat nextgen game that pushes an entire 10 gb worth of data with RASTERIZATION. and then imagine putting all kind of advanced path tracing, DirectGI on top of it. plauisble? not even in the slightest.

this is why these are at this point troll products. unless you go full 16 gb 4080, you won't be getting directgi + nextgen textures on a 12 gb budget at 4K/upscaled scenarios.

this is why cyberpunk is important to them. i mean besides having lastgen textures and assets, the game aggresively culls objects around you
(clearly observeable). then imagine the nextgen horror.

just like how they designed 3060ti and 3070 with 8 gigs of memory, 12 gb 4080 can only save the day. it has no apparent future for nextgen games. 12 gb is an insult for such a capable GPU, especially in regards to DirectGI
 
Last edited:

Roni

Gold Member
i dont think so
new path tracing mode is the most complex type of RT we've ever seen in video games, most likely. Ada is specifically equipped with SER, and other new ray tracing calculation methods that specifically gives it an %70 advantage in those path tracing scenarios. shader execution reordering (brings like %30 improvement in very complex ray tracing scenarios), displaced micro meshes and opacity micro maps. these will factor in when you play cyberpunk with its rt overdrive mode in Ada. the gpus that do not have these stuff will be proportionally slower
so while a rtx 4090 at native 4k does 20-25 fps with rt overdrive mode, most likely a 3080ti will actualy do something like 10 fps. and at that point, even dlss performance wouldn't save it.

i can see 3080ti getting 30-35 frames at 1440p/dlss balanced/performance maybe, but then game mostly looks blurry so yeah

you can bet they will push ray tracing in a manner that makes ampere obsolete. even the top dogs. Path tracing is no joke, and Ampere does not have the tools to handle it with realtime on a game that has the scale of Cyberpunk

in most RT scenarios, a rtx 4080 will match a 3080ti, most likely. but in the case of path tracing in cyberpunk, you can bet it will take the ball and run away. this is how they market it , at least. their "racing" rtx demo will literally won't work on ampere.
He talked about using Psycho RT settings, I use Psycho RT settings on my 3080 Ti.
 

yamaci17

Member
He talked about using Psycho RT settings, I use Psycho RT settings on my 3080 Ti.
sry my bad then

yeah psycho is fine for currentgen cards . i dont know what is the difference between psycho and ultra though. i think medium was only 1 bounce and others bounce more or something

i thought u was referring to
"I probably wont use RT Overdrive mode "
 

Mister Wolf

Member
He talked about using Psycho RT settings, I use Psycho RT settings on my 3080 Ti.


I want higher framerate when I use Psycho with my 3080TI. What is your resolution and framerate using the game benchmark using Psycho lighting paired with RT Reflections? I have a 3080TI paired with 5800X3D. With FOV set to 100, 4K DLSS Performance, RT Lighting Psycho, and RT Reflections I average 56.73 FPS. Thats not nearly good enough.
 

Roni

Gold Member
I want higher framerate when I use Psycho with my 3080TI. What is your resolution and framerate using the game benchmark using Psycho lighting paired with RT Reflections? I have a 3080TI paired with 5800X3D. With FOV set to 100, 4K DLSS Performance, RT Lighting Psycho, and RT Reflections I average 56.73 FPS. Thats not nearly good enough.
90 FOV, DLSS set to Auto and 4k Resolution. Everything maxed out. I get 45-60 FPS.

But the drop to 45 is not GPU, it's either CPU or storage. Not quick enough to load the crowds at 60. Even at low settings I get drops to 45 in heavily crowded scenes.
 
Last edited:

yamaci17

Member
90 FOV, DLSS set to Auto and 4k Resolution. Everything maxed out. I get 45-60 FPS.

But the drop to 45 is not GPU, it's either CPU or storage. Not quick enough to load the crowds at 60. Even at low settings I get drops to 45 in heavily crowded scenes.
its a cpu bottleneck situation



this should help a bit

crowd density to med... would also help. but first try the smt fix (if u have a 8/16 cpu that is)

tbh this game to me is enjoyable at 40-55 frames. vrr helps a bit too. and a bit of motion blur too. and latest version (1.6) has some kind of reflex implementation built in so input lag at %99 gpu usage is also minimal (compared to older versions). but everyone has different needs and wants
 
Last edited:

Mister Wolf

Member
90 FOV, DLSS set to Auto and 4k Resolution. Everything maxed out. I get 45-60 FPS.

But the drop to 45 is not GPU, it's either CPU or storage. Not quick enough to load the crowds at 60. Even at low settings I get drops to 45 in heavily crowded scenes.

I need more power Roni. That DLSS 3.0 will provide a silky smooth experience. We are already hitting high enough framerates(50-60fps) where latency is not a concern. When I'm playing I want the game to have the motion/camera smoothness of 90-120Hz.
 

Reallink

Member
In 2 of the 3 games tested the latency nearly doubled in dlss3 vs 2 and that seems like a huge drawback. You can compare it to the game without any dlss, but that could be running at 30-40fps so having better latency than that is hardly a revelation.

I don't know how I feel about gaming at 100+ fps with worse latency than 60fps. To me it seems like nvidia is rushing something not ready in order to justify the huge price increase of the new GPUs.

The comparison point was a 4090 running native 4K, so everything tested would be 60 or over excepting perhaps the new Psycho Cyber Punk mode. The feature's not designed for titles where DLSS 2.0 can hit 120Hz+, it's meant for future releases that cannot, piling on improved RT effects (like Cyberpunk), or increasing 60Hz titles to 120+. You don't lose anything in any of these scenarios cause your only alternative is that they don't run at all, struggle to run between 30 and 60, or just run at vanilla 60. However in all of them you'd gain a 2-4x+ increase in the perceptual framerate with 3.0.
 
Last edited:

bbeach123

Member
RT look better , but not that much . I go around the city checking all the location and shit and maybe one in every 20 scene the RT one clearly better . Most of the time its just different . Not worth the fps drop imo .

The non-RT lighting already soo good .

 
Last edited:

Crayon

Member
This honestly frustrates me. I've been PC gaming my whole life but I've always cruised by on budget PCs. I never really had the luxury to game the best looking shit on the highest settings at a good framerate. I finally decided to save save save and save to build a high end PC. Somehow and someway I was actually able to get an RTX 3080 last year. It felt good. It felt good that for once I got to be on top for a change. I know it wasn't the absolutely best GPU you could have, but it was almost the best you could have, and that was more than enough for me. Get that "PC Master Race" ego brewing. But you know what, I earned it.

Anyway that feeling lasted around 8 months give or take before Nvidia announced the 4xxx series. Bear in mind I never really kept up with PC tech until early 2020, so I don't know if this is just par for the course, but I couldn't believe it. It really felt like the 3xxx series just came out, with that magnum sized big dick energy. But now it's like "The RTX 3080? Pffff. That's sooooo 2020." And apparently I can't run DLSS 3.0 with it, for reasons.

But judging by literally everyone elses reactions, I'm not the only one who thinks it's bullshit. Not sure if it's for the same reasons.

Nvidia has chosen to make some really expensive, upmarket stuff here. It's pushing into a new tier. Imo it's getting fetishistic, but there's enough people who can buy that now so they made it.

Your computer is pretty sick right now and if you want to start saving again, you got a good 4 years till rtx60XX and your gear will still be chewing up games past then. You should be stoked that you put all that together.

Snap out of it!!

39662c914d1acb7f2f11dbc10b31c49a898ce8d5_hq.gif
 
who knows, the game still have lastgen assets and textures. I mean the trailer did not look that impressive for a supposed "path tracing" mode.

rtx gi in cyberpunk was already a controversial topic. now all of a sudden you will have elitists who would defend RTX GI over native raster lighting shit on RTX GI if the see DF doing %400 closeups and showing differences here and there.

it happens

it will most likely keep happening lol

the reason nvidia is so focused on cyberpunk is because the game has lastgen assets and textures, which means they can push ray tracing while not breaching the shitty 12 gb they put on their fake 4080s and 10 gb they will put on their fake 4070.

imagine a ps5/sx multiplat nextgen game that pushes an entire 10 gb worth of data with RASTERIZATION. and then imagine putting all kind of advanced path tracing, DirectGI on top of it. plauisble? not even in the slightest.

this is why these are at this point troll products. unless you go full 16 gb 4080, you won't be getting directgi + nextgen textures on a 12 gb budget at 4K/upscaled scenarios.

this is why cyberpunk is important to them. i mean besides having lastgen textures and assets, the game aggresively culls objects around you
(clearly observeable). then imagine the nextgen horror.

just like how they designed 3060ti and 3070 with 8 gigs of memory, 12 gb 4080 can only save the day. it has no apparent future for nextgen games. 12 gb is an insult for such a capable GPU, especially in regards to DirectGI
It’s looking like not even the 5090ti will run this at 4k 60
 

FireFly

Member
I still remember when Alex was claiming that DLSS 2.0 on Control was better than native. And he showed that in the video.
Then I played the game and realized he was full of it. He is constantly lying to make NVidia look better than what it really is.
And it's paying off for him and DF.
You can judge for yourself how bad the artefacts are with this video showing only the generated frames:

 

01011001

Banned
I still remember when Alex was claiming that DLSS 2.0 on Control was better than native. And he showed that in the video.
Then I played the game and realized he was full of it. He is constantly lying to make NVidia look better than what it really is.
And it's paying off for him and DF.

Control looks better with DLSS tho. softer? sure, but less overall artifacts in hair etc than with TAA
 

winjer

Gold Member
Control looks better with DLSS tho. softer? sure, but less overall artifacts in hair etc than with TAA

No it doesn't. Not only it loses clarity and resolution when there is camera movement, due to not having time to accumulate data from previous frames, since they are different.
But there is also a lot of ghosting and artifacts.
Yes, the TAA in Control is lacking, but DLSS 2.0 doesn't even match it.
 
RT look better , but not that much . I go around the city checking all the location and shit and maybe one in every 20 scene the RT one clearly better . Most of the time its just different . Not worth the fps drop imo .

The non-RT lighting already soo good .



Try this one. Open both and try switching between them. See if you spot the difference.

sameimage1.png

sameimage2.png
 
Nvidia has chosen to make some really expensive, upmarket stuff here. It's pushing into a new tier. Imo it's getting fetishistic, but there's enough people who can buy that now so they made it.

Your computer is pretty sick right now and if you want to start saving again, you got a good 4 years till rtx60XX and your gear will still be chewing up games past then. You should be stoked that you put all that together.

Snap out of it!!

39662c914d1acb7f2f11dbc10b31c49a898ce8d5_hq.gif
For real...I'd kill to have his set up
 

mrMUR_96

Member
This honestly frustrates me. I've been PC gaming my whole life but I've always cruised by on budget PCs. I never really had the luxury to game the best looking shit on the highest settings at a good framerate. I finally decided to save save save and save to build a high end PC. Somehow and someway I was actually able to get an RTX 3080 last year. It felt good. It felt good that for once I got to be on top for a change. I know it wasn't the absolutely best GPU you could have, but it was almost the best you could have, and that was more than enough for me. Get that "PC Master Race" ego brewing. But you know what, I earned it.

Anyway that feeling lasted around 8 months give or take before Nvidia announced the 4xxx series. Bear in mind I never really kept up with PC tech until early 2020, so I don't know if this is just par for the course, but I couldn't believe it. It really felt like the 3xxx series just came out, with that magnum sized big dick energy. But now it's like "The RTX 3080? Pffff. That's sooooo 2020." And apparently I can't run DLSS 3.0 with it, for reasons.

But judging by literally everyone elses reactions, I'm not the only one who thinks it's bullshit. Not sure if it's for the same reasons.
Well, maybe you should have looked when the new series were due to launch, the release date of new cards is on a pretty even schedule. DLSS 2 will still be updated and supported, DLSS3 just adds the ai frames which required new hardware on the 4000 series. The new prices are ridiculous, but that's a separate issue.
 

Madflavor

Member
Well, maybe you should have looked when the new series were due to launch, the release date of new cards is on a pretty even schedule. DLSS 2 will still be updated and supported, DLSS3 just adds the ai frames which required new hardware on the 4000 series. The new prices are ridiculous, but that's a separate issue.

Well hindsight is 20/20, and I addressed in that post I was fairly new at some of this.
 
Top Bottom