• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DLSS and Path Tracing could force console makers to go with NVIDIA in the future

Could sufficient advances in DLSS and Path Tracing support bring Sony to NVIDIA?

  • No, the competition eventually catches up and offers a similar current value.

  • No, the console market just doesn't care enough to afford the price.

  • Yes, they corner the market by subsidising the chip and outvalue the competition.

  • Yes, the difference will become even larger and consumers will pay for it in the end.


Results are only viewable after voting.

RoboFu

One of the green rats
I just had thought about how gpus are a finite market. Meaning there will be a point where they get so powerful and cheap that any manufacturer will be able to make their own and nvidia will probably not even exist anymore.
Of course I will be long dead by then but some of you might see that day.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Agree with everything u said but FSR is sufficient part lol.

I can tell the upscaling in resolution when I use FSR on high quality at 4k . Like you can see and notice the image quality is being upscaled because it took a hit .

Can't say the same for dlss on quality at 4k. Sometimes it even looks cleaner than native.

FSR isn't sufficient. I would be very generous if I said its half as good as dlss. But.. for those with AMD cards... What other options they have ? Take whatever they can get I guess .
FSR2+ and TSR are sufficient, if checkerboarding and TAAU were sufficient.
FSR2+ and TSR look better than checkerboarding and TAAU........so yeah, thats my reasoning.


To add, blind testing showed that DLSS look slightly better than FSR2+ more often than not.....so yeah not denying DLSS is the better tech, but FSR2+ and TSR are certainly viable and "sufficient" alternatives, all depends from what res you are starting from.
If we assume 4K is still the defacto resolution nextgen, then we can gather games will TSR/FSR from 1440-1800p up to 2160p, without direct comparisons people will be hard pressed actually counting the resolution and saying well this looks bad, and if Digital Foundry dont tell people the starting resolution, they will be more than happy with the output.

AMD-FSR-VS-DLSS-HERO-BANNER.jpg
 

Silver Wattle

Gold Member
Let's face it, NVIDIA has been at least a single step ahead of AMD for years now in terms of tech. AMD is still on a plan to build an open, stable standard but that has yet to materialize in performance that reaches the cutting edge. Mind you, I'm not saying AMD is in trouble at all. Their leadership is very competent and their strategy to be realistic with their prices given their performance output and focus on custom work for Sony and Microsoft has paid off. But Ray Tracing has been a sore spot for a while now in the race and NVIDIA has the clear advantage. Now with Path Tracing in consumer GPU's a reality and technology like DLSS to support high FPS, something people ask for in every other console thread you can stumble into here, could Sony and Microsoft see a gap so wide between image quality to actually use an NVIDIA chip for something like a PS6 Pro or the next Xbox refresh?

I think so... If progress continues as it is now, Microsoft and Sony can stretch this console generation a bit more and by the time a refresh ought to be coming out for the next gen we could be seeing a pricy console offering gaming at an image quality similar to Cyberpunk's Overdrive in 4k 30FPS. Let's call that 7-8 years.

What do you think? Unless AMD can come up with some surprising breakthroughs soon it might be too late.
Montage Reaction GIF by MOODMAN
 

Rudius

Member
Next gen consoles will come out in 5 years. By then AMD should be better in the software department. I doubt Nvidia will be better in the price department.
 

digdug2

Member
If anything, it'll just motivate AMD to continue to get better. I mean, look at their video cards now. NIVIDIA wiped the floor with them for years and years, but not anymore.
 

SolidQ

Member
NVIDIA has been at least a single step ahead of AMD for years now in terms of tech
The problem is price. You should also understand consoles have tiny die size CPU+GPU, so that why AMD going MCM for future, improve massive tech for consoles. NV problem they still have monolith chip, and you can't always make it bigger, bigger,bigger for fighting with AMD
 

Schmendrick

Member
Even in cyberpunk, I wouldn’t say it’s worth the performance hit. Still best in metro and that game is 1800p 60 with full rt on consoles
CP2077 is just an ethusiast tech preview.
The moment hardware power for standardizing stuff like RT GI becomes cheap enough the old rastermethods will be replaced completely. By the time the next console gen hits we`ll have advanced another 2 gpu gens. Should be doable by then.
 
Last edited:

Goon_Bong

Member
Nvidia will no doubt have another big misstep like they had with the FX series in the 2000’s. I still remember getting a 9700 back then and just marvelling at the performance.
 
Not happening.

Nvidia charges a lot of money (for great tech by the way) but MS and Sony aren't gonna be able to afford what Nvidia charges.


They still need to make their consoles cheap and Nvidia will charge them and arm and a leg for anything competent while AMD is more reasonable with their pricing.


Both companies will stick with AMD and while AMD currently sucks at RT by next gen they will probably catch up a bit in path tracing tech even though Nvidia will still continue to be far ahead of them but at least they'll get somewhat decent performance by then.
 

Tsaki

Member
No they won't. AMD is one generation behind in RT performance which is not enough of a deal to make a huge switch like this and they already have AI hardware in the latest Phoenix APUs thanks to Xilinx, if this is something that Sony and MS want to explore. But of course the most important part is that they offer the best x86 CPU cores which Nvidia doesn't.
 

iorek21

Member
I doubt either Sony or Microsoft would care enough to increase their costs (and the consumer’s) for that.
Pathtracing is going to be widely used when it’s affordable and acessible. Until then, it’s a glorified PhsyX.

Then again, I think AI is going to be the next real deal for consoles and developers, not RT or PT.
 
You should take into account APUs and the size of these NVidia gpus. Not to mention cost.
You should take into account that a 10TF machine should not be as huge as the PS5 is, but AMD is not really efficient when considered their better node.


I guess the 4090 might actually be close to what we can expect for the PS6. With the than current node and obviously undervolted, paired with some also undervolted 8+8 core CPU, which should result in a similar power draw and ginormous box size as we have now.


Anyone thinking that Chinese engineered console parts could come with next gen? So far their manufacturing process is behind, and their awesome heaters barely produce frames, but while western governments invest billions to make us a little less dependant on Taiwan and Korea I would assume China invests even more to take over that market in the long term. Maybe not with PS6 but I guess a true Made in China gaming machine is inevitable in the near future.
 

SF Kosmo

Al Jazeera Special Reporter
A new console generation is a looooong way off. 2027 at least. I think by then AMD will have made enough headway on machine learning to have viable DLSS-like features and performance overhead to support more robust raytracing.

Even if they never "catch up" to nVidia in performance, if they're at least offering rough feature parity, it will be enough, since the need for backwards compatibility is so essential now.
 

Braag

Member
Consoles will catch up eventually, AMD will get better with it too and its cost will eventually drop.
 
Last edited:

LiquidMetal14

hide your water-based mammals
Part of me wants this due to how Nvidia does offer smooth performance and top tier feature sets. Then there's the cost and how much vs what AMD offers which works and also saves money to the manufacturer.

For what it is on PC though, they are leading the pack and I wouldn't go any other way today.
 

Buggy Loop

Member
Path tracing is not an ancient technology resurfaced with misterious rituals. Competitors will catch up eventually.

If peoples think that Nvidia went 80’s Monte Carlo path tracing for cyberpunk like they initially did for Quake RTX, It’s that they didn’t look at the tech for RTXDI Look up ReSTIR. It has basically shaken the industry. It’s years and years ahead of what peoples thought real-time path tracing would be at.

But Nvidia is not sitting on their laurels, they’re already on the next thing to improve path tracing performances and less noisy image with neural radiance caching.

AMD provides fuck all in that field. They’re barely involved with the universities or peoples working on RT / ML in general. They both sat at the consortium with Microsoft and vulkan for the API, years before they were implemented in hardware, yet it still feels like AMD is the dude in the back of the class who slept through the consortium and at the last minute, copied what it could from Turing and still managed to fuck it up and gave back a half assed homework.

Intel has more chances to actually be a 2nd competitor to Nvidia in tech than AMD.

Let’s not open the ML discussion because it’s an even more dire situation..
 

Cryio

Member
Games are starting to use FSR2 on consoles, which will only improve in time. And FSR3 will also be possible on consoles as well.

There's no worry here.
 

poppabk

Cheeks Spread for Digital Only Future
Whatever R&D AMD is putting in to path tracing is likely going towards getting it on an APU as that is where they are more than competitive and that is where the market is likely going eventually.
 

acm2000

Member
at this point intel is more likely than AMD, they are at least giving ray tracing hardware a real shot unlike AMD who are still competing with the gtx10 series :LOL:
 
Last edited:

LordOfChaos

Member
I think people forget or never knew that one of the big reasons Nintendo got Nvidia, was that Nvidia had a wafer silicon agreement with TSMC for TX1, but couldn't move enough units despite trying to put it in a few things. Nintendo capitalized on this with a deal for the Switch for an underclocked TX1 which appears pixel for pixel the same.

Nvidia never worked in consoles for more than one generation with a partner because they always exerted more control than ATI/AMD/IBM, and wanted higher margins. Iirc they even screwed Microsoft on die shrink schedules on the OG Xbox, MS left them, then Sony had similar problems with them.

It doesn't really matter if Nvidia stays better at path tracing and SS. As long as AMD hangs in there that's all that really matters, and console chips can be semi customized and draw down features from the product pipeline.

Wildcard: AMD was the only company offering good x86 core performance combined with good graphics in a single chip. Intel could well have good competitive graphics by next gen. I don't really think anyone will be tempted to switch for compatibility reasons (x86 aside the GPU uarch is too different), but Intel may try to get a console deal again.
 

//DEVIL//

Member
FSR2+ and TSR are sufficient, if checkerboarding and TAAU were sufficient.
FSR2+ and TSR look better than checkerboarding and TAAU........so yeah, thats my reasoning.


To add, blind testing showed that DLSS look slightly better than FSR2+ more often than not.....so yeah not denying DLSS is the better tech, but FSR2+ and TSR are certainly viable and "sufficient" alternatives, all depends from what res you are starting from.
If we assume 4K is still the defacto resolution nextgen, then we can gather games will TSR/FSR from 1440-1800p up to 2160p, without direct comparisons people will be hard pressed actually counting the resolution and saying well this looks bad, and if Digital Foundry dont tell people the starting resolution, they will be more than happy with the output.

AMD-FSR-VS-DLSS-HERO-BANNER.jpg
casual gamers won't tell. but most of us here can without digital foundry giving us numbers. I was able to tell the 120 frames on GT7 made the game look like shit for example. antialiasing and bad texture at distance is hard not to notice me for example.

that graph above is showing kinda proving my point though. there are 9 games in which DLSS destroys FSR and 12 according to them are flat-out better than FSR ( even if not by much according to them, this is a subjective point )


Anyway, like I said I agree with your post so we're good lol
 
Last edited:

DaGwaphics

Member
The console vendors don't necessarily need to rely on AMD to boost RT performance. Keep in mind that they could add their own custom blocks to act as their tensor cores, boosting the matrix math performance.
 

SmokedMeat

Gamer™
No, PC versions will just start looking leagues ahead of consoles again unless amd finds a way to fix their raytracing performance.

Their Ray tracing performance is fine.

Then there’s Unreal 5 featuring Lumen, so Ray tracing isn’t a concern at all.
 
Last edited:
I doubt either Sony or Microsoft would care enough to increase their costs (and the consumer’s) for that.
Pathtracing is going to be widely used when it’s affordable and acessible. Until then, it’s a glorified PhsyX.

Then again, I think AI is going to be the next real deal for consoles and developers, not RT or PT.

Rt/PT are techniques that are part of a fundamental aspect of games though so i don't see how that can't be the real deal. Not that ML won't but RT isn't a fad.
 

RoboFu

One of the green rats
30-50% behind NVIDIA is "fine".......

With a lighting system that heavily relies on RT "RT isn`t a concern at all".

right-dr-evil.gif
Huh lumen is the opposite. It can use RT but it in no way heavily relies on it. It was made because RT was so performance heavy.
 

Schmendrick

Member
Huh lumen is the opposite. It can use RT but it in no way heavily relies on it. It was made because RT was so performance heavy.
Wrong... Everything dynamic in Lumen relies on RT. Everything you've seen in demos so far used it with varying accuracy settings ( field geometric abstractions f.e. ).
Lumen offers scalability as in using a conglomerate of techniques so you can finetune what to use for what with what kind of accuracy etc, but it does not transcend technical limitations. You don't get accurate dynamic lighting without RT.
 
Last edited:

Boy bawang

Member
DLSS is a big equalizer. We know that the next Switch will have it, so I'm curious to see how it will compare to the Series S for instance.
 

RoboFu

One of the green rats
Wrong... Everything dynamic in Lumen relies on RT. Everything you've seen in demos so far used it with varying accuracy settings ( field geometric abstractions f.e. ).
Lumen offers scalability as in using a conglomerate of techniques so you can finetune what to use for what with what kind of accuracy etc, but it does not transcend technical limitations. You don't get accurate dynamic lighting without RT.
You are misunderstanding lumen. Lumen uses at its base a form of SOFTWARE RT
Really more mix of ray casting and raster tricks. But it can use use hardware RT.

It’s all in the documentation.
https://docs.unrealengine.com/5.0/en-US/lumen-technical-details-in-unreal-engine/
 

MH3M3D

Member
Not unless nVidia miraculously cathes up with AMD's CPU performance and creates an SOC with both CPU and GPU on it that matches AMD's offerings AAAAND emulates PS4 and PS5 perfectly.
 

Schmendrick

Member
You are misunderstanding lumen. Lumen uses at its base a form of SOFTWARE RT
Really more mix of ray casting and raster tricks. But it can use use hardware RT.

It’s all in the documentation.
https://docs.unrealengine.com/5.0/en-US/lumen-technical-details-in-unreal-engine/
You´re the one misunderstanding things here.
Software RT was the first iteration and only option pre full release and that is while usable far from optimal, let alone performant, which is why Lumen now supports hardware raytracing.
One way or another it`s RT based.
 
Last edited:

SmokedMeat

Gamer™
With a lighting system that heavily relies on RT "RT isn`t a concern at all".

right-dr-evil.gif

https://www.techspot.com/review/2599-radeon-7900-xtx-vs-geforce-rtx-4080/#

What's interesting here is that ray tracing performance in Fortnite, which is one of the best examples of RT effects in any game right now, sees the 7900 XTX and RTX 4080 neck and neck. In fact, the Radeon GPU actually does a better job here with RT Lumen and Nanite enabled, and that's a surprising result.

https://www.techspot.com/review/2642-radeon-7900-xt-vs-geforce-rtx-4070-ti/#Fortnite_RT

Now with hardware ray tracing enabled (along with Lumen and Nanite), we find that performance is very similar using either GPU.


This means that for one of the most impressive examples of ray tracing we have to date, RDNA 3 is able to match Ada Lovelace. Therefore it's going to be very interesting to see how these two architectures compare in future Unreal Engine 5 titles when using ray tracing
.



Cracking Up Lol GIF by The Tonight Show Starring Jimmy Fallon
 
Last edited:
The only way Nvidia would make an acceptable offer if they as whole would suffer financially. or if AMD cards would take a big portion of their PC space that is the only Nvidia would go into a deal with Sony or Microsoft.Nvidia is way to arrogant to accept the deal from Console makers at their conditions.They would charge a premium for the GPUs.
 

Schmendrick

Member
What's interesting here is that ray tracing performance in Fortnite, which is one of the best examples of RT effects in any game right now, sees the 7900 XTX and RTX 4080 neck and neck. In fact, the Radeon GPU actually does a better job here with RT Lumen and Nanite enabled, and that's a surprising result.

https://www.techspot.com/review/2642-radeon-7900-xt-vs-geforce-rtx-4070-ti/#Fortnite_RT

Now with hardware ray tracing enabled (along with Lumen and Nanite), we find that performance is very similar using either GPU.


This means that for one of the most impressive examples of ray tracing we have to date, RDNA 3 is able to match Ada Lovelace. Therefore it's going to be very interesting to see how these two architectures compare in future Unreal Engine 5 titles when using ray tracing
.
1/16th resolution, only gi and reflections and the geometric complexity of Duplo.
"one of the best examples of RT effects" uh-huh........

Congrats AMD "only" loses by ~10% in light RT situations (hardware RT btw). We knew that before. We also know what happens the moment bounces, resolution etc are increased.
 
Last edited:

SmokedMeat

Gamer™
1/16th resolution, only gi and reflections and the geometric complexity of Duplo.
"one of the best examples of RT effects" uh-huh........

Congrats AMD "only" loses by ~10% in light RT situations (hardware RT btw). We knew that before. We also know what happens the moment bounces, resolution etc are increased.

Real world benchmarks versus whatever you pulled out of your ass.

You really do sound like a Schmendrick. 😂
 

Schmendrick

Member
Real world benchmarks versus whatever you pulled out of your ass.

You really do sound like a Schmendrick. 😂
Right because we didn`t have 100s of benchmark comparisons with CP2077 just recently or Portal before that or the 1000s of benchmark charts when the cards were reviewed upon release.....
Can you act any dumber?
 
Last edited:

SmokedMeat

Gamer™
Right because we didn`t have 100s of benchmark comparisons with CP2077 just recently or Portal before that. We´re gonna pretend those don`t exist........
Can you please act any dumber?

Speaking of dumb, my comment was about Unreal 5 and Lumen. That’s what you replied to with your ass stats, before I posted real world benchmarks.

Cyberpunk isn’t Unreal 5 Lumen Schmendrick. Stop being such a Schmendrick, Schmendrick. The school bus will be pulling up any moment, and I don’t want you to be late. Don’t forget your lunch.
 

Schmendrick

Member
"Can you act any dumber?"

Speaking of dumb, my comment was about Unreal 5 and Lumen. That’s what you replied to with your ass stats, before I posted real world benchmarks.

Cyberpunk isn’t Unreal 5 Lumen Schmendrick. Stop being such a Schmendrick, Schmendrick. The school bus will be pulling up any moment, and I don’t want you to be late. Don’t forget your lunch.

Guess you can.
The discussion was about "RT" and "RT performance not mattering" according to you with reference to Lumen...which ironically relies on hardware RT.
images
 
Last edited:
Top Bottom