• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Series X Won't Be Competitive With NVIDIA's DLSS Unless There's a MIRACLE ~ Digital Foundry

Thanks (y)
".....And to make up for the lack of information that comes from per-game networks, the company is making up for it by integrating real-time motion vector information from the game itself, a fundamental aspect of temporal anti-aliasing (TAA) and similar techniques. The net result is that DLSS 2.0 behaves a lot more like a temporal upscaling solution, which makes it dumber in some ways, but also smarter in others. "

"The single biggest change here is of course the new generic neural network. Looking to remove the expensive per-game training and the many (many) problems that non-deterministic games presented in training, NVIDIA has moved to a single generic network for all games "


at first glance it seems like instead of having one supercompute for each game separately, they created and are using a unified supercompute that they feed and use for all games.
so if I understood correctly, the supercompute is still a very crucial part for each and every game that offers DLLS. correct?
I probably read your comment wrong, and took it as if the supercompute part was not needed anymore.
 
Last edited:
Except horizon shows how bad of an idea it is 200-500k sales is not worth losing millions of console sales over. You trading 500k sales for billions in revenue. Same mistake Microsoft made out of desperation to save the division from being shutdown. Sony doesn't need this to save the division like Microsoft.
HZD is a 3 year old game. It would never make the sales it did in 2017. I am sure Sony has metrics on that. Plus it got the label of a "bad port".
The rest of your post about xbox division saved from shutting down, billions traded, etc, well, yeah, that's just your opinion, and it doesn't seem to correlate much to reality.
 
Last edited:
at first glance it seems like instead of having one supercompute for each game separately, they created and are using a unified supercompute that they feed and use for all games.
so if I understood correctly, the supercompute is still a very crucial part for each and every game that offers DLLS. correct?
I probably read your comment wrong, and took it as if the supercompute part was not needed anymore.

If I understand it correctly, when a new game comes out, Nvidia can implement DLSS without additional training from that specific game. So they don't have to do any additional supercomputing, which means that technically they don't need the supercomputer. But as they want to make the algorithm better, they will still frequently feed it with new data.
 

Papacheeks

Banned
Because it's not comparable. That marble demo would run at 240p on a PS5. It's that much more demanding compared to the UE5 demo.



If your entire point is that engines are so well optimized that you don't need DLSS, you picked the wrong example. I'd rather run the UE5 demo at 4K60 than 1440p30.

That was where i was going. And they are using movie assets like 8k+ or something on the UE5 demo. I know what Nividia showed was totally different. Would have been nice for it to be some kind of game engine demo, but I think they were trying to sell RTX 3090 to also prosumer's, like CGI artists, and possibly render farms.

My bad.

I was going off that of the people I watch and talk to, a lot is going on behind the scenes in engines, and nvidia showed me nothing in terms of gaming relevancy in any of their demo's. Just slides and a realtime render of something very impressive if you are a cgi artists or scene creator.
 
Except horizon shows how bad of an idea it is 200-500k sales is not worth losing millions of console sales over. You trading 500k sales for billions in revenue. Same mistake Microsoft made out of desperation to save the division from being shutdown. Sony doesn't need this to save the division like Microsoft.

They aren't losing any console sales by selling an old game to a non-console crowd, and by delivering a shitty port they committed the one cardinal sin of PC gaming, so of course the sales were mediocre at best.
 
and nvidia showed me nothing in terms of gaming relevancy in any of their demo's.

How so? We got huge performance jumps all across the board. And DLSS further increases these jumps. As I said, that UE5 demo will likely run at 4K60 on a mid tier graphics card with DLSS. How is that not relevant to your gaming experience?
 
If I understand it correctly, when a new game comes out, Nvidia can implement DLSS without additional training from that specific game. So they don't have to do any additional supercomputing, which means that technically they don't need the supercomputer. But as they want to make the algorithm better, they will still frequently feed it with new data.
First thing I quoted from your article, in bold, says differently, second quote says "NVIDIA has moved to a single generic network for all games" (I forgot to bold that one)
:]
 

Papacheeks

Banned
How so? We got huge performance jumps all across the board. And DLSS further increases these jumps. As I said, that UE5 demo will likely run at 4K60 on a mid tier graphics card with DLSS. How is that not relevant to your gaming experience?

I'm saying they didn't show fps just vague slides of the improvements in titles compared to turing. They should have showed the UE5 demo or if it was available Frostbyts new engine running on a 3080/3090. They showed you stuff thats out now.

Not stuff people are buying these for, future wise. Like they should have shown the whole CYBERPUNK DEMO running a 3080. They showed a trailer with ray tracing. Moore's law was right the drivers probably are not ready or were not at the time of the taping of said nvidia presentation.
 
If you are primarily a PC gamer and want a console to complement your gaming selection, then a Switch or PS5 may be right for you. Some PC gamers, though, might find it more convenient to have an Xbox as well because Xbox games are persistent over both ecosystems. Buy it once, own it for both and play it on either. Clearly MS doesn't care where you play their titles.

However...

I think the vast majority of console gamers are just that, console only gamers.
The choice between an Xbox or a PC isn't a dilemma for console only gamers.
For console only gamers the existence of the PC is completely irrelevant.
I will, at no stage, consider purchasing a gaming PC.
I am not rare in the console gaming space.

Yes OK I agree that there is a sizeable contingent that are just console gamers. For these guys, Huang could pull out a RTX 3095 from under his leather jacket - freshly warmed up by the heat of his diddy body - and give it 'em free, and they'd probably go home and try to plug it into the USB port on their Acer 13 inch laptop. A graphics card is useless to them.

My point was Ampere will take a chunk of gamers away from Series X especially, as PC has all its games and more.
 

quest

Not Banned from OT
Yes OK I agree that there is a sizeable contingent that are just console gamers. For these guys, Huang could pull out a RTX 3095 from under his leather jacket - freshly warmed up by the heat of his diddy body - and give it 'em free, and they'd probably go home and try to plug it into the USB port on their Acer 13 inch laptop. A graphics card is useless to them.

My point was Ampere will take a chunk of gamers away from Series X especially, as PC has all its games and more.
Anyone who is skipping a series x for ampere was never getting a xbox in the first place.
 

IntentionalPun

Ask me about my wife's perfect butthole
I don't see how a miracle can come close to the sheer horsepower of the hardware in these new nVidia cards..

Or hell even the existing RTX cards. DLSS on your main cores is still worth it, but don't think they'll be able to dedicate enough to even compete with the tensor cores on something like an RTX2070 Super.

I'm just not expecting much from RT or these DLSS-like techs on next gen at all*

* that way if they knock it out of the park I'll be pleasantly surprised
 
I'm saying they didn't show fps just vague slides of the improvements in titles compared to turing. They should have showed the UE5 demo or if it was available Frostbyts new engine running on a 3080/3090. They showed you stuff thats out now.

Not stuff people are buying these for, future wise. Like they should have shown the whole CYBERPUNK DEMO running a 3080. They showed a trailer with ray tracing. Moore's law was right the drivers probably are not ready or were not at the time of the taping of said nvidia presentation.

Yeah, drivers likely weren't ready. The thing is they don't need to show anything, just some benchmarks so we know how fast the cards are. They don't need to show the UE5 demo for us to know how it will run on a 3080. We can easily extrapolate that. They showed the marble demo because it's not possible to run it that way anywhere else. It was unique.
 

Papacheeks

Banned
Yeah, drivers likely weren't ready. The thing is they don't need to show anything, just some benchmarks so we know how fast the cards are. They don't need to show the UE5 demo for us to know how it will run on a 3080. We can easily extrapolate that. They showed the marble demo because it's not possible to run it that way anywhere else. It was unique.

yea, I mean if your a 3D artist that demo probably made the case for you buying a 3090. My point is Ampere did nothing outside of showing you numbers, on how it stacks against next gen. And I would argue Playstation has shown it's feature set more clearly at what's already in use.

Data streaming using the cores for NVIDIA, isn't even ready till next year to go with Direct storage 2 from Microsoft direct x.

So your going to see things on the consoles like instant game loading, and instant game switching that still wont be present on PC for some time. Hence why I think they were trying to cover 2 different buying demographics in that presentation.
 
😂👌 Knew you would find an excuse.
What is "excuse"?
If you want 4k gaming, you buy series x and get the best IQ & performance on consoles. period.
If you are gaming on 1080p panel, we will see after launch how much difference ps5 to series s has
Oh, and
If you don't want your questions answered, then don't ask me ;]
 
Last edited:

Data Ghost

Member
As impressive as the 3090 is, I have no intention on switching to PC (and all of the annoying PC baggage that comes along with it). Also, for some reason gaming PC setups make me cringe.
 

IFireflyl

Gold Member
As impressive as the 3090 is, I have no intention on switching to PC (and all of the annoying PC baggage that comes along with it). Also, for some reason gaming PC setups make me cringe.

I'm sorry... what? What "baggage" does PC have?

Also, how do gaming PC setups make you cringe? A setup is whatever you want it to be. I have my desktop computer next to my 65" TV, and I use a wireless keyboard and mouse (or a wireless Xbox One controller when I'm playing games) from my recliner. How is that more cringe-worthy than having a console next to a monitor/TV and using a wired or wireless controller?
 
Last edited:

FranXico

Member
What is "excuse"?
If you want 4k gaming, you buy series x and get the best IQ & performance on consoles. period.
If you are gaming on 1080p panel, we will see after launch how much difference ps5 to series s has
Oh, and
If you don't want your questions answered, then don't ask me ;]
PS5 is a 4K console. If the Series S targets 1080p, there's no point in comparing the two.
 

martino

Member
i probably misunderstood but isn't tegra x1 supposed to be able to upscale 4k@60fps using dlss ?
if it's the case we can deduce a very modest cost to use equivalent tech from it.
In this scenario i think there is no reason for all of them to not use equivalent tech.
 
Last edited:
XSX is trying to sell itself on graphical performance, that was the pitch.

These cards are more damaging to that IMO.
Oh dear god, this exact same thing is said every single time a new card is announced and it's always wrong. People who play console will get a console, and people who play on PC will buy a graphic card that suits their budget. They can both co-exist and server a similar but different purpose.

Shit, how many console sold this generation at a whopping 1+ TF? If you combine the Xbox, PS4 & Switch its probably somewhere around 200,000,000 console players out there.

But do go on about how a graphics card will suddenly crumble a billion dollar market.
 

FranXico

Member
The reality is that both the PS5 and Xbox Series X are very low end compared to RTX 3080 and 3090.

And yet some people are arrogantly pretending that the Series X comes close to them somehow. As if 2TF is a huge difference, but 18TF and 24TF is not.
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
I share DF's wish that we get some kind of open or standardized verion/alternative. DLSS is such a game-changer that its proprietary status is a real problem.
 

PaNaMa

Banned
Thread title rubs me the wrong way. Xbox Series X was never advertised as having DLSS, was it? Like neither console was. Can we rename the thread "DF - Microsoft's new console doesn't have DLSS, the very thing no one ever said it was gonna have". Then in the body we can talk about how, since the hardware wasn't built with NVIDIA's proprietary DLSS technology, it couldn't possible have DLSS in it - unless there was a miracle where by all series X's were impregnated by the Holy Spirit en route to their retail location.

Neither console is not going to compete with my snowblower for cleaning my driveway either, cause they won't ship with a frickin auger -- and I wasn't expecting them to. They know NVIDIA makes DLSS, not AMD. Is DF's point simply that the new consoles do not feature whatever AMD's version of DLSS is going to be? If so, why not just phrase it that way?
 
The reality is that both the PS5 and Xbox Series X are very low end compared to RTX 3080 and 3090.

And yet some people are arrogantly pretending that the Series X comes close to them somehow. As if 2TF is a huge difference, but 18TF and 24TF is not.
Both VW Golf and VW Golf GTI are very low end compared to a Porsche 718 Boxster.
When you compare the two VWs though, the story is quite different
 

SF Kosmo

Al Jazeera Special Reporter
Thread title rubs me the wrong way. Xbox Series X was never advertised as having DLSS, was it? Like neither console was. Can we rename the thread "DF - Microsoft's new console doesn't have DLSS, the very thing no one ever said it was gonna have". Then in the body we can talk about how, since the hardware wasn't built with NVIDIA's proprietary DLSS technology, it couldn't possible have DLSS in it - unless there was a miracle where by all series X's were impregnated by the Holy Spirit en route to their retail location.

Neither console is not going to compete with my snowblower for cleaning my driveway either, cause they won't ship with a frickin auger -- and I wasn't expecting them to. They know NVIDIA makes DLSS, not AMD. Is DF's point simply that the new consoles do not feature whatever AMD's version of DLSS is going to be? If so, why not just phrase it that way?
Their point is just that AMD and MS are working on their own solution, but it is not likely to be competitive in terms of image quality, because it doesn't have the right hardware to accelerate it.

DLSS is definitely a killer feature, especially for things like Ray Tracing that are two expensive to do natively at 4K. And related technologies like AI denoising, which can be used to do RT more sparsely/cheaply than would be otherwise necessary.
 
Last edited:
I really people would stop trying to cross-compare teraflops across architectures. It's useless if you are not comparing the same architecture. The 3080 is 30teraflops but it is not 3x the power of the PS5. It's 2x the power of the PS5 if you roughly equate a PS5 to a 2080.
 
Last edited:
Terrible analogy
tenor.gif
 

Dane

Member
"PS5 is in a class of its own since you still need a PS5 console to play most of their games "

This is cope
 

FranXico

Member
Thread title rubs me the wrong way. Xbox Series X was never advertised as having DLSS, was it? Like neither console was. Can we rename the thread "DF - Microsoft's new console doesn't have DLSS, the very thing no one ever said it was gonna have". Then in the body we can talk about how, since the hardware wasn't built with NVIDIA's proprietary DLSS technology, it couldn't possible have DLSS in it - unless there was a miracle where by all series X's were impregnated by the Holy Spirit en route to their retail location.

Neither console is not going to compete with my snowblower for cleaning my driveway either, cause they won't ship with a frickin auger -- and I wasn't expecting them to. They know NVIDIA makes DLSS, not AMD. Is DF's point simply that the new consoles do not feature whatever AMD's version of DLSS is going to be? If so, why not just phrase it that way?
They've used this term of "engineering miracle" before, and it was about BC. They clearly knew what was in the pipeline then, and know what's in the pipeline now.

They're just setting the Series X equivalent of DLSS up for a moment of triumph.
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
I really people would stop trying to cross-compare teraflops across architectures. It's useless if you are not comparing the same architecture. The 3080 is 30teraflops but it is not 3x the power of the PS5. It's 2x the power of the PS5 if you roughly equate a PS5 to a 2080.
Well it's one metric. It measures a kind of power. Practical performance is much more complicated. The first half of what you are saying is correct, hit the second half is just as meaningless. What constitutes "power"?

The whole notion of practical performance is getting increasingly complicated when we're talking about essentially four totally different types of compute architectures in these systems.
 
Well it's one metric. It measures a kind of power. Practical performance is much more complicated. The first half of what you are saying is correct, hit the second half is just as meaningless. What constitutes "power"?

The whole notion of practical performance is getting increasingly complicated when we're talking about essentially four totally different types of compute architectures in these systems.

When I said power, I meant performance. PS5 may or may not be as performant as a 2080, but it's hanging around there. The 3080 in DF benchmarks is 80 percent more performant than a 2080. So 2x a PS5 in performance sounds accurate to me. Excluding ray tracing.
 
Last edited:

martino

Member
When I said power, I meant performance. PS5 may or may not be as performant as a 2080, but it's hanging around there. The 3080 in DF benchmarks is 80 percent more performant than a 2080. So 2x a PS5 in performance sounds accurate to me. Excluding ray tracing.
it's possible the gap between 2xxx and 3xxx will widen using the tech those games don't use (the gap vs 1xxx will)
 
Last edited:
well I have spoken too, way before you spoke. maybe you missed it?

some facts:

going by the pompous and impressive numbers nvidia released yesterday,
the new xbox has ~55% of the combined GPU+RT capability that 2080ti has.
2080ti until yesterday was the undisputed king of graphics and undisputed king of RT,
and no matter how you look at it, ps5 cannot touch xbox's numbers even in mark cerny's dreams.

Its not just the 19% difference in TF.
there is also 44% difference in CUs,
and a further 25% difference in GPU bandwidth*
These things add up you know...

*I hope most of you that praise 3080 with its 10GB RAM, got the message that xbox's 560Gb/s 10GB RAM is enough.

xbox also has confirmed h/w machine learning in the box with plenty of custom controllers to aid,
and microsoft has -a very crucial part of the DLSS 2.0 procedure- a vast array of offline compute for deep learning super sampling, or deep learning in general.

I don't want to write too much -I've learned that very rarely people read or understand more than a couple of catchy lines- but anyway these are some of the things that give xbox a great advantage over ps5.


Also keep in mind that xbox will have at launch the IO-GPU advantages nvidia demonstrated yesteday,
while the GTX 3000 series will have to wait for microsoft to deliver them DX IO

tenor.gif
 
Top Bottom