• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Nvidia Laptop RTX 3070 In-Depth: Can A Mobile PC Deliver Next-Gen Gaming Experiences? [DF]

GymWolf

Member
i got one... I will never use this shit in public..
to game on a boat..

i can max anything i play so far on 1080p at stable 144hz
not sure what serious gaming is.. but it's all the power i need for any game i tried so far.
I'm sorry what?
 

FStubbs

Member
Sure thing dude, i'm not gonna try changing your mind, but for the majority of people, buying a laptop for high end gaming is stupid as fuck, they are overpriced with worse performance and temps.

And casual gaming pretty much exist, unless you think that the guy who play a couple of hours of fifa every week is comparable to people who play everyday, visit forums, open topics to discuss videogame related stuff, buy pricey collector editions, buy multiple games every month etc.

Every hobby has different level of commitment\money spended on it, it may be sound cringe, but it is the reality.
Some of those people spend more time on the forums than they do actually playing the games LOL.

Casual gamer, serious gamer - the Wii era wants its lingo back.
 

CrustyBritches

Gold Member
What's the point in testing the 95w version when that's the worst value proposition out of all of the mobile GPU options? For the price the 3060 is a much better buy.
I bought a 3060 and 3070 laptop and ended up returning the 3070 laptop because it was only ~14% faster for 42% more money.
---
I don't think it's really worth buying a laptop for your primary gaming device, however if you already need a laptop then it makes sense to get something with a decent GPU. With some headphones and a controller it can be a nice experience: 1080p(DLSS)+RT @120fps in DOOM Eternal looks and plays great. If you're into VR it's handy as well for pairing with a Quest 2 in Airlink/Virtual Desktop wireless mode and being able to play games from Steam or Viveport at friend or family member's house.
 

FStubbs

Member
I bought a 3060 and 3070 laptop and ended up returning the 3070 laptop because it was only ~14% faster for 42% more money.
---
I don't think it's really worth buying a laptop for your primary gaming device, however if you already need a laptop then it makes sense to get something with a decent GPU. With some headphones and a controller it can be a nice experience: 1080p(DLSS)+RT @120fps in DOOM Eternal looks and plays great. If you're into VR it's handy as well for pairing with a Quest 2 in Airlink/Virtual Desktop wireless mode and being able to play games from Steam or Viveport at friend or family member's house.
Yeah, that was why I passed on the 3070 mobile. There's a reason even nVidia is positioning the 3060 mobile as the "main" laptop card, though they were lying about it being 25% more powerful than a PS5.
 

Spukc

always chasing the next thrill
I'm sorry what?

when there is power available i can game :p
no need for next gen consoles. Also they are way to clunky when traveling.

At home i just use my desktop. Or when i am really lazy as fuck i play some games when in bed.
 

Kilau

Member
I really like my Asus with 5900hx and 3070. Plays everything and is very quiet unless I crank up it’s profile. In the silent profile, the fans don’t even turn on when I’m doing normal work and web browsing.

I still lean more towards playing on my series x or ps5 but I like the option for PC exclusives such as AoE 4 or my massive steam library.
 

CrustyBritches

Gold Member
Yeah, that was why I passed on the 3070 mobile. There's a reason even nVidia is positioning the 3060 mobile as the "main" laptop card, though they were lying about it being 25% more powerful than a PS5.
In laptop form-factor it's about as powerful as a desktop 2060 Super/2070. My 3060(laptop) is locked, so you can only control the target temperature. Not that I'd want to overclock it as it would probably ruin it. Mine sits at ~1450MHz core/14300MHz mem, while the desktop versions look like they can OC to 2000MHz core/17000MHz mem depending on the model(Guru3D).

In games with RT and/or DLSS it gives a very comparable experience to a PS5/XSX. I don't think consoles can even do 120fps+RT in DOOM Eternal. Overall though, it's not anywhere close to 25% more powerful. Probably the other way around.
 
Last edited:

KyoZz

Tag, you're it.
Crazy how many people think we are still in 2005.
Laptop gaming are just fine, I got one last year and I'm pretty happy with it.
i've not played minecraft so i'm might be wrong but i'm asking at what exactly minecraft innovates? still got no answer.
So yeah, you don't know and you shouldn't let the look of the game fool you. Have you ever heard about voxel? Google it, watch video etc...
Just to give you a hint, Minecraft has the biggest map ever in a game, it's even bigger than the Earth.
 

Darius87

Member
Just to give you a hint, Minecraft has the biggest map ever in a game, it's even bigger than the Earth.
procedurally generated world isn't anything new, players should get most merits for crafting the world that's because of minecraft popularity but not anything innovative in game.
 
I bought a lappy with a 3060, shit was louder than a jet. Not worth it.
Most gaming laptops are loud. Even the top high end as they have to slim and sleek, loud doesn't mean they get overheat unless you keep it on bed or closed off space.

Also Most play with headphones then you don't notice it.


My brother got Victus 3060 95w laptop. Laptop temperature never goes over 75 even after 3-4 and it runs almost all games at smooth 60 with max settings and they all look better than console versions.
 
Who buys a laptop for serious gaming?!
I know plenty of people using gaming laptop for serious gaming. Modern laptops are quite good with excellent thermals. Just have to do some research and buy good one.


RTX 3060 and above laptops are beastly and can beat consoles in any game. They can most modern games at ultra at 60fps.

RTX 3060 and 3070 mostly are 1440p laptops but thanks to DLSS, they produce amazing image quality which beats console and delivers solid frame rate.
 

Ozriel

M$FT
Sure thing dude, i'm not gonna try changing your mind, but for the majority of people, buying a laptop for high end gaming is stupid as fuck, they are overpriced with worse performance and temps.

People who want mobility/portability buy a Laptop. Basically, the reason why the form factor was invented in the first place. No sense in buying a gaming desktop if you’re often on the move.

I’m not sure why you needed this simple concept explained to you (should be obvious to anyone above 10 years of age), but this is probably the harsh reality of not implementing an age filter on this website.
 

Ozriel

M$FT
i've stopped watching when Richard said:

does it mean technology? because i don't see any PC game that innovates, most PC games are ports of consoles in general gaming have become stagnant.
Sony made PS5 with I/O complex which could lead in gameplay innovations, windows has yet to release it's counterpart Direct storage. how PC in forefront?

They’re talking about technology. And PC is usually at the forefront of innovation. VR gaming, DLSS/AI upscaling etc.
imagine going into a rage because you’re touting ‘revolutions’ that have yet to bear fruit.
 

GymWolf

Member
People who want mobility/portability buy a Laptop. Basically, the reason why the form factor was invented in the first place. No sense in buying a gaming desktop if you’re often on the move.

I’m not sure why you needed this simple concept explained to you (should be obvious to anyone above 10 years of age), but this is probably the harsh reality of not implementing an age filter on this website.
I know why laptop were invented and why people use them, i just think that the majority of people who is very investend in the videogame hobby still prefer playing big hyped games in in their home with a proper pc or console, maybe in a big tv or a high end monitor, with a good mechanical keyboards etc.

I never, ever saw anyone playing more than fortnite or solitaire in a laptop, maybe it's something that americans do far more than europeans\italians like me, i personally would never play a game that i waited for maybe 5-10 years in a fucking bus or in a plane with other people close to me that talk or make noise or in a park with the sun reflecting on the screen etc.


Call me a romantic or a nerd, but when i play i need some basic conditions to enjoy my play time.

I mean, it's not like buying laptops for heavy gaming is considered a great idea in any gaming forum in existence, 90% of the times you read people trying to discourage other people from buying laptops for hardcore gamers because unless you are literally never at home (hard to believe tbh) they are the worse choice, they cost more than a desktop with the same parts (or you can't even have the same parte to begin with), they have worse temps, worse integrated keybords, and you can't even upgrade them like a regualar desktop (unless i missed completely upgradable laptops that are probably even more overpriced), their only selling point is literally be portable, stop.

no need for that age crap my dear, although i do like your abrasive style :lollipop_squinting:
 

Darius87

Member
They’re talking about technology. And PC is usually at the forefront of innovation. VR gaming, DLSS/AI upscaling etc.
imagine going into a rage because you’re touting ‘revolutions’ that have yet to bear fruit.
The first VR head-mounted display (HMD) system, The Sword of Damocles, was invented in 1968 by computer scientist Ivan Sutherland and his student Bob Sproull.
i could agree on DLSS for PC what else?
sony invented: DVD(PS2), Cell BB engine(PS3), Blu-ray(PS3), CB upscaling(PS4), IO complex(PS5), another thing is consoles comes out every 7-8 years while PC tech comes out more often.
 

winjer

Gold Member
i could agree on DLSS for PC what else?
sony invented: DVD(PS2), Cell BB engine(PS3), Blu-ray(PS3), CB upscaling(PS4), IO complex(PS5), another thing is consoles comes out every 7-8 years while PC tech comes out more often.

The PC introduced a lot of new technologies, for consumers. Here's some recent examples.
Variable rate shading, Variable Refresh Rate (g-sync and free-sync), Mesh Shaders, RTX-IO, etc.
Even ray-tracing was just something for offline rendering, but now it's possible in real time, and it was introduced on the PC.

And let's remember that all those techs you mentioned, were not invented just by Sony alone. They were developed in conjunction with many other companies, some of which have strong connections to the PC, such as AMD and IBM.
 

Darius87

Member
The PC introduced a lot of new technologies, for consumers. Here's some recent examples.
Variable rate shading, Variable Refresh Rate (g-sync and free-sync), Mesh Shaders, RTX-IO, etc.
Even ray-tracing was just something for offline rendering, but now it's possible in real time, and it was introduced on the PC.
RTX-IO isn't invention same with RT, i agree on other tech. Conclusion is that both console and PC manufacturers innovates and adapt each other tech, but let's remember what Richard said:
and let's not forget as PC is forefront of innovation in gaming space always has been always will be.
And let's remember that all those techs you mentioned, were not invented just by Sony alone. They were developed in conjunction with many other companies, some of which have strong connections to the PC, such as AMD and IBM.
yes but sony was leading in most if not all of these inovations.
 

winjer

Gold Member
This in particular arrived on consoles first.

RTX-IO was already in Turing, almost 2 years before the new consoles were released.
The thing that was missing was Direct Storage from MS, for games to use it.

RTX-IO isn't invention same with RT, i agree on other tech. Conclusion is that both console and PC manufacturers innovates and adapt each other tech, but let's remember what Richard said:

yes but sony was leading in most if not all of these inovations.

Yes, RT is an old concept, and it has been used extensively in offline rendering.
But RT was never viable for real time rendering in games, until nVidia introduced RT cores with Turing.
This was the big innovation. And it was done on PCs.
Similar thing with RTX IO.

In somethings Sony was leading, in some they were just a partner.
For example the Cell CPU, was done mostly by IBM.
 

Gamezone

Gold Member
I bought a lappy with a 3060, shit was louder than a jet. Not worth it.

I'm not a laptop guy, but in 2021 creators are still unable to replace air cooling with a silent alternative? I know these things are a lot more powerful than phones, but if you ignore iPhone's, mobile cooling has become a lot better.
 
Last edited:

GymWolf

Member
Normally I would agree with you, and still think it's a shit proposition.

But seeing the prices of GPU's lately, and some "gaming laptops" are essentially giving you one for free. In some cases it "finally" makes sense to buy a laptop if you need a GPU.
I hardly think that who play heavily on desktop is gonna buy a laptop just to have a nice gpu, you get one thing but renounce to many other things, i don't know if people are okay with the tradeoff.

Unless they are in super desperate situation and they need a new pc asap.

I would never do that.
 

KyoZz

Tag, you're it.
procedurally generated world isn't anything new, players should get most merits for crafting the world that's because of minecraft popularity but not anything innovative in game.
The key word in my message was voxel.
The way Minecraft used this technology was revolutionary and opened the door for so many games. Please educate yourself, stop acting like a troll and watch videos about the subject. Until then stop posting because you clearly don't know what you talking about and it's very cringe.
 

Md Ray

Member
RTX-IO was already in Turing, almost 2 years before the new consoles were released.
The thing that was missing was Direct Storage from MS, for games to use it.
I know, but they aren't the same. RTX IO uses GPU or more specifically the SMs (streaming multiprocessors) for decompression of the assets, on consoles (XS/PS5) they aren't using the CUs of the GPU, rather, there's dedicated silicon just for decompression purposes. Hardware-wise in this aspect consoles are ahead.

DirectStorage for Windows

xYRNzWP.png


Even software-wise PS5 already has its own DirectStorage-like new storage API in place and is already being utilized in many titles.


"No such solution for PC" meaning no dedicated decompression block like it does on the consoles.

0dUnUNt.png


But a purpose-built decompression unit akin to consoles is under development though... And it shall be implemented in the next-gen NVIDIA/AMD/Intel GPUs in the future.

oKdboKO.png
 
Last edited:

winjer

Gold Member
I know, it isn't the same as what's available on the consoles. RTX IO uses GPU or more specifically the SMs (streaming multiprocessors) for decompression of the assets, on consoles (XS/PS5) they aren't using the GPU, there's dedicated silicon for decompression. Hardware-wise in this aspect consoles are ahead.

They are similar tech to go away from having the CPU doing the heavy lifting in IO task.
The PC uses GPU, consoles use a dedicated controller.
Yes, consoles by having dedicated hardware, are more efficient.
But the reality is that the PC solution came first. Almost 2 years before consoles.
It was a shame that MS didn't have the software to take advantage. But the same could be said about Mesh Shaders, Sampler Feedback, HDR support, etc.
The hardware was ready, but the Windows team was lagging behind.

But maybe the next SSDs on PC will have something similar.
The PS5 just uses a Marvel controller. So maybe one day we will start to SSDs with it.
 

Md Ray

Member
They are similar tech to go away from having the CPU doing the heavy lifting in IO task.
The PC uses GPU, consoles use a dedicated controller.
Yes, consoles by having dedicated hardware, are more efficient.
But the reality is that the PC solution came first. Almost 2 years before consoles.
It was a shame that MS didn't have the software to take advantage. But the same could be said about Mesh Shaders, Sampler Feedback, HDR support, etc.
The hardware was ready, but the Windows team was lagging behind.

But maybe the next SSDs on PC will have something similar.
The PS5 just uses a Marvel controller. So maybe one day we will start to SSDs with it.
It seems you haven't read or understood the rest of my post. The bespoke decompression hardware came on consoles first (yes, that includes Xbox Series S), and it's built into the main SoC where GPU+CPU are located. It's not a part of the SSD, nor is it a Marvel controller. And there's no such hardware that was ready on PC, NVIDIA is simply turning towards GPU SMs with RTX IO for the time being until something similar to console's solution is developed on the PC side (see the last slide from MS Game Stack in that post). And it is going to be incorporated into future GPUs, not in the SSDs.

Consoles are ahead in this regard.
 

winjer

Gold Member
It seems you haven't read or understood the rest of my post. The bespoke decompression hardware came on consoles first (yes, that includes Xbox Series S), and it's built into the main SoC where GPU+CPU are located. It's not a part of the SSD, nor is it a Marvel controller. And there's no such hardware that was ready on PC, NVIDIA is simply turning towards GPU SMs with RTX IO for the time being until something similar to console's solution is developed on the PC side (see the last slide from MS Game Stack in that post). And it is going to be incorporated into future GPUs, not in the SSDs.

Consoles are ahead in this regard.

And you didn't read my post.
What I said is that both techs have the same purpose of removing IO tasks from the CPU.
And like I said, yes consoles are more efficient because they have dedicated hardware.
But the first solution of this type, showed up on the PC. And them the consoles surpassed the PC solution.

And yes, the SSD controller is a part of the process. Be it on PC or consoles.
It's not just one specialized chip on the console. Everything is working together.
Who knows, maybe one day GPUs, will have a dedicated unit added just for I/O, like they did for enconding, for ray-tracing, AI, etc.
 
Last edited:

scalman

Member
If you have to have a laptop, having one that can play games is perfectly legitimate. You don't have to be rocking a 3090 desktop to be a PC gamer.

My 3060 mobile is great at 1080p gaming, and that's all I need.

Also, LOL at "serious gaming". There's no such thing, that's an oxymoron.
my gtx 1060 6gb laptop is still fine for gaming on 1080p . thats all i need. never will have any desktop anymore as i move and take laptop with me . so next will be 3060 or 3070 then. but only when it will be like needed to even run games.
 

Kazza

Member
I recently bought a 3060 laptop, and I'm really happy with it. Like others have said, laptops are generally played with headphones, so the sound isn't an issue. For older games such as Yakuza 0 I can even play on silent mode and still easily get 60 fps. With so many newer games supporting DLSS, playing the latest and greatest on my living room 4k TV should be an option for the rest of the gen too. I guessed that my laptop was around PS5/XSX performance or higher, so and it's great to get that pretty much confirmed in the video.

I have to admit I originally planned on building a desktop until the chip shortage hit, but now I'm really glad I opted for a laptop, as I needed one for other things too (business etc)
 

Md Ray

Member
And you didn't read my post.
I did.
What I said is that both techs have the same purpose of removing IO tasks from the CPU.
And like I said, yes consoles are more efficient because they have dedicated hardware.
Yes.
But the first solution of this type, showed up on the PC.
No, that's where you're wrong. Here's a direct response from MS to that...

It can't get any clearer than this statement: "No such solution for PC"
0dUnUNt.png


The thing about hardware decompression is that it was announced first and showed up first on consoles. It was announced for both Sony & MS consoles in March of 2020, RTX IO was announced in Sept 2020, it still hasn't "showed up" until now and likely won't until DStorage is here.

Now, you might say Turing architecture has existed since 2018, but what you fail to understand is that there's no specialized, dedicated hardware for decompression on Turing or Ampere akin to consoles. RTX IO in its current form will simply make use of the normal GPU cores (SMs, the equivalent of AMD CUs), the SMs are a part of all NVIDIA cards (including GTX-based for over a decade, it's what contains CUDA cores, etc). For this reason, it doesn't make it "the first solution" on PC. The consoles had it first and PC will follow suit as they say here:

"possible future silicon implementations" basically means the console's silicon implementation for decompression will arrive sometime in the future on PC GPUs.
oKdboKO.png
 

winjer

Gold Member
I did.

Yes.

No, that's where you're wrong. Here's a direct response from MS to that...

It can't get any clearer than this statement: "No such solution for PC"


The thing about hardware decompression is that it was announced first and showed up first on consoles. It was announced for both Sony & MS consoles in March of 2020, RTX IO was announced in Sept 2020, it still hasn't "showed up" until now and likely won't until DStorage is here.

Now, you might say Turing architecture has existed since 2018, but what you fail to understand is that there's no specialized, dedicated hardware for decompression on Turing or Ampere akin to consoles. RTX IO in its current form will simply make use of the normal GPU cores (SMs, the equivalent of AMD CUs), the SMs are a part of all NVIDIA cards (including GTX-based for over a decade, it's what contains CUDA cores, etc). For this reason, it doesn't make it "the first solution" on PC. The consoles had it first and PC will follow suit as they say here:

"possible future silicon implementations" basically means the console's silicon implementation for decompression will arrive sometime in the future on PC GPUs.

I'm not talking about HW compression and decompression.
I'm talking about I/O. These are two different techs.
Just because they affect storage, doesn't mean they are the same.

Look at my post, I clearly stated RTX IO, from the very beginning.
And have always been talking about I/O.

But if you want to talk about data compression in SSDs to improve data transfers, let me remind you of sandforce controllers, almost a decade ago.
 

Md Ray

Member
I'm not talking about HW compression and decompression.
I'm talking about I/O. These are two different techs.
Just because they affect storage, doesn't mean they are the same.

Look at my post, I clearly stated RTX IO, from the very beginning.
And have always been talking about I/O.

But if you want to talk about data compression in SSDs to improve data transfers, let me remind you of sandforce controllers, almost a decade ago.
I was also talking about I/O since the beginning, not SSD controllers. What do you think RTX IO really does? It is literally a tech that allows for asset decompression via GPU.

From NVIDIA Ampere whitepaper:
j1kVuie.png
 

Md Ray

Member
Therefore, another innovation that started on PC.
Nope, you're wrong though.
The thing about hardware decompression is that it was announced first and showed up first on consoles. It was announced for both Sony & MS consoles in March of 2020, RTX IO was announced in Sept 2020, it still hasn't "showed up" until now and likely won't until DStorage is here.
 
Last edited:

Md Ray

Member
The thing is, RTX IO is supported in Ampere and Turing.
And Turing was released 2 years before.
Just because the announcement came later, means nothing.
I already touched that point before:
Now, you might say Turing architecture has existed since 2018, but what you fail to understand is that there's no specialized, dedicated hardware for decompression on Turing or Ampere akin to consoles. RTX IO in its current form will simply make use of the normal GPU cores (SMs, the equivalent of AMD CUs), the SMs are a part of all NVIDIA cards (including GTX-based for over a decade, it's what contains CUDA cores, etc). For this reason, it doesn't make it "the first solution" on PC. The consoles had it first and PC will follow suit as they say here:
GPU SM-based decompression is a quick and dirty solution due to the lack of console-like dedicated decompression units as silicon implementation takes time which is touched upon in the MS Game Stack presentation slides (which you conveniently choose to ignore).

Consoles had them first, PC will have them at a later date. Nothing wrong with accepting that consoles are ahead of PC in this department atm.
xYRNzWP.png
 
Last edited:

winjer

Gold Member
I already touched that point before:

GPU SM-based decompression is a quick and dirty solution due to the lack of console-like dedicated decompression units as silicon implementation takes time which is touched upon in the MS Game Stack presentation slides (which you conveniently choose to ignore).

Consoles had them first, PC will have them at a later date. Nothing wrong with accepting that consoles are ahead of PC in this department atm.

Like I said before. RTX IO was the first of this type of tech.
But was surpassed by the solution on these new consoles, because they have dedicated hardware.

What was lacking on the PC, was the software.
Just because MS has been more focused in making new icons for windows, than in expanding the OS feature set, doesn't mean that PC wasn't there first with the tech.
It just means that the Windows team was slacking off.

If we go by that logic, then NGGP wasn't a PC that showed up first on PC. Simply because DX12_2 wasn't ready yet.
Despite the adopted NGGP was made by nVidia. And also featured on Turing 2 years prior.
 
Last edited:

OmegaSupreme

advanced basic bitch
So anyway. I have been looking at a legion laptop with a 115 watt rtx 3060 and an 8 core amd chip. I imagine it'll perform just as well as this one if not slighty better?
 
I already touched that point before:

GPU SM-based decompression is a quick and dirty solution due to the lack of console-like dedicated decompression units as silicon implementation takes time which is touched upon in the MS Game Stack presentation slides (which you conveniently choose to ignore).

Consoles had them first, PC will have them at a later date. Nothing wrong with accepting that consoles are ahead of PC in this department atm.
xYRNzWP.png
Are you sure it's quick and dirty? After doing some reading it seems quite a competent solution.
 

FireFly

Member
Like I said before. RTX IO was the first of this type of tech.
But was surpassed by the solution on these new consoles, because they have dedicated hardware.
AFAIK Nvidia never spelled out what, if any, specialised hardware RTX IO requires. We know that DirectStorage will work on any DirectX 12 GPU.
 

Md Ray

Member
Are you sure it's quick and dirty? After doing some reading it seems quite a competent solution.
I'm sure it's competent. But it's still using precious GPU cycles for loading and decompressing of game assets, there is definitely going to be a cost involved, it won't be free in terms of perf. Otherwise, console manufacturers wouldn't have bothered to spend R&D costs, using valuable silicon real estate just to develop a separate decompression unit to take advantage of the SSD speeds, they could have easily just used the RDNA 2 CUs instead.

PC is also heading in the same direction as the consoles though, using GPU SMs is just a stopgap solution until they move to dedicated HW just like consoles:

These slides summarize it perfectly:

-No such HW accelerated decompression solution for PC at present
-it takes time to build in the silicon, won't be available instantly
-you can't tell the industry to wait a few years until a similar HW arrives on PC (while it's already available on consoles) -> unacceptable
0dUnUNt.png


"MSFT + GPU vendor collaboration to innovate and create a GPU friendly compression solution that works on today's GPU HW".
Basically talks about RTX IO (and whatever is AMD's alternative) here which allows for GPU-based loading and decompression as a solution for the here and now.

"future silicon implementations" means an HW accelerated decompression solution is under development currently which will be similar to the console implementation.
oKdboKO.png
 
Last edited:

GHG

Member
So anyway. I have been looking at a legion laptop with a 115 watt rtx 3060 and an 8 core amd chip. I imagine it'll perform just as well as this one if not slighty better?

Slightly better.

 

yamaci17

Member
I'm sure it's competent. But it's still using precious GPU cycles for loading and decompressing of game assets, there is definitely going to be a cost involved, it won't be free in terms of perf. Otherwise, console manufacturers wouldn't have bothered to spend R&D costs, using valuable silicon real estate just to develop a separate decompression unit to take advantage of the SSD speeds, they could have easily just used the RDNA 2 CUs instead.

PC is also heading in the same direction as the consoles though, using GPU SMs is just a stopgap solution until they move to dedicated HW just like consoles:

These slides summarize it perfectly:

-No such HW accelerated decompression solution for PC at present
-it takes time to build in the silicon, won't be available instantly
-you can't tell the industry to wait a few years until a similar HW arrives on PC (while it's already available on consoles) -> unacceptable
0dUnUNt.png


"MSFT + GPU vendor collaboration to innovate and create a GPU friendly compression solution that works on today's GPU HW".
Basically talks about RTX IO (and whatever is AMD's alternative) here which allows for GPU-based loading and decompression as a solution for the here and now.

"future silicon implementations" means an HW accelerated decompression solution is under development currently which will be similar to the console implementation.
oKdboKO.png
it is likely that nvidia can offset this kind of load onto their tensor cores, but this is just a speculation from me

maybe thats what they mean by "rtx io"... if not well,..

and then again, what will be the focus

1) real time decompressing of assets when you literally move your mouse?
or just
2) faster loading times or faster transitions

if its the 2, well, you can live without it. waiting 4-5 seconds for a teleportation/fast travel instead of 1 second won't kill nobody

if its the 1, then it would render tons of hardware obsolete, since you cant spare gpu power for this decompressing stuff and render the game itself at the same time. it would specifically render obsolete the whole rdna 2/rtx 2000/3000 gpu line which you would expect to not be obsolete at least for a healthy 4-5 years. i would say both sides will find a healthy solution that will make everyone happy tho

dunno how things will proceed tbh im very interested to see how stuff turns out
 
Last edited:
Top Bottom