• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

Ellery

Member
I tend to react to hypocrisy.

Okay. Calm down and then please explain to me what exactly you mean, because I feel like you are misinterpreting what I said or confusing me for something someone else said.
If you think there is any fanboyism going on on my end then please reread my posts in here, because I shit on both Nvidia and AMD where they deserve it. I buy the products I think best and my last card was an R9 290X which I owned for many years and was extremely impressed by. I also owned Nvidia GPUs.

Sorry I can't figure out your angle or what makes you upset.

Would anyone recommend me selling 5700XT right now and rather focus on the new cards? What do you say..

Depends on what monitor/resolution do you have and what you plan on playing in the future?

I think the cards we are about to hear from are going to be more expensive than the 5700XT unless they are releasing an RTX 3060 as well and if that is the case I'd expect the 2nd market price of the 5700 XT to drop a bit, but I am reaching here. However the RTX 3060 might be further down the line.

The 5700 XT is a great card, but I think it will be dwarfed by the upcoming RTX 30 series and also the Big Navi products (however many there are) not only from a performance standpoint but probably a feature set standpoint as well.
 

YCoCg

Member
For the next few years 10GB is enough for 1440p.
We've got games already hitting close to that NOW what the hell is going to happen with newer games that push even more???

Time was you used to be able to get the 80 series card and be set for a generation, now you pay more for less? Can't even do 4k without running out of VRAM.
 
I have been thinking about the GPUs pricing (that /r/hardware post regarding Turing pushed me even further about skipping this gen)

I don't like the 10GB idea and it seems like the 3080 is basically a 2080 Ti, I don't know honestly my 2070 OCed is doing just fine at 3440x1440@100hz for World of Warcraft and CIV VI at the end of the day.

PS5 coming along with 16GB with 14GB for games at 499$ for me is a better price for this year alone. I might think better about this but sounds like I'm going with the PS5 this fall especially with the crazy games I wanna play.

I'll be watching over this thread but this gen seems like a SUPER bait for later on in the year or next one.
 

Ellery

Member
I have been thinking about the GPUs pricing (that /r/hardware post regarding Turing pushed me even further about skipping this gen)

I don't like the 10GB idea and it seems like the 3080 is basically a 2080 Ti, I don't know honestly my 2070 OCed is doing just fine at 3440x1440@100hz for World of Warcraft and CIV VI at the end of the day.

PS5 coming along with 16GB with 14GB for games at 499$ for me is a better price for this year alone. I might think better about this but sounds like I'm going with the PS5 this fall especially with the crazy games I wanna play.

I'll be watching over this thread but this gen seems like a SUPER bait for later on in the year or next one.

Well to be fair you can't compare the setup of a console like that to a graphics card, but I agree with you that you should keep your 2070 for those games you listed.

I will do the same. Buying the PS5 and keeping my current card for my 1440p 144hz monitor.

We've got games already hitting close to that NOW what the hell is going to happen with newer games that push even more???

Time was you used to be able to get the 80 series card and be set for a generation, now you pay more for less? Can't even do 4k without running out of VRAM.

Well first the games are using more VRAM than they actually need. What monitoring tools show is not indicative of "close to hitting the VRAM bottleneck".
My 8GB card outperforms the 11GB 1080 Ti in every 4K game I have seen so far.
Faster VRAM helps a lot and the jump for the 30 series is extremely big in that regard.

The time of the X80 series cards being the flagship are long gone and Nvidia always was very greedy on the VRAM. Especially in comparison to AMD.
The 1.5GB 580
The 2GB 680
The 3GB 780
Even the 3GB 780 Ti

And once the reviews come out the RTX 3080 10GB will completely outperform the 24GB RTX Titan in probably every single 4K game, but then again I think the 3080 is the 1440p card for Nvidia and the 3090 is the card aimed at 4K gaming.
 

GymWolf

Member
  1. For the next few years 10GB is enough for 1440p.
  2. Good 4K monitors are expensive. Nvidias knows their audience and they can milk 4K gamers for more money with the 3090. Those people have no alternative anyway.
  3. Having just 10GB is cheaper for Nvidia. VRAM is one of the most expensive things for them.
  4. A 10GB card puts people on a more frequent upgrade interval, because they sooner realize that the VRAM is a potential bottleneck.
  5. People look at performance first in reviews and then at the price. The price would go up with more VRAM, but the performance wouldn't
  6. GPU Reviews are a reflection of the current PC gaming landscape with titles from the last few years and the 3080 10GB will be enough for them and reviewers won't run into VRAM shortcomings.
  7. Based on that people are going to see that 10GB is enough for them currently. What the future holds nobody knows.
  8. The 10GB 3080 gives a very clear distinction to the 24GB 3090 and it makes the extremely expensive 1500$ RTX 3090 look like a reasonable buy because it flaunts so much VRAM. Basic consumer psychology.
  9. It leaves more room for future cards that may slot between the 3080 and 3090 to be impressive. Nvidia can go with anything between 10 and 20gb then and people will be impressed.
Isn't the 3080 a 4k gpu? I mean if it's 30% more powerfull than a 2080ti the horsepower is there...it's not like the 3090 is the only 4k capable gpu...
 

Ellery

Member
Isn't the 3080 a 4k gpu? I mean if it's 30% more powerfull than a 2080ti the horsepower is there...it's not like the 3090 is the only 4k capable gpu...

Games are going to be more demanding as well. When the first Titan came out people screamed from the rooftops that this greedy 1000$ product was the first 4K GPU and today I wouldn't even use that piece of junk for 1080p.

Running native 4K is so extremely taxing that Nvidia saw the need to bring DLSS and they also see the need to release an absolute monster GPU with 350W.

In the end it is a little more complex, because many GPUs can easily run 4K if you know your way around graphical settings and the RTX 3080 will be great at 4K yeah.

But there is this unwritten law that in order to qualify as a 4K GPU you need to run native 4K with all settings maxed out and then be able to hold 60fps. (This is a bit of banter, but there is also truth to it)
 

GymWolf

Member
Games are going to be more demanding as well. When the first Titan came out people screamed from the rooftops that this greedy 1000$ product was the first 4K GPU and today I wouldn't even use that piece of junk for 1080p.

Running native 4K is so extremely taxing that Nvidia saw the need to bring DLSS and they also see the need to release an absolute monster GPU with 350W.

In the end it is a little more complex, because many GPUs can easily run 4K if you know your way around graphical settings and the RTX 3080 will be great at 4K yeah.

But there is this unwritten law that in order to qualify as a 4K GPU you need to run native 4K with all settings maxed out and then be able to hold 60fps. (This is a bit of banter, but there is also truth to it)
I have a question, when you use dlss, is the vram used in the same quantity as native 4k? (i already suspect the answer but better asking anyway).

I used dlss a couple of times but i'm not really a guy who play with afterburner stats on the side of the screen...
 
Last edited:
10GB is enough memory now, sure.

But on an $800-900 card, it takes the piss. 16GB should be minimum on a card 20% faster than the 2080 Ti and something someone might use for 3-4 years.
 

Ellery

Member
I have a question, when you use dlss, is the vram used in the same quantity as native 4k? (i already suspect the answer but better asking anyway).

I used dlss a couple of times but i'm not really a guy who play with afterburner stats on the side of the screen...

DLSS 4K would use less vram than native 4K
 
Last edited:

Ellery

Member
Good, let's just hope that every game is gonna support this technology at launch in the near future, this is gonna be the most important thing in pc gaming, more than rtx.

I hope so too. What I have seen from DLSS 2.0 in Death Stranding, Control and F1 2020 is impressive and makes me even more excited for Cyberpunk 2077
 

GymWolf

Member
I hope so too. What I have seen from DLSS 2.0 in Death Stranding, Control and F1 2020 is impressive and makes me even more excited for Cyberpunk 2077
I only tried the previous version on control and it was not that impressive, never tried the 2.0 version but people are ripping their balls out so i guess it's noticeably better.

do we know if cyberpunk support dlss at launch? (and i want to put the emphasis on LAUNCH)
 
Last edited:

Ellery

Member
I only tried the previous version on control and it was not that impressive, never tried the 2.0 version but people are ripping their balls out so i guess it's noticeably better.

Same. When I played Control and Metro Exodus it still was DLSS 1.0, but then I saw Control with DLSS 2.0 at a friends place and the difference was night and day. It is so much better and sharper than the blurry mess that was DLSS 1.0
 

carsar

Member
Well first the games are using more VRAM than they actually need. What monitoring tools show is not indicative of "close to hitting the VRAM bottleneck".
The easiest way to find out vram bottleneck is the monitoring of "BUS interface loading".
0-3% is OK, no vram bottleneck at all.
10-20%, you can lose about 10-20% of performance.

For exapmle, If I run Assasins creed origins on my 980 ti 4k ultra, I can play at 30+fps and bus loaded by the 0-3%. After 10-30 minutes my frame rate drops by 10% at the same places, and bus load increases to 10%. I can decrease resolution to 1080p and then return to 4k, that way return bus load to 0-3% and increase my frame rate. Yes, that is bad memory managements which results in memory leaks, but no one would fix this problems for me, only adding vram could fix this.
And I have no freezes, no texture pop-in, only 10% fps lost.
 
Last edited:

YCoCg

Member
Well first the games are using more VRAM than they actually need. What monitoring tools show is not indicative of "close to hitting the VRAM bottleneck".
My 8GB card outperforms the 11GB 1080 Ti in every 4K game I have seen so far.
Faster VRAM helps a lot and the jump for the 30 series is extremely big in that regard.

The time of the X80 series cards being the flagship are long gone and Nvidia always was very greedy on the VRAM. Especially in comparison to AMD.
The 1.5GB 580
The 2GB 680
The 3GB 780
Even the 3GB 780 Ti

And once the reviews come out the RTX 3080 10GB will completely outperform the 24GB RTX Titan in probably every single 4K game, but then again I think the 3080 is the 1440p card for Nvidia and the 3090 is the card aimed at 4K gaming.
But that's just admitting that people are paying MORE now and not getting what they used to. In what world is a 3080 meant to be a 1440p card when the 2080/Super/Ti was a base 4k card, and even then you've got the next gen consoles hitting higher, if I want 1440p I can just grab a PS4 Pro for like $250, not spend $1,200+ on a GPU
 

Ellery

Member
But that's just admitting that people are paying MORE now and not getting what they used to. In what world is a 3080 meant to be a 1440p card when the 2080/Super/Ti was a base 4k card, and even then you've got the next gen consoles hitting higher, if I want 1440p I can just grab a PS4 Pro for like $250, not spend $1,200+ on a GPU

Yep we are also paying more for other things. I hate it and I vote with my wallet as much as I can.

Nvidia dictates the price with their near monopoly and the people that buy those cards help continue the disproportional price hike. (This is oversimplified, because R&D costs are skyrocketing and the cards are bigger nowadays)

Also the games are getting more demanding. You won't be able to play the same games on the PS4 Pro as you will be with a new GPU. Just like The Witcher 3 is a joke now to run it wasn't in 2015. So we will have games in 2021-2025 that will make the new 30 series cards look old and outdated because newer games require more graphical performance. Maybe in 2023 the RTX 3080 won't even be enough for 1440p anymore. Who knows.

I don't like comparing consoles to PCs and especially graphics cards in that way as to measure their visual quality to resolution only. You can easily grab a 4.2 TF GPU for a PC and have games running at 1440p with roughly the same settings as the PS4 Pro. Like a 1650 Super or 5500 XT maybe.
 

Rikkori

Member
10GB is absolutely not enough for 4K gaming even today. People who dispute it are simply ignorant. Another thing about vram is - it's free image quality improvements, like HDR. A lot of stupid people kept saying for years "ah but what do you need more vram, if you're running bla bla resolution by the time you need more vram your gpu won't be fast enough" and other such non-sense, except you can absolutely run games at high resolutions with less than high-end cards and enjoy superb IQ PRECISELY because you can have vram do so much work in the form of more advanced textures or better streaming distance/details.
And NO! you can't use compression to overcome vram size limit. The compression advances are there to help with bandwidth not size. Screwing faster won't compensate for having a smaller dong. Same for compression vs vram buffer.

And it's not like it's not been proven over & over again, through HBCC tests and the like. The examples keep piling up - and don't think you're safe at 1440p!

FC5 w/ HD textures even at 1440p will run into and past 8 GB limit, so you'll see hitches from time to time. At 4K it runs well into 10+GB. Same story for similar games with HD packs like FFXV, etc.
Doom Eternal at 4K absolutely hitches with only 8 GB
Wolfenstein II needs more than 8 GB at 4K if you turn up the streaming.
Flight Sim 2020, do I even need to mention it? Beyond obvious example.
Greedfall, an AA game, but which has excellently detailed textures and with good draw distances? 10 GB Vram, 12 GB Ram.
Open world games in general, all start to run into vram requirements hardcore. HZD? 10 GB Vram & 12 GB ram. The Division 2? 10.5 GB vram & 12.5 GB ram.
Ray tracing? Easy +1-2 GB added to anything.
Don't even get me started with modding, or we'd be here all day.

And on and on it goes. And what, is someone looking at the geometric detail of Far Cry 5 and telling me that that's the most we can expect of next-gen? When that looks so current gen it looks from an era ago? Let's be real. As soon as the studios make the next-gen only switch for their games the vram requirements will sky-rocket in order to just KEEP Up with consoles, let alone if you want to go a step above. Which, why would you pay $800+ on a GPU alone if you didn't?

Let's keep it real - Nvidia is selling it for "only $800" precisely because they gimp the shit out of the vram. They're offering a tier up above a 2080 Ti, which is currently $1200ish, for less exactly so that they can keep raising prices a tier higher again. So if you want a card that will perform well AND have the longevity (through vram), then you have to pay extra. They sure as hell wouldn't want you to skip an upgrade cycle. That's just less money in their pockets, and they hate that.
 
Last edited:

Eliciel

Member
Okay. Calm down and then please explain to me what exactly you mean, because I feel like you are misinterpreting what I said or confusing me for something someone else said.
If you think there is any fanboyism going on on my end then please reread my posts in here, because I shit on both Nvidia and AMD where they deserve it. I buy the products I think best and my last card was an R9 290X which I owned for many years and was extremely impressed by. I also owned Nvidia GPUs.

Sorry I can't figure out your angle or what makes you upset.



Depends on what monitor/resolution do you have and what you plan on playing in the future?

I think the cards we are about to hear from are going to be more expensive than the 5700XT unless they are releasing an RTX 3060 as well and if that is the case I'd expect the 2nd market price of the 5700 XT to drop a bit, but I am reaching here. However the RTX 3060 might be further down the line.

The 5700 XT is a great card, but I think it will be dwarfed by the upcoming RTX 30 series and also the Big Navi products (however many there are) not only from a performance standpoint but probably a feature set standpoint as well.
4k or 1440p, but more regularly 4k now.
 

Evilms

Banned
b84d57101953dde49f98b135c289ddb120200829103539.png
 

Ellery

Member
4k or 1440p, but more regularly 4k now.

Well 4K is a demanding beast for the 5700 XT and then it depends on whether you are happy with lower framerates and/or turning down settings.

It is a difficult situation right now with upscaling technologies like DLSS and FidelityFX and a question what games are supporting that.

I would expect the upcoming cards to perform significantly better (that includes the 3080, 3090 and the big card from AMD based on RDNA2 that is set to be revealed this year) at 4K, but all of those are going to be expensive.

If I were in your situation and had my gaming focus at 4K I would probably sit on it another year or two and upgrade then, but I think it is reasonable to sell it and buy a more potent product when you are already unhappy and have graphically demanding games like Cyberpunk 2077 you are looking forward to.
I agree the situation is not easy. For 1440p it would be different, but with 4K it is a bit more personal to each person's requirements and expectations. I expect many 4K people (those that play modern AAA PC games) to sell their current cards and upgrade to an RTX 3090.
 

Eliciel

Member
Well 4K is a demanding beast for the 5700 XT and then it depends on whether you are happy with lower framerates and/or turning down settings.

It is a difficult situation right now with upscaling technologies like DLSS and FidelityFX and a question what games are supporting that.

I would expect the upcoming cards to perform significantly better (that includes the 3080, 3090 and the big card from AMD based on RDNA2 that is set to be revealed this year) at 4K, but all of those are going to be expensive.

If I were in your situation and had my gaming focus at 4K I would probably sit on it another year or two and upgrade then, but I think it is reasonable to sell it and buy a more potent product when you are already unhappy and have graphically demanding games like Cyberpunk 2077 you are looking forward to.
I agree the situation is not easy. For 1440p it would be different, but with 4K it is a bit more personal to each person's requirements and expectations. I expect many 4K people (those that play modern AAA PC games) to sell their current cards and upgrade to an RTX 3090.
What would you recommend for 1440p situations?
 

Miggytronz

Member
What would you recommend for 1440p situations?
I'm going 3090 route if the price is right, i want to put myself in position for that high tier 4K monitor market in a years time or less. I have a 2080S right now and play just about everything on MAX HIGH settings in 1440p, ULTRA gets finicky depending on the game but mostly ok.
 

Rikkori

Member
There are rumors that there will be 2 memory configs : https://www.tomshardware.com/news/geforce-rtx-3090-rtx-3080-rtx-3070-specifications-leaked

This is seemingly contradicted by the leak showing Zotac only has 3 models though they could have 3x2 or only have the 2x memory on the high end cards.
It's not contradicted by the leak, that's just for launch. We've known about the double vram variants for a long time, only we don't know when they will arrive (if at all) and how much extra will be. Both crucial questions.
 
Last edited:

Siri

Banned
HDMI 2.1 unlocks the 4K 60 FPS cap on my LG C9, so I might upgrade to a 3080 from an RTX 2080 TI just for that. I really want the 3090, but it’s beginning to look like a 700 watt PSU won’t cut it.
 

Jaxcellent

Member
Guys, im planning on buying a laptop, with a good Nvidia card in, you recommend, search for a deal on one now 2070 super ish, or wait for a laptop with a 3070 in it? I plan on using it with VR at (i wsnt a HP Reverb G2) also, 10th gen i7 and 16 GB ram is fine I guess?, Thanks.
 

KungFucius

King Snowflake
It's not contradicted by the leak, that's just for launch. We've known about the double vram variants for a long time, only we don't know when they will arrive (if at all) and how much extra will be. Both crucial questions.

I don't want a card with gimped VRAM. I also don't want a card that is overkill for TV/console like gaming and the handful of games I play on it each year. If the 3080 is underwhelming, I'll grab a PS5 and wait.
 

Rikkori

Member
Guys, im planning on buying a laptop, with a good Nvidia card in, you recommend, search for a deal on one now 2070 super ish, or wait for a laptop with a 3070 in it? I plan on using it with VR at (i wsnt a HP Reverb G2) also, 10th gen i7 and 16 GB ram is fine I guess?, Thanks.
reverb g2 requires a lot of horse power. if u want a laptop for it i hope u r rich.
 

Reindeer

Member
Seems a weird jump form 10 to 24 GB, maybe Nvidia will push an 18GB card later on?
I have feeling it only has that much Ram because they want to be sure that AMD doesn't beat them for performance crown. Like you say, the massive jump in VRAM doesn't make a lot of sense otherwise.
 
Last edited by a moderator:

Ellery

Member
What would you recommend for 1440p situations?

I would stay with the 5700 XT. My situation is not too different with the RTX 2080 at 1440p and I will probably wait another 2-3 years and upgrade to a new GPU with a new 4K monitor.
 

magaman

Banned
HDMI 2.1 unlocks the 4K 60 FPS cap on my LG C9, so I might upgrade to a 3080 from an RTX 2080 TI just for that. I really want the 3090, but it’s beginning to look like a 700 watt PSU won’t cut it.

What could you possibly need that power for? A 2080TI is more than enough to handle any game out today.
 

Hydroxy

Member
Those who are happy with 1440p 60/75hz will be fine with 3070 for a few years. 1080p and 1440p 60hz 3060 should do the job.
 

RoadHazard

Gold Member
Great. I think you wouldn't mind if I add $420 for that because that is what equals the PS+ online subscription over the course of the PS5 lifetime?

PS+ also gives me a ton of games, so I guess you then also have to subtract the value of those. Which will, in the end, mean you actually have to SUBTRACT from the PC build budget.
 

Ellery

Member
PS+ also gives me a ton of games, so I guess you then also have to subtract the value of those. Which will, in the end, mean you actually have to SUBTRACT from the PC build budget.

I could add all the free Epic Store games I am adding each month, but that would not be really fair for the PS+, because we would go further into adding extra budget to the PC side of things.
 

RoadHazard

Gold Member
I could add all the free Epic Store games I am adding each month, but that would not be really fair for the PS+, because we would go further into adding extra budget to the PC side of things.

Alright, so let's focus on the hardware then, and ignore these unknowns. Let's say I don't want to play online (I barely do).
 

Rbk_3

Member
Are you playing in slow motion? You're probably hitting 4k/120 on all competitive games already. Is there a need to play at 120fps for games that don't require laser focus?

Current cards can't do 4k/120 at all on OLEDs because they only have HDMI 2.0.
 

Denton

Member
So 3080 has same number of cores as 2080Ti and 1GB less memory, but higher bandwidth. I am really curious how much faster than 2080Ti is it gonna be and what its price is gonna be.
 

Ellery

Member
Alright, so let's focus on the hardware then, and ignore these unknowns. Let's say I don't want to play online (I barely do).

I am still not sure if you are wanting me to actually pick PC parts for you in November 2020 or if you are trying to make a point, because I am going to buy a PS5 in November and not a new graphics card.

So 3080 has same number of cores as 2080Ti and 1GB less memory, but higher bandwidth. I am really curious how much faster than 2080Ti is it gonna be and what its price is gonna be.

Easily 25% above the 2080 Ti and probably much more for raytracing/dlss.
 
Last edited:
I've heard rumor that DLSS 3.0 is gonna be game agnostic? Which would be an amazing achievement. DLSS 2.0 has been amazing but with the onus on sending the full game to Nvidia to let their A.I. algorithms on the Devs, so little games use it. The ones that do are amazing examples of the tech, but more games need it. Hopefully 3.0 really can be done without a lot of Dev input.
 
Last edited:

nemiroff

Gold Member
What could you possibly need that power for? A 2080TI is more than enough to handle any game out today.

Well.., not any actually.. One good example: Microsoft Flight Simulator - On a Reverb G2 VR headset (4K - 2160 x 2160 per eye @ 90Hz).


I've heard rumor that DLSS 3.0 is gonna be game agnostic? Which would be an amazing achievement. DLSS 2.0 has been amazing but with the onus on sending the full game to Nvidia to let their A.I. algorithms on the Devs, so little games use it. The ones that do are amazing examples of the tech, but more games need it. Hopefully 3.0 really can be done without a lot of Dev input.

I'm pretty sure that's already one of the key features of DLSS 2.0.

Edit:

However:
The catch to DLSS 2.0, however, is that this still requires game developer integration, and in a much different fashion. Because DLSS 2.0 relies on motion vectors to re-project the prior frame and best compute what the output image should look like, developers now need to provide those vectors to DLSS. As many developers are already doing some form of temporal AA in their games, this information is often available within the engine, and merely needs to be exposed to DLSS. None the less, it means that DLSS 2.0 still needs to be integrated on a per-game basis, even if the per-game training is gone. It is not a pure, end-of-chain post-processing solution like FXAA or combining image sharpening with upscaling.
 
Last edited:
Top Bottom