• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

why are NVIDIA and AMD so stingy with VRAM?

64bitmodels

Reverse groomer.
the new AD106 GPU from Nvidia that was leaked (assumed to be a 4060) suggests performance similar to a 3070 ti, which is not a very good deal to begin with considering the gap between the 3090 and 4090, but the thing that really cuckles my fuckles is the rumored VRAM of 8gb. in 2023.

I don't get this shit, it feels like this card should be 12gb at the very least considering today's modern system requirements for many games. the 3060 previously was 12gb so why not just extend that to the TI version, why are we regressing again here? is VRAM not cheap or something? I felt like this would be the generation where we truly jump up in VRAM and leave the days of single digit vram behind us yet everything below the 70 ti is 8gb.

Why can't we live in a world where the 4060ti is 12gb?
 

LordOfChaos

Member
About 6 dollars a gigabyte


Jumping up to 12 would actually be quite a bit of margin cost eaten into, you might think well yeah I'd pay that difference anyways, but they're also competing with each other to offer X performance at Y price point and 8 vs 12 probably isn't moving the needle enough to eat into their margins or jump the mrsp up. There's also the validation and such.

*not saying this is the way it should be, but it's the way it do be
 

Gaiff

Member
That's strictly an NVIDIA problem, not an AMD one. The 6700 which sits three tiers below the 3080 has as much VRAM. The 6800 which is a tier below has 60% more VRAM. Historically, AMD hasn't been stingy with their VRAM in over 10 years. The same cannot be said for NVIDIA, especially when it comes to their upper mid-range to high-end chips. Only their top SKUs have generous amounts of VRAM.
 
Last edited:

64bitmodels

Reverse groomer.
That's strictly an NVIDIA problem, not an AMD one. The 6700 which sits three tiers below the 3080 has as much VRAM. The 6800 which is a tier below has 60% more VRAM. Historically, AMD hasn't been stingy with their VRAM in over 10 years. The same cannot be said for NVIDIA, especially when it comes to their upper mid-range to high-end chips. Only their top SKUs have generous amounts of VRAM.
oh yeah, that's true
AMD are fine with their VRAM, then
still wonder why Nvidia thinks we need less than we do though. most 8gb cards are going to age like milk going into the mid 2020s when games are optimizing for 10gb thanks to the PS5 and Series X requirements
 
Last edited:

64bitmodels

Reverse groomer.
Jumping up to 12 would actually be quite a bit of margin cost eaten into, you might think well yeah I'd pay that difference anyways, but they're also competing with each other to offer X performance at Y price point and 8 vs 12 probably isn't moving the needle enough to eat into their margins or jump the mrsp up. There's also the validation and such.
it's the difference between having a smooth gaming experience at higher settings and being a complete and utter stutterfest. 4 more GB makes a world of difference even at 1440p.


DLSS3's generated frames can help with the amount of frames but it can't help with frametimes or stutters. plus, the 4080 costs 300 dollars to manufacture, which means they're making 4x the amount of cash they dumped into this. If the 4080 is that expensive then surely they could sacrifice a bit of profit just to futureproof the far cheaper to manufacture midrange better ehh?
 
Last edited:

StreetsofBeige

Gold Member
They want you to upgrade to the next gpu in 3 years. Make the current gpu too good at too low of a price and gamers will milk it.

I wasnt a PC gamer at the time, but wasnt the 8800xt card a fantastic care PC gamers milked for ages? Nvidia doesnt want to happen again.
 

64bitmodels

Reverse groomer.
I wasnt a PC gamer at the time, but wasnt the 8800xt card a fantastic care PC gamers milked for ages? Nvidia doesnt want to happen again.
a more recent example would be the 1080ti. pretty much everyone that owns a 1080ti won't need to upgrade for the next 3 years at least.... and that card came out in 2017
 

LordOfChaos

Member
They want you to upgrade to the next gpu in 3 years. Make the current gpu too good at too low of a price and gamers will milk it.

I wasnt a PC gamer at the time, but wasnt the 8800xt card a fantastic care PC gamers milked for ages? Nvidia doesnt want to happen again.

I think that was also a case of launching within a month of the PS3 but being way more powerful than either console GPU, and that console generation went on for a quite long time which held game scope to what those consoles could do. So it stayed usable for almost that whole 7 year period if you really wanted to stretch it.

Also that generation's very limited 512MB or worse, 256+256
 
Last edited:

Kataploom

Gold Member
How are 3080 users dealing with Forspoken? Doesn't that game check the amount of VRAM and limit the texture detail quite a lot?
 
GDDR6X isn't cheap or anything. People complain all day about how GPU's cost too much and then turn around and also complain there's not enough VRAM.

The 4090 has 24 GB of VRAM, there you go. With a price to match.
 
These cards have a specific performance target.

If you remove all bottlenecks, you could use a 4060 for 4k gaming.

Nvidia doesn't want that. They want you to get 4080 above for 4k gaming.

4060 they will target 1080p crowd. It will be fine for that.
 

Pagusas

Elden Member
Memory cost money, money cuts Into margins, the name of the game is to have as high margins as possible.

As someone who does a lot of production work on my machine, cards like the 4090 have been a godsent. Years ago before the titans/xx90’s arrived, if you wanted a lot of memory for production use you were forced down nvidia’s Quadro line, which usually cost 5k+ for the performance of a consumer card with more memory and production optimized drivers. Lately life has been good and the savings incredible compared to what we used to have to do for our small studio. So I guess I see the current cards differently then most, the 3090 and 4090 were both incredibly cheap for what they offered for hybrid use and had a ton of vram.
 
Last edited:
it's the difference between having a smooth gaming experience at higher settings and being a complete and utter stutterfest. 4 more GB makes a world of difference even at 1440p.


DLSS3's generated frames can help with the amount of frames but it can't help with frametimes or stutters. plus, the 4080 costs 300 dollars to manufacture, which means they're making 4x the amount of cash they dumped into this. If the 4080 is that expensive then surely they could sacrifice a bit of profit just to futureproof the far cheaper to manufacture midrange better ehh?
When the 3080 came out, how many people said the VRAM limitation would be a huge deal? How big a deal has it actually been?
 

ancelotti

Member
There's a big difference between VRAM allocation and actual usage.

Forspoken will allocate itself up to 14-15GB maxed out at 4K. However, when it comes to in game performance, the 3080 (10GB) is still faster than the 6900 XT (16GB) at all resolutions, and scales similarly to the 3090 (24GB).

 
No it won't. 8GB is going to be obsolete even for that before 2023 ends. Also, the card will cost 500 euros or more.

I don't see how anyone can defend this.

Not defending this. They should offer more so it doesn't remain a bottleneck in certain situations.

I am not gaming on pc for 3-4 yrs now, but shouldn't 8gb be good for 1080p?
 

Superbean

Neo Member
It's a 1080p card, they don't need much vram. If your playing higher res you have no taste and performance getting choked by lack of vram is the least of your worries.
1440p typically runs 6 to 8 GB. 4k 10 to 12. 1080p less, 4 to 6gb. Well optimized games like doom eternal use even less
 
About 6 dollars a gigabyte


Jumping up to 12 would actually be quite a bit of margin cost eaten into, you might think well yeah I'd pay that difference anyways, but they're also competing with each other to offer X performance at Y price point and 8 vs 12 probably isn't moving the needle enough to eat into their margins or jump the mrsp up. There's also the validation and such.

*not saying this is the way it should be, but it's the way it do be
Then costs to mount that tech onto their PCB in the grand scheme of things its prolly costing them more than 6$ per gigabyte.

Testing of parts dead units cuts into costs and such.
 
Last edited:
I refused to upgrade until there was a card with at least 16GB which is why I got the 4080. I know the 3090/Ti had 24GB but they were hard to find and more expensive. I was starting to feel like 8GB wasn't enough anymore and I wasn't going to get a card with just 10-12GB.

16GB is good enough right now for gaming but some games can use 9-11GB so I'm kinda wishing I had just got a 4090 instead with 24GB but whatever it's not a huge deal. I was mostly kicking myself for not getting a 4090 when I was messing about with Stable Diffusion which eats up all your VRAM. 24GB would have been amazing for AI rendering. I suppose if you're doing that then ideally you'd want a 48GB card lol
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Meanwhile Intel stick to 256bit bus and 8/16GB.

Nvidia dropping the bus widths is so weird to me.
I know people keep saying the larger cache makes up for it.
But I dont believe that.

i dont mind the VRAM amounts, but the reduction in bus width thats a travesty.
I will wait for the 320bit 4080'20G.
256bit is for the xx70s.
Going much lower is legit shocking.

The 4060Ti already looks shocking, I cant even imagine what the 4060 base is gonna be like
288GB/s memory......mate what is this a Series S?
O1x8W86.png




Why is AMD in this conversation.
Arent their current crop of cards 20 and 24.
And last gen they launched with 16GB only.
The lower tier cards not meant for 4K gaming were still 12GB
You had to go down to the practically 1080p cards to find 8GB.

3774-front.jpg
 

winjer

Gold Member
Meanwhile Intel stick to 256bit bus and 8/16GB.

Nvidia dropping the bus widths is so weird to me.
I know people keep saying the larger cache makes up for it.
But I dont believe that.

i dont mind the VRAM amounts, but the reduction in bus width thats a travesty.
I will wait for the 320bit 4080'20G.
256bit is for the xx70s.
Going much lower is legit shocking.

The 4060Ti already looks shocking, I cant even imagine what the 4060 base is gonna be like
288GB/s memory......mate what is this a Series S?
O1x8W86.png




Why is AMD in this conversation.
Arent their current crop of cards 20 and 24.
And last gen they launched with 16GB only.
The lower tier cards not meant for 4K gaming were still 12GB
You had to go down to the practically 1080p cards to find 8GB.

3774-front.jpg

AMD is better at giving more vram at the mid and high end. But at the low end, it's very bad.
For example, look at the 6500XT. It only has 4GB of vram. And to make it worse, only a 64bit memory bus. And to make it worse, only 16MB of Infinity Cache. And to make it even worse, it only supports 4 lanes of PCIe, so if a game needs to get data from main memory, which it will do a lot because it only has 4GB, it will be bottlenecked by such a meager PCIe bus bandwidth.
And NVidia managed to do as bad, with the GTX 1630 in 2022.
 

M1chl

Currently Gif and Meme Champion
I see that people completely ignores data, this is Forespoken 4k, I made only relevant card screen.

OV5yDpf.png


VRAM propaganda is pushed mainly by Radeon fanboys, don't suck on it. VRAM bandwidth is deficit in big resolution, capacity not so much (at least yet).
 
Laughs in 24GB currently used mostly for Overwatch 2 at 4K/120Hz

Meanwhile Intel stick to 256bit bus and 8/16GB.

Nvidia dropping the bus widths is so weird to me.
I know people keep saying the larger cache makes up for it.
But I dont believe that.

i dont mind the VRAM amounts, but the reduction in bus width thats a travesty.
I will wait for the 320bit 4080'20G.
256bit is for the xx70s.
Going much lower is legit shocking.

The 4060Ti already looks shocking, I cant even imagine what the 4060 base is gonna be like
288GB/s memory......mate what is this a Series S?
O1x8W86.png




Why is AMD in this conversation.
Arent their current crop of cards 20 and 24.
And last gen they launched with 16GB only.
The lower tier cards not meant for 4K gaming were still 12GB
You had to go down to the practically 1080p cards to find 8GB.

3774-front.jpg

Pretty crazy to think about. It's not that long ago entry level GPU's like the RX 550/560 & GTX 1050 had a 128-Bit bus, even the RTX 3050 does thinking about it. RTX 4050 gonna be 64-Bit at this rate :messenger_grimmacing_
 

winjer

Gold Member
I see that people completely ignores data, this is Forespoken 4k, I made only relevant card screen.

OV5yDpf.png


VRAM propaganda is pushed mainly by Radeon fanboys, don't suck on it. VRAM bandwidth is deficit in big resolution, capacity not so much (at least yet).

That graph is nice, but it doesn't show the high res mip maps that are not loaded in the game.
Still, this is such a poorly optimized game, that these comparisons are a moot point.
 

RoboFu

One of the green rats
I see that people completely ignores data, this is Forespoken 4k, I made only relevant card screen.

OV5yDpf.png


VRAM propaganda is pushed mainly by Radeon fanboys, don't suck on it. VRAM bandwidth is deficit in big resolution, capacity not so much (at least yet).
Psst.. no one gives a shit about forespoken. 🫤 lol 😂 and it specifically lowers settings you can’t control if it runs out of vram that you will probably never notice. A lot of vram hungry games will have an odd stutter or pause here and there if you push the vram to limit.

But since consoles are not topping out vram above 10 gigs you will probably be good gaming at 1440 with high settings for a while still in demanding games. It’s not that vram doesn’t matter it’s the consoles have dictated the limit.
 
Last edited:

Ev1L AuRoN

Member
That's just a way to artificially limit the GPU at 4k, so it won't cannibalize their higher end tiers. A 3070ti level card with decent VRAM and additions like DLSS3.0 would do 4k well enough for people to just ignore the 4080 and up.
 

M1chl

Currently Gif and Meme Champion
Psst.. no one gives a shit about forespoken. 🫤 lol 😂 and it specifically lowers settings you can’t control if it runs out of vram that you will probably never notice. A lot of vram hungry games will have an odd stutter or pause here and there if you push the vram to limit.

But since consoles are not topping out vram above 10 gigs you will probably be good gaming at 1440 with high settings for a while still in demanding games. It’s not that vram doesn’t matter it’s the consoles have dictated the limit.
c9LsYEa.png


In this very thread, much concern

That graph is nice, but it doesn't show the high res mip maps that are not loaded in the game.
Still, this is such a poorly optimized game, that these comparisons are a moot point.
Seems competent:


In a sense, that VRAM size isn't an issue, my counter culture narrative is that nVidia sold quite a few 3090, due to the fact that this exists. I would argue that gimping 4080 bandwidth will have more issues down the line.
 

Xyphie

Member
E3IJA2CWQAQKHqA.jpg:large


It's simply because DRAM prices have kind of stagnated on a $/GB basis for about 10 years now. When I bought my 4670K like ten years ago a DDR3 16GB kit was like ~100€. A DDR5 32GB kit is maybe 170€ or so? Compare that to the massive fall in the previous ten years 2003-2013.
 

Petopia

Banned
the new AD106 GPU from Nvidia that was leaked (assumed to be a 4060) suggests performance similar to a 3070 ti, which is not a very good deal to begin with considering the gap between the 3090 and 4090, but the thing that really cuckles my fuckles is the rumored VRAM of 8gb. in 2023.

I don't get this shit, it feels like this card should be 12gb at the very least considering today's modern system requirements for many games. the 3060 previously was 12gb so why not just extend that to the TI version, why are we regressing again here? is VRAM not cheap or something? I felt like this would be the generation where we truly jump up in VRAM and leave the days of single digit vram behind us yet everything below the 70 ti is 8gb.

Why can't we live in a world where the 4060ti is 12gb?
Wouldn't matter if youre gaming on 1080 p dude.
 

StreetsofBeige

Gold Member
E3IJA2CWQAQKHqA.jpg:large


It's simply because DRAM prices have kind of stagnated on a $/GB basis for about 10 years now. When I bought my 4670K like ten years ago a DDR3 16GB kit was like ~100€. A DDR5 32GB kit is maybe 170€ or so? Compare that to the massive fall in the previous ten years 2003-2013.
That brings back memories.

I remember my bro buying a 4 meg RAM block to add to his PC and it cost $200 cdn.
 

64bitmodels

Reverse groomer.
Wouldn't matter if youre gaming on 1080 p dude.
1080p is quickly falling into the same realm as 720p where it's starting to become sort of a 'last resort' low end resolution, we can't exactly keep using 1080p as a standard anymore when 1440p is today's 1080p.

I got 24? What’s the problems?
i mean for cards lower than 80 series VRAM is a joke. you're lucky to get more than 8gb
you own a 4090 so its clearly not a problem for you
 

DaGwaphics

Member
Both AMD and Nvidia seem to block the AIBs from clam-shelling the memory now as well. If they didn't, I'm sure we would have seen some 16GB 3060TIs or even 6650XTs.
 

bbeach123

Member
planned obsolescence .

I only used mid range GPU since 2004 , and more than half of the time I had to upgrade because the lack of Vram .
 
Last edited:
Usually the market dictates what the specifications are. While covid supply problems and crypto travesty happened, the market got a bit crazy, and we still see the aftereffects. If people were clearly buying product x because of y GB of RAM we would see those products winning. But high end got its pricey solution, mid range got pricier and accepted by some, but many are still lingering around with 1060 and the likes, the last cards having proper value for small money.
Will take some months and maybe years until the current weird phase will get again a bit more logical. Or the average card will stay much more expensive than a few years back, and have more RAM matching the better GPU, while many gamers will switch to buying used cards and staying longer on older products. Before we saw: buy, use and throw away more often. From an environment standpoint maybe currently actually better.

Just say "no thanks" if you feel not like you get your mones's worth with xyz.

I bought a used 6GB card recently because it was in the price range I feel comfortable with spending, anyway and especially for a used card.
 
Top Bottom