• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

8 Gb of Vram is not enough even for 1080p gaming.

Bojji

Member
well I said at console settings... so no RT.

also running the game on Ultra is fucking stupid and isn't representative of how people play the game.
I bet with reasonable settings it runs fine with RT enabled.




it's an open world game, so of course textures are less crisp than in a linear game.
but still no issue with RT reflections enbaled. and the issues I do get are not VRAM related

I completed CP with ultra RT on my 2560x1080 monitor when i had 3070 (bought in on launch). Game was optimized for just released Nvidia ampere lineup so they scaled everything to work on 3070 without problem.

this is not the same as then, pc gaming is a lot more lucrative these days, and if you tailor your game to the 1% who can afford the top hardware you going to lose out on all that pc gaming money, which makes no sense especially in the current financial situation. Once again the myth of pc gaming being about the biggest fastest best seems to permeate the internet. pc gaming is really about the opposite for the majority. Yes hardware requirements change over time but the majority of users lag behind those requirements by years. Just look at the big sellers on pc.

Majority of PC gaming is around free to play games that have super low requirements. But ports of current gen AAA games will have huge requirements, you won't be able to get the same settings on 8GB GPUs as on 16GB consoles. Developers never really cared about steam surveys and such, if you have inadequate hardware you have to buy better parts, that's it.
 

damidu

Member
it doesn’t matter if its hw issue or dev issue,
the results are the same. devs will not bother to tweak stuff, theyll go with console specs as baseline.
on that regard, yeah don’t buy a 8gb gpu
 

nkarafo

Member
Doesn't matter, NVIDIA will still sell you the 4060 with 8GB for 500$+

I play on a 1080p, 240hz monitor so i don't need as much VRAM as most other people who play on 4K screens. But even i would not buy a card that only has 8GB. I did that mistake when i bought the 960 2GB card back in the day when "2GB is enough for 1080p" and within a single year i had problems with it. Never cheap out on VRAM.
 
The first half of the video was pure shits & giggles. Imagine having better RT hardware just to get slam dunked by RDNA2 due to a lack of VRAM :messenger_grinning_sweat:

You should check PTR in video in OP, it's fine without RT on 3070 but with RT enabled performance tanks to single digits.

Cyberpunk is old at this point and it never had very good textures in vanilla version.

A lot of folk use Cyberpunk as a point of reference with 8GB cards, but they should really try the game at 4K to see how bad some textures are. It has such a wide range in texture quality that its no surprise 1440p is fine with RT on an 8GB card - 4K (Quality DLSS) with maxed RT only uses around 11/12GB on my 3090. On the odd occasion it can end up looking a bit goofy with highly detailed characters in a low detail environment
 

64bitmodels

Reverse groomer.
sad state of affairs where gamers have to upgrade their hardware because developers refuse to optimize for the majority market share of PC hardware.

& Nvidia scamming us with low VRAM cards. in 2020 the 3070 was a beast, now its reduced to a midrange card because of fucking VRAM.... if it were 16gb it'd be a bloodbath in all of those comparisons
 
Last edited:

SmokedMeat

Gamer™
Doesn't matter, NVIDIA will still sell you the 4060 with 8GB for 500$+

I play on a 1080p, 240hz monitor so i don't need as much VRAM as most other people who play on 4K screens. But even i would not buy a card that only has 8GB. I did that mistake when i bought the 960 2GB card back in the day when "2GB is enough for 1080p" and within a single year i had problems with it. Never cheap out on VRAM.

My first GPU, an EVGA GTX 960 with 2GB VRAM. It was a learning experience trying to run games with that card.

Got a free copy of Arkham Knight, which besides being an awful port I couldn’t play - went beyond my VRAM limits.
 
Last edited:

hlm666

Member
yes not being able to run games at ultra textures purely because of vram limitations is very saddening but some people act like its the end of the world
Any idea what is going on with the last of us in the video from the op. It shows per process vram use and the 3070 is like 3GB vs the 6800 at 9GB. Is this really a lack of vram issue or is something else going on here? If it was similar to the other vram issues you have shown where they only use 80% of the capacity it doesn't seem like an equal % not used between the cards. The rest of the Vram is used but how much of that is not the game process? I've never seen such a weird vram split, I must be dodging all the bullets here (ie shit games).
 
Last edited:

nkarafo

Member
My first GPU, an EVGA GTX 960 with 2GB VRAM. It was a learning experience trying to run games with that card.

Got a free copy of Arkham Knight, which besides being an awful port I couldn’t play - went beyond my VRAM limits.

My eye opener with this card was Resident Evil 7. The 960 GPU is way more powerful than what the base PS4 has. I should be able to play the game with console settings @ 1080p with no issues, i didn't even want to have better quality. But i couldn't even do that because my VRAM was filled with those settings. I had to play @ 900p or something for a smooth experience.

Basically, my GPU couldn't even be used properly because of the tiny VRAM bottleneck. The 4060 will have the exact same problem with 8GB in 2023.
 
Last edited:

Bojji

Member
sad state of affairs where gamers have to upgrade their hardware because developers refuse to optimize for the majority market share of PC hardware.

& Nvidia scamming us with low VRAM cards. in 2020 the 3070 was a beast, now its reduced to a midrange card because of fucking VRAM....

Developers were never doing it in the past with 256MB, 512MB, 1GB, 2GB and 4GB cards. Same goes for one, two and four core CPUs. This is the state of PC gaming, most of the time devs make straight ports and it's no surprise that 16GB consoles have games that use this amount of memory and 8GB cards shits themselves trying to render them.

It's fully on Nvidia that they made powerful cards (more powerful than consoles) that started to suffer with current gen only games, they knew this from the beginning. They want you to buy 40 series GPUs, that's it.
 

Gaiff

SBI’s Resident Gaslighter
sad state of affairs where gamers have to upgrade their hardware because developers refuse to optimize for the majority market share of PC hardware.

& Nvidia scamming us with low VRAM cards. in 2020 the 3070 was a beast, now its reduced to a midrange card because of fucking VRAM.... if it were 16gb it'd be a bloodbath in all of those comparisons
I wouldn't say that. This is 2023. 8GB cards have been available for almost 10 years now. The 3070 with its 8GB frame buffer was always a scam. It should have been 12.
 

Rossco EZ

Member
get a 6800xt instead. 12gb will be better than 8gb but youll still have issues. you just gotta get a 80 series to stand a chance
hard to find in the uk atm though for a decent price, i’ll check again tonight as i’m ordering parts this week but yeah i haven’t seen any or again they seem a bit out of budget
 

Thaedolus

Member
I kinda feel like 8GB really should be enough for 1080p but also it’s silly that cards these days only have 8GB when my 5+ year old 1080ti had 11.

That said, if devs can’t get games running on 16GB cards with 32GB system RAM smoothly, they should be fired.
 

Spyxos

Member
If you ask gaf, they'll tell you its not enough, youll need at least 32 GB of vram in a year. YOu'll be fine with 8GB, 10 GB, 12GB at 1440p until the next gen consoles as long as you dont expect to use every visual setting at max like a dumbass.
You should have at least as much Vram as the new consoles and that is 12gb and 12.5gb Vram. So that you have peace in this console generation.
 

yamaci17

Member
Any idea what is going on with the last of us in the video from the op. It shows per process vram use and the 3070 is like 3GB vs the 6800 at 9GB. Is this really a lack of vram issue or is something else going on here? If it was similar to the other vram issues you have shown where they only use 80% of the capacity it doesn't seem like an equal % not used between the cards. The rest of the Vram is used but how much of that is not the game process? I've never seen such a weird vram split, I must be dodging all the bullets here (ie shit games).
could be a monitoring error. game wouldnt function that well with only use that much vram

regardless, I'm able to get stable 45 fps lock with tight %1 lows (nearly perfect) at 1440p high textures (only visual effects and volumetrics are reduced to low)



45 fps lock is because my aging zen+ cpu. there is still some spare gpu power

the whole "you need ps2 textures to get playable framerates with 8 gb cards" is a hyperbole
 
Last edited:

Marlenus

Member
I'm not afraid to click left on settings to get 60fps.

And if that's not good enough I can wait until I upgrade in a year or three. Plenty of games get released that don't need top of the line hardware.

For people who purchased lower tier GPUs this would be fine. Even the 3060Ti to a degree. What is not so fine is when a $600 GPU suffers the same fate in just 2 years by something that is entirely foreseeable and easily avoidable. If the 3070Ti had been 16GB it would be a monster card that would last a good while and would have been a pretty good buy at $600.

The 3060 12GB makes it even worse as people who went that far down the stack now can get a much better gaming experience in the latest games than those who went with higher tier models. That should not really happen either.
 

SmokedMeat

Gamer™
sad state of affairs where gamers have to upgrade their hardware because developers refuse to optimize for the majority market share of PC hardware.

& Nvidia scamming us with low VRAM cards. in 2020 the 3070 was a beast, now its reduced to a midrange card because of fucking VRAM.... if it were 16gb it'd be a bloodbath in all of those comparisons

Nvidia loves it. They want everyone upgrading, hence the reason why they cut their cards down.

No way would I be dropping almost a grand on a GPU with 12GB of VRAM in 2023. I don’t care how the fake frame tech is.
 

Marlenus

Member
You should have at least as much Vram as the new consoles and that is 12gb and 12.5gb Vram. So that you have peace in this console generation.

I would probably go for 16GB for 1440p to allow a little slack for sub optimal ports that are bound to occur.

4K should really be 20-24GB and 1080p should be around 12GB with 8GB being okay at the very bottom level.

This should then be fine until we get out of the cross gen phase of the PS6 and Xbox Next generation and requirements will probably increase again.
 

Bojji

Member
If you ask gaf, they'll tell you its not enough, youll need at least 32 GB of vram in a year. YOu'll be fine with 8GB, 10 GB, 12GB at 1440p until the next gen consoles as long as you dont expect to use every visual setting at max like a dumbass.

Great advice. If he listens to you he will hate you some time into the future.

12GB is in my opinion good minimum for current gen games in 1440p resolution (and should be into foreseeable future). But for higher res and with RT it might be not enough.
 

SmokedMeat

Gamer™
My eye opener with this card was Resident Evil 7. The 960 GPU is way more powerful than what the base PS4 has. I should be able to play the game with console settings @ 1080p with no issues, i didn't even want to have better quality. But i couldn't even do that because my VRAM was filled with those settings. I had to play @ 900p or something for a smooth experience.

Basically, my GPU couldn't even be used properly because of the tiny VRAM bottleneck. The 4060 will have the exact same problem with 8GB in 2023.

I played at 900p all the time with that card. I remember GTA V being another game that my VRAM couldn’t handle.

Thankfully the 1070 was a nice step up for me with 8GB of VRAM. To think the 4060 is still going to use the same amount of VRAM as a card from mid 2016.
 

Kenpachii

Member


I have the Rtx 3070 and Rtx 3060ti both are used for 1080p gaming and as you can see in this video the 8Gb Vram is not enough at all. Even at such a low resolution. Don't even get me started on higher resolutions.

00:00 - Welcome back to Hardware Unboxed
01:25 - Backstory
04:33 - Test System Specs
04:48 - The Last of Us Part 1
08:01 - Hogwarts Legacy
12:55 - Resident Evil 4
14:15 - Forspoken
16:25 - A Plague Tale: Requiem
18:49 - The Callisto Protocol
20:21 - Warhammer 40,000: Darktide
21:07 - Call of Duty Modern Warfare II
21:34 - Dying Light 2
22:03 - Dead Space
22:29 - Fortnite
22:53 - Halo Infinite
23:22 - Returnal
23:58 - Marvel’s Spider-Man: Miles Morales
24:30 - Final Thoughts


All these games play on 8gb v-ram cards, so it is enough.
 

ChiefDada

Gold Member
Alas, the biggest question is: How come TLOU Part I runs perfectly on a system with 16GB total memory, when in the PC you need 32GB RAM + 16GB video RAM to even display textures properly?

anvZRVF.gif
 

GHG

Member
If you ask gaf, they'll tell you its not enough, youll need at least 32 GB of vram in a year. YOu'll be fine with 8GB, 10 GB, 12GB at 1440p until the next gen consoles as long as you dont expect to use every visual setting at max like a dumbass.

Not sure how anyone can watch the evidence in the video that serves as the basis for this discussion and come to that conclusion but ok. Nobody serious has said 32gb will be required either.

Nvidia have sold many of their customers down a river with their lower vram cards but part of me thinks that was all part of the plan.
 
Great advice. If he listens to you he will hate you some time into the future.

12GB is in my opinion good minimum for current gen games in 1440p resolution (and should be into foreseeable future). But for higher res and with RT it might be not enough.

I need receipts because by that dumb logic that you and most of gaf have, more than 60% of the PC community will no longer be able to play games on PC. How are you people so fucking stupid? Who has a PC and plays PC games with everything maxed even on a high-end GPU? Who buys bad ports? Common sense seems to have gone the drain here.


RzRgfK2.png
 
Last edited:

Kenpachii

Member
It's a way for NVIDIA to troll their customers and make them feel bad for their purchase so they feel forced to upgrade.

Exactly, nvidia always plays the v-ram game when they got no competition so they can force through sponsorship the next generation people to buy into the new gen again.
 

yamaci17

Member
I think it is funny how many of these are AMD sponsored titles that look no better than titles from 2018.

That’s a new time low for Hardware Unboxed, milking the cow to the max here.

Alas, the biggest question is: How come TLOU Part I runs perfectly on a system with 16GB total memory, when in the PC you need 32GB RAM + 16GB video RAM to even display textures properly?
16 gb ram + 8 gb vram is enough to "display textures properly" with decently stable performance;

native 1440p



I've better results without recording. sadly i dont have a capture card. recording also uses ram+vram+gpu resources

1440p dlssq high textures no hitches smooth operation;

 
Last edited:

nkarafo

Member
I played at 900p all the time with that card. I remember GTA V being another game that my VRAM couldn’t handle.

Thankfully the 1070 was a nice step up for me with 8GB of VRAM. To think the 4060 is still going to use the same amount of VRAM as a card from mid 2016.

Yeah, i bet they will do the same thing they did with the 3060 line. They will release the 4060 ti with 8GB and when the stocks get sold they will release the non ti 4060 with 16GB or something so everyone can feel guilt for any card they bought previously and make them feel forced to upgrade.

I just wish all those customers decide to buy an AMD card after that experience.
 
Last edited:
Not sure how anyone can watch the evidence in the video that serves as the basis for this discussion and come to that conclusion but ok. Nobody serious has said 32gb will be required either.

Nvidia have sold many of their customers down a river with their lower vram cards but part of me thinks that was all part of the plan.

"play"

71S9anX.jpg


4qZyRkh.jpg


At least it runs I guess.


U were the one bashing and calling them out for sucking AMD but when it comes to VRAM you're just going to suck their dicks like a good boy now eh? I already disproved the video above in a previous post and so did other users. But yes, bad ports are clearly making your point valid. I should just block you.

Ah yes HardwareUnboxed the msot unreliable source of information. I instantly clicked the RE4 because I knew it was BS. First of all, texture quality in RE does not change, besides low/medium/high ofc, but the vram allocation on High it does as the name implies. Just allocates more VRAM. There is no difference between having 8GB or 6GB High settings.

A Plague Tale runs perfectly fine on 8GB VRAM and as for the rest of the big names he pulled out like Callisto Protocol, TLOU, Hogwarts and fucking FOrspoken do I need to remind everyone that these are absolute bad ports? Also shocking that maxing out settings for almost no visual benefits consumes a lot of VRAM.

You can make SKyrim consume 24GB of VRAM. Jesus, so much misinformation over and over.

Fucking reminder every single time for the oblivious people here: you can absolutely play games at 1080p, 1440p and even 4k if you use DLSS/FSR and dont max out dumb shit settings that consume A TON of vram for no visual benefits such as shadows(I think in RE4 max shadows almost consume 1gb of VRAM and youd need a 5000% zoom to notice the difference) or volumetric shit in clouds/fogs/rays. Common sense that these youtubers have none and gaf doomsayers need a reality check.
 

nkarafo

Member
"play"

71S9anX.jpg


4qZyRkh.jpg


At least it runs I guess.

What really grinds my gears is that the texture quality with a 8GB VRAM card can be so awful in modern games. I mean, you had much better texture quality with 2GB VRAM games in the base PS4 generation or even during the PS3/360 gen. The medium quality textures that fill 8GB VRAM in TLOU are literally below PS2 quality. The example in Harry Potter you provided looks like something out of the N64. Why the fuck are N64 textures gobbling 8GB VRAM? WHat is wrong with developers these days?
 

GHG

Member
U can kill any card with insane settings, some cards are not meant to play games at certain settings.

When you consider vram is the only bottleneck present in these examples, it's a problem.

It's an embarrassing showing for Nvidia cards with this vram configuration (and similar). Essentially you're looking at having to drop the settings to medium in order to be playable while the AMD card of similar overall ability can still run the game at high/ultra at good framerates without any problems.
 

Marlenus

Member
U were the one bashing and calling them out for sucking AMD but when it comes to VRAM you're just going to suck their dicks like a good boy now eh? I already disproved the video above in a previous post and so did other users. But yes, bad ports are clearly making your point valid. I should just block you.

You made a claim but that does not 'disprove' anything. The video evidence is far far far stronger than your claims. The frame time graphs are there, the IQ comparison is there and the gameplay comparison is there. The evidence is right in your face.
 

SlimySnake

Flashless at the Golden Globes
How much does VRAM cost anyway?

How can consoles put 16GB of vram in a $500 console but nvidia cant?

IIRC, the bom on the GTX 580 was around $80 and nvidia was charging over $500 for it. I refuse to believe that these GPUs are so much more expensive these days. Seems to me that nvidia wants their insane profit margins to continue. They made what $10 billion in profits last year? they can surely afford to add some more vram to these cards.
 

GHG

Member
U were the one bashing and calling them out for sucking AMD but when it comes to VRAM you're just going to suck their dicks like a good boy now eh? I already disproved the video above in a previous post and so did other users. But yes, bad ports are clearly making your point valid. I should just block you.

Because AMD got it right in terms of VRAM while Nvidia got it wrong with their non flagship cards. No company is perfect, we can point out their respective flaws.

Block me if you want to if it helps you feel ok. It won't change the reality of the situation though.
 
Top Bottom