• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

8 Gb of Vram is not enough even for 1080p gaming.

DeepEnigma

Gold Member
Really because I have a 2060 in a laptop and I was able to run Hogwarts Legacy at 1080p with 40~60fps with a mix of high/ultra settings. The likes of Halo: Infinite I'm hitting high 70s/low 70s at max settings.
Both cross gen games built with 2013 hardware in mind.
 
Last edited:

Marlenus

Member
Enable RT and DLSS see what happens.

One card is better at one thing, the other at another thing. U choose.

Nobody's fault then yourself.

I bought a 3080, i have no illusion the 10gb is going to limit itself in the future.

Plague tale was with RT, hogwarts was with RT, Callisto was with RT, RE4 was with RT.

Steve even showed that DLSS to 1080p does not always fix it. IE the 3070 can't even handle 720p in some titles with RT on.
 

Stuart360

Member
Funnily enough Hogwarts and TLOU both averaged around 7gb vram usage at 1080p with High textures.
Resi Evil 4 averaged around 7.5gb vram usage and that was with the '6gb txture' option.

I still havent seen a game use more than 8gb vram usage at 1080p, and even if some games do start doing that, we can turn down settings if needed (i have 11gb vram by the way).
 
I don't know about some of those games but I had no issues with Dead Space, Plague Tale and Resident Evil 4 on my RTX 2060 Super at 1080p, high/very high settings, no RT, 60 FPS.
 

Kenpachii

Member
When you consider vram is the only bottleneck present in these examples, it's a problem.

It's an embarrassing showing for Nvidia cards with this vram configuration (and similar). Essentially you're looking at having to drop the settings to medium in order to be playable while the AMD card of similar overall ability can still run the game at high/ultra at good framerates without any problems.

Then don't buy 8gb cards simple as that, nvidia drops a 24gb 3090 for the v-ram people, its to expensive buy AMD or wait. Nobody forces you to buy anything.

v-ram has always been a limitation on nvidia gpu's when nvidia sits at the top because nvidia is competing then against itself, they are not idiots and don't release another 1000 series gpu's.
 

Wvrs

Member
Both cross gen games built with 2013 hardware in mind.
Is there any game in the video in the OP that isn't cross-gen or could be except for the fact it just wasn't a focus for them? With PS3-PS4 I felt like I was playing new games that simply weren't possible on earlier hardware but with PS4-PS5 it just feels like higher resolution, higher framerate and a bit more detail.
 

Kenpachii

Member
Plague tale was with RT, hogwarts was with RT, Callisto was with RT, RE4 was with RT.

Steve even showed that DLSS to 1080p does not always fix it. IE the 3070 can't even handle 720p in some titles with RT on.

My point is, 3070 is better at rast performance because of DLSS and has better RT performance, the 6080 has more v-ram and is worse at everything else. Different cards do different things. U choose what u want. Not that hard.

Nvidia wants you to have v-ram issue's because it makes you upgrade to there next series of GPU's. That's the whole point.
 
Last edited:

GHG

Member
Then don't buy 8gb cards simple as that, nvidia drops a 24gb 3090 for the v-ram people, its to expensive buy AMD or wait. Nobody forces you to buy anything.

v-ram has always been a limitation on nvidia gpu's when nvidia sits at the top because nvidia is competing then against itself, they are not idiots and don't release another 1000 series gpu's.

You don't have to tell me. I just recently upgraded from one of their 8GB cards because it wss starting to struggle to a 4090. This isn't a problem for me, I tend to take my own advice, especially when the situation is as obvious as this one.
 

ToTTenTranz

Banned
no matter how i look at this, it seems like real bad business practice to release a game which 80% of your target audience cant play
good luck with unrealistic ram requirements, the majority of pc gamers just going to play other things


Nonsense. No one is losing any audience. Those games play will perfectly fine with your 8GB VRAM GPU.
You just need to play at a significantly lower image quality settings than the people with 12 to 16GB VRAM.

And if that bothers you too much, next time just listen to all the people saying "X amount of RAM is too little for the medium-long term" and don't buy a GPU with VRAM designed by planned obsolescence.

Regardless, most people just go with the game's recommended settings and call it a day. They won't even notice how far the IQ is from the max settings.
 

64bitmodels

Reverse groomer.
if anything linux should have lower idle vram footprint than windows. though I have no idea how stuff works with linux
OK I read your post more carefully and came up with a similar guide for Linux

kill hardware acceleration for Steam and Discord as usual
use a window manager such as DWM or I3 rather than a desktop environment, those consume far less VRAM (and general resources) than a desktop environment, especially DEs like Gnome and KDE. Also, they don't bloat up more with usage as much as DEs and Windows does. i3 also has a shortcut ability to reboot i3 in place (Windows key, shift + R)

and ofc kill any background processes using up VRAM as usual, though there's more likely to be far less of them on linux
 
Last edited:
you can play tlou with high textures at 1440p or 4K DLSS performance. 1080p is a cakewalk



- Disable hardware accerelation for discord+steam if you use them (most critical ones)

- Untick GPU accerelated rendering in web views in Steam's settings

gNF8dlT.png


- Open CMD, use " taskkill /f /im dwm.exe " Don't worry, it won't kill the DWM, it will just reset it. This will reduce dwm.exe's VRAM usage if your system was open for a bit of time. You can create a batch code if you want to and keep it on your desktop. Run it before running a game.
- Go to task manager, details tab, select column, tick "dedicated gpu memory usage". observe what gobbles up your VRAM. turn off everything you can.

ideally you can have
a) 150 200 mb idle vram usage at 1080p/single screen
b) 250-350 mb idle vram usage at 1440p/single screen
c) 400-600 mb idle vram usage at 4k/single screen


if you can get your idle vram usage to 300 mb at 1440p; you can use 7.4 gb worth of texture/game data and ran the game smoothly without problems (evidenced by the video, and even recording itself takes vram)

- if you're on w11, and do not use widgets uninstall it as it uses 100 to 200 mb of vram.
Open powershell with admin rights
winget uninstall "windows web experience pack"
if it gets installed again through microsoft store, disabie it via group policy editor

- or upgrade if you have muneh. if you can't reduce your idle vram usage below 500 mb or simply don't want to, devs won't cater to your multitasking needs.

I warned everyone that you would have to sacrifice on textures / multitasking ability back in 2020. I bought 3070 at MSRP price at launch knowingly that I'd have to turn down from Ultra. however you can still get decent/acceptable image quality out of it.

I will keep using my GPU with aforementioned tricks; as long as I'm not forced into PS2 textures.

Somehow when I read all this I’m so happy I can just turn on my ps5 and get the optimum (maybe not best if you have a 4090) experience. No fuss. Just turn on and go. I have a 3070 on pc and it’s ok for most games. I’m too old now to fiddle with settings infinitely so it’s nice to be able to just sit down, turn on the power and you’re done.
 
Users should be more demanding with Nvidia instead of doing damage control with these statements, it is clear that 8 Gb of vram was low even at launch for some graphics cards released in recent years (if even a PS4 had 8 gb in 2013. ..), Nvidia knows this and it is more than possible that it does it on purpose to force the purchase of new graphics cards just because of this vram issue.
 

RobRSG

Member
Maybe because better IO system comes into play on the console and that's compensated for by increased memory requirements on PC.
Maybe, but only if you believe stuff like the second GPU hidden inside the Xbox One PSU.

This kind of conjecture will get you upgrading to whatever they want you to buy next.
 

Bojji

Member
8GB of VRAM is no longer good enough.

RTX3070 8GB vs 6700XT 12GB





I had 8GB VRAM GPUs since 1070 (and then 2060S, 3070, no GPU, 5700XT, 3060TI) and for many years this was more than enough but even back in 2020 when i was buying 3070 i knew from the start that this will be a problem very soon. And after buying gaming GPU again in November last year i started to see how many games have problem with this amount of memory in 4k resolution even with some DLSS/FSR reconstruction. And with newer games even lower resolutions are staring to have problems.

People denying reality ITT are funny...
 

Rentahamster

Rodent Whores

To be a little more clear, these are probably the prices now, which raises the question of why Nvidia can't make new versions of the 3070 with more VRAM, or at least let their AIB partners do that like they used to do back in the day. A couple of years ago, when the 3070 released, prices were a different. However, even at 2019 prices, it probably wouldn't have been that hard to include more VRAM from the start, especially considering how high the markup on these cards were/still are. They were probably focusing their attention on crypto miners and taking advantage of the huge demand spike from COVID. A perfect storm of profit seeking and (probably) planned obsolescence. It would have been Nvidia's self interest to milk this for all it's worth. Good for them, bad for us.

(2019 article)

Electronic Components Dealers lists various Micron GDDR5 and GDDR6 chips, pricing for 2,000 units. From these it can be seen that GDDR6 is currently much more expensive than GDDR5 at the moment. Going from 8 to 16 Gigabyte would bring with GDDR6 memory probably around 100 to 150 US dollar more.

The mentioned prices correspond to a purchase quantity of 2,000 pieces. Manufacturers of video cards are likely to buy larger quantities, which means they could get the parts cheaper. 3dcenter.org estimates the possible discount at 20 to 40 percent, the prices in the table below would only be estimates.
 
Last edited:

ToTTenTranz

Banned


Yes, volatile memory is really cheap nowadays. Samsung is closing entire production lines because of how many unsold memory chips they're storing away in warehouses, as they lost like 96% in operating profit from the same quarter in the previous year.


The 4060 / 4060Ti, assuming they're using an 8-channel 256bit bus, had better come equipped with 16GB by default, or the thing is going to get pretty bad results in 2023 games.
AMD's Navi 32 cards are definitely coming with 16GB as they're replacing Navi 21's 6800/6900XT.

I think we really need non-binary graphics memory. DDR5 already allows for it, but for graphics we'll need to wait until GDDR7 or later to get it.
 

Marlenus

Member
Yes, volatile memory is really cheap nowadays. Samsung is closing entire production lines because of how many unsold memory chips they're storing away in warehouses, as they lost like 96% in operating profit from the same quarter in the previous year.


The 4060 / 4060Ti, assuming they're using an 8-channel 256bit bus, had better come equipped with 16GB by default, or the thing is going to get pretty bad results in 2023 games.
AMD's Navi 32 cards are definitely coming with 16GB as they're replacing Navi 21's 6800/6900XT.

I think we really need non-binary graphics memory. DDR5 already allows for it, but for graphics we'll need to wait until GDDR7 or later to get it.

4060 is going to be a 128 bit bus.

N32 will be 16GB for the 256 bit parts but could be 32GB on that bus. Massive overkill for their performance tier though so AMD might stick with 16GB for the 7800XT. They have a decision to make for the 7700XT which was probably going to use 3 MCDs attached to an N32 GCD for 12GB of vram. They can still go that route but pricing may need to be nearer to $400 than to $500. The other option would be to stick with 16GB and just use a cut down N32 for this tier.

N33 with 128 bit will be the curious one. Probably would expect it to come in at 8GB but could be 16GB and that might allow them to stay nearer to $350 at launch.
 

Kataploom

Gold Member
Depends on how much you paid.

If i pay 500$ for a 1080p experience should i have to click left? I feel like i shouldn't. Not for texture quality at least.
But high and ultra texture quality are mostly for 4k right now, unless it's a shit half baked port like TLOU, you should be good on medium at 1080p imo.

BTW 8gb for 1080p + DirectStorage should be even overkill considering consoles have barely 13GB that they have to share with non graphic stuff (so around 8 GB to 10 GB max in most games) for freaking 4K.

But the reality is that the video is considering games running at way higher settings than on consoles at double the framerate anyway, just at 1080p (most popular PC resolution anyway).

I wouldn't touch Nvidia cards for a while unless I have money to throw at their top end and not break the bank at all, their RT advantage is being harmed by their lack of VRAM on mid range cards, I had VRAM issues only on two cards: integrated Vega 8 and 1060 3gb, both could give way more with only 1GB more but stayed at 70% to 80% of GPU usage with 99% of VRAM allocation a lot of times... Can understand AMD capping an integrated GPU but a dedicated one? Fuck off Nvidia, not going back to the dark days.
 

LostDonkey

Member
So my next build I need at least 64gb of system ram and 32gb of Vram
Users should be more demanding with Nvidia instead of doing damage control with these statements, it is clear that 8 Gb of vram was low even at launch for some graphics cards released in recent years (if even a PS4 had 8 gb in 2013. ..), Nvidia knows this and it is more than possible that it does it on purpose to force the purchase of new graphics cards just because of this vram issue.

Excuse me sir. But that's my Avatar.
 

winjer

Gold Member
We also have to consider that there was a time when NVidia didn't enforce an amount of vram on it's GPUs.
For example, this meant that the 8800GT had a 256MB version, another with 512MB and another with 1GB. The normal was 512MB.
But AIBs could tweak the amount to hit more price points.
Up into Kepler there were GPUs with varying amounts of vram. There were AIBs making some GTX 680 with 4GB of vram.
If it wasn't for NVidia limiting what AIBs could put in their cards, we now could have variants of the 3070 with 16GB of vram.
 
Last edited:

Kataploom

Gold Member
If you're playing at 1080p you might as well get a console.
That's if you only play console ports and don't care about reduced settings or lack of 60 fps, remember that even many console players play on 1080p monitors (you can see many Switch and XSS setups like that) so resolution is the last thing they'd care about as long as it gets the minimum. Those games in the video are running at way higher settings and at double the framerate than console Quality modes.
 

Crayon

Member
Yea, anyone remember games being optimised? Fun times.

Optimized for what though? The games have med-high texture settings to lower if you want to do 1440 on an 8gb card and it works. A card can't play ultra settings forever. There has to be a bottleneck somewhere, eventually.
 

GHG

Member
We also have to consider that there was a time when NVidia didn't enforce an amount of vram on it's GPUs.
For example, this meant that the 8800GT had a 256MB version, another with 512MB and another with 1GB. The normal was 512MB.
But AIBs could tweak the amount to hit more price points.
Up into Kepler there were GPUs with varying amounts of vram. There were AIBs making sme GTX 680 with 4GB of vram.
If it wasn't for NVidia limiting what AIBs could put in their cards, we now could have variants of the 3070 with 16GB of vram.

Yep, the 2 660's I had in SLI were the 6GB variants which actually matched the bus width (meaning all the VRAM was available at full speed). As a result those two puppies paired together far outlasted the 970 at similar cost.

The variations in terms of VRAM along with additional options like SLI meant you could really get creative and shop around for the best deal for any given use case.

Now it's just take what's given, and as such anyone not able to shop at the very top end gets shafted on the Nvidia side of things.

I'm pretty surprised this is an issue AGAIN when this has been happening since gaming existed.

Older people should know better by now.

Well at least it makes it 100% evident who the PC gaming old timers are.
 
Last edited:

Trunim

Member
8gb minimum vram for star wars is bullshit. I will be able to play that game just fine on my 6 gb. People that are so stubborn and happy about it just try to justify their 700 dollar purchase because they’ve got no games to put their gpu to the test.
 

yamaci17

Member
OK I read your post more carefully and came up with a similar guide for Linux

kill hardware acceleration for Steam and Discord as usual
use a window manager such as DWM or I3 rather than a desktop environment, those consume far less VRAM (and general resources) than a desktop environment, especially DEs like Gnome and KDE. Also, they don't bloat up more with usage as much as DEs and Windows does. i3 also has a shortcut ability to reboot i3 in place (Windows key, shift + R)

and ofc kill any background processes using up VRAM as usual, though there's more likely to be far less of them on linux
yes u got the gist of it

here's what happens with regular casual users; the famous trio chrome(or any chromium based hardware accerelated browser) + discord + steam

npIu7Bq.png


hardware accerelation off, a whole lot of VRAM is saved and...

htaLigl.png


[4K DLSS performance btw]
again, what do I know? not every user will be inclined to sacrifice on multitasking capabilities (in this case, you still will be able to use them. but they will be laggier since you run them on CPU)

most people are simply unaware that a healthy chunk of their GPU is used by windows+other apps. what is worse is, the more hardware accerelated software you run, the more Windows uses (by itself, not the software itself. DWM.exe gets bloated. something to do with how these application interact with desktop compositor I'd have to guess).

it is still unacceptable that you can't run hardware accerelated software with ease of mind with such GPUs. but I'm glad the option to disable it exists.
 
Last edited:

SmokedMeat

Gamer™
That’s why I love consoles. You just slide in the disc and play. It costs a lot of money and a lot of time (time is more valuable than everything else) to play a game.

That’s great, but anyone can have that same console experience of not being able to do anything about the way a game is.
 

GHG

Member
worse than that, for a solution they’ll cough up another 1k+ to the company that duped them in the first place.

I think it's a stretch to say they were "duped". This isn't a 970 situation where a portion of the vram isn't available at advertised speeds, it is very much what it says on the tin.

The problem is a combination of ignorance (which in some cases is actually intentional because a person convinced themselves they must purchase a particular product instead of saving a bit more and/or looking at alternatives) and poor advice being given in the PC building/gaming community which has been rampant in the last couple of years.

On an unrelated note it's at least it's become clear why there have been so many reports of stuttering and "bad textures" or texture pop-in in recent releases. Granted, there are some titles that will stutter even on a 4090 (which is almost always due to CPU bottleneck issues), but I'd be willing to bet most of those are due to people running up against the vram wall. The fact that the 6800 is so smooth in these tests along with the respective vram usage stats tells us exactly what the issue is. Developers simply need to be honest and up the VRAM recommendations for their games where appropriate.
 

bender

What time is it?
yes u got the gist of it

here's what happens with regular casual users; the famous trio chrome(or any chromium based hardware accerelated browser) + discord + steam

npIu7Bq.png


hardware accerelation off, a whole lot of VRAM is saved and...

htaLigl.png


[4K DLSS performance btw]
again, what do I know? not every user will be inclined to sacrifice on multitasking capabilities (in this case, you still will be able to use them. but they will be laggier since you run them on CPU)

most people are simply unaware that a healthy chunk of their GPU is used by windows+other apps. what is worse is, the more hardware accerelated software you run, the more Windows uses (by itself, not the software itself. DWM.exe gets bloated. something to do with how these application interact with desktop compositor I'd have to guess).

it is still unacceptable that you can't run hardware accerelated software with ease of mind with such GPUs. but I'm glad the option to disable it exists.

Ellie is not amused by the lack of hardware acceleration when web browsing.

ellie.jpg
 
Top Bottom