• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

8 Gb of Vram is not enough even for 1080p gaming.

BootsLoader

Banned
Except that’s a myth, and console games also release in shitty, unfinished states. Only first party games who have all the support, time, and help they need release games that run optimized. Even then as seen on Switch games can still be complete and utter shit.
Nah you’re wrong. Maybe some games are released in a bad state but no way it’s the same shitty situation as a PC. They are many third party games also that play well on consoles.
They have to optimize them because there’s no other way for them. The good thing about optimizing games on console is that you only have to optimize for one console and your game is optimized for all consoles (of the same company). On the other hand, PC is a highly versatile machine so you can’t optimize that easily. It takes more time and resources to do so.
 

BootsLoader

Banned
The thing is: If people were to play games at console settings and framerates, lots of complainers wouldn't be complaining about lack of VRAM or unoptimized ports, to be fair the most PC gaming complainers were those trying to enable RT knowing they should with no issues as they play other games
I understand what you mean but for me, comfort and convenience is top priority. I consider gaming as “relaxing time”, in my relaxing time I just want to…..relax, not thinking about TXER47900 VRAM bottlenecks Nvidia, update api or other issues.
 

Hydroxy

Member
Meanwhile me here coping with my 4gb 1650 laptop but I play on medium settings with fsr so it works fine. It is however disappointing that upcoming 4050 mobile has only 6gb while 4060 mobile has only 8gb
 

Patrick S.

Banned
I've always been nVidia, with a very short excursion to an AMD R9 290 blower jet.

I have a 3080 that was marketed as a monster card that together with DLSS will be a beast for ages. Now it turns out that because of the RAM situation, the card is basically already obsolete.

I'm sitting out this GPU generation, and my next card will be AMD or Intel, if they still make GPUs then. Meanwhile, I'm buying fresh AAA food for my PS5 and my Nintendo Switch, while the PC gets the scraps, like Barotrauma for €11 the other day, and Elite Dangerous Odyssey for €13 yesterday. Fed up with the money burning machine that is PC gaming.
 

gamer82

Member
rip my 1060 with 3gb ram, i upgraded my ram and my cpu and psu , i had intentions of getting a grahics card them came along ps5 and waiting to hear about the new gpus , im hardly playing it latley so no chance of getting a gfx anytime soon not at those prices but hey at least i have windows 11 now.:messenger_beaming:
 
Last edited:

SNG32

Member
it boils down too shitty ports as well. Hogwarts legacy runs shitty across the board for both consoles and PC. I know people shouldn’t expect 4k ultra raytracing at 60 fps at 8 gb vram but you should be able to get stable frame rates with medium to high settings at 1440p. The frame rates are even trash at low settings and this is with a decent mid range gpu.
 
Last edited:

Loxus

Member
This is the perfect time for AMD to unveil their 16-18GB 7700XT/7800XT, and poke some fun at Nvidia.

Literally the perfect time to be taking marketshare away from Nvidia, and they’re not capitalizing on it.
Funny thing is, they did. Except for the 7700XT/7800XT.
AMD ain't playing around.

Building an Enthusiast PC
More Memory Matters
Without enough video memory your experience may feel sluggish with lower FPS at key moments, more frequent pop-in of textures, or – in the worst cases – game crashes. You can always fine tune the in-game graphics settings to find the right balance of performance, but with more video memory you are less likely to have to make these compromises. For this enthusiast build, we recommend graphics cards with at least 16GB of video memory for ultimate 1440p and 4K gaming. For more mid-range graphics that are targeting 1440p, AMD Radeon™ offers 12GB GPUs that are excellent for QHD displays.

Peak Memory Usage in Newer Games
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.


636WG7p.png
V1ycixr.png
wxOo2Ld.png
6OMaEeC.png
XZUHsrc.png
 
Last edited:

64bitmodels

Reverse groomer.
Funny thing is, they did.
AMD ain't playing around.

Building an Enthusiast PC
More Memory Matters
Without enough video memory your experience may feel sluggish with lower FPS at key moments, more frequent pop-in of textures, or – in the worst cases – game crashes. You can always fine tune the in-game graphics settings to find the right balance of performance, but with more video memory you are less likely to have to make these compromises. For this enthusiast build, we recommend graphics cards with at least 16GB of video memory for ultimate 1440p and 4K gaming. For more mid-range graphics that are targeting 1440p, AMD Radeon™ offers 12GB GPUs that are excellent for QHD displays.

Peak Memory Usage in Newer Games
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.


636WG7p.png
V1ycixr.png
wxOo2Ld.png
6OMaEeC.png
XZUHsrc.png
they barked but they didn't bite.... he was asking for a 7700xt/ 7800xt which would be more powerful than the previous and offer a better alternative to Nvidia with low vram. they're currently just saying "look at how much more VRAM our GPUs have!"
 

rodrigolfp

Haptic Gamepads 4 Life
That’s why I love consoles. You just slide in the disc and play. It costs a lot of money and a lot of time (time is more valuable than everything else) to play a game.
Play with 100~200+ ms of input lag, deadzone, shitty IQ, lower frame rate, etc etc, while paying $70 and there is nothing you can do about it other than play the PC version.
 
Last edited:

SmokedMeat

Gamer™
Funny thing is, they did. Except for the 7700XT/7800XT.
AMD ain't playing around.

Building an Enthusiast PC
More Memory Matters
Without enough video memory your experience may feel sluggish with lower FPS at key moments, more frequent pop-in of textures, or – in the worst cases – game crashes. You can always fine tune the in-game graphics settings to find the right balance of performance, but with more video memory you are less likely to have to make these compromises. For this enthusiast build, we recommend graphics cards with at least 16GB of video memory for ultimate 1440p and 4K gaming. For more mid-range graphics that are targeting 1440p, AMD Radeon™ offers 12GB GPUs that are excellent for QHD displays.

Peak Memory Usage in Newer Games
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.


636WG7p.png
V1ycixr.png
wxOo2Ld.png
6OMaEeC.png
XZUHsrc.png

Waking people up is good, but they should’ve been ready to roll out their 4060/4070 competitors.

The 4060 especially is a giant joke in 2023, and only an uneducated fool would buy one off Jensen.
I guess AMD’s looking to sell off more 6000 series cards, and reality is 6700XT/6800XT buyers don’t really need to upgrade the way 3060ti/3070/3070ti users do.
 

Loxus

Member
they barked but they didn't bite.... he was asking for a 7700xt/ 7800xt which would be more powerful than the previous and offer a better alternative to Nvidia with low vram. they're currently just saying "look at how much more VRAM our GPUs have!"
This is AMD basically confirming more VRAM is better for the 7700XT/7800XT.

Leaks confirm at least 16GB for them.
I would assume 7700XT - 12GB (3 MCD)/192-bit and 7800XT - 16GB (4 MCD)/256-bit.
1 MCD is 4GB
4 MCD = 16GB

AMD's RDNA 3 Graphics

Navi32

  • gfx1101 (Wheat Nas)
  • Chiplet - 1x GCD + 4x MCD (0-hi)
  • 30 WGP (60 legacy CUs, 7680 ALUs)
  • 3 Shader Engines / 6 Shader Arrays
  • Infinity Cache 64MB (0-hi)
  • 256-bit GDDR6
  • GCD on TSMC N5, ~200 mm²
  • MCD on TSMC N6, ~37.5 mm²
 
Last edited:

64bitmodels

Reverse groomer.
Come on now, I know that PC is master race blah blah blah, but when it comes to optimization and convenience, consoles are unmatched, you gotta admit that. It’s a fact.
first of all, when was the last time you heard someone unironically say PC master race? it was probably in the prehistoric ages. Why do you console guys still think that we still believe in that stupid PCMR meme?

Second, everyone's admitted that since 2020. Even 3 years later no PC can match the price to performance shown with the Xbox Series X, PS5, or even Series S. But like... shoving it in our face and talking about how GLAD you are that you have a console and everything's so simple in a PC oriented thread is just.... everyone can SEE what you're doing. Just because you have more convenience, it doesn't suddenly make consoles a smarter purchasing decision, and there's no reason to come waltzing in here saying "SO GLAD I GOT A CONSOLE JUST PUT IN DISC AND PLAY!!!". We'll simply get cards with better VRAM and move on with our lives.
 
Last edited:

BootsLoader

Banned
Play with 100~200+ ms of input lag, deadzone, shitty IQ, lower frame rate, etc etc, while paying $70 and there is nothing you can do about it other than play the PC version.
Please elaborate in this. I don’t have those problems on console. Give me some examples.
 

BootsLoader

Banned
first of all, when was the last time you heard someone unironically say PC master race? it was probably in the prehistoric ages. Why do you console guys still think that we still believe in that stupid PCMR meme?

Second, everyone's admitted that since 2020. Even 3 years later no PC can match the price to performance shown with the Xbox Series X, PS5, or even Series S. But like... shoving it in our face and talking about how GLAD you are that you have a console and everything's so simple in a PC oriented thread is just.... everyone can SEE what you're doing. Just because you have more convenience, it doesn't suddenly make consoles a smarter purchasing decision, and there's no reason to come waltzing in here saying "SO GLAD I GOT A CONSOLE JUST PUT IN DISC AND PLAY!!!". We'll simply get cards with better VRAM and move on with our lives.
Master race is humor. Maybe my humor is bad.
I said why I love consoles and suddenly it bothers you? If you like PC more, play on PC. I didn’t said that it’s bad in any meaning.

Different people have different opinions. I shared mine, did not know that I insulted you.
 

SlimySnake

Flashless at the Golden Globes
vwXMnUP.jpg


Path tracing enabled, 1440p, DLSS3. BRO I thought I needed 12 to 16 GB Minimum for next-gen? Whats going on?

Fucking dumbasses.
Exception not the rule. Hardware Unboxed covered almost a dozen game and almost all the new games had the same problem. Cyberpunk is technically a game from 2020.

It does look like that by adding Path tracing they almost doubled their VRAM usage from 5GB to 9GB.
 

64bitmodels

Reverse groomer.
I said why I love consoles and suddenly it bothers you?
In a vaccum i wouldn't care, but there's been a recent trend of console players on GAF entering negative PC threads and just.... saying why they love the convenience of having a console. It's been happening since these shitty PC ports went into overdrive in 2022.

it's clearly done to get a rise out of people and stir up warring, otherwise people who play on PS5/Xbox would not give a single flying fuck about what's happening on PC. That's why i'm so critical when i see posts like that on a thread like this, it's fucking bait meant to incite arguments.
 
Last edited:

The Cockatrice

Gold Member
Hardware Unboxed covered almost a dozen game

Half of them ran fine on 8GB and the other ones that didnt were bad ports. Starting now, after all the hate CDPR got for how shit CP2077 ran, if any game without raytracing demands more VRAM than a fully path-traced, open world game, I'll give it my bad port seal.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Please elaborate in this. I don’t have those problems on console. Give me some examples.
God of War 2018, RE Remakes, RDR2... Or every game ever still locked to 20~30fps without option for native resolutions, and forced shitty Vsync putting their lag too high.
 
Last edited:

Spyxos

Gold Member
vwXMnUP.jpg


Path tracing enabled, 1440p, DLSS3. BRO I thought I needed 12 to 16 GB Minimum for next-gen? Whats going on?

Fucking dumbasses.
Did you get lost in the woods? Wrong game there was a raytracing patch not texture resolution patch. And the game is old.
 
Last edited:

Bojji

Member
Half of them ran fine on 8GB and the other ones that didnt were bad ports. Starting now, after all the hate CDPR got for how shit CP2077 ran, if any game without raytracing demands more VRAM than a fully path-traced, open world game, I'll give it my bad port seal.

Most games released on pc in the last few years were classified as "bad ports" (at least at launch). Shader stuttering and now high VRAM requirements... "Bad ports" are the norm.

In your screenshot cyberpunk required more than 8GB of memory (and even with PT game has the same shitty textures It had in 2020)...
 

Kataploom

Gold Member
I understand what you mean but for me, comfort and convenience is top priority. I consider gaming as “relaxing time”, in my relaxing time I just want to…..relax, not thinking about TXER47900 VRAM bottlenecks Nvidia, update api or other issues.
Those are barely any worries on PC actually, the problem is so absurdly magnified that seems to make others think they're the daily life of PC gamers lol. That's why these frequent wave of bad ports day 1 are being talked so much, it was never this common, but even then most problems are coming to people maxing out the games way over consoles settings or wanting to run them at very high frame rates, for which they're sometimes some engines not designed to even to these days.

I've seen people complaining that some game won't let them go "above 90fps" and that for them is a "bad port". Nothing to do with these day 1 fiascos we're having recently tho.

My experience with PC is mostly "turn on and play whatever from my store", sometimes I have to mod something but that's mostly for stuff like the usual japanese port without japanese VA or without 60 fps mode, but those are things that you would eat as it on console without the possibility to change anyway.
 

BootsLoader

Banned
In a vaccum i wouldn't care, but there's been a recent trend of console players on GAF entering negative PC threads and just.... saying why they love the convenience of having a console. It's been happening since these shitty PC ports went into overdrive in 2022.

it's clearly done to get a rise out of people and stir up warring, otherwise people who play on PS5/Xbox would not give a single flying fuck about what's happening on PC. That's why i'm so critical when i see posts like that on a thread like this, it's fucking bait meant to incite arguments.
God of War 2018, RE Remakes, RDR2... Or every game ever still locked to 20~30fps without option for native resolutions, and forced shitty Vsync putting their lag too high.
Those are barely any worries on PC actually, the problem is so absurdly magnified that seems to make others think they're the daily life of PC gamers lol. That's why these frequent wave of bad ports day 1 are being talked so much, it was never this common, but even then most problems are coming to people maxing out the games way over consoles settings or wanting to run them at very high frame rates, for which they're sometimes some engines not designed to even to these days.

I've seen people complaining that some game won't let them go "above 90fps" and that for them is a "bad port". Nothing to do with these day 1 fiascos we're having recently tho.

My experience with PC is mostly "turn on and play whatever from my store", sometimes I have to mod something but that's mostly for stuff like the usual japanese port without japanese VA or without 60 fps mode, but those are things that you would eat as it on console without the possibility to change anyway.
Ok,
ok,
and ok.
 

Mr Moose

Member
PS4 games have expiration dates? We are talking gaming overall. Plus RDR2 and RE remakes on current gen (RE4R) still have 200+ of lag and the deadzones (REs).
They don't use 8GB VRAM (the PS4 games, I think most for PS4 is 5? And Pro 5.5?), aren't $70 and most of those listed don't have native current gen versions (except Resi Evil). Input lag is kinda crap in some though.
 

rodrigolfp

Haptic Gamepads 4 Life
They don't use 8GB VRAM (the PS4 games, I think most for PS4 is 5? And Pro 5.5?), aren't $70 and most of those listed don't have native current gen versions (except Resi Evil). Input lag is kinda crap in some though.
I didn't say they use or that they all have all those problems I mentioned at the same time.

If RE4R doesn't use more than 8GB for Vram on PS5 and XSX then the ports are even worse than we know, because they could and should.

Games without native current gen versions are another problem consoles have.

He asked some examples. If I would list all games with problems never fixed on consoles, oh boy...
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Half of them ran fine on 8GB and the other ones that didnt were bad ports. Starting now, after all the hate CDPR got for how shit CP2077 ran, if any game without raytracing demands more VRAM than a fully path-traced, open world game, I'll give it my bad port seal.
Almost all of the latest games have this issue. Ive been bitching about this for the last few months. Gotham Knights, Forspoken, Callisto Hogwarts, Witcher 3, RE4 all have really poor RT performance. Not all are related to VRAM like TLOU but lack of vram definitely doesnt help.

PCs will NEVER get optimized ports. You can go back 2-3 generations and every PC port releases with issues. PCs are meant to brute force through those poor optimizations and these games do exactly that unless you turn on RT which increases VRAM usage or enable ultra textures and boom, those same cards simply crash and simply do not perform according to their specs.

This will only continue as devs release unoptimized console ports. Yes, console ports. You think TLOU is properly optimized on the PS5? Fuck no. 1440p 60 fps for a game that at times looks worse than TLOU2? TLOU2 ran at 1440p 60 fps on a 4 tflops polaris GPU with a jaguar CPU. PS5 has a way better CPU and a 3x more powerful GPU. Yet all they managed to do was double the framerate. Dead Space on the PS5 runs at an internal resolution of 960p. That is not an optimized console game I can promise you. PCs just like consoles are being asked to brute force things, and the AMD and Nvidia cards with proper vram allocations can do exactly that.

Respawn's next star wars game is a next gen exclusive. FF16 is a next gen exclusive. Both look last gen but i can promise you, they will not be pushing 5GB vram usage like cyberpunk, rdr2, and horizon did. Those games still look better than these so-called next gen games, but it doesnt matter. They are being designed by devs who no longer wish to target last gen specs. And sadly, despite the fact that the 3070 is almost 35-50% faster than the PS5, the vram is going to hold it back going forward.
 

64bitmodels

Reverse groomer.
Respawn's next star wars game is a next gen exclusive. FF16 is a next gen exclusive. Both look last gen but i can promise you,
they are next gen in all but where it matters. most third party devs simply use next gen as a way to get away with terrible optimization. why push the medium forward when we can use the extra headroom to slack off a bit more?
 
Last edited:

The Cockatrice

Gold Member
Is that RAM or VRAM?
Either way, it's above 8GB, which falls under the 12GB minimum for 1440p.

It's VRAM per process, not the dumb shit everyone uses, allocated VRAM. Either way, it's fully path traced, maxed, a technology that wont see proper day light until the next 6 years and it uses 9gb. You can play games fine on 8GB with a few settings lowered as long as you dont play bad ports maxed or get scared by gaf. Anyway Im out of this topic.
 
Last edited:
Why the hell are people still talking about last gen games? Ofcourse 8 gigabyte is enough for CP2077 it's minimum required hardware is a toaster.

And I have no dog in this race no real interest in triple A games at all.
 
Top Bottom