Bo_Hazem
Banned
Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.
Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.
That VRAM usage readout in RE2 is more like a vague estimation than an actual hard limit. DF tested it and mentioned the same thing. I've also tested this by doing 12GB on a 1060 6GB card without hitching or stuttering.
It will actually last you a lot longer than that.12 is barely bigger than 10.
The 3070 will last me until next generation.
Need to update that to $3000.
Need to update that to $3000.
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.12GB of memory is not what a future proofed high end card should have.
I wanted to buy a new PC this year but I'm not paying 1500$ at scalpers price just for the GPU for 12GB of memory.
Admittedly when it comes to this stuff I am quite ignorant on this subject.Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.
Allocate.
Your card will never use that 12gb of vram at an acceptable frame rate, so it's a moot point. Unless of course you're doing video editing or something.
Admittedly when it comes to this stuff I am quite ignorant on this subject.
That said the main reason I went with the 3090 is the amount of RAM as I wanted to eliminate that as a possible bottleneck as I too thought 10 GB was just too low.
It may be overkill for me gaming at 4k 160fps (when possible) and I will never need the RAM the 3090 has but I wanted it
I thought the 3090 was going to be a little future proofing for me but keep seeing the 4000 series will be announced this year and if the performance jump is there I am all in on the 4090
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.
Game will be targeting the ram amounts on the consoles this gen, whihc means like 10gb tops.
If you're going to upgrade every gen then future proofing is kind of irrelevant.Admittedly when it comes to this stuff I am quite ignorant on this subject.
That said the main reason I went with the 3090 is the amount of RAM as I wanted to eliminate that as a possible bottleneck as I too thought 10 GB was just too low.
It may be overkill for me gaming at 4k 160fps (when possible) and I will never need the RAM the 3090 has but I wanted it
I thought the 3090 was going to be a little future proofing for me but keep seeing the 4000 series will be announced this year and if the performance jump is there I am all in on the 4090
Uses, or allocates?HZD in 8K uses around 21GB of VRAM on 3090. On PC you'd aim for more than what the consoles can offer, high-end cards naturally.
I only really upgrade when the performance jump is worth it like the 2000 series was kind of a stinker imoIf you're going to upgrade every gen then future proofing is kind of irrelevant.
Uses, or allocates?
I briefly had a 3070 in the house while playing Cyberpunk ... and frame rate would TANK if "allocated" exceeded 8 GB on that card . The only way around it was to adjust settings to bring it under 8.Allocated means fuck all though. Unless I’m presented special K + MSI afterburner VRAM utilization (real). I think I can count on one hand games that exceeded dedicated 10GB VRAM usage and I personally can only remember Ubisoft ghost recon being the main culprit with shit ports or excessive ultra settings. Between 1080p to 4K it’s under a 1GB difference. ~700MB iirc. But exceeding 12GB? No.
Worst offenders being the game setting menus trying to offer an estimate of memory usage, they’re almost always off, it’s something that is mind boggling coming from devs. RE8 coming to mind with asking 12GB in settings when dedicated memory averages at 6.5 utilization, with the memory leak at launch, a lot of peoples were heavy on confirmation bias that 8GB would not be enough, when that was proven wrong further down the line. Same with Godfall where devs said 12GB would be required for max settings.. reality? 6.5GB… (trombone fail sound) Strange that so many AMD sponsored games would inflate numbers.
Cyberpunk 2077 max settings allocated is 8.5GB while the real dedicated is 6.4GB, in general you will be rasterization limited way before using all memory. Same with Microsoft Flight sim which will take all allocated space, but need 6-7GB in reality.
But but.. the future? You say. Well yes the future, as in consoles with IO management being the baseline for the upcoming generation, DirectStorage and sampler feedback being the equivalent on PC when games support it, it’ll even reduce this down further and save a ton of ram space as contrary to what’s being done now which is overloading VRAM with things you might need, consoles and PC APIs are heading to only loading what you need. Then DLSS, XeSS, UE5’s TSR..
So no, I don’t believe this argument.
You mean just for the GPU, right?Need to update that to $3000.
It’s not moot. Textures don’t require processing power primarily, it’s vram. It will be able to use higher res textures before an 10gb 3080 runs out of vram.Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.
Allocate.
Your card will never use that 12gb of vram at an acceptable frame rate, so it's a moot point. Unless of course you're doing video editing or something.
By the time games are using over 10gb for textures, your 3060 won't be playing at acceptable framerates. 3060 isn't a 4k card anyway.It’s not moot. Textures don’t require processing power primarily, it’s vram. It will be able to use higher res textures before an 10gb 3080 runs out of vram.
*paper launches
That is just not true. There are games that already want more than 10gb for highest textures.By the time games are using over 10gb for textures, your 3060 won't be playing at acceptable framerates. 3060 isn't a 4k card anyway.
Holy shit. Sadly that's how the market works, and it's the same for overpriced GPU's as for scalped consoles... They'll be there until people stop wanting to pay that much for them.As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.
Behold, a lone 3090 for $2300.
I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."
Some guy next to me overheard and replied, "Yeah, but it's worth it."
I looked at him like the idiot he deserved to be looked at as.
I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"
He nods enthusiastically and says, "Absolutely."
This is why we can't have nice things.
What a bot, hahaAs an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.
Behold, a lone 3090 for $2300.
I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."
Some guy next to me overheard and replied, "Yeah, but it's worth it."
I looked at him like the idiot he deserved to be looked at as.
I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"
He nods enthusiastically and says, "Absolutely."
This is why we can't have nice things.
You don't seem to be factoring in the current availability of an RTX 3090 at MSRP prices.As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.
Behold, a lone 3090 for $2300.
I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."
Some guy next to me overheard and replied, "Yeah, but it's worth it."
I looked at him like the idiot he deserved to be looked at as.
I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"
He nods enthusiastically and says, "Absolutely."
This is why we can't have nice things.
What games?That is just not true. There are games that already want more than 10gb for highest textures.
I can play heavy 4k games on my 3060 with reduced settings, however textures are always at the highest setting. They don’t need to be reduced because of the high vram, it’s great. An 8gb 3070 will choke at native 4k doom eternal with max textures.
Witcher 3 is basically running at near max at 4k60 for example on 3060.
You don't seem to be factoring in the current availability of an RTX 3090 at MSRP prices.
Its worth whatever someone is willing to pay for it
Just because something is jacked up in price from scarcity doesn't mean it's inherently worth that price. It may be the market value, but market value doesn't equal intrinsic value.
Does the item hold that same value when supply returns to normal? No.Its worth whatever someone is willing to pay for it
Allocating =\= requiring vram.What games?
On my 6800 I would run Insurgency Sandstorm with no texture streaming with the HD texture pack for better performance. That would allocate between 13 and 15gb of my vram. My 3080 with texture streaming still blows it away on performance.
This will become even less of an issue once DirectStorage is rolled out.
Doesn't matter its what its valued at right now and someone will pay that for it if they want it bad enough and have the moneyDoes the item hold that same value when supply returns to normal? No.
In capitalism, things are worth what people are willing to pay for it.Does the item hold that same value when supply returns to normal? No.
This is the reason why the prices won't go back to normal once the scarcity ends. No producer or retailer wants them to get lower.As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.
Behold, a lone 3090 for $2300.
I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."
Some guy next to me overheard and replied, "Yeah, but it's worth it."
I looked at him like the idiot he deserved to be looked at as.
I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"
He nods enthusiastically and says, "Absolutely."
This is why we can't have nice things.
Far Cry 6 is an AMD sponsored title with very strange performance. Wonder why that is.Allocating =\= requiring vram.
What games what? That I run at 4k? Damn near all of them.
Look i’m glad you like your 3080. It will still have vram issues before 3060. Far cry 6 already had an issue with the 10gb 3080 at least at launch. No matter how anyone wants to spin it, it’s beyond stupid that a 1080ti/3060 have more vram.
Far Cry 6 is an AMD sponsored title with very strange performance. Wonder why that is.
Doom External is more sensitive to bandwidth, otherwise the 1080ti would outperform the 2070 if vram was the issue.
DOOM Eternal Benchmark Test & Performance Analysis - 26 Graphics Cards Compared
DOOM Eternal is the long-awaited sequel to the epic DOOM series. There's even more carnage, and gameplay is super fast-paced. Built upon the id Tech 7 engine, visuals are excellent, and graphics performance is outstanding. We tested the game on all modern graphics cards at Full HD, 1440p and 4K...www.techpowerup.com
No, you got it wrong.Well, we can go around all day on this, and I won't so, again, I'll part by saying intrinsic value =/= market value. If you think a 3090 is objectively worth $2,300, more power to you.
Enjoy your 3080 bub. I don’t like purchase justifying arguments. Nobody is saying the 3060 is better than 3080! But you will have vram bottleneck, no getting around it. Wait until you see what vram the 4xxx series has. Cross gen period is also helping that 10gb vram buffer.
The fact is, you can have higher textures on a 3060 and time will reflect that more and more.
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.
Game will be targeting the ram amounts on the consoles this gen, whihc means like 10gb tops.
PC gaming is about being better than consoles otherwise it wouldn't have much sense.
What does that even mean? Better it what way? Unless you're one of those strange people who cannot see beyond AAA titles, then yeah, better framerate/resolution/visuals is the only benefit you'll get on PC, but there's WAY more than that, just the genre/IPs that aren't available/suited for consoles, like, take LoL playerbase alone for example - in what way do you think they should feel they're "better" than console players? They don; dive a fuck, they just enjoy their beloved game that's not even available on consoles.
As for the GPU from the topic, let's wait for the OG 3080 to run out of VRAM first, because really, most likely RTX I/O will show up before that card runs out of VRAM, let alone the new model with 12.
This is just sad. ultra settings are very over demanding for the visual return usually. I rather have the extra cash than spend 800 bucks on a single card. A card with 10gb vram that has less than a much older high end cards. But again, textures are basically free if you have the vram.Is 8GB of Vram enough for the 3070
For 1440p 8gb right now is OK. Doom Eternal will use over 7 on Nightmare, but at 4k it will over flow the memory buffer. I don't base cards on what they can do. It matters more to me on what they can't. Like, for example right if a card does not have enough VRAM for even one game at 4k? then it...forums.overclockers.co.uk
This isn't a purchasing justification topic, I'm simply stating that the vram differences are essentially irrelevant.
By your own logic you should have sprung for the 6800. The 3060 is only getting 38 fps in Far Cry 6 at 4k ultra. The 8gb 2070 super out performs it too.
The 3060 has 12gb because it uses a 192 bit bus. Meaning it could either have 6 or 12gb. The 3060ti has 8gb because it uses a faster 256 bit bus, so it could either have 4 or 8gb. Obviously 4gb is too low for new games.
AMDs marketing is working well I see.