• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New 12 GB 3080 launched.

Bo_Hazem

Banned
Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.

Z0a31z5.jpg
 

Elios83

Member
12GB of memory is not what a future proofed high end card should have.
I wanted to buy a new PC this year but I'm not paying 1500$ at scalpers price just for the GPU for 12GB of memory.
 
Last edited:

Xyphie

Member
Average_4K.png


3080 Ti and 3090 are essentially the same card (minor differences in SMs, BW and clocks), except for 12GB vs 24GB, so other than perhaps some extreme outlier it's safe to say that 12GB vs 24GB makes no difference.
 

Stuart360

Member
12GB of memory is not what a future proofed high end card should have.
I wanted to buy a new PC this year but I'm not paying 1500$ at scalpers price just for the GPU for 12GB of memory.
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.
Game will be targeting the ram amounts on the consoles this gen, whihc means like 10gb tops.
 

HeisenbergFX4

Gold Member
Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.

Allocate.

Your card will never use that 12gb of vram at an acceptable frame rate, so it's a moot point. Unless of course you're doing video editing or something.
Admittedly when it comes to this stuff I am quite ignorant on this subject.

That said the main reason I went with the 3090 is the amount of RAM as I wanted to eliminate that as a possible bottleneck as I too thought 10 GB was just too low.

It may be overkill for me gaming at 4k 160fps (when possible) and I will never need the RAM the 3090 has but I wanted it :)

I thought the 3090 was going to be a little future proofing for me but keep seeing the 4000 series will be announced this year and if the performance jump is there I am all in on the 4090
 

Bo_Hazem

Banned
Admittedly when it comes to this stuff I am quite ignorant on this subject.

That said the main reason I went with the 3090 is the amount of RAM as I wanted to eliminate that as a possible bottleneck as I too thought 10 GB was just too low.

It may be overkill for me gaming at 4k 160fps (when possible) and I will never need the RAM the 3090 has but I wanted it :)

I thought the 3090 was going to be a little future proofing for me but keep seeing the 4000 series will be announced this year and if the performance jump is there I am all in on the 4090

When SSD's are normalized less VRAM won't be an issue going forward. You still have normal RAM as well, unlike consoles. But games are designed around Xbox Series S, and at best XSX and PS5 for multiplats.
 

Bo_Hazem

Banned
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.
Game will be targeting the ram amounts on the consoles this gen, whihc means like 10gb tops.

HZD in 8K uses around 21GB of VRAM on 3090. On PC you'd aim for more than what the consoles can offer, high-end cards naturally.
 

Dream-Knife

Banned
Admittedly when it comes to this stuff I am quite ignorant on this subject.

That said the main reason I went with the 3090 is the amount of RAM as I wanted to eliminate that as a possible bottleneck as I too thought 10 GB was just too low.

It may be overkill for me gaming at 4k 160fps (when possible) and I will never need the RAM the 3090 has but I wanted it :)

I thought the 3090 was going to be a little future proofing for me but keep seeing the 4000 series will be announced this year and if the performance jump is there I am all in on the 4090
If you're going to upgrade every gen then future proofing is kind of irrelevant.
HZD in 8K uses around 21GB of VRAM on 3090. On PC you'd aim for more than what the consoles can offer, high-end cards naturally.
Uses, or allocates?

But we're still a ways off from 8k in gaming. At decent frame rates at least.
 
Last edited:

HeisenbergFX4

Gold Member
If you're going to upgrade every gen then future proofing is kind of irrelevant.

Uses, or allocates?
I only really upgrade when the performance jump is worth it like the 2000 series was kind of a stinker imo

If the leaks on the 4000 series are right about said performance jump I am jumping in on my main gaming PC
 

AGRacing

Member
Allocated means fuck all though. Unless I’m presented special K + MSI afterburner VRAM utilization (real). I think I can count on one hand games that exceeded dedicated 10GB VRAM usage and I personally can only remember Ubisoft ghost recon being the main culprit with shit ports or excessive ultra settings. Between 1080p to 4K it’s under a 1GB difference. ~700MB iirc. But exceeding 12GB? No.

Worst offenders being the game setting menus trying to offer an estimate of memory usage, they’re almost always off, it’s something that is mind boggling coming from devs. RE8 coming to mind with asking 12GB in settings when dedicated memory averages at 6.5 utilization, with the memory leak at launch, a lot of peoples were heavy on confirmation bias that 8GB would not be enough, when that was proven wrong further down the line. Same with Godfall where devs said 12GB would be required for max settings.. reality? 6.5GB… (trombone fail sound) Strange that so many AMD sponsored games would inflate numbers.


Cyberpunk 2077 max settings allocated is 8.5GB while the real dedicated is 6.4GB, in general you will be rasterization limited way before using all memory. Same with Microsoft Flight sim which will take all allocated space, but need 6-7GB in reality.

But but.. the future? You say. Well yes the future, as in consoles with IO management being the baseline for the upcoming generation, DirectStorage and sampler feedback being the equivalent on PC when games support it, it’ll even reduce this down further and save a ton of ram space as contrary to what’s being done now which is overloading VRAM with things you might need, consoles and PC APIs are heading to only loading what you need. Then DLSS, XeSS, UE5’s TSR..

So no, I don’t believe this argument.
I briefly had a 3070 in the house while playing Cyberpunk ... and frame rate would TANK if "allocated" exceeded 8 GB on that card . The only way around it was to adjust settings to bring it under 8.

If that's the result of exceeding allocated memory I don't care if it's allocated or used.
 
Not good enough? Based on...? AMD sponsored titles that strangely request more vram? Wonder why that is.

Allocate.

Your card will never use that 12gb of vram at an acceptable frame rate, so it's a moot point. Unless of course you're doing video editing or something.
It’s not moot. Textures don’t require processing power primarily, it’s vram. It will be able to use higher res textures before an 10gb 3080 runs out of vram.
 

Dream-Knife

Banned
It’s not moot. Textures don’t require processing power primarily, it’s vram. It will be able to use higher res textures before an 10gb 3080 runs out of vram.
By the time games are using over 10gb for textures, your 3060 won't be playing at acceptable framerates. 3060 isn't a 4k card anyway.
 
By the time games are using over 10gb for textures, your 3060 won't be playing at acceptable framerates. 3060 isn't a 4k card anyway.
That is just not true. There are games that already want more than 10gb for highest textures.

I can play heavy 4k games on my 3060 with reduced settings, however textures are always at the highest setting. They don’t need to be reduced because of the high vram, it’s great. An 8gb 3070 will choke at native 4k doom eternal with max textures.

Witcher 3 is basically running at near max at 4k60 for example on 3060.
 
Last edited:

Catphish

Member
As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.

Behold, a lone 3090 for $2300.

I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."

Some guy next to me overheard and replied, "Yeah, but it's worth it."

I looked at him like the idiot he deserved to be looked at as.

I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"

He nods enthusiastically and says, "Absolutely."

This is why we can't have nice things.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.

Behold, a lone 3090 for $2300.

I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."

Some guy next to me overheard and replied, "Yeah, but it's worth it."

I looked at him like the idiot he deserved to be looked at as.

I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"

He nods enthusiastically and says, "Absolutely."

This is why we can't have nice things.
Holy shit. Sadly that's how the market works, and it's the same for overpriced GPU's as for scalped consoles... They'll be there until people stop wanting to pay that much for them.
 

Patrick S.

Banned
As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.

Behold, a lone 3090 for $2300.

I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."

Some guy next to me overheard and replied, "Yeah, but it's worth it."

I looked at him like the idiot he deserved to be looked at as.

I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"

He nods enthusiastically and says, "Absolutely."

This is why we can't have nice things.
What a bot, haha
 

Ulysses 31

Member
As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.

Behold, a lone 3090 for $2300.

I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."

Some guy next to me overheard and replied, "Yeah, but it's worth it."

I looked at him like the idiot he deserved to be looked at as.

I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"

He nods enthusiastically and says, "Absolutely."

This is why we can't have nice things.
You don't seem to be factoring in the current availability of an RTX 3090 at MSRP prices.
 

Dream-Knife

Banned
That is just not true. There are games that already want more than 10gb for highest textures.

I can play heavy 4k games on my 3060 with reduced settings, however textures are always at the highest setting. They don’t need to be reduced because of the high vram, it’s great. An 8gb 3070 will choke at native 4k doom eternal with max textures.

Witcher 3 is basically running at near max at 4k60 for example on 3060.
What games?

On my 6800 I would run Insurgency Sandstorm with no texture streaming with the HD texture pack for better performance. That would allocate between 13 and 15gb of my vram. My 3080 with texture streaming still blows it away on performance.

This will become even less of an issue once DirectStorage is rolled out.
 

Catphish

Member
You don't seem to be factoring in the current availability of an RTX 3090 at MSRP prices.
Angry Harrison Ford GIF


Just because something is jacked up in price from scarcity doesn't mean it's inherently worth that price. It may be the market value, but market value doesn't equal intrinsic value.
 
What games?

On my 6800 I would run Insurgency Sandstorm with no texture streaming with the HD texture pack for better performance. That would allocate between 13 and 15gb of my vram. My 3080 with texture streaming still blows it away on performance.

This will become even less of an issue once DirectStorage is rolled out.
Allocating =\= requiring vram.

What games what? That I run at 4k? Damn near all of them.

Look i’m glad you like your 3080. It will still have vram issues before 3060. Far cry 6 already had an issue with the 10gb 3080 at least at launch. No matter how anyone wants to spin it, it’s beyond stupid that a 1080ti/3060 have more vram.

This isn’t a new revelation. My 4gb 1050ti always had the highest textures at 1080p for at least as long as I used it, where as 3gb or 2gb cards needed compromise with textures and other settings at times.
 
Last edited:

Doczu

Member
As an aside, I was at MicroCenter this past weekend buying a new NVMe for my rig. For giggles, I went to the GPU case to see what they had.

Behold, a lone 3090 for $2300.

I said to my daughter who was with me, "Holy crap. Look at that. That's friggin insane."

Some guy next to me overheard and replied, "Yeah, but it's worth it."

I looked at him like the idiot he deserved to be looked at as.

I says, "I bought a 3080 at release for $800. You're telling me this thing is worth $1500 over that!?"

He nods enthusiastically and says, "Absolutely."

This is why we can't have nice things.
This is the reason why the prices won't go back to normal once the scarcity ends. No producer or retailer wants them to get lower.

I hope that the inflation and market crash will teach people to stop overpaying for stuff.
 

Dream-Knife

Banned
Allocating =\= requiring vram.

What games what? That I run at 4k? Damn near all of them.

Look i’m glad you like your 3080. It will still have vram issues before 3060. Far cry 6 already had an issue with the 10gb 3080 at least at launch. No matter how anyone wants to spin it, it’s beyond stupid that a 1080ti/3060 have more vram.
Far Cry 6 is an AMD sponsored title with very strange performance. Wonder why that is.

Doom External is more sensitive to bandwidth, otherwise the 1080ti would outperform the 2070 if vram was the issue.
 
Last edited:

Catphish

Member
Well, we can go around all day on this, and I won't so, again, I'll part by saying intrinsic value =/= market value. If you think a 3090 is objectively worth $2,300, more power to you.
 
Far Cry 6 is an AMD sponsored title with very strange performance. Wonder why that is.

Doom External is more sensitive to bandwidth, otherwise the 1080ti would outperform the 2070 if vram was the issue.

Enjoy your 3080 bub. I don’t like purchase justifying arguments. Nobody is saying the 3060 is better than 3080! But you will have vram bottleneck, no getting around it. Wait until you see what vram the 4xxx series has. Cross gen period is also helping that 10gb vram buffer.

The fact is, you can have higher textures on a 3060 and time will reflect that more and more.
 
Last edited:

Doczu

Member
Well, we can go around all day on this, and I won't so, again, I'll part by saying intrinsic value =/= market value. If you think a 3090 is objectively worth $2,300, more power to you.
No, you got it wrong.

No power to anyone thinking those prices are ok. None. Zero. Zip. Nothing
 

Dream-Knife

Banned

Enjoy your 3080 bub. I don’t like purchase justifying arguments. Nobody is saying the 3060 is better than 3080! But you will have vram bottleneck, no getting around it. Wait until you see what vram the 4xxx series has. Cross gen period is also helping that 10gb vram buffer.

The fact is, you can have higher textures on a 3060 and time will reflect that more and more.

This isn't a purchasing justification topic, I'm simply stating that the vram differences are essentially irrelevant.

By your own logic you should have sprung for the 6800. The 3060 is only getting 38 fps in Far Cry 6 at 4k ultra. The 8gb 2070 super out performs it too.
The 3060 has 12gb because it uses a 192 bit bus. Meaning it could either have 6 or 12gb. The 3060ti has 8gb because it uses a faster 256 bit bus, so it could either have 4 or 8gb. Obviously 4gb is too low for new games.

AMDs marketing is working well I see.
 
Last edited:

Elios83

Member
There probably wont be a game this gen that uses more than 12gb of vram. Hell i still havent seen a game that has maxed the 6gb on my 980ti, albeit i mostly play at 1080p but still.
Game will be targeting the ram amounts on the consoles this gen, whihc means like 10gb tops.

PC gaming is about being better than consoles otherwise it wouldn't have much sense.
Over time we'll see games with extra/ultra quality settings for textures and other assets that will need more memory than what it is used in consoles.
And indeed the 3090 already has 24GB?
This is why I absolutely can't justify spending more than 1000$ for a GPU with just 12GB of RAM. 16GB should be the minimum for a future proofed high end card.
Unfortunately the global shortages situation plus nVidia being an arrogant market leader with poor competition has led to the worst case scenario. Products with disappointing upgrades, at absurd prices and impossible to find.
 

ZywyPL

Banned
PC gaming is about being better than consoles otherwise it wouldn't have much sense.

What does that even mean? Better it what way? Unless you're one of those strange people who cannot see beyond AAA titles, then yeah, better framerate/resolution/visuals is the only benefit you'll get on PC, but there's WAY more than that, just the genre/IPs that aren't available/suited for consoles, like, take LoL playerbase alone for example - in what way do you think they should feel they're "better" than console players? They don; dive a fuck, they just enjoy their beloved game that's not even available on consoles.


As for the GPU from the topic, let's wait for the OG 3080 to run out of VRAM first, because really, most likely RTX I/O will show up before that card runs out of VRAM, let alone the new model with 12.
 

Elios83

Member
What does that even mean? Better it what way? Unless you're one of those strange people who cannot see beyond AAA titles, then yeah, better framerate/resolution/visuals is the only benefit you'll get on PC, but there's WAY more than that, just the genre/IPs that aren't available/suited for consoles, like, take LoL playerbase alone for example - in what way do you think they should feel they're "better" than console players? They don; dive a fuck, they just enjoy their beloved game that's not even available on consoles.


As for the GPU from the topic, let's wait for the OG 3080 to run out of VRAM first, because really, most likely RTX I/O will show up before that card runs out of VRAM, let alone the new model with 12.

Yeah I was talking just graphics-wise, in the sense of "I'm buying a 1500$ GPU to get something more graphics-wise than a 400-500$ console", not as a general platform of course.
There are obviously other advantages like mods, more freedom, more customizations,etc.
Let's see if nVidia announces the 4000 series later this year as the rumors claim, their prices and the RAM.
I want to get a new PC this year because my 4 years old notebook is well past its prime but this general situation is really unfortunate.
 
Last edited:

Patrick S.

Banned
These discussions about needing one bazillion Gigabytes of video RAM to be able to load ultra super duper textures that will let you distinguish a single grain of salt on the texture of a picnic table 600 meters away from the camera are really ridiculous.

People have really forgotten what gaming is about, and what is and what isn't important in a videogame.
 
Last edited:

This isn't a purchasing justification topic, I'm simply stating that the vram differences are essentially irrelevant.

By your own logic you should have sprung for the 6800. The 3060 is only getting 38 fps in Far Cry 6 at 4k ultra. The 8gb 2070 super out performs it too.
The 3060 has 12gb because it uses a 192 bit bus. Meaning it could either have 6 or 12gb. The 3060ti has 8gb because it uses a faster 256 bit bus, so it could either have 4 or 8gb. Obviously 4gb is too low for new games.

AMDs marketing is working well I see.
This is just sad. ultra settings are very over demanding for the visual return usually. I rather have the extra cash than spend 800 bucks on a single card. A card with 10gb vram that has less than a much older high end cards. But again, textures are basically free if you have the vram.

If you know how to optimize settings, you could get 1080p60fps on witcher 3 with ultra textures back in the day on 1050ti which was far weaker than a 970. The ultra textures kept the game looking high end.

You’re incapable of admitting you’ve never seen a game where you had to lower textures personally because you didn’t have enough vram ; lack of experience. Or you’re bring intellectually dishonest.

So much fanboy type arguing and lack of experience in your posts, i’m ignoring you now haha.
 
Last edited:
Top Bottom