• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

The bad AF is nothing to do with CB rendering though.

Personally I can't see a huge difference when sat ~7ft from a 55" TV. I also cant see a huge difference between 1440p and 4k at that distance.

I'm all about DLSS/CB/whatever because unless you're sat up close the difference isn't big enough considering the trade-off in performance.
The post-processing has nothing to do with CB rendering either.

I also didn’t say the difference was massive, I said I could easily tell it. 1440p once again looks significantly worse than 4K even at viewing distances but the trade-off is worth it. I’m willing to go at 1080p or even 720p if it gives me a stable 60fps. Resolution isn’t all that high on my list of priorities.
 

DeaDPo0L84

Member
So is it safe to assume out of all the 3000 series card both soon to be revealed and potential variants in the future that 3090 will be the best of the bunch?
 

Ellery

Member
So is it safe to assume out of all the 3000 series card both soon to be revealed and potential variants in the future that 3090 will be the best of the bunch?

There is always room for more and you will be at Nvidia's mercy.

But I guess looking at the size of that lad, the power draw and flaunting 24GB GDDR6X for a 4 digit price it is relatively safe to assume that it is the top dog for at least this year and probably well into next year as well.
 

Myths

Member
The only justification I can come up with is that it a lot of things other than play games, but yeah if you're not any major video/photo editing there's no reason to get these crazily priced GPUs. Here's hoping AMD comes out with a nice price-performance beast to force NVIDIA to rethink its pricing but it probably won't happen.
That’s exactly what I’m into other than just video games.

It makes very little sense to continually engage in the platform comparison when one is largely more capable and has other factors going for it than the next. The target group you’d want to engage with such an argument in regard to pricing (that’s really what all this ever amounts to) are those who probably just buy the most expensive components/builds strictly for just gaming. Not even most expensive, even mid-range, just to see why spend the money over console.

I’m genuinely interested in seeing a polled thread for this, the number of consumers here purchasing a GFX card only for gaming and not other means.
 

supernova8

Banned
That’s exactly what I’m into other than just video games.

It makes very little sense to continually engage in the platform comparison when one is largely more capable and has other factors going for it than the next. The target group you’d want to engage with such an argument in regard to pricing (that’s really what all this ever amounts to) are those who probably just buy the most expensive components/builds strictly for just gaming. Not even most expensive, even mid-range, just to see why spend the money over console.

I’m genuinely interested in seeing a polled thread for this, the number of consumers here purchasing a GFX card only for gaming and not other means.

I suppose if people can afford to splurge over and over then that's cool but in the 'real world' others have to be a little smarter. I know that once I buy my motherboard I'm pretty much prisoner to that year and maybe the year after's worth of CPUs (AMD has been kind), so there's relatively little point in skimping on the CPU in my mind. RAM is middle ground depending on the point at which you jump in. If you get a DDR3 motherboard at the end of DDR3 then you're a tool obviously.

GPUs though, it takes a while to go from PCIe standards so I can be pretty confident my base rig will take any GPU as long as the case is big enough and the PSU is capable. Since I've already spent the bulk on my other parts, I just end up going mid-range for the GPU. I had a GTX 1060 for quite a while. Served me well!

Aside from simply having so much disposable income that it was a no-brainer, I cannot understand why someone would buy a top of the range crazy expensive GPU.
 

Hostile_18

Banned
Hell will freeze over before I buy a card in 2020 with 10Gb of vram. For those so-called 'rumoured prices' I'm seeing, it's doubly absurd.

Pricing issue aside if no more than 10gb gets used in the next say 3 years why do you want more that you won't use? Obviously if your intending to keep the card for a long, long time then fair enough but I don't get that impression from most users here.
 
So is it safe to assume out of all the 3000 series card both soon to be revealed and potential variants in the future that 3090 will be the best of the bunch?

Asking myself the same question. I might be tempted to jump on a 3090 (btw, purely for gaming), but I'd refrain from doing so if there was a 3090 Ti or something right around the corner (< 12 months from now).
 

FireFly

Member
So is it safe to assume out of all the 3000 series card both soon to be revealed and potential variants in the future that 3090 will be the best of the bunch?
We should hopefully find out at launch if the 3090 is a cut down chip at all. But most likely if it is, then the full version would appear as a Titan RTX.

But if these cards are really on 8nm, then I imagine Nvidia will have a 7nm refresh coming.
 
Blatant is a pretty strong word. The only reason the vast majority people who claim to be able to tell the difference can do so is because they were shown the difference on a side-by-side Youtube video.
Or because they have a pair of functioning eyes.

The difference is obvious but checkerboard looks good enough but let’s not lie and pretend it looks the same, it doesn’t.
 

nemiroff

Gold Member
I didnt expect Nvidia to make things difficult for me again. But the 10GB vs 24GB thingy has thrown me off track completely. Now the only thing missing is the 3090 priced like a RTX Titan and my plans will be completely cancelled and I will be forced to wait for the Super version. Fucking hell, I need the release event badly.
 

Ellery

Member
I didnt expect Nvidia to make things difficult for me again. But the 10GB vs 24GB thingy has thrown me off track completely. Now the only thing missing is the 3090 priced like a RTX Titan and my plans will be completely cancelled and I will be forced to wait for the Super version. Fucking hell, I need the release event badly.

How much would you be ready to pay for the RTX 3090, 60% performance increase over 2080 Ti, 24GB GDDR6X, 350W TDP ?
 

sendit

Member
How much would you be ready to pay for the RTX 3090, 60% performance increase over 2080 Ti, 24GB GDDR6X, 350W TDP ?

Likely, the 3090 RTX will last atleast 2-3 years until something else matches it. Worthwhile investment in my book, so price really isn't a factor for me.
 
Last edited:

Ellery

Member
Likely, the 3090 RTX will last atleast 2-3 years until something else matches it. Worthwhile investment in my book, so price really isn't a factor for me.

I don't know the financial situation you are in, but if price isn't really a factor then you are probably well off. So my guess would be around 1500-1600$ and you probably have an RTX 2080 Ti you can sell to dampen the cost I guess?
 

Kenpachii

Member
I have a feeling a 3080TI will release next year.

That's what the 3090 is. Nvidia is just swapping names. They always do this.

They probably wanna not dig into the ti naming or super naming because of waiting to see what amd offers, so they can do a super again but call it TI, i could even see those release 6 months later.
 
Last edited:

Mister Wolf

Member
That's what the 3090 is. Nvidia is just swapping names. They always do this.

They probably wanna not dig into the ti naming or super naming because of waiting to see what amd offers, so they can do a super again but call it TI, i could even see those release 6 months later.

I'm aware that's what people besides Nvidia are saying but I'm not ruling out TI versions of these cards just yet.
 

sendit

Member
I don't know the financial situation you are in, but if price isn't really a factor then you are probably well off. So my guess would be around 1500-1600$ and you probably have an RTX 2080 Ti you can sell to dampen the cost I guess?

Decently as a Software/Web Developer. However, I am extremely frugal with what I spend my money on and what value I would get out of a purchase. Thinking about people who spend 5 dollars on a cup of coffee every day, which equates to about ~1800 a year. 1500-1600$ for something that should last 2-3 years really isn't that much.
 

kiphalfton

Member
That's what the 3090 is. Nvidia is just swapping names. They always do this.

They probably wanna not dig into the ti naming or super naming because of waiting to see what amd offers, so they can do a super again but call it TI, i could even see those release 6 months later.

I think the 3090 is the equivalent of a titan. Why else would it supposedly have over twice as much RAM as the 3080? And launched early. The titan generally always preceded the x80Ti; the x80Ti was comparable but at a lower price and with less RAM. Turing is the only time the x80Ti was released alongside the x70 and x80 cards.

My prediction is the 3080Ti will have 16gb-18gb RAM and cost about $1000 and will be released to coincide with Big Navi.
 
Last edited:

CuNi

Member
So... would a 750w(80+ gold) suffice for a 3090?

Technically you just need to care about wattage. The Rating (Silver, Bronze, Gold etc.) is just how much it will draw from you outlet.
A 750 Watts PSU will deliver 750 Watts to the PC, it'll draw around 833 Watts from your outlet if it's running at 90% (Gold rating) efficiency.

Edit: To answer your question.
It's hard to say. 750w is a good start I'd say but in the end it comes down to your complete system and once specs are released it's actual power draw.
A 750w PSU might suffice for a 3090, but if you add a overclocked CPU, drives, fans, RGB and maybe other peripherals, in theory it could get close to it's limits but I'd say a 750w PSU is on the safer side of things.
 
Last edited:
$2k and it's also super THICC (read: case compatibility issues)

lol nvidia not even hiding the fact they are just milking the first day adopters
 

b0uncyfr0

Member
Pricing issue aside if no more than 10gb gets used in the next say 3 years why do you want more that you won't use? Obviously if your intending to keep the card for a long, long time then fair enough but I don't get that impression from most users here.
Well yes if i paid that much for a card, i expect it to last along time. Im sure ive seen some games use more than 8gb recently, and that's not taking into consideration that the NG consoles have alot more at 16. If you upgrade every year, this won't be a problem. I keep my cards a minimum of 2 years ( i prefer 3) so I like to plan ahead.
 
Last edited:

Hostile_18

Banned
Well yes if i paid that much for a card, i expect it to last along time. Im sure ive seen some games use more than 8gb recently, and that's not taking into consideration that the NG consoles have alot more at 16. If you upgrade every year, this won't be a problem. I keep my cards a minimum of 2 years ( i prefer 3) so I like to plan ahead.

Don't forget though that the console Ram is shared, there is no system ram. Not long till we find out everything 😊
 

kittoo

Cretinously credulous
Well yes if i paid that much for a card, i expect it to last along time. Im sure ive seen some games use more than 8gb recently, and that's not taking into consideration that the NG consoles have alot more at 16. If you upgrade every year, this won't be a problem. I keep my cards a minimum of 2 years ( i prefer 3) so I like to plan ahead.

I would've gone for a 3080 in heartbeat if Nvidia didn't gimp it with that amount of RAM. At 4I and high settings there are already games that use more than 8GB of video RAM. I think Remake 2 did that. I personally have seen my 1070 8GB filled. So next gen 4K is almost guaranteed to run against that wall pretty soon.
Really pissed at Nvidia about this.
 

ShirAhava

Plays with kids toys, in the adult gaming world
RTX 3080 and below seems like its for people who still have Fermi or Kepler cards its a great upgrade for them.

980 TI and up either get the 3090 or wait.....3080 is great for now but with 10gb vram in 12-18 months you are going to need a new card.
 

llien

Member
Fucking hell..$2000
That's in China.
Recent update is that it's $1400.

I would've gone for a 3080 in heartbeat
Without knowing how much it consumes, costs or how performant it is.

giphy.gif
 

Arun1910

Member
Pricing issue aside if no more than 10gb gets used in the next say 3 years why do you want more that you won't use? Obviously if your intending to keep the card for a long, long time then fair enough but I don't get that impression from most users here.

I'm just going to link this thread and copy/paste a post here for why 10GB may not be great for "next gen".

TL DR - Better RT implementations = Need more VRAM. Better Console Hardware = Higher specs for PC, probably need more VRAM.


 
Last edited:

Hostile_18

Banned
I'm just going to link this thread and copy/paste a post here for why 10GB may not be great for "next gen".

TL DR - Better RT implementations = Need more VRAM. Better Console Hardware = Higher specs for PC, probably need more VRAM.




See if anyone is in the position to know this its Nvidia. So either A) they don't think more than 10gb is necessary to most people. B) They are deliberately gimping the cards to get people to upgrade again. C) These cards are going to be launched at a super competitive price.

I'm new to the PC market so not had enough experience to comment which one it may be.
 

Kenpachii

Member
I think the 3090 is the equivalent of a titan. Why else would it supposedly have over twice as much RAM as the 3080? And launched early. The titan generally always preceded the x80Ti; the x80Ti was comparable but at a lower price and with less RAM. Turing is the only time the x80Ti was released alongside the x70 and x80 cards.

My prediction is the 3080Ti will have 16gb-18gb RAM and cost about $1000 and will be released to coincide with Big Navi.

3090 is nowhere near a titan, titan v-ram pool should be 32gb+. 24gb is basically what u would expect the 3080ti to sit at.

then about 3080.

3080 is a budget card sold for top end money. Anything below 16gb of v-ram has no future simple as that.

I would not even consider the card as a 3060 option.

People are currently looking through this generation glasses. Next generation baseline for v-ram will be 16gb of memory. the base line will go up massively.

This reminds me of a 580 1,5gb, totally solid gpu performance and memory performance, v-ram was fine for ultra settings on any game until PS4 games arrived. It couldn't even run unity or watch dogs on any setting because v-ram bottleneck. Now it probably won't be as bad for the 3080, but expect medium settings being a thing for the card through the entire generation if not low with it.

Any gpu at 400+ when next gen consoles are out needs 16gb of v-ram modules.

10gb makes absolute no sense at all in any way other then they eyeballed for that number. The only way that card has any reason to exist is if its dirt cheap like 200 bucks. And we all know that won't be the case.

It's clear to me that the entire 3000 series besides the 3090 is going to be replaced in 6 months with proper versions when amd has there cards on the market. Anybody buying into these cards will be burned hard.

And for me with a 1080ti, i will be sitting this clusterfuck of a card generation out until they offer something half decent.
 
Last edited:

ZywyPL

Banned
I'm just going to link this thread and copy/paste a post here for why 10GB may not be great for "next gen".

TL DR - Better RT implementations = Need more VRAM. Better Console Hardware = Higher specs for PC, probably need more VRAM.




For what resolution tho? 1080p? That will be plenty of RAM. 1440p? Still more than enough. 4K? Now this is where the issue might indeed appear, but anyone planning to plug his PC into a 4K display will be most likely looking at the 3090. Still, the gap between 3080 and new RTX a.k.a. 3090 is too damn wide, looking at the past generations an 80SM card is clearly missing in the lineup, with potentially 12-16GB RAM. But I think that's exactly the plan and NV is back to their strategy with filling the gaps later on, forcing people to either get the highest possible model at launch, testing how many people are really willing to get the top, most expensive cards no matter the cost, or to start somewhere in the middle with let's say 3060/70 and upgrade later on to 3080 Super (72SM) or Ti (80SM) models, basically double-dipping.
 

Ellery

Member
3090 is nowhere near a titan, titan v-ram pool should be 32gb+. 24gb is basically what u would expect the 3080ti to sit at.

then about 3080.

3080 is a budget card sold for top end money. Anything below 16gb of v-ram has no future simple as that.

I would not even consider the card as a 3060 option.

People are currently looking through this generation glasses. Next generation baseline for v-ram will be 16gb of memory. the base line will go up massively.

This reminds me of a 580 1,5gb, totally solid gpu performance and memory performance, v-ram was fine for ultra settings on any game until PS4 games arrived. It couldn't even run unity or watch dogs on any setting because v-ram bottleneck. Now it probably won't be as bad for the 3080, but expect medium settings being a thing for the card through the entire generation if not low with it.

Any gpu at 400+ when next gen consoles are out needs 16gb of v-ram modules.

10gb makes absolute no sense at all in any way other then they eyeballed for that number. The only way that card has any reason to exist is if its dirt cheap like 200 bucks. And we all know that won't be the case.

It's clear to me that the entire 3000 series besides the 3090 is going to be replaced in 6 months with proper versions when amd has there cards on the market. Anybody buying into these cards will be burned hard.

And for me with a 1080ti, i will be sitting this clusterfuck of a card generation out until they offer something half decent.

Now that you mention it the funny thing is that the 3080 10GB and the 3090 24GB are roughly the same % difference in VRAM as the resolution pixel difference between 1440p and 4K
 

nemiroff

Gold Member
How much would you be ready to pay for the RTX 3090, 60% performance increase over 2080 Ti, 24GB GDDR6X, 350W TDP ?

I don't know, it's kinda hard to come up with a clear thought about this, it's so contextually dependent on the sum of all the facts. And we don't know for sure yet. To not feel bad about it has to be less than 1500.. If the card is minblowingly amazing I could potentially go up to 2000.. IDK, we'll see..

It's going to be so interesting to finally see the GPUs materialize with prices and to hear Nvidia explain the 10GB VRAM part for 3080. They surely must be aware that this is a thing that their customers could potentially be hesitant about.
 
Last edited:
See if anyone is in the position to know this its Nvidia. So either A) they don't think more than 10gb is necessary to most people. B) They are deliberately gimping the cards to get people to upgrade again. C) These cards are going to be launched at a super competitive price.

I'm new to the PC market so not had enough experience to comment which one it may be.

I can tell you right now, that its option B.
 

llien

Member
"Specifications leaked"

Pcie 4, HDMI 2.1, DP 1.4a
3090
24GB on a 384-bit bus at 19.5 Gbps. (936 GB/s bandwidth)
5,248 CUDA @ 1695 MHz (if this is boost freq, then it's 17.8TFlops, 2080Ti + 30%)
350W TDP
(FE - "new connector", AIB - two 8 pin)

About 14% more shaders for 25% more power compared to TU102.

3080
10GB on 320-bit bus at at 19 Gbps (760 GB/s bandwidth)
4,352 CUDA @ 1710 Mhz
320W?? TDP

(Note: RTX 2080 Ti 4352 CUDA (same), 1545 MHz boost, 11 GB, 616 GB/s, 250 W, 12 nm process )

3070
It exists (surprising eh?)
Older mem @ 16Gbps on 256-bit bus

Buzzwords skipped: "even better tensor cores", "even ray tracier ray tracing", AI, ML, Double Penetration deep learning,
Surprising bit (if true) and a big deal: rumored to be TSMC 7nm
"The data that we saw clearly mention the 7nm fabrication node. "

TPU
 
Last edited:
Top Bottom