• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

KungFucius

King Snowflake
So with regards to AIBs.. is it the consensus here that EVGA and Gigabyte are the best options? Other recommendations?

If you want it ASAP, you may only be able to find a random brand in stock. Even then there is still the silicon lottery. So you need to either wait, or buy a ticket. It will take a long time before any reviews really tell you which particular brand is better. It's rare that there is a truly bad model, but sometimes there are bad batches. That takes months to really show itself.

I expect this round of GPUs to be very hard to get because they are launching at the same time as the consoles and Cyberpunk. I plan to open tabs for all sites and hammer the fuck out of them until I see an Add to Cart button. I don't care too much what brand but would prefer a step up in the cooling department. I just want one to get here in time to play Cyberpunk on it.
 

llien

Member
So, "tier up" in terms of CUs (3070 has 2080, 3080 has what 2080Ti had, with a bit higher clocks) and "doubling" of barely useful RT cores.
Strange bit is the much higher number of transistors involved (i.e. 3070 having more than double of the respective card).
Either that info is wrong, or NV made yet unseen major arch changes.

Based on what Alex said in the DF videos, the artifacts are visible when you zoom into sections of the screen, but at normal viewing distances are very hard to spot.
But this can be said about 4k in general. Are you not too close? Are things moving on the screen?
Good luck spotting the difference. No buzzwordy tech needed either.

I agree that ray tracing is currently (ie at this very moment in time) irrelevant but I think we'll see a change once the consoles (that both support ray tracing on some fundamental level) are released.
Even before the recent statement from Microsoft side (on devs being reluctant to touch RT) it was hard to come up with reasons to use RT.
As a game developer, especially after seing Epiec's PS5 demo, why would you even consider using DXR like API? What for?

It's not just fancy image sharpening
Upscaling. Yes, that's exactly what it is. And it is hilarious that in the respective thread a user has pointed out that raindrops on the backpack were wiped out, without any single reviewer "noticing" it. The "but you need to get really close to see the difference" is a funny argument.
 

supernova8

Banned
Upscaling. Yes, that's exactly what it is. And it is hilarious that in the respective thread a user has pointed out that raindrops on the backpack were wiped out, without any single reviewer "noticing" it. The "but you need to get really close to see the difference" is a funny argument.

It's an upscaling solution that AMD cannot (or at least does not) currently replicate and if it still cannot replicate it then that's a win for NVIDIA. Listen I don't even want NVIDIA to 'win' because it's better for the consumer if AMD wins for once (to drive prices down a little) but to sneer at DLSS is silly.
 

supernova8

Banned
only just becoming?

most of these 'insider' YT'ers are just recycling and padding rumors.
Thanks for your click$views

Stay away from 'insiders' who dont do their own content or have sponsored content.
They are a waste of time

Well yeah literally all of these youtubers are like that because the actual insiders want to stay anonymous for obvious reasons. He and redgamingtech clearly know they can make more money by getting their videos over 10-15 minutes so literally every video is that long (or longer) despite them having practically nothing to talk about.

At least in RGT's case we can go to their website and see the information and be done in about 30 seconds, but with MLID you have to sit through that shit unless he puts timestamps (which he only does for those long-ass 'podcast' borefests).

But yeah it's not like I could do any better so I should stop complaining :p
 

McHuj

Member
I wonder if a 3090 would fit in any HTPC case? Any put anything similar in a small case?
 
Last edited:

Siri

Banned
So the 3090 probably won’t fit in my mini-itx case, probably won’t run on my 700 watt PSU, and probably won’t be readily available for many months after launch.

I wonder if one of the board partners will find a way to create a slightly toned down 3090?
 

Yoda

Member
I can't fit the 3090 in my case and I'm a big fan of the itx form factor. I'll hold off for now, hopefully someone comes out w/2 slots + not as long (doubtful on this point).
 

Xyphie

Member
If you're watercooling you'll be fine in a Nano S because while the card is 310mm long the PCB with a waterblock will be significantly shorter.
 

llien

Member
cannot (or at least does not) currently replicate
There is nothing but highly subjective (not to say dubious) assessment behind how well AI upscaling 2.0 actually works.
At the end of the day, it revolves around "people are not that good as figuring out true resolution"
 

FireFly

Member
But this can be said about 4k in general. Are you not too close? Are things moving on the screen?
Good luck spotting the difference. No buzzwordy tech needed either.
Well in the DF videos they were doing an 8x magnification of sections of the screen to see the difference. So presumably your face would need to be pressing against the screen.

But I think the point is that for a given viewing distance, the reconstruction will let you get away with a lower source resolution. So if you "don't need" 4K, then instead of using 1440p, you can use 1080p with reconstruction instead. And if the extra performance can be used to add more visual effects, then you could end up with something that it is better looking overall. At least, why would you not want the option of being able to do this?

In a few years I imagine both AMD and Nvidia will have similar options and it will no longer be a buzzword, but an established part of the technology stack. At that point all of the flashy marketing will be irrelevant.
 
Last edited:

SmokSmog

Member
3913c238d16c92257d914a9fc2b202ae6be13a1908801225150a4904828fd23f.gif
 

kiphalfton

Member
I wouldn't usually buy directly from EVGA's website, but in the event better models come out in October/November (to coincide with Big Navi launch) you can just use the Step Up offer.
 

Nydus

Member
Well yeah literally all of these youtubers are like that because the actual insiders want to stay anonymous for obvious reasons. He and redgamingtech clearly know they can make more money by getting their videos over 10-15 minutes so literally every video is that long (or longer) despite them having practically nothing to talk about.

At least in RGT's case we can go to their website and see the information and be done in about 30 seconds, but with MLID you have to sit through that shit unless he puts timestamps (which he only does for those long-ass 'podcast' borefests).

But yeah it's not like I could do any better so I should stop complaining :p
That's why I watch gamermeld. ~5 minute video with aggregation of all the rumours floating around. He even gives a heads up how credible the leak is, based on past activities of the leaker. Really helpful with those Twitter leaks of Chinese insiders.
 

Myths

Member
I am currently rocking a Zen 2 Ryzen 7 3700X with my overclocked RTX 2080 and it runs damn well. I haven't had any troubles so far in any game at 1440p 144hz.

If you can wait go with the Zen 3 4000 CPUs which are probably another 15% better than the Zen 2 CPUs.

Also I don't really like talking about bottlenecks that much, because it depends too much on games. There are games that bottleneck every CPU out there because of the nature of the game, but most modern AAA games are always running the GPU at max load before they run into any other bottleneck. Especially at 4K the GPUs are not good enough and won't be for a long while.

Additionally it has to be said that the differences between the high end CPUs (both from AMD and Intel) are not that big, and also depend on games. In some games the Intel CPUs are 10% better and then there are games like Troy where the 16 core AMD 3950X is 30% better than the best Intel CPU.
All of the high end CPUs 3700X/3800X(T)/3900X/3950X/9900K/10900K are good enough and won't be a bottleneck for any graphics card and the performance will only go up with the 4700X-4950X and 11900K.
I’m thinking about upping from 9900K to 10900K anyway. I’d feel the need to up my PSU 100 more watts from 850 paired with the 3090 though.
 

supernova8

Banned
That's why I watch gamermeld. ~5 minute video with aggregation of all the rumours floating around. He even gives a heads up how credible the leak is, based on past activities of the leaker. Really helpful with those Twitter leaks of Chinese insiders.

Yeah his short videos are good but I cannot stand his voice and especially the "welcome everywuun to geeeeeeeeeeeeeeeeeeeeemer maaalld"
 

Ellery

Member
I’m thinking about upping from 9900K to 10900K anyway. I’d feel the need to up my PSU 100 more watts from 850 paired with the 3090 though.

I guess that depends on the 3090 if it is a special aftermarket design and how much overclocking you want to do with the CPU/GPU. I could see an unlocked 10900K and 3090 Lightning with maxed power target to be quite demanding in terms of power draw.

We will see soon. I expect the 3090 Founders Edition to be 350 TDP
 

BluRayHiDef

Banned
So, for someone who is looking for a card that will max out most games at 4K with at least decent ray tracing for the next four years, would it be foolish to snag a 3090 as soon as possible or to wait for Big Navi?
 
Last edited:

Blade2.0

Member
I'm sitting with an AMD Ryzen 2500 + a 1080Ti, 16GB of ram. How good am I for next-gen games? my case runs hot, though. I can get it to go to 90C if the game is intense. I think is hould get a new case with more fans and maybe watercool the 2500, or move up to a 3500. Anyone got tips on what to do with mine?
 

longdi

Banned
I'm sitting with an AMD Ryzen 2500 + a 1080Ti, 16GB of ram. How good am I for next-gen games? my case runs hot, though. I can get it to go to 90C if the game is intense. I think is hould get a new case with more fans and maybe watercool the 2500, or move up to a 3500. Anyone got tips on what to do with mine?

I guess you can move up to 3700x if you need it now and easily.
The new zen3 4000 series will of course be even better, but they are coming late late year end, and your current motherboard may or not support it.
 
So, for someone who is looking for a card that will max out most games at 4K with at least decent ray tracing for the next four years, would it be foolish to snag a 3090 as soon as possible or to wait for Big Navi?

That sounds like you need the most powerful GPU possible. Which means AMD won't even be an option. Get the 3090.
 

888

Member
I have the money for it, but I don't want to buy a whole new case if don't have too. I have the Corsair Crystal 570X case for reference.

I have the same case. The 3090 is about 12.5 inches. You will be fine. I have tons of room even with a push pull 360mm rad on the front.
 

BluRayHiDef

Banned
That sounds like you need the most powerful GPU possible. Which means AMD won't even be an option. Get the 3090.

I know that AMD often disappoints in the enthusiast market, but I keep hearing that Big Navi will change things (even though that's always said before they release a new micro-architecture but turns out to be false). Also, there's the issue of TDP; the 3090 is said to run hot and consume a lot of electricity but Big Navi is said to run cooler and to consume less power.
 
I know that AMD often disappoints in the enthusiast market, but I keep hearing that Big Navi will change things (even though that's always said before they release a new micro-architecture but turns out to be false). Also, there's the issue of TDP; the 3090 is said to run hot and consume a lot of electricity but Big Navi is said to run cooler and to consume less power.

Oh I wouldn't be surprised if AMD indeed delivers a compelling high end GPU in November. It's just that the 3090 will be faster, and if you're not willing to make any compromises with regards to 4K performance, the 3090 is the card to get.
 

BluRayHiDef

Banned
Oh I wouldn't be surprised if AMD indeed delivers a compelling high end GPU in November. It's just that the 3090 will be faster, and if you're not willing to make any compromises with regards to 4K performance, the 3090 is the card to get.
Yes, but what if the difference in performance is negligible and is outweighed by price and TDP?
 
Remember:



nu3UQOW.png

This is very dumb.

The top end is never best at price/performance because cost/mm2 is not linear. The larger the die area the more likely it will have defects. That's just the nature of semiconductor manufacturing. The mid range is what you want when you want price/performance.

"The more you buy, the more you save." That's literally why Costco Wholesale exists. If I buy 10 toothbrushes for $6 instead of 1 toothbrush for $1, I'll be saving money in the long run. There is no controversy.
 
Last edited:

nemiroff

Gold Member
What do you think about this?



The funny thing about that video is that he basically said very little of substance. No new information, just speculating from the BLOPS trailer.

With that said, I'm definitely rooting for AMD to give Nvidia proper competition.
 
Last edited:

llien

Member
AMD's rumored 505mm² should be close enough to 3080 (and likely that is why it only wields 10GB, it's much cheaper that way), the 3090 is likely a 600mm²+ and will not be directly challenged.

What's interesting is the board power figures put out by Gainward. The RTX 3090 typical board power (at least for the Phoneix GS), is rated at 350 W, while that of the RTX 3080 is rated at 320 W.




TPU

Curious, since most of that 30W difference is eaten up by the additional 14GB GDDR6x.
 

Armorian

Banned
you have to account for the better clock speeds and bandwidths.

This won't matter much if game actualy needs more than 10GB of memory, 3080 memory setup is for people that play in high fps/1440p. 20GB version will be for 4K lovers.

I’m thinking about upping from 9900K to 10900K anyway. I’d feel the need to up my PSU 100 more watts from 850 paired with the 3090 though.

You have to be shitting me, almost no performance upgrade if your 9900K is overclocked. and 10/20 setup is weird, no games will use it. Wait for some new arch from Intel or Ryzen 4xxx. Oh any you would be locked to PCIE 3.0 for that 3090, who knows if that card needs more BW or not.

 
Last edited:

supernova8

Banned
Oh any you would be locked to PCIE 3.0 for that 3090, who knows if that card needs more BW or not.

Yeah I'm wondering about the PICE 3 to 4 stuff. It's a bit beyond even my shit understanding. I've got a PCIE 3 board right now (B450) so I guess I'll be OK. How much performance we leaving on the table without PCIE 4?
 

kittoo

Cretinously credulous
Are we sure that a 3080 20GB is coming? If so, I will get that. Aint gonna get a gimped 3080 10GB or super expensive 3090.
 
Top Bottom