• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 3090 Ti launches on March 29th, reviews coming on the same day

Hezekiah

Banned
Sorry i didn't link you as someone else. if you notice the 4080 in that leak is about the 3090 level. but with less VRAM. So in theory. I am not expecting much here. maybe it will have better ray tracing lol.

the point is. there is a reialible leak that the full chip has been delayed. god knows till when. and even if they release it this year, it will a paper release like usual and then scalping and stuff . you will end up with nothing in general but in 2023. while the 3090 if found below MSRP is really a good value with its 24gigs and will last you for years.
Finding a 3090 below MSRP is like finding gold.

Also, as mentioned it has horrible power consumption, and I'm sure the 4080 will be better in that regard.
 
Sorry i didn't link you as someone else. if you notice the 4080 in that leak is about the 3090 level. but with less VRAM. So in theory. I am not expecting much here. maybe it will have better ray tracing lol.

the point is. there is a reialible leak that the full chip has been delayed. god knows till when. and even if they release it this year, it will a paper release like usual and then scalping and stuff . you will end up with nothing in general but in 2023. while the 3090 if found below MSRP is really a good value with its 24gigs and will last you for years.
So can you share the link with us or is it a secret?

edit: is this it? https://www.tweaktown.com/news/8486...s-gpu-specs-leak-thanks-to-hackers/index.html Are we to assume 103 means *80 series?
 
Last edited:

//DEVIL//

Member
Finding a 3090 below MSRP is like finding gold.

Also, as mentioned it has horrible power consumption, and I'm sure the 4080 will be better in that regard.
I am not sure what part of the world you live in. But here in Canada, the 3090 is always below MSRP for used video cards. I bought a 3090 STRIX white with stickers on it for 2300 CAD ( around 1850 USD )

and some are even seller brand new 3090 FE for about the same price sealed.
 

//DEVIL//

Member
So can you share the link with us or is it a secret?

About the delay.


About the leak



This doc was created from the latest hack on Nvidia where some of the stuff got leaked. this was a screenshot grab from the Graphically challenged channel on youtube where he shared.



this is his video as well
In theoretical numbers it suggest its more less 40% higher than 3090. but in reality this number is gonna be 10 to 15% in actual performance gain. but with less VRAM

At this point. when everyone will jump on the 4000 for mining or scalping plus the limited quantity it will have assuming its not delayed. we will be lucky to get a 4080 by early 2023.

mind you. these cards they are getting pushed to power also by force. 500 and 600 watts for a video card ? sure if you wanna BBQ on it go ahead. you will have a new furnace in your home.

Unless Nvidia switch to MCM which isn't happening this 4000 generation... now come 5000 ? ... yeah that's worth it.
 
Last edited:

//DEVIL//

Member
That stinks, if true. So only big upgrade for 3080 owners might be 4090 for $2K.. :messenger_pensive:

The question is.. do you really need it ? do you need all that power ? I gave at ultra wide 2k monitor. this 3090 will do be good with unreal engine 5 games everything ultra and easily locked 60 fps and higher ( with DLSS i am locking to a 120 fps no matter what ).

so why jump from the 3080 ? its perfectly fine. unless you like the latest video card no matter what then yeah. . also.. you cant think of it as 2000$ cash upfront. you have your card resale value that would help.

But IF anything, I will not sell my 3090 strix white till I get the card I want .. then I will sell. made the mistake of selling the 2080ti before the release of the 3000 series thinking I made the right choice.... man did I get fucked ... never again.
 
Sorry, I didn't link you as someone else. if you notice the 4080 in that leak is about the 3090 level. but with less VRAM. So in theory. I am not expecting much here. maybe it will have better ray-tracing lol.

the point is. there is a reliable leak that the full chip has been delayed. God knows till when. and even if they release it this year, it will be a paper release like usual and then scalping and stuff. you will end up with nothing in general but in 2023. while the 3090 if found below MSRP is really a good value with its 24gigs and will last you for years.
No way in hell the 4080 isn’t better than 3090.
 

Dream-Knife

Banned
My beef is that they will still be using the same standard cooling solution we've been using for years now on something that'll be pumping out real heat.
I wonder if we will eventually get GPUs with tower style coolers. Build some adjustable support stands in it, and go crazy.
Wow this is a stupid release that will probably still sell unfortunately :/

At a certain point extra power is just stupid if it runs much hotter to do it. This is maybe going to be 5 to 10% better but with 100 more watts? ridiculous.

This is one reason I stick with xx50 ti or x060 level nvidia cards typically and the 70 range is as high as i’ll ever go.

With enough vram, Lovelace 4070 will beat this card at much lower watts and be much cheaper if you can wait.
Rumor is these new cards are using a lot of power. They also will probably be using the ATX 3.0 connector.
We assumed a lot of the same things pre 3080 and we were mostly wrong. We'll see release date, but gut instinct is you're wasting money on 3090TI just like the Titans. But hey if ppl have money to waste then go for it.
Halo cards have always been a waste of money if you measure performance vs cost. The type of people who buy them have to have the best regardless, so this will sell.
On average at launch cuz you know how Nvidia seemingly abandon old cards the 2080 vs 3080 was closer to 50% than being completely double.

And at 4K the 2080 was already NOT a 4K card at the 3080s launch it was bandwidth and memory starved which is why at 4K the performance differential suddenly changed.

The 3080 vs 4080 memory setup is alot closer.
12G vs 16G
384bit vs 256bit + huge cache.

We dont know how much that extra cache will help, but id hazard a guess and say at launch the 4080 wont be double the performance of the 3080 12G at 4K.
Not really fair to compare the $1200 12gb model to the 4080. It should be compared to the $700 10gb model. The 12gb model has about a 2-4% performance difference, which heavily ruins the price to performance ratio, and would justify an increase in MSRP of the 40 series cards.
Finding a 3090 below MSRP is like finding gold.

Also, as mentioned it has horrible power consumption, and I'm sure the 4080 will be better in that regard.
I wouldn't really count on it.
400 series right around the corner. Can’t imagine being sweaty enough to buy this.
Some people have to have the best. I've never understood it, but there's some people who care more about benchmarking than actually using their system. Overclocking is a hobby into itself.
 
That stinks, if true. So only big upgrade for 3080 owners might be 4090 for $2K.. :messenger_pensive:
How is the 4080 being 40% better than 3090 not a big upgrade over 3080? That is maxwell to pascal territory.

According to that chart, I can’t settle for anything less than 4060ti seeing as the 4060 has a 128bit bus and 8gb? Which makes me doubt the whole chart btw.

No way would I get an 4060 with 8gb since my 3060 has 12 lol.
 
Last edited:

//DEVIL//

Member
No way in hell the 4080 isn’t better than 3090.
In theoretical numbers it's 40% more. how much is that related to video gaming? we don't know. usually less. and it will have less VRAM as well ( so far it's pointing to be 16gigs for 4080 ). and it will have higher Watts. thus more heat or a 3.5 slot card at least. that is too much really for a gain of 20 to 30% at best.

at today's standard. is the 2080ti inferior to the 3080 for regular gaming? no both are amazingly fine. The same will go for 3090 and 4080.
 
In theoretical numbers it's 40% more. how much is that related to video gaming? we don't know. usually less. and it will have less VRAM as well ( so far it's pointing to be 16gigs for 4080 ). and it will have higher Watts. thus more heat or a 3.5 slot card at least. that is too much really for a gain of 20 to 30% at best.

at today's standard. is the 2080ti inferior to the 3080 for regular gaming? no both are amazingly fine. The same will go for 3090 and 4080.
In all reality 16 gigs on the 4080 will be god's plenty. But higher heat is an issue, if true. 10 gigs on the 3080 is an issue in the long run. But 16? That's great.

But I am betting on Maxwell to Pascal performance, personally. If the the 4080 is indeed 40% faster than 3090 than that is definitely an 980 > 1080 leap comparing 3080 to 4080.
 

Dream-Knife

Banned
How is the 4080 being 40% better than 3090 not a big upgrade over 3080? That is maxwell to pascal territory.

According to that chart, I can’t settle for anything less than 4060ti seeing as the 4060 has a 128bit bus and 8gb? Which makes me doubt the whole chart btw.

No way would I get an 4060 with 8gb since my 3060 has 12 lol.
The 3060 getting 12gb was likely a response to AMD, and a 192bit bus is either 6 or 12. AMDs next cards are also rumored to have less vram.

These cards supposedly have a lot of L2, which mitigates the downside of a smaller bus. It's the same thing AMD did with RDNA2.
In all reality 16 gigs on the 4080 will be god's plenty. But higher heat is an issue, if true. 10 gigs on the 3080 is an issue in the long run. But 16? That's great.

But I am betting on Maxwell to Pascal performance, personally. If the the 4080 is indeed 40% faster than 3090 than that is definitely an 980 > 1080 leap comparing 3080 to 4080.
1080 -> 2080 was 30%. 2080 -> 3080 was 40%. I think it's safe to bet 30-40% every gen.
 
Dream-Knife Dream-Knife Yeah but the lower bus width on rdna2 affected performance either the higher the resolution you used, or lower, can't remember. It definitely was a cost cutting measure first and foremost.

Definitely wouldn't want to downgrade on my 3060's memory set up even if the 4060 outperforms it otherwise. Reducing textures / hitting vram limits sucks. If true, 4060ti or 4070 looks like the upgrade for me. Hopefully they're not THAT power hungry.

We will see on how big of a jump it is I suppose.

Edit : just checked hardware unboxed, the 6800xt performed better in some titles than 3080 at lower resolutions but performed worse at 4k compared to rtx 3080, definitely because of the memory bus.
 
Last edited:

winjer

Gold Member
Dream-Knife Dream-Knife Yeah but the lower bus width on rdna2 affected performance either the higher the resolution you used, or lower, can't remember. It definitely was a cost cutting measure first and foremost.

Definitely wouldn't want to downgrade on my 3060's memory set up even if the 4060 outperforms it otherwise. Reducing textures / hitting vram limits sucks. If true, 4060ti or 4070 looks like the upgrade for me. Hopefully they're not THAT power hungry.

We will see on how big of a jump it is I suppose.

The Infinity Cache has many advantages, it increases overall memory bandwidth, in reduces latencies considerably, and it reduces memory accesses resulting in lower power consumption.
At 4K, 128MB of this cache has a hit rate of over 60%. At 1440p, more than 70%.
The disadvantage is that it uses quite a bit of space on die. But it's worth it, so much so, that nVidia is increasing the caches for Ada, significantly.
It's also very good for RT, because the BHV can be stored in there, and since it has such low latency it has a big benefit to performance.
And also consider that Ai requires a lot of the memory subsystem, so having big caches helps a lot. RDNA2 doesn't have tensor units, but if RDNA3 has them, it will be very beneficial.
Ada will probably have a big jump in AI because of the big L2 cache increase.
 
Last edited:
The Infinity Cache has many advantages, it increases overall memory bandwidth, in reduces latencies considerably, and it reduces memory accesses resulting in lower power consumption.
At 4K, 128MB of this cache has a hit rate of over 60%. At 1440p, more than 70%.
The disadvantage is that it uses quite a bit of space on die. But it's worth it, so much so, that nVidia is increasing the caches for Ada, significantly.
It's also very good for RT, because the BHV can be stored in there, and since it has such low latency it has a big benefit to performance.
And also consider that Ai requires a lot of the memory subsystem, so having big caches helps a lot. RDNA2 doesn't have tensor units, but if RDNA3 has them, it will be very beneficial.
Ada will probably have a big jump in AI because of the big L2 cache increase.
Not discounting what the cache does or brings to the table, it just doesn't eliminate the need for a fast bus.

As I noted above, in some scenarios the 6800xt was bandwidth starved at 4k in comparison to lower resolutions.

It's going to be a very interesting battle between amd and nvidia this round. And also if Intel shows up hopefully that helps with prices.
 
Last edited:

winjer

Gold Member
Not discounting what the cache does or brings to the table, it just doesn't eliminate the need for a fast bus.

As I noted above, in some scenarios the 6800xt was bandwidth starved at 4k in comparison to lower resolutions.

It's going to be a very interesting battle between amd and nvidia this round. And also if Intel shows up hopefully that helps with prices.

Depending of the amount, it does reduce the need for a fast bus.
I don't think the 6800XT loses because of lack of memory bandwidth, at higher resolutions. I think it's a matter of shader throughput.

Now with RDNA3, if it has Tensor Cores, and more advanced RT, and the shaders units, it's going to require even more cache.
 
How is the 4080 being 40% better than 3090 not a big upgrade over 3080? That is maxwell to pascal territory.

According to that chart, I can’t settle for anything less than 4060ti seeing as the 4060 has a 128bit bus and 8gb? Which makes me doubt the whole chart btw.

No way would I get an 4060 with 8gb since my 3060 has 12 lol.
Yeah I'm prob overreacting. It was nice to have $800 card that is basically top tier [-10-15%], but it was silly of me to expect nvidia to repeat that 'mistake' with 40 series.
 

Dream-Knife

Banned
Dream-Knife Dream-Knife Yeah but the lower bus width on rdna2 affected performance either the higher the resolution you used, or lower, can't remember. It definitely was a cost cutting measure first and foremost.

Definitely wouldn't want to downgrade on my 3060's memory set up even if the 4060 outperforms it otherwise. Reducing textures / hitting vram limits sucks. If true, 4060ti or 4070 looks like the upgrade for me. Hopefully they're not THAT power hungry.

We will see on how big of a jump it is I suppose.

Edit : just checked hardware unboxed, the 6800xt performed better in some titles than 3080 at lower resolutions but performed worse at 4k compared to rtx 3080, definitely because of the memory bus.
Yeah bandwidth is key, but keep in mind the AMD cards used GDDR6 vs GDDR6x. It's also easily overclocked to 20gbps.

Also, according to Microsoft at least, ideally you'll have over 12gb of Vram :messenger_grinning_smiling:.
znjNEh7.jpg
 
Last edited:

//DEVIL//

Member
In all reality 16 gigs on the 4080 will be god's plenty. But higher heat is an issue, if true. 10 gigs on the 3080 is an issue in the long run. But 16? That's great.

But I am betting on Maxwell to Pascal performance, personally. If the the 4080 is indeed 40% faster than 3090 than that is definitely an 980 > 1080 leap comparing 3080 to 4080.
40% is a theoretical number. not an actual performance gap. because the Cuda core and SM are still higher on the 3090 than 4080. But due to different design, the performance is suggested on paper is about 40% faster. in reality though, I am expecting 15 to 20 frames different in 4k.
 
Also, according to Microsoft at least, ideally you'll have over 12gb of Vram :messenger_grinning_smiling:.
Lol what game is that?

Also you know as well as I do some games waaaay overestimate vram (like in menu's where it says you need 13gb etc. etc.) and that even if a game only needs 8gb, it might use 10gb on the vram usage bar just because it's a 12gb card and it's got the available memory.

I'd be surprised if 3060 ever needed more than 12gb. But maybe the next gen cards might in titles way into the future.
 
Last edited:

Dream-Knife

Banned
Lol what game is that?

Also you know as well as I do some games waaaay overestimate vram (like in menu's where it says you need 13gb etc. etc.) and that even if a game only needs 8gb, it might use 10gb on the vram usage bar just because it's a 12gb card and it's got the available memory.

I'd be surprised if 3060 ever needed more than 12gb. But maybe the next gen cards might in titles way into the future.
Halo Infinite.

Yeah, a lot of games will ask for much more than they need. Only time I've legitimately gone over 8 was when I disabled texture streaming in Insurgency Sandstorm and had the entire map loaded into memory (14.5gb-15gb). Perhaps that may be a reason these new cards will have less.
 
Halo Infinite.

Yeah, a lot of games will ask for much more than they need. Only time I've legitimately gone over 8 was when I disabled texture streaming in Insurgency Sandstorm and had the entire map loaded into memory (14.5gb-15gb). Perhaps that may be a reason these new cards will have less.
8gb is already obsolete, well it wouldn't be enough for 2080ti/3080 anyway even now. 10gb is just enough for 3080 right now but at some point it'll hit a wall like the 2gb 680 did.
 
is the 2080ti inferior to the 3080 for regular gaming?
3080 is 30-40% faster at 4k, and the 3080's MSRP was almost half.

wouldve been dumb to buy a 2080ti with the 3080 right around the corner.
the Cuda core and SM are still higher on the 3090 than 4080.
4080 is supposed to have higher everything, except memory size and memory bandwidth.
more shading units, TMUs, ROPs, SMs, tensor cores, etc.
 
We'll see I guess. I trust Nvidia knows what they're doing. So far that 16gb has just been marketing.
Oh yeah definitely, 16gb is utterly useless so far. Probably makes sense for future proofing this next round of cards though, then 12gb for mid range and 8gb for the 4050’s and such just to keep them affordable.
 

//DEVIL//

Member
3080 is 30-40% faster at 4k, and the 3080's MSRP was almost half.

wouldve been dumb to buy a 2080ti with the 3080 right around the corner.

4080 is supposed to have higher everything, except memory size and memory bandwidth.
more shading units, TMUs, ROPs, SMs, tensor cores, etc.

Right... We ended up with the 2080TI costing more than buying price thanks to the LHR cards and availability, etc. let alone the fact the 3090 is an FHR card so depending on mining this fall it might be a very popular card. ( usually, it booms around fall ) so we will see.
 

Dream-Knife

Banned
Oh yeah definitely, 16gb is utterly useless so far. Probably makes sense for future proofing this next round of cards though, then 12gb for mid range and 8gb for the 4050’s and such just to keep them affordable.
Keep in mind future proofing is a fallacy.

I don't think any current card is going to be playable at 4k once UE5 hits.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
IUcEVB4.png


This doc was created from the latest hack on Nvidia where some of the stuff got leaked. this was a screenshot grab from the Graphically challenged channel on youtube where he shared.

This looks like some ol bullshit.
The 4060 and 4070 are going to be 12GB with the 192bit G6 memory. The cache making up the loss in bandwidth from the usual 256bit that the xx70 shoulda coulda gotten.
We arent getting xx60s and above with 6GB or 8GB I think the perf bump justifies abandoning 8GB and certainly abandoning 6.

Why would the 4080 be AD103?
Its also unlikely to be G6X with the cache again making up the difference.


This guy looked at the leak of the chips and just speculated, pretty much all his guesses are off very rarely is a first batch the full tilt chip.
Hes assuming Nvidia will somehow max out every single chip first go?
Theres wishful thinking then theres this shit, getting yields that good day one then NOT selling those full tilts to corporate clients is wild.
There wont be yields that high Nvidia would even expect yields that high, so the first batch of AD102, AD104 and hopefully AD103 will be some percentage of the full chip NOT 100% as hes speculating.
 
This looks like some ol bullshit.
The 4060 and 4070 are going to be 12GB with the 192bit G6 memory. The cache making up the loss in bandwidth from the usual 256bit that the xx70 shoulda coulda gotten.
We arent getting xx60s and above with 6GB or 8GB I think the perf bump justifies abandoning 8GB and certainly abandoning 6.

Why would the 4080 be AD103?
Its also unlikely to be G6X with the cache again making up the difference.


This guy looked at the leak of the chips and just speculated, pretty much all his guesses are off very rarely is a first batch the full tilt chip.
Hes assuming Nvidia will somehow max out every single chip first go?
Theres wishful thinking then theres this shit, getting yields that good day one then NOT selling those full tilts to corporate clients is wild.
There wont be yields that high Nvidia would even expect yields that high, so the first batch of AD102, AD104 and hopefully AD103 will be some percentage of the full chip NOT 100% as hes speculating.
Prices are literally decided days before launch anyway. So including them in a sheet 6 months before launch discredits everything.
 
Keep in mind future proofing is a fallacy.

I don't think any current card is going to be playable at 4k once UE5 hits.
When I say future proof I don’t mean forever I just mean that the parts won’t be bottlenecked by some limitations like a good gpu paired with old cpu, or not enough vram/ram.

You don’t want to upgrade sooner than you have to. I expect my 32gb ram to last the generation for example, as well as my 5700x cpu upgrade soon.

Regarding UE5 native 4k, I have no idea but at least there’s dlss.
 
Last edited:
Actually could have bought the FE version easily. No wonder though,. This is one of the worst priced cards in recent history. Especially as in 6 months it will be outperformed by a card less than 50% it's price.
 

Ulysses 31

Member
This one's available in my area too, wondering if I should get this one instead of the Founders Edition... 👀

AORUS GeForce RTX 3090 Ti XTREME WATERFORCE 24G
1000
 

kraspkibble

Permabanned.
This one's available in my area too, wondering if I should get this one instead of the Founders Edition... 👀

AORUS GeForce RTX 3090 Ti XTREME WATERFORCE 24G
1000
lol 3 fan liquid cooling for a GPU is ridiculous. the amount of power these things use and the heat output is insane.
 

Ulysses 31

Member
lol 3 fan liquid cooling for a GPU is ridiculous. the amount of power these things use and the heat output is insane.
I'm using the 3090 version ATM which has 2 fans on the radiator. The cooling's pretty good so I'd assume the cooling on the 3090 TI would be too.
 

mitchman

Gold Member
It's not like the 3090 will be stored away to collect dust, I'll sell it be fund most of the 3090 TI.
Sure, if you can get a good price for the 3090 closer to the TI price. I'm not craving to replace my 3080 or the 6900XT for this power gobbler though.
 

skneogaf

Member
Just sold my 3090 founders edition for 200 less than I paid for it nearly 9 months ago, I was always unhappy with the ram heat being 100 degrees which made the fans spin up to extreme levels at points in a game that didn't make sense.

I'll try a 3090ti as all the ram is at least on the side of the heat sink fan.

I have ordered the founders edition again but I may cancel and pick up the Asus tuf 3090ti as it has two hdmi ports.
 

Celcius

°Temp. member
For those of you who bought one, how does the noise level compare to the 3090 or whatever card you were using before? Does the huge cooler allow it to be quieter than ever, or is it as loud as it hot?
 
Last edited:

Ulysses 31

Member
For those of you who bought one, how does the noise level compare to the 3090 or whatever card you were using before? Does the huge cooler allow it to be quieter than ever, or is it as loud as it hot?
Didn't seem that loud to me when I used the FE edition. I replaced it with the Aorus 3090 TI Waterforce and that one's much cooler and it doesn't heat up the motherboard/chipset as much.
 
Last edited:

GreatnessRD

Member
Didn't seem that loud to me when I used the FE edition. I replaced it with the Aorus 3090 TI Waterforce and that one's much cooler and it doesn't heat up the motherboard/chipset as much.
Are you buying the 40 series at launch or was this your big purchase for the time being?
 
Top Bottom