• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 3090 Ti launches on March 29th, reviews coming on the same day

HeisenbergFX4

Gold Member
450 power?? At least you don't need a heater in winter.
8MU8PH3.gif
Every time you boot it up

pilot gas GIF
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Will keep my regular 3090 but interested in seeing the reviews. Seems a bit too close to next gen though.

Next gen? Ps5 and XSX are barely 2 years old, and because of consoles limitations and majority of games are tailored towards that hardware there won't be a true next gen closer to the usual console cycle.

Especially with how the demand problem ls right now.
 

Celcius

°Temp. member
Next gen? Ps5 and XSX are barely 2 years old, and because of consoles limitations and majority of games are tailored towards that hardware there won't be a true next gen closer to the usual console cycle.

Especially with how the demand problem ls right now.
By next gen I mean the next generation of nvidia graphics cards, not consoles
 

FingerBang

Member
Wow, that's a pretty significant bump. That's probably how the $1500 3090 should have been from the start honestly. 10% for over twice the price of a 3080 isn't worth it. 20% can be argued however.

You can get a 450w bios for 3080, 3080ti, and 3090 right now. These cards are power starved.
What do these bios do? Do you see a good performance bump?

It's insane how we got to the point where AMD stuff is the most efficient on the market. The only reason I stuck to Nvidia this gen (3080FE) is DLSS and in part RT performance, but I am confident RDNA 3 is going to be their GPU ZEN 3 moment.
 

SmokSmog

Member
10% gain vs stock 3090 which has 350TDP limit. This RTX 3090Ti is OCed RTX 3090 out of the box with extra 2SM. We already have AIB versions of RTX 3090 with 450W TDP. Difference between 450W 3090 and 450W 3090Ti will be in the range of 2-4%
 
Last edited:

daffyduck

Member
Wow, that's a pretty significant bump. That's probably how the $1500 3090 should have been from the start honestly. 10% for over twice the price of a 3080 isn't worth it. 20% can be argued however.

You can get a 450w bios for 3080, 3080ti, and 3090 right now. These cards are power starved.
In an ideal world, it would have been twice the price of the 3080.

Being scalped for a 3080 vs paying a little bit more for a 3090 at retail made the price easier to stomach.
 

OsirisBlack

Banned
Normally I'd make a joke about people buying production cards for gaming, but at resale prices, if you can get one of these for RRP it might not even be a bad idea.
Depending oh how much better perf is on my 3090 I’ll grab one and hand the current over to the lady of the house. After which I won’t look at anything they do for the next 5 or 6 years.
 

AndrewRyan

Member
Am considering this card since there's 30 days left on my EVGA step-up program, meaning I can upgrade at the MSRP price. Current card is a 3080 12gb (lhr) hybrid. The current offer is for the 3090 ultra for an additional $600. Am leaning towards not upgrading and waiting for the 4000 series, but waiting to see the price and performance details on Tuesday.
 

winjer

Gold Member
Using 100W more, but only getting 10% extra performance. This one is not very power efficient.
The temperatures should be rather high, as well.

But I guess that for those who want the absolute best, these things won't matter all that much.
 

Dream-Knife

Banned
What do these bios do? Do you see a good performance bump?

It's insane how we got to the point where AMD stuff is the most efficient on the market. The only reason I stuck to Nvidia this gen (3080FE) is DLSS and in part RT performance, but I am confident RDNA 3 is going to be their GPU ZEN 3 moment.
It'll let you keep a higher clock in more demanding games. I haven't moved to the 450w bios personally. On the 380w I'm maxing out my monitor for the games I play anyway. Clocks will go higher if I bump up the power limit, but I'm limited to 399w on the "normal" bios with the +5% power slider.

I switched from a 6800 to a 3080 last year because AMD is so terrible with drivers. Best of luck to you if you go that route.

As for efficiency, sure. Doesn't really matter on a desktop though.

Zen 2 was the breakout btw.
 

Silver Wattle

Gold Member
Next Gen is out at the end of this year, if you need a card just buy a mid ranger that's at retail price and wait, buying this is stupid.
 

//DEVIL//

Member
Next Gen is out at the end of this year, if you need a card just buy a mid ranger that's at retail price and wait, buying this is stupid.

1- the 4000 Series will have a higher than 350W . thus they are using this new PCIE gen 5 connector. which in return,, you will start doing BBQ on your card if you want )
2- if a 2000 USD MSRP price ( at least ) is barely a bit more powerful than 3090. how much you really think the 4000 series will be ? ( yes i know different design and all ). but if history told us anything, you will be lucky if you get 30% performance gain over the 3090 with the 4080. and even with that the MSRP price will be high. with higher watts usage and most likely less VRAM too.

Not trying to downplay future cards. but you can always use the ( waiting game ) card on everything. some people want the most powerful thing. period. they can afford it. so be it.

I have a 3090 white Strix. I love this card. if 4080 comes out and its barely 20% more powerful ( Honestly I do not think it will be more than or 30% MAX in actual gaming performance. not theoretical BS. or 100% more powerful ray tracing crap), then they can keep that card ill not even bother. especially with the 4080 most likely will be 12 or 16 gigs. will just wait for 5000 series.
 
The biggest beef for me is most of the 3rd party 3090 TIs will have a 3.5 slot cooling solution. I like my pci-e slots for other peripherals dammit!
What peripherals do people even plug into PCI-e slots besides video cards these days? Everything else is integrated on the motherboard.
 

Hezekiah

Banned
Well said. Had the same effect on me, too.

I was holding out on getting a 3000 series card but all of the bullshit around them soured me greatly on Nvidia in general. I have a 1070Ti that can play most of my currently owned games at 1440p, but I had to tone down Red Dead Redemption 2 way more than I wanted to, so I decided to get a new card.

So, for this gen, I'm going Team Red (6900 XT). And the order just took a couple of clicks. No F5'ing. No subscribing to 3 different Twitter feeds, 5 Discord channels, and 15 stock apps. No need to buy from a scalper. No need to make a line outside of Micro Center. Ain't got time for any of that bullshit 😂

Maybe someday I'll come back to Nvidia. Maybe...
Out of interest how much did you pay for the 6900XT? And do you find FSR useful?

I also own a 1070ti which is long in the tooth now, plus I play at 3440*1440.
 
Out of interest how much did you pay for the 6900XT? And do you find FSR useful?

I also own a 1070ti which is long in the tooth now, plus I play at 3440*1440.
Way overpriced. I paid US$1600 for it, and it was "on sale" (!!!!). I haven't turned on FSR yet.

The one I got is made by MSI, one called a "Gaming Z Trio." About 2 weeks or so after I bought my card, a buddy pointed me to another 6900XT on sale at MicroCenter -- the MSI "Gaming X Trio." But this one was on sale much closer to the original MSRP of the 6900XT, I think it was $1050 or so. I think there's some difference between the X and Z variants of the card, in that the Z can be over-clocked in a way that the X can't. But I mean, we're talking about the 6900XT, which -- until the release of this 3090Ti -- was (by several benchmarks from reputable sources) the most "powerful" card for raw standard rasterization. So this over-clocking that the Z can do that the X can't... we're talking marginal, corner case shit here that shouldn't matter to most people. So if you're on the market, I would urge you to keep an eye out either for this X variant of the MSI card, or other 6900XT cards which from time to time go on sale for about $1300 or thereabouts.

I love my 1070Ti though, it has served me really well -- and will continue to serve me well, since I'm gonna move it to my backup rig. But I splurged for the 6900XT because I'm interested in Ultrawide (just like you, 3440x1440), and some recent games (RE2 Remake, RE3 Remake, and Red Dead 2) were starting to show the card's age. The Resident Evil games I could still play at 2560x1440, but to get decent framerates, I had to lower quite a few graphical settings. Red Dead 2 just couldn't work at 1440p, so -- for the first time since I got the 1070Ti -- I had to downgrade a game to 1080p. I think that's when I realized it was time for a new card.

Zero regrets. The 6900XT is an absolute BEAST.

I have a good friend who got lucky last year and got the 6800XT (AMD reference) at the original retail of $650 from AMD directly. But to get to that point, he spent weeks (maybe even months) looking. I value my time too much to be F5'ing, subscribing to stock apps and Twitter feeds, and all that nonsense.
 

mitchman

Gold Member
Probably a good time to introduce it now that GPU prices are falling and availability of gfx cards are way up. I'm happy with my RX 6900XT and RTX 3080, though.
 

Hezekiah

Banned
Way overpriced. I paid US$1600 for it, and it was "on sale" (!!!!). I haven't turned on FSR yet.

The one I got is made by MSI, one called a "Gaming Z Trio." About 2 weeks or so after I bought my card, a buddy pointed me to another 6900XT on sale at MicroCenter -- the MSI "Gaming X Trio." But this one was on sale much closer to the original MSRP of the 6900XT, I think it was $1050 or so. I think there's some difference between the X and Z variants of the card, in that the Z can be over-clocked in a way that the X can't. But I mean, we're talking about the 6900XT, which -- until the release of this 3090Ti -- was (by several benchmarks from reputable sources) the most "powerful" card for raw standard rasterization. So this over-clocking that the Z can do that the X can't... we're talking marginal, corner case shit here that shouldn't matter to most people. So if you're on the market, I would urge you to keep an eye out either for this X variant of the MSI card, or other 6900XT cards which from time to time go on sale for about $1300 or thereabouts.

I love my 1070Ti though, it has served me really well -- and will continue to serve me well, since I'm gonna move it to my backup rig. But I splurged for the 6900XT because I'm interested in Ultrawide (just like you, 3440x1440), and some recent games (RE2 Remake, RE3 Remake, and Red Dead 2) were starting to show the card's age. The Resident Evil games I could still play at 2560x1440, but to get decent framerates, I had to lower quite a few graphical settings. Red Dead 2 just couldn't work at 1440p, so -- for the first time since I got the 1070Ti -- I had to downgrade a game to 1080p. I think that's when I realized it was time for a new card.

Zero regrets. The 6900XT is an absolute BEAST.

I have a good friend who got lucky last year and got the 6800XT (AMD reference) at the original retail of $650 from AMD directly. But to get to that point, he spent weeks (maybe even months) looking. I value my time too much to be F5'ing, subscribing to stock apps and Twitter feeds, and all that nonsense.
I was exactly the same with Red Dead 2 - in fact I stopped playing it because my frame rates tanked going from 1080p to 1440 ultrawide. Even dropping down to low quality preset only had me in the mid 40s and I wasn't willing to drop resolution.

I have seen this card on Overclockers which is £1,100, but with the 4000 series apparently around the corner I am hesitant. I need to do more research on FSR, and then there's raytracing to consider, though I'm not as obsessed with it as some people are.

I might just wait until the end of the year and see what I can get for ~£750. I agree on the F5'ing, I ain't got time for that. Have just about accepted using stock apps which I needed to get my PS5.

It's crazy what has become acceptable price wise though. I paid £420 for my 1070ti (MSI FROZR). The 6900XT does look beastly at higher resolutions, but $1600 is more than I'm willing to pay!
 
The biggest beef for me is most of the 3rd party 3090 TIs will have a 3.5 slot cooling solution. I like my pci-e slots for other peripherals dammit!
My beef is that they will still be using the same standard cooling solution we've been using for years now on something that'll be pumping out real heat.
 
I was exactly the same with Red Dead 2 - in fact I stopped playing it because my frame rates tanked going from 1080p to 1440 ultrawide. Even dropping down to low quality preset only had me in the mid 40s and I wasn't willing to drop resolution.

I have seen this card on Overclockers which is £1,100, but with the 4000 series apparently around the corner I am hesitant. I need to do more research on FSR, and then there's raytracing to consider, though I'm not as obsessed with it as some people are.

I might just wait until the end of the year and see what I can get for ~£750. I agree on the F5'ing, I ain't got time for that. Have just about accepted using stock apps which I needed to get my PS5.

It's crazy what has become acceptable price wise though. I paid £420 for my 1070ti (MSI FROZR). The 6900XT does look beastly at higher resolutions, but $1600 is more than I'm willing to pay!
All very good points.

On the point of waiting for 4000 series -- I wasn't willing to bet on that for a couple of reasons:
  1. Not sure where crypto is going to be when those cards release. Yes it seems to be on a big downward trend -- at the moment. Could pick back up (in which case potential 4000 series gamers trying to buy would be in the same predicament as they have been with the past two years with the 3000 series). But if crypto is still not picking up then you win.
  2. I only have a 750W power supply that I had no desire to replace (got the cable management nice and neat the first time around!) and the analyses/rumors are that 4000 series cards are going to be very power hungry. Hell, even now, a 750W supply may be on the border for the 3000 series -- I could of course have gotten a 3070Ti, but even the 3080/3080Ti start getting into border territory on a 750W.
  3. Speaking of 3070Ti (when I was shopping for cards, this was the one that was least marked up, relatively speaking), it only has 8GB of VRAM, which in 2022 is some LOL-worthy shit. I've been a big Nvidia fan since I started gaming on PC, but damn they're skimpy with VRAM this gen. Does not bode well for the 4000 series. On the flip side... the 3060 and 3080 both have 12GB variants, so things may not be so terrible for Nvidia's next gen.
  4. I don't care about Ray Tracing -- at least not yet. The tech seems awesome but it seems like it needs a couple of years to become widely mainstream/pervasive. BUT, as happy as I am with my AMD card, I will be missing out on DLSS which does seem like a truly awesome technology, even now. (On the flip side, let's see how AMD's FSR 2.0 turns out...)
  5. To the great point you brought up about "what's acceptable" from a pricing perspective -- yeah, the days of me walking into a Best Buy and buying a 1070Ti for a value of $250, brand new -- those days may be LONG gone. (It was about $370 but it came with a couple of $60 games...). Hell, Nvidia beat themselves over the head for releasing the 3080 reference at $700 retail -- which is why the 3080Ti released at $1200 retail (or maybe it was $1100, can't remember) for marginal gains over the 3080. So if the rumors of the 4000 series being close to twice as powerful are true -- you may be looking at the 4080 releasing at $1,000 at retail, at a minimum.
There were other decisions that also factored into what price I would tolerate now (current inflation rates here in the US, the ongoing war in Europe and any potential fallouts in the global markets, etc) but this post is already way too long so I won't go into those. Needless to say, it wasn't an easy or spontaneous decision -- parting with $1600 is not something I casually do every day 😂

TL;DR you're doing the right thing by waiting. Even if it wasn't optimal for Red Dead 2, the beauty of the 1070Ti is that it *can* play like 95% of the current library on Steam at 1440p with no sweat. The other 5% are the "latest" AAA games that fall into the same category as Red Dead 2.
 
Imagine buying this shit and in 5 months you get a 4070/80 that is going to be at least twice better in performance. Im sure the two gaf members are excited about it. You know who you are.
 

//DEVIL//

Member
Imagine buying this shit and in 5 months you get a 4070/80 that is going to be at least twice better in performance. Im sure the two gaf members are excited about it. You know who you are.
Never ever was a jump in generation where it was twice a better performance.

From the leaks, the 3090 is about same level as 4080 but 24 vs 16 vram in favor of the 3090.

So if someone find a decently priced 3090 today that is way below msrp, go for it. Even if you have to sell it next year in favor of 4000 series ( because don’t get your self the paper release this year you are not gonna get it aside from bots ), you won’t lose much since you paid well below the msrp .

The 3090 white strix here retail for 3000$ CAD after tax, I got it with stickers on it for 2200$ last week. Yeah I’ll take it why not.
 

Celcius

°Temp. member
Here’s something that no one seems to be talking about - with ALL THAT ELECTRICAL CURRENT going through the card, what is coil whine going to be like?
 
Last edited:

Pagusas

Elden Member
Here’s something that no one seems to be talking about - with ALL THAT ELECTRICAL CURRENT going through the card, what is coil whine going to be like?
low powered SoC's can have coil whine, one does not mean the other. Coil Whine is a manufacturing accuracy issue, not a "more power = more whine" issue.
 
Wow this is a stupid release that will probably still sell unfortunately :/

At a certain point extra power is just stupid if it runs much hotter to do it. This is maybe going to be 5 to 10% better but with 100 more watts? ridiculous.

This is one reason I stick with xx50 ti or x060 level nvidia cards typically and the 70 range is as high as i’ll ever go.

With enough vram, Lovelace 4070 will beat this card at much lower watts and be much cheaper if you can wait.
 
Never ever was a jump in generation where it was twice a better performance.

From the leaks, the 3090 is about same level as 4080 but 24 vs 16 vram in favor of the 3090.

So if someone find a decently priced 3090 today that is way below msrp, go for it. Even if you have to sell it next year in favor of 4000 series ( because don’t get your self the paper release this year you are not gonna get it aside from bots ), you won’t lose much since you paid well below the msrp .

The 3090 white strix here retail for 3000$ CAD after tax, I got it with stickers on it for 2200$ last week. Yeah I’ll take it why not.
Can you share only a single of those leaks?

2080 to 3080 was close to 2x perf uplift. and I'd expect similar case with 4080 which would mean at minimum 50% faster than 3090. Maybe you meant 4060?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Imagine buying this shit and in 5 months you get a 4070/80 that is going to be at least twice better in performance. Im sure the two gaf members are excited about it. You know who you are.

A 4070 isnt going to match a 3090ti, itll be lucky to walk with the 3080 12G.
Hell the the 4080 at its best, absolutely shunted to death is not going to be twice better in performance.

Assuming the RTX 4070 is AD104 and theyve cut down the memory bus and replaced it with their new Large L2, dont expect the 4070 to walk a 3090ti.
Itll be a good card but its not even double the performance of the 3070 let alone double the performance of a 3090ti.
NvP64Il.jpg



P.S IF the 4070 is AD103 based then we might be in for a treat indeed.
Assuming crypto doesnt boom again as new GPUs hit the market so prices skyrocket again, with Nvidias higher MSRP I dread to even think how high prices will go.

P.P.S As a skip a generationer RTX40 can chill, Ill wait for the RTX50 which will likely be MCM design and a true generational leap coming from an RTX30.
 
A 4070 isnt going to match a 3090ti, itll be lucky to walk with the 3080 12G.
Hell the the 4080 at its best, absolutely shunted to death is not going to be twice better in performance.

Assuming the RTX 4070 is AD104 and theyve cut down the memory bus and replaced it with their new Large L2, dont expect the 4070 to walk a 3090ti.
Itll be a good card but its not even double the performance of the 3070 let alone double the performance of a 3090ti.
NvP64Il.jpg



P.S IF the 4070 is AD103 based then we might be in for a treat indeed.
Assuming crypto doesnt boom again as new GPUs hit the market so prices skyrocket again, with Nvidias higher MSRP I dread to even think how high prices will go.

P.P.S As a skip a generationer RTX40 can chill, Ill wait for the RTX50 which will likely be MCM design and a true generational leap coming from an RTX30.

We assumed a lot of the same things pre 3080 and we were mostly wrong. We'll see release date, but gut instinct is you're wasting money on 3090TI just like the Titans. But hey if ppl have money to waste then go for it.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Can you share only a single of those leaks?

2080 to 3080 was close to 2x perf uplift. and I'd expect similar case with 4080 which would mean at minimum 50% faster than 3090. Maybe you meant 4060?

How did you measure the close to 2x perf uplift.
On average the 3080 was ~50% faster in gaming.

We assumed a lot of the same things pre 3080 and we were mostly wrong. We'll see release date, but gut instinct is you're wasting money on 3090TI just like the Titans. But hey if ppl have money to waste then go for it.
2080 to 3080 everyone expected between 40 and 50% better performance.
By the time the 3080 came out, the perf uplift was about 45% so pretty much exactly what everyone was expecting.
If anything I feel with Ada people are hoping the uplift is much higher than 2080 to 3080.
We are likely setting ourselves up for disappointment.

relative-performance_3840-2160.png
 
How did you measure the close to 2x perf uplift.
On average the 3080 was ~50% faster in gaming.


2080 to 3080 everyone expected between 40 and 50% better performance.
By the time the 3080 came out, the perf uplift was about 45% so pretty much exactly what everyone was expecting.
If anything I feel with Ada people are hoping the uplift is much higher than 2080 to 3080.
We are likely setting ourselves up for disappointment.

relative-performance_3840-2160.png
It depends on game to game, but there are examples where you get 2x perf. especially when RT is involved. Here look at TR, TW3 etc > https://www.guru3d.com/articles_pages/msi_geforce_rtx_3080_suprim_x_12gb_review,12.html

Ad Valhalla, F1 '21 ~2x at 4K.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It depends on game to game, but there are examples where you get 2x perf. especially when RT is involved. Here look at TR, TW3 etc > https://www.guru3d.com/articles_pages/msi_geforce_rtx_3080_suprim_x_12gb_review,12.html

Ad Valhalla, F1 '21 ~2x at 4K.
On average at launch cuz you know how Nvidia seemingly abandon old cards the 2080 vs 3080 was closer to 50% than being completely double.

And at 4K the 2080 was already NOT a 4K card at the 3080s launch it was bandwidth and memory starved which is why at 4K the performance differential suddenly changed.

The 3080 vs 4080 memory setup is alot closer.
12G vs 16G
384bit vs 256bit + huge cache.

We dont know how much that extra cache will help, but id hazard a guess and say at launch the 4080 wont be double the performance of the 3080 12G at 4K.
 
On average at launch cuz you know how Nvidia seemingly abandon old cards the 2080 vs 3080 was closer to 50% than being completely double.

And at 4K the 2080 was already NOT a 4K card at the 3080s launch it was bandwidth and memory starved which is why at 4K the performance differential suddenly changed.

The 3080 vs 4080 memory setup is alot closer.
12G vs 16G
384bit vs 256bit + huge cache.

We dont know how much that extra cache will help, but id hazard a guess and say at launch the 4080 wont be double the performance of the 3080 12G at 4K.
Now I'm interested to know which games their [techpowerup] test suite consist of. I mean there's are quite a few games that gives 100% uplift and yet they show 40% wtf? That would mean they had to find some games where there must have been single digits for average to drop to 40 %. ed: 60% ...
 
Last edited:

//DEVIL//

Member
Now I'm interested to know which games their [techpowerup] test suite consist of. I mean there's are quite a few games that gives 100% uplift and yet they show 40% wtf? That would mean they had to find some games where there must have been single digits for average to drop to 40 %. ed: 60% ...
Sorry, I didn't link you as someone else. if you notice the 4080 in that leak is about the 3090 level. but with less VRAM. So in theory. I am not expecting much here. maybe it will have better ray-tracing lol.

the point is. there is a reliable leak that the full chip has been delayed. God knows till when. and even if they release it this year, it will be a paper release like usual and then scalping and stuff. you will end up with nothing in general but in 2023. while the 3090 if found below MSRP is really a good value with its 24gigs and will last you for years.
 
Last edited:
Top Bottom