HeisenbergFX4
Gold Member
Every time you boot it up450 power?? At least you don't need a heater in winter.
Every time you boot it up450 power?? At least you don't need a heater in winter.
Will keep my regular 3090 but interested in seeing the reviews. Seems a bit too close to next gen though.
You can get a 450w bios for 3080, 3080ti, and 3090 right now. These cards are power starved.450 power?? At least you don't need a heater in winter.
By next gen I mean the next generation of nvidia graphics cards, not consolesNext gen? Ps5 and XSX are barely 2 years old, and because of consoles limitations and majority of games are tailored towards that hardware there won't be a true next gen closer to the usual console cycle.
Especially with how the demand problem ls right now.
What do these bios do? Do you see a good performance bump?Wow, that's a pretty significant bump. That's probably how the $1500 3090 should have been from the start honestly. 10% for over twice the price of a 3080 isn't worth it. 20% can be argued however.
You can get a 450w bios for 3080, 3080ti, and 3090 right now. These cards are power starved.
Should i sell my kidney to get
Or settle for my steam deck. Choices choices. Too bad I'm broke.
In an ideal world, it would have been twice the price of the 3080.Wow, that's a pretty significant bump. That's probably how the $1500 3090 should have been from the start honestly. 10% for over twice the price of a 3080 isn't worth it. 20% can be argued however.
You can get a 450w bios for 3080, 3080ti, and 3090 right now. These cards are power starved.
I didn't think the scalpers were as interested in the top tier.I'm envisioning the scalpers already.
Anything popular is potential profit. I dread to think how much these will go for.I didn't think the scalpers were as interested in the top tier.
Depending oh how much better perf is on my 3090 I’ll grab one and hand the current over to the lady of the house. After which I won’t look at anything they do for the next 5 or 6 years.Normally I'd make a joke about people buying production cards for gaming, but at resale prices, if you can get one of these for RRP it might not even be a bad idea.
At least early on, the 3090s weren't very popular because they were "twice the price of the 3080". (on paper)Anything popular is potential profit. I dread to think how much these will go for.
Just 10%?
Normally that would be sufficient if they released close to the base model.
But 3090 has been out.
Sounds to me like some quick milkage before they announce the 4000 Series.
It'll let you keep a higher clock in more demanding games. I haven't moved to the 450w bios personally. On the 380w I'm maxing out my monitor for the games I play anyway. Clocks will go higher if I bump up the power limit, but I'm limited to 399w on the "normal" bios with the +5% power slider.What do these bios do? Do you see a good performance bump?
It's insane how we got to the point where AMD stuff is the most efficient on the market. The only reason I stuck to Nvidia this gen (3080FE) is DLSS and in part RT performance, but I am confident RDNA 3 is going to be their GPU ZEN 3 moment.
Next Gen is out at the end of this year, if you need a card just buy a mid ranger that's at retail price and wait, buying this is stupid.
What peripherals do people even plug into PCI-e slots besides video cards these days? Everything else is integrated on the motherboard.The biggest beef for me is most of the 3rd party 3090 TIs will have a 3.5 slot cooling solution. I like my pci-e slots for other peripherals dammit!
Out of interest how much did you pay for the 6900XT? And do you find FSR useful?Well said. Had the same effect on me, too.
I was holding out on getting a 3000 series card but all of the bullshit around them soured me greatly on Nvidia in general. I have a 1070Ti that can play most of my currently owned games at 1440p, but I had to tone down Red Dead Redemption 2 way more than I wanted to, so I decided to get a new card.
So, for this gen, I'm going Team Red (6900 XT). And the order just took a couple of clicks. No F5'ing. No subscribing to 3 different Twitter feeds, 5 Discord channels, and 15 stock apps. No need to buy from a scalper. No need to make a line outside of Micro Center. Ain't got time for any of that bullshit
Maybe someday I'll come back to Nvidia. Maybe...
I use my extra pci-e slots for extra nvme expansion, allows me to have up to 4x’s more nvme’s instead of continuing to invest in outdated sara ssd’s after my motherboards 2 nvme slots are takenWhat peripherals do people even plug into PCI-e slots besides video cards these days? Everything else is integrated on the motherboard.
Way overpriced. I paid US$1600 for it, and it was "on sale" (!!!!). I haven't turned on FSR yet.Out of interest how much did you pay for the 6900XT? And do you find FSR useful?
I also own a 1070ti which is long in the tooth now, plus I play at 3440*1440.
I was exactly the same with Red Dead 2 - in fact I stopped playing it because my frame rates tanked going from 1080p to 1440 ultrawide. Even dropping down to low quality preset only had me in the mid 40s and I wasn't willing to drop resolution.Way overpriced. I paid US$1600 for it, and it was "on sale" (!!!!). I haven't turned on FSR yet.
The one I got is made by MSI, one called a "Gaming Z Trio." About 2 weeks or so after I bought my card, a buddy pointed me to another 6900XT on sale at MicroCenter -- the MSI "Gaming X Trio." But this one was on sale much closer to the original MSRP of the 6900XT, I think it was $1050 or so. I think there's some difference between the X and Z variants of the card, in that the Z can be over-clocked in a way that the X can't. But I mean, we're talking about the 6900XT, which -- until the release of this 3090Ti -- was (by several benchmarks from reputable sources) the most "powerful" card for raw standard rasterization. So this over-clocking that the Z can do that the X can't... we're talking marginal, corner case shit here that shouldn't matter to most people. So if you're on the market, I would urge you to keep an eye out either for this X variant of the MSI card, or other 6900XT cards which from time to time go on sale for about $1300 or thereabouts.
I love my 1070Ti though, it has served me really well -- and will continue to serve me well, since I'm gonna move it to my backup rig. But I splurged for the 6900XT because I'm interested in Ultrawide (just like you, 3440x1440), and some recent games (RE2 Remake, RE3 Remake, and Red Dead 2) were starting to show the card's age. The Resident Evil games I could still play at 2560x1440, but to get decent framerates, I had to lower quite a few graphical settings. Red Dead 2 just couldn't work at 1440p, so -- for the first time since I got the 1070Ti -- I had to downgrade a game to 1080p. I think that's when I realized it was time for a new card.
Zero regrets. The 6900XT is an absolute BEAST.
I have a good friend who got lucky last year and got the 6800XT (AMD reference) at the original retail of $650 from AMD directly. But to get to that point, he spent weeks (maybe even months) looking. I value my time too much to be F5'ing, subscribing to stock apps and Twitter feeds, and all that nonsense.
My beef is that they will still be using the same standard cooling solution we've been using for years now on something that'll be pumping out real heat.The biggest beef for me is most of the 3rd party 3090 TIs will have a 3.5 slot cooling solution. I like my pci-e slots for other peripherals dammit!
All very good points.I was exactly the same with Red Dead 2 - in fact I stopped playing it because my frame rates tanked going from 1080p to 1440 ultrawide. Even dropping down to low quality preset only had me in the mid 40s and I wasn't willing to drop resolution.
I have seen this card on Overclockers which is £1,100, but with the 4000 series apparently around the corner I am hesitant. I need to do more research on FSR, and then there's raytracing to consider, though I'm not as obsessed with it as some people are.
I might just wait until the end of the year and see what I can get for ~£750. I agree on the F5'ing, I ain't got time for that. Have just about accepted using stock apps which I needed to get my PS5.
It's crazy what has become acceptable price wise though. I paid £420 for my 1070ti (MSI FROZR). The 6900XT does look beastly at higher resolutions, but $1600 is more than I'm willing to pay!
Never ever was a jump in generation where it was twice a better performance.Imagine buying this shit and in 5 months you get a 4070/80 that is going to be at least twice better in performance. Im sure the two gaf members are excited about it. You know who you are.
low powered SoC's can have coil whine, one does not mean the other. Coil Whine is a manufacturing accuracy issue, not a "more power = more whine" issue.Here’s something that no one seems to be talking about - with ALL THAT ELECTRICAL CURRENT going through the card, what is coil whine going to be like?
Can you share only a single of those leaks?Never ever was a jump in generation where it was twice a better performance.
From the leaks, the 3090 is about same level as 4080 but 24 vs 16 vram in favor of the 3090.
So if someone find a decently priced 3090 today that is way below msrp, go for it. Even if you have to sell it next year in favor of 4000 series ( because don’t get your self the paper release this year you are not gonna get it aside from bots ), you won’t lose much since you paid well below the msrp .
The 3090 white strix here retail for 3000$ CAD after tax, I got it with stickers on it for 2200$ last week. Yeah I’ll take it why not.
Imagine buying this shit and in 5 months you get a 4070/80 that is going to be at least twice better in performance. Im sure the two gaf members are excited about it. You know who you are.
A 4070 isnt going to match a 3090ti, itll be lucky to walk with the 3080 12G.
Hell the the 4080 at its best, absolutely shunted to death is not going to be twice better in performance.
Assuming the RTX 4070 is AD104 and theyve cut down the memory bus and replaced it with their new Large L2, dont expect the 4070 to walk a 3090ti.
Itll be a good card but its not even double the performance of the 3070 let alone double the performance of a 3090ti.
P.S IF the 4070 is AD103 based then we might be in for a treat indeed.
Assuming crypto doesnt boom again as new GPUs hit the market so prices skyrocket again, with Nvidias higher MSRP I dread to even think how high prices will go.
P.P.S As a skip a generationer RTX40 can chill, Ill wait for the RTX50 which will likely be MCM design and a true generational leap coming from an RTX30.
Can you share only a single of those leaks?
2080 to 3080 was close to 2x perf uplift. and I'd expect similar case with 4080 which would mean at minimum 50% faster than 3090. Maybe you meant 4060?
2080 to 3080 everyone expected between 40 and 50% better performance.We assumed a lot of the same things pre 3080 and we were mostly wrong. We'll see release date, but gut instinct is you're wasting money on 3090TI just like the Titans. But hey if ppl have money to waste then go for it.
It depends on game to game, but there are examples where you get 2x perf. especially when RT is involved. Here look at TR, TW3 etc > https://www.guru3d.com/articles_pages/msi_geforce_rtx_3080_suprim_x_12gb_review,12.htmlHow did you measure the close to 2x perf uplift.
On average the 3080 was ~50% faster in gaming.
2080 to 3080 everyone expected between 40 and 50% better performance.
By the time the 3080 came out, the perf uplift was about 45% so pretty much exactly what everyone was expecting.
If anything I feel with Ada people are hoping the uplift is much higher than 2080 to 3080.
We are likely setting ourselves up for disappointment.
On average at launch cuz you know how Nvidia seemingly abandon old cards the 2080 vs 3080 was closer to 50% than being completely double.It depends on game to game, but there are examples where you get 2x perf. especially when RT is involved. Here look at TR, TW3 etc > https://www.guru3d.com/articles_pages/msi_geforce_rtx_3080_suprim_x_12gb_review,12.html
Ad Valhalla, F1 '21 ~2x at 4K.
Now I'm interested to know which games their [techpowerup] test suite consist of. I mean there's are quite a few games that gives 100% uplift and yet they show 40% wtf? That would mean they had to find some games where there must have been single digits for average to drop to 40 %. ed: 60% ...On average at launch cuz you know how Nvidia seemingly abandon old cards the 2080 vs 3080 was closer to 50% than being completely double.
And at 4K the 2080 was already NOT a 4K card at the 3080s launch it was bandwidth and memory starved which is why at 4K the performance differential suddenly changed.
The 3080 vs 4080 memory setup is alot closer.
12G vs 16G
384bit vs 256bit + huge cache.
We dont know how much that extra cache will help, but id hazard a guess and say at launch the 4080 wont be double the performance of the 3080 12G at 4K.
Sorry, I didn't link you as someone else. if you notice the 4080 in that leak is about the 3090 level. but with less VRAM. So in theory. I am not expecting much here. maybe it will have better ray-tracing lol.Now I'm interested to know which games their [techpowerup] test suite consist of. I mean there's are quite a few games that gives 100% uplift and yet they show 40% wtf? That would mean they had to find some games where there must have been single digits for average to drop to 40 %. ed: 60% ...