• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 4070 Ti to replace canceled RTX 4080 12GB SKU, launch in Janaury

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The top resolutions at the time :p As I do now.
I guess you have a point, doesn't change the fact that nobody does what you say. 99% of gamers simply use way more power now. So it is a valid point, but not realistic one.
I dont think thats true.
Cuz people who would be 4090 customers (as in they want the best of the best regardless of cost) were either doing SLI or had X2 cards way back when and were already on ~850+W power supplies.

Assuming your power supply is still going strong the one you used to SLI and OC your K chip would still handle the 4090 today without issue.

People who were in the 200W TDP range are likely still in the 200W TDP range today.
The 4060 is expected to be a 200W card.
The GTX260 was a 185W card.
So realistically those customers are still in the same range.

You basically have to go back to pre-GTX to find top range cards that had ~300W PSU as a recommended. (pre-NeoGAF effectively)
And yes im talking about the 6800GS and 8800GTS (By the time GTX cards were a thing I was already on a pretty beefy PSU, and by the GTX570 I was already at 850W, if my PSU wasnt out of warranty I would probably still be using it in my main rig right now, but its still going strong in my media center).
And im not buying a new PSU for the 40 series cuz 850W is enough and has been for a very very long time, actual power draw at the wall for me hasnt really increased in years.
In fact I think one of the main reasons it has increased is that im actually using a HFR monitor so my GPU is actually maxing out.
Even with SGSSAA and OGSSAA I generally wasnt maxing out my GPUs way back when cuz i would vsync to 60/75Hz.
 

GymWolf

Member
Read my post again. You are so happy that perf per wat is so much better, but you take 2-3x more power from the socket. Hurray!


For me it comes down to this: 10 years ago I played top graphics games using 300W, now to play top graphics games I use 600W. Progress.
Optimization of power consumption can only go that far dude, with a 4090 he can play at 4k120++ with incredible looking games, not exactly what we had 15 years ago.
 

thuGG_pl

Member
Oh
Optimization of power consumption can only go that far dude, with a 4090 he can play at 4k120++ with incredible looking games, not exactly what we had 15 years ago.
I'm not denying the progrss in graphics. I'm denying the perception that we are somehow more eco friendly today, we are not, the opposite actualy.
 

GymWolf

Member
Oh

I'm not denying the progrss in graphics. I'm denying the perception that we are somehow more eco friendly today, we are not, the opposite actualy.
Sure, i think you and ghg were just talking about different things and you were both right.
 

01011001

Banned
Read my post again. You are so happy that perf per wat is so much better, but you take 2-3x more power from the socket. Hurray!


For me it comes down to this: 10 years ago I played top graphics games using 300W, now to play top graphics games I use 600W. Progress.

undervolt it, underclock it, and even then it will still have really good performance.

also the card doesn't hit its max TDP that often. that 600W is the max it can take at full load.
in many games it will hover at around 350W to 450W without any undervolting or downclocking
 
Last edited:

Dr.D00p

Member
Personally I don't really give a shit about power consumption of these or any other GPU, my main objection to them is being taken for a stupid cunt by a bunch of rich, greedy cunts at Nvidia HQ...
 

thuGG_pl

Member
undervolt it, underclock it, and even then it will still have really good performance.

also the card doesn't hit its max TDP that often. that 600W is the max it can take at full load.
in many games it will hover at around 350W to 450W without any undervolting or downclocking

You are missing the point. But we can end it now, because it's an offtop already.
 

HoodWinked

Member
4D chess move. Sell weaker card by the same name. Causes massive outrage.

Now do the thing that you always should have done in the first place and sell it under a different name, look like geniuses.

4
 

thuGG_pl

Member
4D chess move. Sell weaker card by the same name. Causes massive outrage.

Now do the thing that you always should have done in the first place and sell it under a different name, look like geniuses.

4
I don't think they will if they keep 900$ price. There will be serious outrage if they do that.
Sadly basicalally with this gen nV and AMD decided to f**k us over.
 

hlm666

Member
That is slightly cheaper than what the "4080 12gb" was set at right? still expensive for that tier though.

There is definitely push back on the prices, next quarter financials should be interesting we will be able to see nvidias gaming revenue. Does AMD give a breakdown of gpu/cpu or does it just lump both into client computing or something?
9hGBfdHQBWtrbYQKAfFZWD.png

 

GHG

Gold Member
The top resolutions at the time :p As I do now.
I guess you have a point, doesn't change the fact that nobody does what you say. 99% of gamers simply use way more power now. So it is a valid point, but not realistic one.

Higher resolutions, higher refresh rates, higher quality assets in games along with more complexity. All of these things require more and more power. There's only so much you'd be able to get out of 350w, but then consoles and laptops still exist so it's not like you don't have options.
 

DaGwaphics

Member

Still seems like it's $100 overpriced, but at least it's not $899. :messenger_grinning_smiling:
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If you think for second Nvidia will lower the price of the 4080 and to an extent, the 4070ti from the official old price of the 4080, then you are sadly mistaken. Nvidia is like Apple, all they need to do is stop making 4090 for couple of months, all the stock of 4080 will be gone. which is what is happening to be honest as its almost impossible to get a 4090 at all. especially here in Canada ever since the release date, no card was restocked.

Unless they do Supers again like Turing, im pretty sure they will actually end up doing price drops on the 4080 sooner rather than later and releasing a 4080'20G in its place.
The 4080'12G 4070Ti is already reported to be getting a price drop.
Granted there are no founders editions so you might never actually find it at the price.
But CES and GTC will have the answers for us.

 

HoofHearted

Member
I have never said it's cheap. I have the FE 4090 and I paid so much money for it. but like you said it's a hobby and it's my main one when I have time. But I still wouldn't see Nvidia at least officially dropping the price on an FE card. they might lower the price for AIB cards so they lower the price too. But as Nvidia itself admits their pricing is high and lower it officially after release? not gonna happen. Not till the 5000 series is almost here or if they decided to release the 4090ti for the same price as the 4090, then they will lower the price for the 4090 and 4080 accordingly ( just like how they did later with the 3090ti, 3090, and the 3080ti but even then, they only did couple of months before the 4000 series lol ). but they didn't lower the price of the other cards. I also highly doubt the 4090ti will be a 1600$ card either. its gonna be 2000$ for sure.

Again, the 4090ti isn't aimed at you or me. It's aimed at the guys with lots of money who want the best even if it's for a short period of time. the type of people that shit money and 2000$ is like 20$ to us.

People's problem with the 4080 is in its name associated with its price. if the 4090 was called 4090 ti and the 4080 was called 4090, people would be all over it " OH MY GOD NVIDIA IS THE BEST 4090 for 1200 ONLY, AND ITS 50% FASTER THAN 3090TI BUT MUCH CHEAPER NVIDIA ARE GODS GAME OVER AMD "

As it stands, the 4080 is a supreme card and top-of-the-line card for any 4k 60+ title. let alone DLSS and frame generation ( even if FG 1.0 isn't great to some, it will only get better with the upcoming updates, however, I played spider man with frame generation..I didn't feel any input lag but I felt the double frames )

To each his own. But honestly, if you can afford the 4090, go for it. if you can afford the 4080, go for it. I will never recommend AMD 7900xtx over 4080 no matter what. let alone the huge mistake @Topher was about to make by buying a 7900XT. man dodged a bullet -_-

Certainly agreed that it's an expensive hobby.. :). I've spend waaaayyyy too much over the past several years upgrading my rig.. and am now considering upgrading again over the course of this year. Comparatively speaking though - the price/performance tiers compared to what I bought for last gen have been significantly changed.

Even with the latest news above (if it holds true) regarding $799 for the 4070Ti - that's still a ~100% increase in price versus last gen - and that's only one part of the overall build. Considering this, along with the notable drop in YoY sales for GPUs indicates that the demand is (unsurprisingly) significantly less compared to the 3xxx series - obviously due to impacts from mining, but also due to change in current pricing.

Adding to this - the overall cost of attempting to buy all the required components for a full net "new" build is nearly double or triple if you're targeting top of the line/latest gen releases compared to the past few generations.

Realistically, budgeting and spending $3-4k+ on a custom "high-end" build in this market isn't sustainable IMHO. At that price point - priorities shift considering the significant investment. The average consumer (non-4090 buyer / low-mid range) will simply not buy/invest until the prices come lower - especially for those looking to upgrade to latest gen even for a mid-range build.

NVIDIA noted during their previous calls/updates that the pricing strategy they employed was focused on putting downward pressure on the market to sell remaining stock of 3xx series. That's been largely accomplished over this past quarter as most of the remaining 3xxx stock is the lower end/tier cards. The question is - how much longer is NVIDIA, AMD, and other parts manufacturers willing to live with the reduced YoY sales?

Setting the 4090 aside - expecting continued 30%+ profit margins at 100% increase MSRPs isn't sustainable. We're already seeing evidence of this across several markets/regions ...

Your statements may certainly hold true .. but I still think a price adjustment is already being reviewed internally and we may see it sooner than later..
 
Last edited:

GreatnessRD

Member
They won't be forced to do it, It was the plan all along.

That $899 price tag of the XT already had an early adopter (at least) $100 price tax included in it, just waiting to be knocked off once Nvidia launched the 4070Ti.
That seems logical. Milk the early adopters and then get your originally planned target audience when the time comes, is a win win for AMD. CES is going to be interesting to say the least.
 

Buggy Loop

Member


3080 vs 4080 is +48.8% at 4k, +61.5% at 4k RT.

Sites with better CPUs are >50% generally. Igor got +58.7%

The problem anyway isn't the boost in performance gen to gen, it's the price the problem, it's +72%.

3090 vs 4090 got a +70.8% boost for +7% price increase. That makes sense.
 

DaGwaphics

Member

The Gigabyte Eagles aren't even dual slot this time, damn. They need to start thinking outside the box on these things (vapor chamber, liquid metal, etc.) to avoid the constant giant cards. That will hurt sales too, as many will just opt out of the model vs. getting a gigantic case.
 
Top Bottom