• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen NVIDIA GeForce RTX 4090 With Top AD102 GPU Could Be The First Gaming Graphics Card To Break Past 100 TFLOPs

To the people whining about power consumption and electrical cost:

Even if you played 8 hours a day, 30 days a month, and your electric rate is $0.12 per KWh, the extra 100w a 4090 draws over a 3090 amounts to a whopping extra $2.88 per month. Wow. From there you can figure out the added cost based on the final numbers. 200w more = $5.76 a month. 300w more (not going to happen) then it's $8.64 more per month.

I'd like to think if you are someone buying a $2,000 graphics, $500 CPU, $400 motherboard, $300 NVME drive, and $300 of DDR5 RAM etc etc, then an extra $3-6 a month isn't going to be worth caring about. It's such a silly overblown thing.
 

Kenpachii

Member
To the people whining about power consumption and electrical cost:

Even if you played 8 hours a day, 30 days a month, and your electric rate is $0.12 per KWh, the extra 100w a 4090 draws over a 3090 amounts to a whopping extra $2.88 per month. Wow. From there you can figure out the added cost based on the final numbers. 200w more = $5.76 a month. 300w more (not going to happen) then it's $8.64 more per month.

I'd like to think if you are someone buying a $2,000 graphics, $500 CPU, $400 motherboard, $300 NVME drive, and $300 of DDR5 RAM etc etc, then an extra $3-6 a month isn't going to be worth caring about. It's such a silly overblown thing.

Only problem i have is heat that's about it, couldn't care about energy consumption specially not when i got solar panels that provide the majority of the energy anyway.
 

ZywyPL

Banned
To the people whining about power consumption and electrical cost:

Even if you played 8 hours a day, 30 days a month, and your electric rate is $0.12 per KWh, the extra 100w a 4090 draws over a 3090 amounts to a whopping extra $2.88 per month. Wow. From there you can figure out the added cost based on the final numbers. 200w more = $5.76 a month. 300w more (not going to happen) then it's $8.64 more per month.

I'd like to think if you are someone buying a $2,000 graphics, $500 CPU, $400 motherboard, $300 NVME drive, and $300 of DDR5 RAM etc etc, then an extra $3-6 a month isn't going to be worth caring about. It's such a silly overblown thing.

Energy is almost free really, I learned it few months ago when I was a full month away from home with only the fridge being turned on, running on bare minimum level even, and I still paid 65% of what I usually pay each month because most of the bill turned out to be the constant fees you pay for you energy provider no matter what, and since then I stopped taking care at energy consumption at all, all devices are in stand-by mode now and I charge the phone at home twice a day instead of at work, and those extra 100, 200, 300W next-gen GPUs and CPUs take will transkate into nothing at the end of the day, most of the bill will still be those constant fees.
 

Rickyiez

Member
Only problem i have is heat that's about it, couldn't care about energy consumption specially not when i got solar panels that provide the majority of the energy anyway.
True , heat is the biggest issue for us that is living in Tropical countries . Gaming at 60-70 degree , the hot air exhausted from the PC will just bring the room temperature up .
 
Last edited:
With node shrinks they can easily do it with their laptop-esq setups. 10 years ago we had 3.3 TF cards running 200w. An undervolted RX6800 does 170w with 16tf. In 5-6 years I'm sure they'll be able to get close.
Interesting.
What does this look like in terms of form factor. Do these consoles get physically bigger? Personally won't mind a fridge that's packing 40TF though a lot of people might take issue with that.
 

Dream-Knife

Banned
Interesting.
What does this look like in terms of form factor. Do these consoles get physically bigger? Personally won't mind a fridge that's packing 40TF though a lot of people might take issue with that.
Should be the same size really. 200w is 200w. These consoles could be even smaller with a better cooling system and more airflow, but you have to make them for a common denominator and not require disassembly for regular maintenance.
 
Should be the same size really. 200w is 200w. These consoles could be even smaller with a better cooling system and more airflow, but you have to make them for a common denominator and not require disassembly for regular maintenance.
Very interesting. Must admit I don't know about the inner workings of these things so I hope you don't mind the questions.
If they can maintain similar form factor while increasing system performance then great.
 

HeisenbergFX4

Gold Member
Replaying Spiderman with one of these?

protect spider man GIF
 

Amiga

Member
Hardware needs more innovations in pipelines and handling of memory, not TFLOPS. Make games easier to develop and more sophisticated worlds. Things that would lead to less problems like in Cyberpunk.
 

CuNi

Member
To the people whining about power consumption and electrical cost:

Even if you played 8 hours a day, 30 days a month, and your electric rate is $0.12 per KWh, the extra 100w a 4090 draws over a 3090 amounts to a whopping extra $2.88 per month. Wow. From there you can figure out the added cost based on the final numbers. 200w more = $5.76 a month. 300w more (not going to happen) then it's $8.64 more per month.

I'd like to think if you are someone buying a $2,000 graphics, $500 CPU, $400 motherboard, $300 NVME drive, and $300 of DDR5 RAM etc etc, then an extra $3-6 a month isn't going to be worth caring about. It's such a silly overblown thing.

To the Americans not understanding that their energy is basically free, let me explain it to you.
A 200W increase in power draw for 8h a day, 7 days a week would cost more than 200€ a year here in Germany.
And I picked a pretty good price already. If I were to pay that in my current plan, it'd be 262,08€ per year to be exact.

Now obviously I'm not gaming 8h a day 7 days a week, but since covid I also use the PC for office work etc so it is running a lot per day already.
So yes, especially in Europe it IS a thing many people consider. And price will only go up for energy here since the war with Russia and everyone trying to cut ties and stop buying gas from them.
 
Last edited:

OZ9000

Banned
To the people whining about power consumption and electrical cost:

Even if you played 8 hours a day, 30 days a month, and your electric rate is $0.12 per KWh, the extra 100w a 4090 draws over a 3090 amounts to a whopping extra $2.88 per month. Wow. From there you can figure out the added cost based on the final numbers. 200w more = $5.76 a month. 300w more (not going to happen) then it's $8.64 more per month.

I'd like to think if you are someone buying a $2,000 graphics, $500 CPU, $400 motherboard, $300 NVME drive, and $300 of DDR5 RAM etc etc, then an extra $3-6 a month isn't going to be worth caring about. It's such a silly overblown thing.
Heat and noise.

I want my room to be as cool and quiet as possible.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Hardware needs more innovations in pipelines and handling of memory, not TFLOPS. Make games easier to develop and more sophisticated worlds. Things that would lead to less problems like in Cyberpunk.
Better pipelines?
Thats exactly whats going on with virtual geometry, visual scripting and realtime global illumination.
Actually making games is getting much much easier.
No need to bake lighting.
Basically every engine is PBR so materials always look correct.
DCC -> Game Engine is basically a one click job these days textures, UVs and all.
What used to take me hours in UDK or CryEngine 3 takes me seconds today.

These new cards also have massive L2 cache now which explains the reduced memory interface....in certain applications Ada is going to absolutely wipe the floor with Ampere, if devs really take advantage of the new chips Ampere will age really quickly.

More TFLOPs are always welcome.
 

Dream-Knife

Banned
To the Americans not understanding that their energy is basically free, let me explain it to you.
A 200W increase in power draw for 8h a day, 7 days a week would cost more than 200€ a year here in Germany.
And I picked a pretty good price already. If I were to pay that in my current plan, it'd be 262,08€ per year to be exact.

Now obviously I'm not gaming 8h a day 7 days a week, but since covid I also use the PC for office work etc so it is running a lot per day already.
So yes, especially in Europe it IS a thing many people consider. And price will only go up for energy here since the war with Russia and everyone trying to cut ties and stop buying gas from them.
The US is the largest economy.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Like I said. A rtx 4070 will be on par or beat by a low margin 3090 ti.

Problem of 4070 will be 12gb
If it manages to handily beat a 3080 12G ill be mighty impressed
Gaming performance will be really hard to gauge because that L2 cache will massively benefit some games and will go nowhere in others.
Im hoping the ~500 dollar RTX 4070 does actually beat a 3090ti but im skeptical.

The 4090ti being such a huge upgrade over the 4080 is bullshit, for 3080 owners the only upgrade thats gonna real feel like an upgrade beyond the usual skip a generation is to buy the likely 1500 dollar base price 4090.
The 4080ti better come out sooner rather than later.
 

winjer

Gold Member
But this year, there was a drop off of 19%, for this last quarter, compared to the same quarter, last year.
And this last quarter in 2022, a 6.2% decrease over the first quarter.

People are no longer buying so many PCs and GPUs. In part because of the end of the lockdown restrictions, and working from home. In part because of much less profitability from mining.
The PC market is getting back to normal demand from consumers.
 

winjer

Gold Member
My point is, people complaining about efficiency can just buy a lower powered card. Most people buy the lower end cards anyway.

Or undervolting.
I saw some people undervolting a 3090, cutting off almost 100W, sacrificing just a bit of clock speed, and performance was just ~5% lower.
The same can be done with any GPU, from nvidia and AMD.
 
Now obviously I'm not gaming 8h a day 7 days a week, but since covid I also use the PC for office work etc so it is running a lot per day already.
Which is specifically why I made such a massive overestimate of time spent gaming under maximum load. It's totally unrealistic and a worst case scenario. Nobody games 8 hours a day 7 days a week at maximum GPU load. Realistically your rip off EU energy cost per year can be cut in half or even a quarter. Most people can only game a couple hours a day, and many can only game on the weekend. Your 200 euro estimate is massively overblown, just like my gorgeous American estimate is a massive overestimate too.

TLDR energy costs are a joke when you look at the big picture, yes, even for you europoors
 

winjer

Gold Member


According to Kopite7kimi, the next-gen RTX 4060 graphics card is reportedly launching with higher TDP than RTX 3070. The RTX 3070 is based on a cut-down GA104 GPU with TDP limited to 220W. This means that RTX 4060 would ship with even higher TDP.

Yesterday, a new rumor with release dates for RTX 30 series has emerged. Citing their own sources, Wccftech claims that the launch date has again been shifted by a month. Interestingly, for the first time, the RTX 4060 graphics card was mentioned with possible CES 2023 debut:

  • NVIDIA GeForce RTX 4090 – October 2022 Launch
  • NVIDIA GeForce RTX 4080 – November 2022 Launch
  • NVIDIA GeForce RTX 4070 – December 2022 Launch
  • NVIDIA GeForce RTX 4060 – January CES 2023 Unveil

At a time when energy prices are on the rise, this news of much higher power usage, is a bit worrisome.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not






At a time when energy prices are on the rise, this news of much higher power usage, is a bit worrisome.
The GPU price crash is coming and its coming in hot.
The delay is almost certainly because board partners need to get rid of RTX30/RX6 stock.

Those poor scalping bastards...cant even sell their scalps so need to start returning shit that they bought at already scalped prices.....hahaha KARMA!


rsqjr79r73191.jpg


qym0b69r73191.jpg



P.S Scalpers are scum everytime one gets burned an Angel gets its wings.
 

Leonidas

Member
At a time when energy prices are on the rise, this news of much higher power usage, is a bit worrisome.
Power going up is a good thing as long as performance & performance per watt is also good. If these GPUs are 2x the performance of 30-series as some rumors suggest, these cards will have much better performance per watt vs. 30-series while also being way more powerful.
 

kraspkibble

Permabanned.
The GPU price crash is coming and its coming in hot.
The delay is almost certainly because board partners need to get rid of RTX30/RX6 stock.

Those poor scalping bastards...cant even sell their scalps so need to start returning shit that they bought at already scalped prices.....hahaha KARMA!


rsqjr79r73191.jpg


qym0b69r73191.jpg



P.S Scalpers are scum everytime one gets burned an Angel gets its wings.
it would be even better if the store only offered store credit or straight up refused to take the returns. that cunt shouldn't be getting their money back.
 

hlm666

Member
The GPU price crash is coming and its coming in hot.
The delay is almost certainly because board partners need to get rid of RTX30/RX6 stock.

Those poor scalping bastards...cant even sell their scalps so need to start returning shit that they bought at already scalped prices.....hahaha KARMA!


P.S Scalpers are scum everytime one gets burned an Angel gets its wings.
Was looking at the first image and was wondering if the box on the ground was also his, scrolled down now there's 3 cards on the counter so that's a yes......
If the delay is because of the stock it annoys me even more than other reasons.
 

David B

An Idiot
Jumping to 100 teraflops? That's just insane. I really don't think things are gonna get that high yet. The 3090 Ti with 40 teraflops is already high as wowsers. I'm thinking there just gonna go to 50 teraflops for the 4090. 10 teraflops is already a big huge milestone for graphics, it will make a solid 4K to 8K game and have 60 to 100 FPS. It's already hard to reach 4K and 60 fps, but with the 4090 we will be there and more. But 100 teraflops, again, not gonna happen for at least 3 more years.
 
The GPU price crash is coming and its coming in hot.
The delay is almost certainly because board partners need to get rid of RTX30/RX6 stock.

Those poor scalping bastards...cant even sell their scalps so need to start returning shit that they bought at already scalped prices.....hahaha KARMA!


rsqjr79r73191.jpg


qym0b69r73191.jpg



P.S Scalpers are scum everytime one gets burned an Angel gets its wings.

There's economic crysis on the horizon, and potential war. Even if scalpers will no longer buy GPUs, I'm not so sure if GPU prices will go down in the near future. I want to be wrong though..., so please explain me why I'm wrong :D.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
There's economic crysis on the horizon, and potential war. Even if scalpers will no longer buy GPUs, I'm not so sure if GPU prices will go down in the near future. I want to be wrong though..., so please explain me why I'm wrong :D.
Cuz Nvidia is gonna release a new generation of chips by the end of the year.
Vendors cant sell RTX30s for anywhere near the price of the RTX40s and they cant realistically sell the RTX40s at 200% MSRP. (Economic crisis, demand and all that jazz)
The only real option is to drop the price of the RTX30s so they can actually sell them off.
Else they will just have a bunch of stock as people wait out the economic crisis you mentioned.
Holding stock costs more than selling it low.

RTX30s are practically at MSRP as we speak, once the RTX40s drop their demand even at MSRP will plummet, how do they balance the reduced demand with the excess supply they will have.
Econ 101 is the answer.
 

Sanepar

Member
Jumping to 100 teraflops? That's just insane. I really don't think things are gonna get that high yet. The 3090 Ti with 40 teraflops is already high as wowsers. I'm thinking there just gonna go to 50 teraflops for the 4090. 10 teraflops is already a big huge milestone for graphics, it will make a solid 4K to 8K game and have 60 to 100 FPS. It's already hard to reach 4K and 60 fps, but with the 4090 we will be there and more. But 100 teraflops, again, not gonna happen for at least 3 more years.
Yeah man the flagship for a new gen will jump 25%. Guys come on! Every new gen besides series 2000 was a jump of 50-70%.

This will be the same. People who spent 2x msrp are wishing a bad series. I have bad news for u guys(including me who has a 6800.

I can't wait to jump on a rtx 4080 beating 3090 ti in 25-30% perf.
 
Cuz Nvidia is gonna release a new generation of chips by the end of the year.
Vendors cant sell RTX30s for anywhere near the price of the RTX40s and they cant realistically sell the RTX40s at 200% MSRP. (Economic crisis, demand and all that jazz)
The only real option is to drop the price of the RTX30s so they can actually sell them off.
Else they will just have a bunch of stock as people wait out the economic crisis you mentioned.
Holding stock costs more than selling it low.

RTX30s are practically at MSRP as we speak, once the RTX40s drop their demand even at MSRP will plummet, how do they balance the reduced demand with the excess supply they will have.
Econ 101 is the answer.
In my country people are still selling used GPUs like pascal GTX 1080 for around 350-370$ even when better and new cards like 3060 costs only a little bit more, so I'm not so sure if people will be willing to sell ampere GPUs at much lower price even when 40xx series GPUs will launch.
 

CuNi

Member
Which is specifically why I made such a massive overestimate of time spent gaming under maximum load. It's totally unrealistic and a worst case scenario. Nobody games 8 hours a day 7 days a week at maximum GPU load. Realistically your rip off EU energy cost per year can be cut in half or even a quarter. Most people can only game a couple hours a day, and many can only game on the weekend. Your 200 euro estimate is massively overblown, just like my gorgeous American estimate is a massive overestimate too.

TLDR energy costs are a joke when you look at the big picture, yes, even for you europoors

Ah sorry. I didn't immediately see that you are only here to troll.

Handwaiving away energy costs because it doesn't fit your narrative. There is a reason why the EU reset the energy labels to drive companies to innovate more into power efficient devices instead of stagnation, but I guess such efforts are a joke in your opinion too. Just like conserving gas. Why do that when you can drive a SUV am i right?

Also, even if you cut the cost by four, 60€ a year is still horrible. It's not what you even pay for the card per year. It's what you pay MORE per year. If this trend continues, in 2 generations, my gaming pc would be responsible for 50% of my energy bill as everything else, even TVs etc. Is only going down in power consumption.

If you want to believe it or not, efficiency and power consumption will be gaining in importance in the near future and AMD and nVidia will have to follow the trend.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
In my country people are still selling used GPUs like pascal GTX 1080 for around 350-370$ even when better and new cards like 3060 costs only a little bit more, so I'm not so sure if people will be willing to sell ampere GPUs at much lower price even when 40xx series GPUs will launch.
Are any GPUs sold at MSRP in your country?
Cuz thats a defining factor, there are economies where any economic forecast based on US MSRP is basically pointless, places where the second hand and grey market are bigger than the "legit" market. Price balancing may take alot longer wherever you are from or prices may generally just be more expensive.
In my home country PS5s launched at ~1000 dollars, not scalped prices thats just what price is.
 
Are any GPUs sold at MSRP in your country?
Cuz thats a defining factor, there are economies where any economic forecast based on US MSRP is basically pointless, places where the second hand and grey market are bigger than the "legit" market. Price balancing may take alot longer wherever you are from or prices may generally just be more expensive.
In my home country PS5s launched at ~1000 dollars, not scalped prices thats just what price is.
3060 is close to MSRP, yet many people still sell their old GTX 1080 for just a little bit less than 3060. I'm still happy with my GTX 1080 and I dont want to sell it (it will be my backup card), but I think after nearly 6 years this card is worth no more than 200$, yet people still sell it for like 350$.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
3060 is close to MSRP, yet many people still sell their old GTX 1080 for just a little bit less than 3060. I'm still happy with my GTX 1080 and I dont want to sell it (it will be my backup card), but I think after nearly 6 years this card is worth no more than 200$, yet people still sell it for like 350$.
Well the GTX 1080 still performs well today, in raw raster its spitting distance from a RTX 3060 so kinda makes sense it would be priced just below a 3060 when there are no other cards to buy.
And are people consciously choosing to buy (not sell) buy GTX 1080s over RTX 3060s now that MSRP is basically met and they (according to you) cost effectively the same?
Thats odd cuz I could understand during the shortage/scalping any GPU was a good GPU.
But in the current climate arent those guys just buying RTX3060s instead.

Anyway. The larger markets such as Europe and the US, everything prior to RTX30s has dropped in price tremendously, not too long ago Turing 2060s were going for $400 - 500 thats well over MSRP, but now they are like ~250 at most....the same crash will happen to the RTX30s when the RTX40s show up.
Partners cant charge double MSRP, ebay cant charge double MSRP, they cant just hold on to the stock and pray demand for RTX30s bounces back, no one will want an RTX30 when they can get an RTX40 for the same price or less and near double the performance.
So the market will have to fix itself and in doing so they will have to reduce the price of RTX30s.

Im really curious why you would think at a global scale RTX30s would hold their value after the RTX40s drop?
 
Well the GTX 1080 still performs well today, in raw raster its spitting distance from a RTX 3060 so kinda makes sense it would be priced just below a 3060 when there are no other cards to buy.
And are people consciously choosing to buy (not sell) buy GTX 1080s over RTX 3060s now that MSRP is basically met and they (according to you) cost effectively the same?
Thats odd cuz I could understand during the shortage/scalping any GPU was a good GPU.
But in the current climate arent those guys just buying RTX3060s instead.

Anyway. The larger markets such as Europe and the US, everything prior to RTX30s has dropped in price tremendously, not too long ago Turing 2060s were going for $400 - 500 thats well over MSRP, but now they are like ~250 at most....the same crash will happen to the RTX30s when the RTX40s show up.
Partners cant charge double MSRP, ebay cant charge double MSRP, they cant just hold on to the stock and pray demand for RTX30s bounces back, no one will want an RTX30 when they can get an RTX40 for the same price or less and near double the performance.
So the market will have to fix itself and in doing so they will have to reduce the price of RTX30s.
Raw performance is not everything. I know my GTX 1080 OC is close to stock 3060 in old games, but I dont have features like HW RT, HW decompression, mesh shaders, and these can make a big difference. For example mesh shaders 3dmark demo suggest up to 4x more performance, and if game is using RT you can add another 4x. With DLSS on top of that RTX 3060 should run circles around my GTX 1080, and there are already games showing very big performance gap.

For example crysis 2 remastered:


1440p results:
GTX 1080 - 10fps
RTX 3060 38fps, and 77fps with DLSS.

People who have 3060 can play this game at 77fps with, while I can only watch screenshorts :p.

And Crysis 2R doesnt even use all turing/ampere features, yet the results already shows how much faster 3060 can be. I will be not surprised if some upcoming games will not even run on my GTX 1080 at all, and especially if developer will use RT GI (I really hope dead space will feature SW RT like lumen, because otherwise my GTX 1080 will have a problem).

I'm still happy with my GTX 1080 performance (with 2.1GHz OC it has 10.7TF, so it's just a little bit slower than stock 1080ti 11.3TF), because for now it runs 99% of my games at 1440p 60fps, but I think this GPU cant be compared to something like RTX 3060. If you also consider how old GTx 1080 is (it has no warranty), it's certainly not worth 350$ asking price,

Im really curious why you would think at a global scale RTX30s would hold their value after the RTX40s drop?

Because people are greedy. If economic experts are correct and current events will indeed lead to huge economic crisis, then we will see shortages of everything, including semiconductors of course, and when Nv and AMD will have troubles making their new GPUs then people will be forced to buy old GPUs and I'm sure Ampere GPU owners will absolutely try to exploit this situation. I hope experts are wrong with their predictions, but only time can tell who was correct.
 
Top Bottom