• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen NVIDIA GeForce RTX 4090 With Top AD102 GPU Could Be The First Gaming Graphics Card To Break Past 100 TFLOPs

CuNi

Member
Unless you are mining with them, even an 800w card will barely show up on the power bill. My 3090 is running at 500watts overclocked, and my computer amounts to a rounding error on the energy bill. People WAY overplay the actual cost of computer power consumption and quote full 100% load peak draw power as the scary figure, and fail to talk about how 90%+ of the time the GPU sits at a much lower power profile.

You are wrong and I already corrected someone else in another thread on it.
It may be "a rounding error on the energy bill" in America, but it surely isn't in the EU.
In the other thread, we had a example statement about a 200W increase and how much of a footprint it would have on the energy bill in a fictional 8h a day whole year scenario and it around 140€ per year more.
And that is only the INCREASE of 200W.

I can give you a more realistic scenario.
Let's assume the GPU draws 600W (GPU alone!!) and we run it for 21h a week (average of 3h per day) which is realistic.
That GPU alone would cost me, with my current pricing, 275,18€ per year. And that is the price of the GPU only. There is nothing else like CPU or anything else.
The same GPU with 800W would already make out insane 366,91€.

Yes, this is with 100% of advertised power draw but even if you take away 10% or even 20% off those figures, the price pay'd is getting ridiculous.
And about Idle-power, this example was made with only 3h gaming a day. Many use the PC far longer daily for browsing/work etc. While it isn't insanely high, Idle draw depending on card, still easily reaches 40W and more.
And the rest of the system still has to be added to the whole cost calculation.

I don't know you, so I cannot judge your wealth, but to be honest, nearly 400€ per year spent on GPU energy alone to me is far off from a rounding error.
 
Last edited:

CrustyBritches

Gold Member
This is from the energy.gov appliance energy calculator. This would be for a 500W GPU running at gaming load 3hrs/day, 200 days/year.
pRTuGtS.jpg


Idle power consumption for a 3080 is like 10-20W. Media playback is around 25-30W.
 

Pagusas

Elden Member
You are wrong and I already corrected someone else in another thread on it.
It may be "a rounding error on the energy bill" in America, but it surely isn't in the EU.
In the other thread, we had a example statement about a 200W increase and how much of a footprint it would have on the energy bill in a fictional 8h a day whole year scenario and it around 140€ per year more.
And that is only the INCREASE of 200W.

I can give you a more realistic scenario.
Let's assume the GPU draws 600W (GPU alone!!) and we run it for 21h a week (average of 3h per day) which is realistic.
That GPU alone would cost me, with my current pricing, 275,18€ per year. And that is the price of the GPU only. There is nothing else like CPU or anything else.
The same GPU with 800W would already make out insane 366,91€.

Yes, this is with 100% of advertised power draw but even if you take away 10% or even 20% off those figures, the price pay'd is getting ridiculous.
And about Idle-power, this example was made with only 3h gaming a day. Many use the PC far longer daily for browsing/work etc. While it isn't insanely high, Idle draw depending on card, still easily reaches 40W and more.
And the rest of the system still has to be added to the whole cost calculation.

I don't know you, so I cannot judge your wealth, but to be honest, nearly 400€ per year spent on GPU energy alone to me is far off from a rounding error.
you are correct i’m thinking in regards to america, our energy price in my area is 12.8/kwh. Our electric bill averages about $500-$600 a month with the super hot summer months sometimes hitting $1000/month, and of that the electronics in the house (computers, game consoles, TV’s) are basically nothing on the bill (we monitor everything closely with a home energy monitoring system clammed to the breaker box and individual circuits). It’s pool pumps and air conditioners that chew through 70%+ of the bill. My computer on average cost $17 a month to run, with moderate gaming and video production stuff. even doubling that wouldn’t make much different on the power bill as a whole to me.
 
Last edited:

Rickyiez

Member
Unless you are mining with them, even an 800w card will barely show up on the power bill. My 3090 is running at 500watts overclocked, and my computer amounts to a rounding error on the energy bill. People WAY overplay the actual cost of computer power consumption and quote full 100% load peak draw power as the scary figure, and fail to talk about how 90%+ of the time the GPU sits at a much lower power profile.
Heat is the real issue , especially in tropical countries . GPU and CPU on load during gaming is not comfortable in the room without AC
 

Tomi

Member
I think we wont see anything about new rtx graphic cards this year because market is floded with rtx 3xxx series.
Maybe some announcment in december, but i dont believe that either
 

Airbus Jr

Banned
Nah just trying to get those QD-OLEDs to actually work out.

cyberpunk-2077-rt-3840-2160.png


deathloop-rt-3840-2160.png



watch-dogs-legion-rt-3840-2160.png



Can you imagine when actual nextgen games show up.
Yeah man that 3090 perfomance are not good enough

Cant ran ray tracing watch dog legion @ 60 fps

Lets wait and see for the rtx4000 series this year
 
Last edited:

ZywyPL

Banned
Boring.
The consoles haven’t even been tapped out or even got close to being tapped out.

Tapped out? They can't even run PS4/XB1 games at 4K60. If you want to "tap them out" you'll end with The Matrix demo running at 1080p 20-30FPS, and that's exactly what the upcoming cards are for - for those kind of next-gen visuals except at 4K60.
 
Last edited:

Hot5pur

Member
Synthetic benchmark and a game developed in partnership with Nvidia ... Gee I wonder why the results are so promising.

Feels like a controlled leak. Maybe the full AD102 could be quite good but I wouldn't count on the other product categories to be much better than the standard 50% gen on gen uplift.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Synthetic benchmark and a game developed in partnership with Nvidia ... Gee I wonder why the results are so promising.

Feels like a controlled leak. Maybe the full AD102 could be quite good but I wouldn't count on the other product categories to be much better than the standard 50% gen on gen uplift.
Considering the 3080 was a GA102 and the 4080 is an AD103 with lower bus width its almost certainly not going to be the same jump as the 3090 to 4090.
AD102 is huge and fast as fuck cuz its on TSMC now....i full expect the 4090 to be ballpark twice as fast as the 3090.

The people doing the leaks are almost always partner employees in china, Nvidia has no real control and would likely rather they not leak shit in case things change.
Remember the 16GB 3070Ti and 20GB 3080Ti?
People were disappointed and angry at Nvidia about products that were never even announced.

P.S You can buy 13th gen ES chips right now if you know where to look....these arent controlled leaks.
No benefit to them.
 
Top Bottom