• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD confirms Radeon 7000 (RDNA3) will have increased power consumption

If 7900 xt has 54 tflops and 350w I don't see the point to go nvidia. Dlss will be useless.

It's sounding like 7900 XT will have a lot more than 54 TF and consume much more than 350 watts. Going by leaks and rumors anyway.

I'm personally guessing it'll be around 75 TF and between 425 watts - 450 watts power consumption. No solid idea on shader cores tho; rumors/leaks have bounced between 12,288 and 15,360 a few times.

As long as there are IPC improvements console mid gen updates might just opt to not draw as many watts and have lower frequencies. If the rumors for 4090 are true it might go past the 100 TFlop threshold while PS5 is sitting at 10 TFlops. That's quite a generational leap, a PS5 pro probably doesn't have to go past 20/25 Tflops.

Just keep in mind TF aren't everything when it comes to gaming performance, not even close. Even if/when mesh shading becomes more prominent in the graphics pipeline (that's one thing where TF can matter because it is compute-driven and TF measure compute capability), you'll still need fixed-function processes, still need ROPs & TMUs, still need your primitive units etc. because different things may benefit from different functions.

It'd be funny (to me, anyway) if the 4090 is 100 TF but only has say 450 Gpixel/s pixel fillrate. I mean the 3090 is already over 3x the TF of PS5 but barely does 40 Gpixels/s more in pixel fillrate.

Nvidia tflops is half amd tflops nowadays. So comparing 4090 ti perf directly to ps5 would be 50tf vs 10tf

7900 xt will be 46-54 tflops probably at 400w
7800 xt 36-42 tflops probably at 300w

Even if u consider possible pro models on rdna4 these gen will not gonna catch these cards.

There's no way the 7900 XT comes in that low at performance unless set the clock to like 1.8 GHz or can only go as high as 2.2 GHz, and that's with a 12,288 shader core model. RDNA2 can already go much higher in clock, even the 6900 XT can boost to 2.5 GHz.

And again, it's not just about TFs. The consoles can make GPU, CPU, and I/O customizations the PC GPUs cannot, which can help even the score in some cases. And unless a game is massively reliant on compute-side processing, TF isn't going to matter that much.
 

twilo99

Member
There is no way to keep up with Nvidia unless they up the watts..

I think a 7700XT will be able to outperform a 6900 while using less power.

IF any of the numbers are actually right, yes... that would be great for the industry.
 
Odd that these big corps are pushing out these power hungry components while everyone is freaking out about the environment and micro managing their lives around it.

Forgetting the polar bears for a second, the cost per kw alone right now makes these a big no and I'm hesitant to even fire up my rig with a GTX1080.
 

mitchman

Gold Member
Hotter systems? I need to love to Norway.
Sweating James Mcavoy GIF
Won't help you. My apartment in downtown Oslo is 30C inside during summer. And no air con installed anywhere.
 
Odd that these big corps are pushing out these power hungry components while everyone is freaking out about the environment and micro managing their lives around it.

Lol, fools.

Forgetting the polar bears for a second, the cost per kw alone right now makes these a big no and I'm hesitant to even fire up my rig with a GTX1080.

Guessing you're either staying in a hut or live in Britain, where air conditioners are enemies.
 

KungFucius

King Snowflake
What is the trend in GPU power till now? It has always been increasing as far as I can recall. My first GPUs got their power from the PCI and/or AGP bus. At some point they needed 6 pin then 8pin then 12 pin add ons. Even higher end desktop CPUs are more power hungry. Is this next gen going to be worse relative to increase in capability or do we have better competition causing each company to design higher power GPUs to achieve the performance they feel they need to satisfy the market?
 

JackSparr0w

Banned
What is the trend in GPU power till now? It has always been increasing as far as I can recall. My first GPUs got their power from the PCI and/or AGP bus. At some point they needed 6 pin then 8pin then 12 pin add ons. Even higher end desktop CPUs are more power hungry. Is this next gen going to be worse relative to increase in capability or do we have better competition causing each company to design higher power GPUs to achieve the performance they feel they need to satisfy the market?
Yeah but it has now got to the point that power draw can cause noticeable problems. When a GPU produces enough heat to keep a small bedroom warm in the middle of the winter you kind know things have gotten a little too far.
 

Trogdor1123

Member
This is getting crazy. They need to start making them much more efficient. I don’t some hippy government forcing it either. Consumers should start to act, especially with the cost of power
 
Top Bottom