• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RDNA 3 To Offer More Than 50% Performance Per Watt Over RDNA 2, Confirmed To Be Chiplet Based.

AMD has confirmed RDNA 3 will offer more than 50% performance per watt over RDNA 2.

Confirmed to be chiplet based and using 5nm Process.

Rearchitected Compute Unit

Offers “optimised graphics pipeline” and “Next-Generation Infinity Cache."

RDNA 4 confirmed to land in 2024.




UMdvr5o.jpg



7sPy1iw.jpg


EDIT : My first ever thread, go easy on me folks.
 
Last edited:
It does have machine learning/deep learning/neural cores or whatever right? That way AMD can do their own DLSS down the line
 

Dream-Knife

Banned
7nm to 5nm yields 40% better performance alone

So basically 10% performance gains from RDNA3 over 2.

50% increase in power is not enough to justify new Pro consoles. That’s 15 TF vs 10 TF.

PS4 Pro doubled the TF
15tf is nearly a RX6800. Would be a pretty good upgrade for consoles. It won't make people happy though as they'll just compare it to high end PC parts and complain.
 

SlimySnake

Flashless at the Golden Globes
Wait, just perf per watt? No Perf per clock increase? Thats disappointing.

Still, Perf per watt means they can run the PS5 much cooler if they ever decide to make a slim model. I really dont know how they can get a 20 tflops PS5 Pro without going over 250 watts even with this 50% perf per watt increase.

For PCs, this is going to be fun because 6900xt is already only 260 watts or so while the 3080 12 GB models can go up to 400 watts. Even the 10 GB ones can hit 320 watts. So if their 80 CU card comes in around 170 watts they can potentially aim for 100 - 160 CUs and still come under 400 watts. Thats probably how they get to 50 tflops.
 

Buggy Loop

Member
Sounds good. Just don’t get all stupid with the pricing.

Or you know, actually manufacture them..

Steam hardware survey shows they are nearly non existent. 6800XT at 0.15%..


To me this screams paper launch, even one year and a half later, it didn’t make a dent.
 

Dream-Knife

Banned
For PCs, this is going to be fun because 6900xt is already only 260 watts or so while the 3080 12 GB models can go up to 400 watts. Even the 10 GB ones can hit 320 watts. So if their 80 CU card comes in around 170 watts they can potentially aim for 100 - 160 CUs and still come under 400 watts. Thats probably how they get to 50 tflops.
6900xt stock is 300w. Nitro + is 320w, Red Devil is ~380-400w. EVGA FTW 3 and MSI Suprim X 3080 10gb use 380w stock.

Or you know, actually manufacture them..

Steam hardware survey shows they are nearly non existent. 6800XT at 0.15%..


To me this screams paper launch, even one year and a half later, it didn’t make a dent.
Microcenter is full of them. When I picked my card up, all you could get was AMD. People just don't want them.
 
I hope RT performance doesn't suck again.

im sure it's been properly implemented they (AMD) did a whole video on addressing ray tracing issues with current graphics

They’ll definitely have more dedicated silicon to accelerate RT instead of relying on TMU’s (RDNA 2 implementation). It’s similar to what Nvidia is doing.
 
Last edited:
7nm to 5nm yields 40% better performance alone

So basically 10% performance gains from RDNA3 over 2.

50% increase in power is not enough to justify new Pro consoles. That’s 15 TF vs 10 TF.

PS4 Pro doubled the TF
Pro was sold $399 back then and was their cheap mid-gen option. They won't do the same mistake with PS5 Pro. Expect min 20TF (I'd say around 22TF) and probably 600$.

Besides they'll have to double the CUs or BC won't be possible (or incredibly hard to do).
 

SlimySnake

Flashless at the Golden Globes
6900xt stock is 300w. Nitro + is 320w, Red Devil is ~380-400w. EVGA FTW 3 and MSI Suprim X 3080 10gb use 380w stock.


Microcenter is full of them. When I picked my card up, all you could get was AMD. People just don't want them.
In game, and i have seen dozens of benchmarks, it hovers around 260 watts. I have posted several videos on gaf about this.
 
Pro was sold $399 back then and was their cheap mid-gen option. They won't do the same mistake with PS5 Pro. Expect min 20TF (I'd say around 22TF) and probably 600$.

Besides they'll have to double the CUs or BC won't be possible (or incredibly hard to do).

72 CU’s is a lot silicon for a game console, especially if it’s mass produced, there will be implications for yields as well.

However if they pull it off I won’t be complaining.
 
You are misreading the stats published by TSMC.
It's either 40% power reduction, at the same clock. Or, 20% faster clock speed at the same power.

They will be increasing the number of transistors with the die size the same as ps5

Maybe slightly increasing clocks too. 15TF looks to be the ballpark for ps5 pro. Not worth it
 

winjer

Gold Member
They will be increasing the number of transistors with the die size the same as ps5

Maybe slightly increasing clocks too. 15TF looks to be the ballpark for ps5 pro. Not worth it

Nothing indicates a change to RDNA3 for consoles. In fact, the opposite.
A waffer in N5 costs more than 80%, than one in N7. And all process nodes have rising in price at TSMC during the last couple of years.
This is too expensive to make a Pro console, or even a node reduction.

Sony is ramping up production in N7. And might eventually go for N6, since this is just an optimized N7.
But an N5 is just too much in this market.
 

Buggy Loop

Member
Microcenter is full of them. When I picked my card up, all you could get was AMD. People just don't want them.

And that’s the 2nd problem: False MSRP. Can expect AIBs to be a bit higher priced than reference cards, especially in this crazy market, but AMD AIBs are batshit insane. Basically peoples are waiting on AMD drops.
 

SlimySnake

Flashless at the Golden Globes
Pro was sold $399 back then and was their cheap mid-gen option. They won't do the same mistake with PS5 Pro. Expect min 20TF (I'd say around 22TF) and probably 600$.

Besides they'll have to double the CUs or BC won't be possible (or incredibly hard to do).
Yeah, i can definitely see them releasing the Pro version for $599. They will likely want the same performance as the 72 CU 6800xt which is 20 tflops, but even with a node shrink, thats going to be an expensive shrink and the TDP will be really high around 250 watts since it will need to be paired with a Zen 3 CPU with its own power consumption. For $500, cooling 250 watts is going to be a pain. But for $600 they could probably use vapor chamber cooling instead of that big heatsink brick. I hope they go that route.

Found a couple of videos of 6800xt (20 tflops) and 6600xt (10 tflops) power consumption and I dont think 50% gets them from 250 watts to the 130-150 watts we see in the benchmarks. 50% perf per watt gets them down to 170 watts for the GPU alone. Way higher than the 130-140 watts we see here for the 10.6 tflops 6600xt which hits way higher clocks than the PS5.

Dream-Knife Dream-Knife Looks like the 6900xt is indeed 270-300 watts consistently. I stand corrected. I dont know which benchmarks i was looking at before. Must have misread the numbers.

 

Crayon

Member
My next gpu is whatever is good when es6 drops. Prefereably amd so I hope they catch up on the ray tracing. Even the sprinkles of raytracing games have now are really nice.
 

Dream-Knife

Banned
In game, and i have seen dozens of benchmarks, it hovers around 260 watts. I have posted several videos on gaf about this.
It depends on the game. I have my card set to 400w, but if I'm playing Elden Ring it will only pull 95w for example.
And that’s the 2nd problem: False MSRP. Can expect AIBs to be a bit higher priced than reference cards, especially in this crazy market, but AMD AIBs are batshit insane. Basically peoples are waiting on AMD drops.
Nvidia cards had dumb prices in Microcenter at the time as well. People just didn't want the AMD cards.

Yeah, i can definitely see them releasing the Pro version for $599. They will likely want the same performance as the 72 CU 6800xt which is 20 tflops, but even with a node shrink, thats going to be an expensive shrink and the TDP will be really high around 250 watts since it will need to be paired with a Zen 3 CPU with its own power consumption. For $500, cooling 250 watts is going to be a pain. But for $600 they could probably use vapor chamber cooling instead of that big heatsink brick. I hope they go that route.

Found a couple of videos of 6800xt (20 tflops) and 6600xt (10 tflops) power consumption and I dont think 50% gets them from 250 watts to the 130-150 watts we see in the benchmarks. 50% perf per watt gets them down to 170 watts for the GPU alone. Way higher than the 130-140 watts we see here for the 10.6 tflops 6600xt which hits way higher clocks than the PS5.

Dream-Knife Dream-Knife Looks like the 6900xt is indeed 270-300 watts consistently. I stand corrected. I dont know which benchmarks i was looking at before. Must have misread the numbers.


It wouldn't have to be paired with a Zen 3, and tbh I don't see how that would be a benefit to the system as games are GPU limited, and still have to run on the base console.
 
Last edited:
My wallet is ready for an Xbox Series X Elite with RDNA 3.

But we barely have games taking full advantage of the Series X (and PS5) as-is.

Anyway I'd like to pick one of these up for an eGPU/desktop swap setup if the price is right particularly on the lower end. I really want to hear more about power consumption, core counts, cache sizes etc. as well.
 

Tripolygon

Banned
They will be increasing the number of transistors with the die size the same as ps5

Maybe slightly increasing clocks too. 15TF looks to be the ballpark for ps5 pro. Not worth it
You keep saying not worth it like that's up to you to decide. Pro consoles are inevitable, and they will no doubt employ the newer architecture. 15TF at the same price as current consoles is a great value proposition not only for flat screen gamers but also PSVR 2 owners who may want higher resolution.
 
You keep saying not worth it like that's up to you to decide. Pro consoles are inevitable, and they will no doubt employ the newer architecture. 15TF at the same price as current consoles is a great value proposition not only for flat screen gamers but also PSVR 2 owners who may want higher resolution.

50% is not big enough to justify a split dev environment target
 

Corndog

Banned
You keep saying not worth it like that's up to you to decide. Pro consoles are inevitable, and they will no doubt employ the newer architecture. 15TF at the same price as current consoles is a great value proposition not only for flat screen gamers but also PSVR 2 owners who may want higher resolution.
Not a big enough jump.
 
>50% perf per watt? That's pretty insane to me. Sounds amazing. Too bad they didn't say a word about the RT performance. That doesn't sound good.
 

SmokedMeat

Gamer™
Or you know, actually manufacture them..

Steam hardware survey shows they are nearly non existent. 6800XT at 0.15%..

[/URL]

To me this screams paper launch, even one year and a half later, it didn’t make a dent.
Nvidia has an iron grip on the market. AMD’s GPUs have been readily available in my area, but prices were absurd. Not sure what they’re like now, as I wound up getting an Nvidia card.

I don’t think we’ll see a paper launch. We just can’t have miners clamoring for them.
 

CrustyBritches

Gold Member
Pro consoles aren't dropping anytime soon. Expect 2025 at the earliest (and a ten year generation).
Yeah, RDNA3 is this year, so I don’t really see how that lines up with Pro consoles. Although I have doubts there will even be Pro consoles, it would make more sense if they came with RDNA4 in 2024.
 

Dirk Benedict

Gold Member
Pro consoles aren't dropping anytime soon. Expect 2025 at the earliest (and a ten year generation).
Maybe... With Edit: one of China's top Economists calling for China to take Taiwan as a strategic chip? Even acknowledged that the U.S was transfering the ability to make these chips, to our own soil (the U.S)
 
Last edited:
Because you'll buy it, and they can push the resolution higher. More and more people buy 8k tv's every year.
4K is not even the standard yet. What kind of console hardware are we looking at to run 8k content. PS5 won't hit native 4k on next gen exclusives. We're looking at 1440p for games, at best.
 
Last edited:
Top Bottom