• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RDNA 3 GPUs Confirmed For Launch on 3rd November

GreatnessRD

Member
Eagerly awaiting this for my new PC build.

When is Nvidia’s supposed to be?
The 40 Series press conference is in 2 hours. We'll see shortly!

And.... its showtime, folks! Techmas has come! I am really curious to see what AMD has been able to do with the MCM design. Hoping the raytracing is on par with Nvidia this go around. Exciting times, folks!
 

SlimySnake

Flashless at the Golden Globes
100 tflops is so close I can feel it.

That said, I hope they invest in machine learning and dedicated RT cores than just pushing standard rasterization performance.
 

winjer

Gold Member
Sort out your RT performance, AMD and I'll at least then consider your cards....but not until.

There were some leaks claiming RDNA3 would increase RT performance by 2.5X
But this is just some supposed leak, so....

Also, I expect them to add some sort of tenros units, similar to NVidia and Intel.
 
Exactly. Killing the nvidia hype … sadly they are doing it wrong lol.

If you really wanted to kill the hype, you release a figure that beats the 4090 or 2x more powerful 3090ti etc.
for 99$ and with just 23W power draw, including free lifetime subscriptions of GP, PS+, Prime, Stadia, Netflix, Disney+, HBO, Spotify and daily foot massages from Lisa, Phil and Jimbo.

While the performance crown is important for marketing, having competitive products in the lower segments seems way more interesting to me with those insane prices on the upper end.
 

GreatnessRD

Member
Jacket Man has lost his got damn mind with those prices. The 16GB 4080 version will probably come close to FE 4090 prices with the AIB cards. Fuckin' stupid, but you knew this was gonna happen. Covid Era showed people were willing to spend money wildly. Just hope he and the investors understand that the stimmy money is long gone. I expect the High-end enthusiast will grab the 4090 and top tier 4080 as expected, but the regular folks? I don't know, lol.
 

hlm666

Member
Not sure AMD is going to be much better than NV in the power department. This seems to suggest they are going to have similar power requirements.

 
Last edited:

Hot5pur

Member
The 4090 has a power draw similar to 3090 but doubles it in performance. So effectively performance per watt jump of 100 percent? I guess we'll have to see whether those benchmarks were done at 600 W or 450 W.

Either way, only a 50 percent performance per watt bump for AMD likely won't suffice. They will increase the power too. If they go up 100 W relative to the 6900xt, then that will give them 2x (1.5 from node and *1.33 from power increase), which will be competitive with Nvidia

Question is how good will raytracing performance be. I don't expect they will forgo the opportunity to price their cards as obnoxiously as Nvidia. Likely slightly cheaper. They know Nvidia can cut prices so AMD will likely just follow suite and ride the high margins rather than gain market share.
 

hlm666

Member
The 4090 has a power draw similar to 3090 but doubles it in performance. So effectively performance per watt jump of 100 percent? I guess we'll have to see whether those benchmarks were done at 600 W or 450 W.

Either way, only a 50 percent performance per watt bump for AMD likely won't suffice. They will increase the power too. If they go up 100 W relative to the 6900xt, then that will give them 2x (1.5 from node and *1.33 from power increase), which will be competitive with Nvidia

Question is how good will raytracing performance be. I don't expect they will forgo the opportunity to price their cards as obnoxiously as Nvidia. Likely slightly cheaper. They know Nvidia can cut prices so AMD will likely just follow suite and ride the high margins rather than gain market share.
I don't think they have an answer for the 4090, they seem to be just letting nvidia totally have their own way with it's launch. You would think if they had something they were going to run against that you would have had some leaks or tweets and stuff like they do with cpus. The Nov launch means they might have something to deal with the 4080 16gb seeing that's supposed to launch that month. Only another few weeks to find out which way im going anyway.
 

GHG

Gold Member
Not sure AMD is going to be much better than NV in the power department. This seems to suggest they are going to have similar power requirements.


If true, this will end up being closer than most people anticipate considering how much power the 6900 and 6950 xt consume:

 

GymWolf

Member
If they launch on nov 3 it means that the presentation is some days before that? Or they just present and launch in the same day?
 

Xyphie

Member
Not sure AMD is going to be much better than NV in the power department. This seems to suggest they are going to have similar power requirements.


PCB.jpg


Based on the PCB we have seen from Igor's Lab, we should expect the top-end card to use upwards of 375-450W based on the 3x8-pin connectors (150W each) alone with possibly an additional 75W from the PCIe slot. Don't really see why some people have convinced themselves that 7900XT will be massively more power efficient than 4090, both will be made with the same TSMC N5P-derived node so there will be process parity as compared to RDNA2 and Ampere (and even then Ampere compared favorably).
 

phil_t98

#SonyToo
are any of the RDNA3 features gonna pop up on PS5 and Xbox series consoles? I know chipset will be different but I wonder if some feature will find there way over
 

MarkyG

Member
RDNA3 needs to pull off some magic here. Obi-Wan isn't our only hope, to prevent the Dark Lord Nvidi-sidius from prevailing!
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
RDNA3 needs to pull off some magic here. Obi-Wan isn't our only hope, to prevent the Dark Lord Nvidi-sidius from prevailing!
In pure raster AMD was already matching Nvidia, in price/performance they were beating Nvidia.
Nvidia wins in DLSS, DLAA, Raytracing, Freestyle and Mindshare.

AMD needs to steal mindshare this generation with good RT, hopefully some ML hardware and a damn good price.
Their Ryzen moment hasnt happened in the GPU space just yet......we are still waiting for it.


Unfortunately im locked into the CUDA/OptiX workspace so I cant really support AMD directly, but for any friends who need a machine built for them, assuming the prices are reasonable, i will probably be recommending Ampere and/or RDNA3.
 

Kenpachii

Member
6000 series was pretty disappointing, so lets hope they manage to push some more performance forwards with the 7000 series. with nvidia doubling its performance over a 3080, and dlss 3.0 which basically 4x increases its performance over a 3080, it doesn't look good for AMD.

Let's see what they got.
 

hlm666

Member
In pure raster AMD was already matching Nvidia, in price/performance they were beating Nvidia.
Nvidia wins in DLSS, DLAA, Raytracing, Freestyle and Mindshare.

AMD needs to steal mindshare this generation with good RT, hopefully some ML hardware and a damn good price.
Their Ryzen moment hasnt happened in the GPU space just yet......we are still waiting for it.


Unfortunately im locked into the CUDA/OptiX workspace so I cant really support AMD directly, but for any friends who need a machine built for them, assuming the prices are reasonable, i will probably be recommending Ampere and/or RDNA3.
Their Ryzen moment happened when Nvidia went to samsung this gen and they failed to take advantage of it. It gave them a massive frequency advantage they lose this time because nvidia were not going to do an Intel and keep making the same mistake for multiple generations. I'm waiting for the Announcements personally because I will buy AMD myself if they live up to expectations, but the more info that comes the more it seems they are not going to change the landscape much.
 

FireFly

Member
Their Ryzen moment happened when Nvidia went to samsung this gen and they failed to take advantage of it. It gave them a massive frequency advantage they lose this time because nvidia were not going to do an Intel and keep making the same mistake for multiple generations. I'm waiting for the Announcements personally because I will buy AMD myself if they live up to expectations, but the more info that comes the more it seems they are not going to change the landscape much.
AMD got the frequency advantage going from RDNA 1 to RDNA 2 on 7nm. Now Nvidia are merely catching up to RDNA 2 despite having full process advantage. It remains to be seen what kind of clocks RDNA 3 can hit.
 

winjer

Gold Member
RDNA 1 and RDNA 2 were both on TSMC N7 node. The reason why RDNA2 clocks much higher is because AMD sent a team of CPU engineers to RTG, as to help improve the execution pipeline.
They managed to remove limites on several critical points of the RNDA2 pipeline, allowing it to clock much higher.

NVidia has been on similar clock speeds for several generations now. Ever since Pascal.
Pascal, Turing and Ampere were all on different process nodes, and almost the same clock speed. I doubt that Samsung's 8nm was the reason for Ampere clocking much lower than RDNA2.
Ada Lovelace is doing the same thing that AMD did with RDNA2. But also pushing a lot of voltage to push clocks higher.

There are rumors that RDNA3 can clock close to 4Ghz. But that is something we'll have to wait to confirm.
 

Sanepar

Member
RDNA3 needs to pull off some magic here. Obi-Wan isn't our only hope, to prevent the Dark Lord Nvidi-sidius from prevailing!
Depends if u care about bs like dlss and rt you will keep going nvidia. If u care about raster and price amd will easily delivery.
 

b0uncyfr0

Member
Regarding the usual comparison, i think RT is still too young and taxing to be worthwhile.

I will pay a bit more for RT though, but not much (no more than 10%). With FSR and DLSS, there is almost no difference now, at least nothing I can see unless squinting. What I don't like is AMD's approach. It almost seems like they're saying 'our products are on the same level' when they're not.

You're inferior AMD - price your products accordingly and give us reason to jump onto team Red.
 

Buggy Loop

Member
They didn’t “kill” ampere at rasterization with RDNA 2. It was on par or even worse. 3080 had overall +3% avg soon as you went beyond 1080p.

Which is massively disappointing considering the sacrifices they made to their RT and ML to actually have better rasterization performances than competition. It should have been much better.
 

b0uncyfr0

Member
They didn’t “kill” ampere at rasterization with RDNA 2. It was on par or even worse. 3080 had overall +3% avg soon as you went beyond 1080p.

Which is massively disappointing considering the sacrifices they made to their RT and ML to actually have better rasterization performances than competition. It should have been much better.
I think this is not the case anymore at 1080p/1440. AMD cards are beating their equivalents (price-wise) quite handily. It changes when it comes to 4K though.
 
Top Bottom