• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 5 To Be A Completely New GPU Architecture From The Ground Up, RDNA 4 Mostly Fixes RDNA 3 Issues & Improves Ray Tracing







My brother's 5700 XT had 2 years of black screens before fixed

7000 series had 8 months of worse performances than 6000 series in VR before being fixed. How is that acceptable


Jake Gyllenhaal No GIF

"AMD drivers are as good as Nvidia" is basically the same thing as "Gamepass increases game sales"
 

ABnormal

Member
I mean, if Tom Henderson says PS6 is 2028, then we have no other choice than to believe him given his immaculate track record having accurately leaked the PlayStation Portal, PlayStation 5 Slim, PlayStation 5 Pro and various PlayStation 5 accessories.

Regarding console hunger, well, that's what the PS5 Pro will be all about: giving a much more higher-end way to experience the games designed for the PS5, which will remain the target console hardware for third-party developers until the arrival of PS6.

It is also worth-noting that PlayStation Studios has their roadmap for what remains of this generation and likely the cross-gen period with the next one locked down by now, meaning that even if Sony wanted to bring forward the release of the PS6, they wouldn't really have the next-gen games to support it's launch, so they ultimately wouldn't proceed with such plan.
Yeah, I remember his statement. I still hope that he only got some provisionary plans and that there's no rigidly fixed timeline, because this would be a very long one, especially considering that we are at the start of an AI revolution which will be heavily impacting the way graphics output (and other things) will be managed. PS5 Pro surely can satisfy some of that hunger, but it's just an introduction to it, and four years are LONG, technically speaking. Very long. In the coming years we will witness many breakthroughs and current gen will be felt obsolete quickly, not simply from a rendering standpoint. But it's also true that a delay would also lead to a true utilization of this gen capabilities with the dedicated gaming development.
For sure, covid created a long delay on gaming developments and we will feel that delay across most of this gen, so it's probably inevitable in some measure.
 

Loomy

Thinks Microaggressions are Real
That’s not how it works.

Sony may have a rough framework of specs, but they will be working with AMD in tandem on next gen architecture changes
Yeah that's kind of what I meant with the second sentence. "Planning" and "creating a custom version" may have been the wrong terms to use at this stage. You are of course right that they will be working on it together. My skepticism is on timelines.

Like I said, Sony may have had what you described as a 'rough framework' for at least 4 years now, and it's clear AMD's plans around their GPU business has shifted since they launched the RDNA 3 cards (probably partly why Scott Herkelman left late last year).

What isn't clear is how far into that process they are and how that has affected plans for RDNA 5, and if those have recently shifted dramatically, that would affect any plans their partners started formulating 4 years ago. Who knows, maybe affected them for the better, but I always lean towards the conservative estimates on these things though.
 
IKR. One minute they are on track the next they just focusing on larger integrated chiplets. Who knows when they will release another round of desktop cards.

I am hoping they at least manage to compete at the level AMD is at some point. Options are always good.
Battlemage will launch this year for Intel. Guaranteed. The question is how good will be vs RDNA4.
 

Tams

Member
RDNA5 is due in 2026. If the PS6 was 2028 which was the date we got from the FCC trial then it should be RDNA6.

Going by precedent, I reckon the PS6 will be a customised version of RDNA 5 with some elements of RDNA 6.

As RDNA 5 is a major change, that would be the minimum Sony would accept, but they'll also want something that has been tested out in the wild.

RDNA 6 might not be until 2028, but even 2027 would be too late for a 2028 PS6. However, some elements that either AMD have planned for RDNA 6 or that work with Sony on the PS6 end up in RDNA 6 will be there.
 

HeisenbergFX4

Gold Member
That’s not how it works.

Sony may have a rough framework of specs, but they will be working with AMD in tandem on next gen architecture changes
Exactly, well assuming their process is close to how Xbox goes about it

Even before a current gen launches Xbox already had some people ballparking what the next gen looked like for a set price.
 

DonkeyPunchJr

World’s Biggest Weeb
Their "Zen" moment against Nvidia

Ill Be Back Jim Carrey GIF


So fucking delusional. Fucking hell wccftech, such a dumb bait.
It’ll be their “Zen moment” as in, building a completely new architecture instead of building on and adding new functionality to their existing architecture (similar to how Zen was a completely new design that replaced AMD’s previous mobile + desktop architectures)

Will it put them in striking distance of Nvidia and give them an architecture that can be built upon and overtake Nvidia like Zen did with Intel? Wouldn’t bet on it but I’d love to be wrong.
Here we go again... hope for AMD Radeon.... -.-
I’m already reading the “well we already knew RDNA4 was a stopgap, just wait for RDNA5, that’s when the next generation really begins, and it’s gonna be a monster if all those rumors are true!” posts in my head.

seriously though, nothing would make me happier than seeing AMD bitch slap Nvidia like they did with the 9700 Pro, but I’ll believe it when I see it.
 

simpatico

Member






My brother's 5700 XT had 2 years of black screens before fixed

7000 series had 8 months of worse performances than 6000 series in VR before being fixed. How is that acceptable


Jake Gyllenhaal No GIF

5-10% of my gaming hours in any given year are spent playing old Fallout games. And to think I was ready to give Radeon a chance as my next GPU...
 
It’ll be their “Zen moment” as in, building a completely new architecture instead of building on and adding new functionality to their existing architecture (similar to how Zen was a completely new design that replaced AMD’s previous mobile + desktop architectures)

Will it put them in striking distance of Nvidia and give them an architecture that can be built upon and overtake Nvidia like Zen did with Intel? Wouldn’t bet on it but I’d love to be wrong.

I’m already reading the “well we already knew RDNA4 was a stopgap, just wait for RDNA5, that’s when the next generation really begins, and it’s gonna be a monster if all those rumors are true!” posts in my head.

seriously though, nothing would make me happier than seeing AMD bitch slap Nvidia like they did with the 9700 Pro, but I’ll believe it when I see it.
RDNA 3 was hyped to be the “Zen” moment with the first ever chiplet based GPU
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So they are basically sending RDNA4 out to die.....a brutal brutal death.



Intel atlerast has marketshare so RaptorLake-R as aborted a CPU as it was could sustain them for the true nextgen chip.


RDNA4 is gonna sell peanuts......probably wont even be worth the shipping costs.
Who are AMDs major board partners for GPUs......MSI abandoned them yeah?


As much as ive been looking forward to Battlemage, ive always been apprehensive about its ability to actually keep Intel happy enough for Celestial and Druid to make it to market.
But now im looking at Celestial and seriously thinking they might have enough marketshare at that point to be a serious contender for AMD with Druid.
 

hlm666

Member
I don't see this imaginary "Zen Moment" happening for AMD, they're just not as good as Nvidia.

Even when they have the chance, like they did against Lovelace, they completely dropped the ball on price and performance.
It's not that AMD are not as good, it's nvidia are not intel, they are not going to coast on their leader status for multiple generations like intel did. AMD's moment was when nvidia went with samsung 8nm, and instead of punishing that AMD shit the bed selling the tsmc allocation to sony they could have used to make more gpus (and cpus but intel was still failing so they got away with that).

There wont be a zen moment for gpu until nvidia screws up repeatedly like intel.
 

winjer

Member
It's not that AMD are not as good, it's nvidia are not intel, they are not going to coast on their leader status for multiple generations like intel did. AMD's moment was when nvidia went with samsung 8nm, and instead of punishing that AMD shit the bed selling the tsmc allocation to sony they could have used to make more gpus (and cpus but intel was still failing so they got away with that).

There wont be a zen moment for gpu until nvidia screws up repeatedly like intel.

Nvidia slowed down a bit with the 8nm from Samsung, but they weren't slacking off like Intel did for a decade.
Intel slowed down a lot after Sandy Bridge. At best, we were getting 10% gen over gen. And things got much worse after Skylake, with zero IPC improvements for 3 generations.
And then there was the problem with Intel not adopting EUV. And that meant that TSMC also managed to surpass them.
So in a few years, Intel went from being the leader in CPUs and in process nodes. To being outclassed by TSMC, Samsung, AMD, Apple, ARM and Qualcomm.
And to make things even worse, servers started using GPUs, and Intel had nothing. So Nvidia started to gain a huge market share.
And to make things even worse, Intel stopped the development of Larrabee and fired it's main developer. Which then went to Nvidia to make Tensor Cores and create the huge AI boom we see today.
 
Going by precedent, I reckon the PS6 will be a customised version of RDNA 5 with some elements of RDNA 6.

As RDNA 5 is a major change, that would be the minimum Sony would accept, but they'll also want something that has been tested out in the wild.

RDNA 6 might not be until 2028, but even 2027 would be too late for a 2028 PS6. However, some elements that either AMD have planned for RDNA 6 or that work with Sony on the PS6 end up in RDNA 6 will be there.
Yes this is true, if it was Microsoft it be easy to say it would be full RDNA6 in 2028. My biggest worry is PS5 cross gen and the reason to update to a PS5, the console industry right now is in a very challenging environment what with many people happy to play Apex, Fortnite and PUBG on PS4s instead of needing to update to PS5 and the lack of price reduction of PS5s even in 2024.
 

Trogdor1123

Member
Everyone wants and to compete, we do. But, our heads don’t align with our hearts. I wouldn’t hold my breath on this.
 

SmokedMeat

Gamer™






My brother's 5700 XT had 2 years of black screens before fixed

7000 series had 8 months of worse performances than 6000 series in VR before being fixed. How is that acceptable


Jake Gyllenhaal No GIF


I guess it you’re not playing VR games like me currently, you won’t notice?

But yes, AMD’s drivers are generally fine.
 

Ecotic

Member
PCGamer

Ultimately, then, I'll stick to what I said last time around. RDNA 4 and the Radeon RX 8000-series, as it will presumably be known, will be limited in scope and something of a stop gap. It'll be RNDA 5 in late 2025, or more likely 2026, that could be the last roll of the dice for AMD and its Radeon gaming graphics. If that's a flop, it's hard to see why AMD will keep investing in what AMD itself dismisses as a low-margin business. And so it could be adios for discrete Radeon GPUs on the PC.

Is there any truth to this? I read it this weekend and it worried me. I always figured there were good reasons for AMD to be in the discrete GPU gaming business even if they have low market share, like having current tech ready to go for Sony and Microsoft's console business, and powering handhelds like the Steam Deck or Rog Ally. And I imagine it reassures A.I. customers to know that in all aspects of the GPU business, AMD trades blows with Nvidia. To just bow out and cede it to Nvidia seems short-sighted in the big picture.

I'm genuinely asking, the PC GPU business isn't my specialty.
 
Last edited:

Dr.D00p

Member
I doubt Nvidia even wants AMD to exit the GPU market as that would make them a monopoly by default which will mean the regulators could step in and demand the break up of their various divisions.
 

Buggy Loop

Member
PCGamer



Is there any truth to this? I read it this weekend and it worried me. I always figured there were good reasons for AMD to be in the discrete GPU gaming business even if they have low market share, like having current tech ready to go for Sony and Microsoft's console business, and powering handhelds like the Steam Deck or Rog Ally. And I imagine it reassures A.I. customers to know that in all aspects of the GPU business, AMD trades blows with Nvidia. To just bow out and cede it to Nvidia seems short-sighted in the big picture.

I'm genuinely asking, the PC GPU business isn't my specialty.

I mean, I'm surprised Nvidia even bothers with dedicated GPUs now and for how long since that silicon wafer could be better used to sell AI cards with much bigger margins.

Wouldn't be surprised in 10 years that Nvidia is out of dedicated GPUs. Streaming tech will likely be in another league than nowadays also.
 

OmegaSupreme

advanced basic bitch
They need to get their shit together.
Reduce the power consumption and increase the raytracing performance drastically. Also fix the FSR It's nowhere near as good as DLSS.
I don't know why you got lols for this. Fsr is dog shit compared to dlss. Amd is going to have to do a lot to get my money as far as gpus
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I doubt Nvidia even wants AMD to exit the GPU market as that would make them a monopoly by default which will mean the regulators could step in and demand the break up of their various divisions.

I dont think you need to worry about Nvidia becoming a monopoly anytime soon.
 

winjer

Member
I mean, I'm surprised Nvidia even bothers with dedicated GPUs now and for how long since that silicon wafer could be better used to sell AI cards with much bigger margins.

Wouldn't be surprised in 10 years that Nvidia is out of dedicated GPUs. Streaming tech will likely be in another league than nowadays also.

I doubt Nvidia will leave the GPU market. It might not be as profitable as the AI clusters, but it is profitable.
More important, it's the entry gate for the Nvidia ecosystem. Be it computer students or small business, they will use Nvidia GPUs, and CUDA and AI tools.
They will be trained and used to using nvidia software and hardware.
There is a reason why Microsoft used to give away licenses to students, schools and universities. It was to make sure that everyone knew how to use Windows, not MacOS, not Linux.
So when a company hired people, it was much cheaper to buy an enterprise license, than to re-train everyone to use some other OS.

But gaming GPUs will become a third focus for Nvidia. Well behind the AI and professional market.
Every architecture will be developed with AI and the professional market in mind. And then adapted for gaming.

I really doubt cloud streaming will ever not suck. Unless we find a way to break the laws of physics and transfer data at a speed faster than light, latency will always be much worse, than a local machine.
 

Buggy Loop

Member
I doubt Nvidia will leave the GPU market. It might not be as profitable as the AI clusters, but it is profitable.
More important, it's the entry gate for the Nvidia ecosystem. Be it computer students or small business, they will use Nvidia GPUs, and CUDA and AI tools.
They will be trained and used to using nvidia software and hardware.
There is a reason why Microsoft used to give away licenses to students, schools and universities. It was to make sure that everyone knew how to use Windows, not MacOS, not Linux.
So when a company hired people, it was much cheaper to buy an enterprise license, than to re-train everyone to use some other OS.

But gaming GPUs will become a third focus for Nvidia. Well behind the AI and professional market.
Every architecture will be developed with AI and the professional market in mind. And then adapted for gaming.

I really doubt cloud streaming will ever not suck. Unless we find a way to break the laws of physics and transfer data at a speed faster than light, latency will always be much worse, than a local machine.

Good points for entry point.

10 years is a long time with what's coming on the horizon for AI though. The concept of a CPU/GPU drastically change and dissolve into some AI focused compute that just interprets what the graphics should look like. I think the battle of raw rendering and trying to do path tracing and the likes is gonna be pushed for AI.

Streaming too, who knows really, but even Geforce n Iow found totally playable when I tried it. Better than local? No, not on PC at least, but it does have lower latency for many tests and games than even consoles. But again, 10 years is a long damn time. Let's see.
 

winjer

Member
Good points for entry point.

BTW, did you see Nvidia saying that their GPUs are better than NPUs.
With Intel and AMD adding NPUs to their CPUs, this means that there is the potential for the new entry point for AI, to become things like Strix Halo and Arrow lake.
Nvidia sees this as a potential threat, if they have to publicly reinforce that their GPUs are the best for AI.

 
Nvidia slowed down a bit with the 8nm from Samsung, but they weren't slacking off like Intel did for a decade.
Intel slowed down a lot after Sandy Bridge. At best, we were getting 10% gen over gen. And things got much worse after Skylake, with zero IPC improvements for 3 generations.
And then there was the problem with Intel not adopting EUV. And that meant that TSMC also managed to surpass them.
So in a few years, Intel went from being the leader in CPUs and in process nodes. To being outclassed by TSMC, Samsung, AMD, Apple, ARM and Qualcomm.
And to make things even worse, servers started using GPUs, and Intel had nothing. So Nvidia started to gain a huge market share.
And to make things even worse, Intel stopped the development of Larrabee and fired it's main developer. Which then went to Nvidia to make Tensor Cores and create the huge AI boom we see today.

Intel recently brought out AMSL's entire stock of the newest EUV machines for 2024, seems like they want to compete more serious on the foundry front, each machine retails for around 300 million I think. I'm curious to see what kind of implications this will have on their CPU nodes.

 
Last edited:

Cyborg

Member
I would freaking pay extra money if PlayStation switched to Nvidia. DO IT!
AMD is such a joke company in the GPU market
 

twilo99

Member
BTW, did you see Nvidia saying that their GPUs are better than NPUs.
With Intel and AMD adding NPUs to their CPUs, this means that there is the potential for the new entry point for AI, to become things like Strix Halo and Arrow lake.
Nvidia sees this as a potential threat, if they have to publicly reinforce that their GPUs are the best for AI.


It’s more about the software and dev tools which come with Nvidia’s AI offerings.. that’s how they lock you in
 

E-Cat

Member
Completely new architecture, then why the hell is it called RDNA 5?

The marketing department at AMD is embarrassing.
 

DonkeyPunchJr

World’s Biggest Weeb
Completely new architecture, then why the hell is it called RDNA 5?

The marketing department at AMD is embarrassing.
From OP:
The GPU generation following RDNA 4 is expected to be built on a brand new architecture designed from the ground up and it isn't even known if AMD will retain the RDNA 5 branding for it or switch to something entirely new.
 

Tams

Member
I would freaking pay extra money if PlayStation switched to Nvidia. DO IT!
AMD is such a joke company in the GPU market

Lol, I doubt you would be willing to pay the extra using Nvidia would cause. Not to mention, they already tried that, and Nvidia screwed them over.

The main point of consoles is a cheaper computer for gaming.
 

DonkeyPunchJr

World’s Biggest Weeb
Lol, I doubt you would be willing to pay the extra using Nvidia would cause. Not to mention, they already tried that, and Nvidia screwed them over.

The main point of consoles is a cheaper computer for gaming.
Seriously, does anybody think we would get a better console at a $500 price if Sony went with Nvidia? The reason Sony goes with AMD is because they probably get a really good deal. They wouldn’t get a good deal from Nvidia.

(BTW the one time PlayStation went with Nvidia it was an overpriced console stuck with a gimped 7900GT on a 128 bit bus… and launched just before 8800 series came and destroyed everything before it)
 
Intel recently brought out AMSL's entire stock of the newest EUV machines for 2024, seems like they want to compete more serious on the foundry front, each machine retails for around 300 million I think. I'm curious to see what kind of implications this will have on their CPU nodes.

Intel is about 10 years late to the EUV party but I guess it's good they finally turned it around LMAO
 

Buggy Loop

Member
Seriously, does anybody think we would get a better console at a $500 price if Sony went with Nvidia? The reason Sony goes with AMD is because they probably get a really good deal. They wouldn’t get a good deal from Nvidia.

(BTW the one time PlayStation went with Nvidia it was an overpriced console stuck with a gimped 7900GT on a 128 bit bus… and launched just before 8800 series came and destroyed everything before it)

Because Kutaragi was almost about to launch a Playstation 3 with their internal made GPU, which was greatly underpowered because the fucking madman thought Cell was everything. Internally devs said this would be a massive mistake to launch the console as is. They knocked on Nvidia's door without any time to make anything custom.

I hope this is the truth so it drives down the price of Nvidia GPUs

And that's why AMD might as well close down the dedicated GPU division :messenger_tears_of_joy:

You and millions of others, me included actually, want AMD to compete to get cheaper Nvidia cards. Unless AMD comes with something disruptive and have the edge software wise, most are doing just that, hoping for lower nvidia prices.
 
Last edited:
Seriously, does anybody think we would get a better console at a $500 price if Sony went with Nvidia? The reason Sony goes with AMD is because they probably get a really good deal. They wouldn’t get a good deal from Nvidia.

(BTW the one time PlayStation went with Nvidia it was an overpriced console stuck with a gimped 7900GT on a 128 bit bus… and launched just before 8800 series came and destroyed everything before it)
This is what happens when people pretend Nintendo doesn't exist

You don't seriously think that Nintendo "wouldn't get a good deal" from Nvidia? When the Switch launched with a $300 price and was reportedly profitable from day 1? And Nintendo is probably going to be profitable from day 1 with the Switch 2...
 

Panajev2001a

GAF's Pleasant Genius
Because Kutaragi was almost about to launch a Playstation 3 with their internal made GPU, which was greatly underpowered because the fucking madman thought Cell was everything. Internally devs said this would be a massive mistake to launch the console as is. They knocked on Nvidia's door without any time to make anything custom.
Partially yes, mostly probnaly, but believe it or not CELL was not going to be the GPU. It offloaded geometry processing to the CELL BE (like on PS2) and kept triangle setup, rasterisation, and pixel shading on the GPU with a LOT of eDRAM. Rumors are issues were related to thermals and chip complexity.

I reckon they expected Xbox 360 to come out much later than it did and that they would have time to sort that out.
 
Top Bottom