• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

I have a feeling rasterization wise it might be in the ballpark of 4090, more or less some smallish percentage (<10%) but RT wise it will be much much less. Throw in if you are using DLSS/FSR and you have even more variables to account for. No idea were this is going to land without proper benchmarks. Seems nice though.
 

FingerBang

Member
I'm very happy with what AMD is showing here and there's no doubt their value proposition is fantastic compared to what Nvidia is offering.

This seems to be their zen 1 moment and it's even more exciting for the people waiting for low-mid range cards who will not be ass-fucked with insane pricing.

Also, the fact this card can be powered by only 2 pins with no adapters and can just fit in a case is a great advantage. I thought people were overreacting but for fuck's sake the 4090 is huge beyond ridiculous.
 

Buggy Loop

Member
The 4080 is only about 20% faster in raster than the 3090Ti according to NVs own slides.

nf3wckl18ux91.jpg


I hate using this stupid techpowerup chart because of the CPU bottlenecked 4090, funny how r/AMD is only using that site.. anyway, doesn’t matter, it’s for the 4080 estimates.

But you see, 4090 exceeded estimations from the Nvidia slides when it was benchmarked, nobody thought it was such a beast after the presentation. Who’s to say 4080 won’t benefit similarly? I’m calling it now, 4080 and 7900XTX will trade blows at rasterization. I think I don’t need to mention how it’ll go for RT..
 

Buggy Loop

Member
Remember this is a big architectural change. The drivers will improve over time ( knowing amd over a long slow amount of time 😂) and the cache set up is going to make these cards very interesting.

Oh and nvidia is doomed.

Oh god, it’s the AMD copium already

Let me guess, RDNA 4 will be the Nvidia killer?

The AMD cycle continues, as precise as Sonic fans’ hope for a new good game only to be crushed.
 
Last edited:

hlm666

Member
Starting next year many more games like Metro Exodus EE will release completely designed around utilizing raytraced lighting. Silent Hill 2 Remake is using Lumen from UE5 in 2023. I'm not sure that Jedi Survivor will use Lumen but it is an UE game and the developers have already said they intend to use a raytraced lighting system. Both are slated for a 2023 release. We even got a confirmation through a job listing that Starfield will utilize raytracing.
Calisto Protocol has RT and AMD didn't even mention it, pretty much speaks volumes but they talked about RT shadows in Halo ........... To top things off none of these RT numbers being used for nvidia have SER used either.
 

Mister Wolf

Gold Member
Calisto Protocol has RT and AMD didn't even mention it, pretty much speaks volumes but they talked about RT shadows in Halo ........... To top things off none of these RT numbers being used for nvidia have SER used either.

Now the UE has Lumen, raytraced lighting is going to be widely adopted across the industry. Lumen is GPU/CPU heavy like every other raytraced lighting system.
 
Last edited:

64bitmodels

Reverse groomer.
Oh god, it’s the AMD copium already

Let me guess, RDNA 4 will be the Nvidia killer?

The AMD cycle continues, as precise as Sonic fans’ hope for a new good game only to be crushed.
RDNA 3 has already killed nvidia wdym
even if the 4080 traded blows with the 7900xtx (lmao fat chance) why the fuck would you get one over the 7900xtx, which is definitely way cheaper and far more worth your money.
nvidia is doomed
 
Last edited:

Buggy Loop

Member
What are you talking about? We can’t talk about the new architecture in a rdna3 thread? And why are you posting theoretical benchmarks. Why not wait until actual benchmark comparisons?
Seems like you are the one coping for nvidia here in a AMD thread.

Who sees a conference with no mention of competitor fps to compare to, then see that even AMD stacked the cards against their very own 6950XT to boost up the 7900XTX gains, would say that Nvidia is doomed?

It’s even more dire than RDNA 2. 3090 had less advantage compared to 6900 than what we’re estimating (bullshit amd slide) between 4090 vs 7900XTX, and what happened last gen? 3090 sold more than the entirety of the RDNA 2 series.

It’s a fucking déjà vu

And same with MSRP, that’s bollocks with AMD AIBs, they aren’t just $50 higher, they go way high.

64bitmodels 64bitmodels
Who the fuck stacks $1000 at a minimum in the hope of a unicorn reference card you’ll never get on amd.com, is telling to himself that it’s much cheaper than $1200? Oh it’s ok, I’ll pick the card that is 50% of the RT performance, at the dawn of waves after waves of RT features games
 
Last edited:

FireFly

Member
I hate using this stupid techpowerup chart because of the CPU bottlenecked 4090, funny how r/AMD is only using that site.. anyway, doesn’t matter, it’s for the 4080 estimates.

But you see, 4090 exceeded estimations from the Nvidia slides when it was benchmarked, nobody thought it was such a beast after the presentation. Who’s to say 4080 won’t benefit similarly? I’m calling it now, 4080 and 7900XTX will trade blows at rasterization. I think I don’t need to mention how it’ll go for RT..
Even in heavily ray traced games in the presentation AMD was 50% faster, so I doubt we will see only a 40% average speed up. And a 67% maximum increase for the 4080 from the 3080 is beyond the 64% increase in compute, based off boost clocks.
 

kuncol02

Banned
Top card vs top card. We've been doing this since times immemorial.

Also news to me that 450 is almost 2x 300. You can also power limit the 4090 to 350W and retain 95% of its 450W performance.
Do you also compare top Fiat vs top BMW vs top Iveco?

 

64bitmodels

Reverse groomer.
Who the fuck stacks $1000 at a minimum in the hope of a unicorn reference card you’ll never get on amd.com, is telling to himself that it’s much cheaper than $1200? Oh it’s ok, I’ll pick the card that is 50% of the ET performance, at the dawn of waves after waves of RT features games
1. the cryptomining bs is over, the scalpers won't do shit for at least like a week lol
2. yes i will infact pick the card that's 50% of the rt performance because literally every game announced for 2023, 2024 and 2025 have shown us literally nothing about how much it'll be used. Even if it's used, it'll be some optional toggle that removes like 70% of the performance for some prettier lighting effects
"we're at the dawn of waves after waves of RT features games" ok so where the fuck are they? Why are people still doubting RT if we're at the dawn of RT games?

Nvidia is doomed, imma keep saying it cuz it's pissing you off
 
Last edited:

OZ9000

Banned
1. the cryptomining bs is over, the scalpers won't do shit for at least like a week lol
2. yes i will infact pick the card that's 50% of the rt performance because literally every game announced for 2023, 2024 and 2025 have shown us literally nothing about how much it'll be used. Even if it's used, it'll be some optional toggle that removes like 70% of the performance for some prettier lighting effects
"we're at the dawn of waves after waves of RT features games" ok so where the fuck are they? Why are people still doubting RT if we're at the dawn of RT games?

Nvidia is doomed, imma keep saying it cuz it's pissing you off
Metro Exodus EE features full RT and it hardly looks any different from the original game. The shadow and lighting looks more realistic but doesn't necessarily improve the visuals.

Unless we see games utilising RT from the ground up, it's always going to be a feature which gives prettier lighting for a huge performance penalty (as opposed to a revolution in visuals).

Granted with DLSS 3.0 and FSR 3.0 you'll get 'free frames' but the input lag sounds pretty shit on both. I like my games to be fast and responsive. Input lag feels horrendous when playing with a controller.

Given how games are built with consoles in mind, I think we're 4-5 years away until we see the full benefit of raytracing.
 
Last edited:

benno

Member
That article isn't correct with the FE. Only overclocked cards hit the 600w limit
olVL88O.jpg
 

Buggy Loop

Member
Even in heavily ray traced games in the presentation AMD was 50% faster, so I doubt we will see only a 40% average speed up. And a 67% maximum increase for the 4080 from the 3080 is beyond the 64% increase in compute, based off boost clocks.

We should stop using whatever AMD put in their slides. Maybe a few of you missed previous page :

https://www.neogaf.com/threads/amd-...vailable-december-13th.1644577/post-266872747



https://www.neogaf.com/threads/amd-...vailable-december-13th.1644577/post-266873895
 
Last edited:

Buggy Loop

Member
Nvidia is using a more advance node for Ada Lovelace - TSMC 4N vs TSMC N5 + N6 on RDNA 3

Cost wise, RNDA 3 cards will be significantly cheaper to produce due to die size and MCM design.

4N node is a slight mod to N5, no advantage there
 

Pagusas

Elden Member
Great to see a sub 1k price, but sad knowing the RT performance will still suck. Nvidia is still the only flagship game in town if you care about it, which I hoped would change. But that’s ok, the 4090 is a beast of a card and their was basically no chance AMD touched it.
 

Tams

Member
Metro Exodus EE features full RT and it hardly looks any different from the original game. The shadow and lighting looks more realistic but doesn't necessarily improve the visuals.

Unless we see games utilising RT from the ground up, it's always going to be a feature which gives prettier lighting for a huge performance penalty (as opposed to a revolution in visuals).

Granted with DLSS 3.0 and FSR 3.0 you'll get 'free frames' but the input lag sounds pretty shit on both. I like my games to be fast and responsive. Input lag feels horrendous when playing with a controller.

Given how games are built with consoles in mind, I think we're 4-5 years away until we see the full benefit of raytracing.
I reckon it'll be like all the other Nvidia proprietary technologies; they technically work and do makes graphics look better, but they aren't worth the hit in performance and will either be dropped or a more open implementation eventually adopted.

PhysX is pretty cool. You never see mention of it now. Hairworks does look pretty great. Name five games that use it. 3D Vision is/was actually a really good implementation of 3D. It's deader than a dodo.
 

Buggy Loop

Member
1. the cryptomining bs is over, the scalpers won't do shit for at least like a week lol

You don't get it. AMD does not produce many cards at all. They know their market share. There's no flow of their reference cards. They are a non factor, 1.48% for the entire RDNA 2 range.

1ru4foykc6s91.png




2. yes i will infact pick the card that's 50% of the rt performance

r/AyyMD can't wait to welcome you. Especially that ones that will tell you "WuT ? I don't have driver issues! Driver problems are overstated i've not had problems for X months! Why are you having problems with drivers!" while you have grey/black screen among many other problems.

5E5248AE2070CE4CD7BCEF89666CCA259C085B4D
 

Gaiff

SBI’s Resident Gaslighter
Do you also compare top Fiat vs top BMW vs top Iveco?

[/URL][/URL][/URL]
If there were two brands of cars only, yes, I would.

Did you bother reading the URL you posted?

As you can see, the RTX 4090 draws a total power of 425W in MSI Kombuster, a stress-testing application, while sustaining a GPU core clock of 3GHz. This is nearly 500MHz higher than its stock boost clock of 2,520MHz.

That's a stress test pushing the card far above gaming scenarios. Additionally, this was 4 days before release, the FE caps out at 450W with some AIB models going higher. The 4090 draws 400-450W under load and is easy to power limit to 350-400W while retaining almost the same performance. It doesn't draw almost 2x what the 7900 XTX does.
 
Last edited:

kyliethicc

Member
Can anyone explain to me why they went from 7nm to 5nm and REDUCED the clockspeeds to 2.3 GHz? The 6900xt used to regularly hit 2.5 ghz.

Especially when nvidia went from 2.0 ghz to 3.0 ghz when going from 8nm to 4nm. I was expecting clocks to hit 3.0 GHz and 100 tflops. 61 is impressive but way below what the rumors were indicating. The total board power is also very conservative. If Nvidia is willing to go to 450-550 watts, why are they locking themselves to slower clocks and just 350 watts?

I really wonder what happened here. poor performance at higher clocks? logic failure? poor thermals? Even with 2.5 ghz they couldve beaten the 4090.
the official boost clock of the 6900xt reference is 2.2 GHz

the official boost clock of the 7900xtx reference is 2.5 GHz
 

PhoenixTank

Member
Also news to me that 450 is almost 2x 300. You can also power limit the 4090 to 350W and retain 95% of its 450W performance.
My understanding is that, unfortunately, the 4090 will not boot with just two 8 pins into the adapter, so you're limited to software side corrections with the power limit and non-bypassable power requirements on the Psu. That will be a factor for some.
This doesn't invalidate what you're saying, just thinking of the practical implications.
 

JohnnyFootball

GerAlt-Right. Ciriously.
My understanding is that, unfortunately, the 4090 will not boot with just two 8 pins into the adapter, so you're limited to software side corrections with the power limit and non-bypassable power requirements on the Psu. That will be a factor for some.
This doesn't invalidate what you're saying, just thinking of the practical implications.
You're understanding is incorrect. The founders edition absolutely will boot with just two 8 pin adaptors.
 

FireFly

Member

b0bbyJ03

Member
I’m surprised by how much people care about RT. I’ve had a 3080 since launch and never ever turned that shit on, other than to see what it looks like for a moment. I’ll take the performance over the slightly better lights and shadows. I’m not saying I will buy this new set of AMD GPUs, but if I was in the market the RT performance would be low on the list of things I’d be looking for. I’m more concerned about things like the 1% lows and overall driver stability. If they get those right I might consider buying the XTX.
 

Fredrik

Member
To be honest even with the likely better RT performance it still isn't worth getting a 4080 for that price over the XTX.

The 4080 would only be worth it if it cost sub 1k.
If it would be better then it may be worth it for someone who just want the card that is better 🤷‍♂️ We’re all enthusiasts here and a price difference of $300 on a $4,000+ PC setup is hardly a deciding factor.

BUT! Just comparing specs alone will put 4080 behind on everything and I assume that non-enthusiast gamers don’t look at performance graphs but look at dumbed down comparison tables and ”experts” on Youtube. Things could easily get awkward for Nvidia. Just like with AMD vs Intel on CPUs a few years ago everyone wants AMD to succeed here.
 

OZ9000

Banned
If it would be better then it may be worth it for someone who just want the card that is better 🤷‍♂️ We’re all enthusiasts here and a price difference of $300 on a $4,000+ PC setup is hardly a deciding factor.

BUT! Just comparing specs alone will put 4080 behind on everything and I assume that non-enthusiast gamers don’t look at performance graphs but look at dumbed down comparison tables and ”experts” on Youtube. Things could easily get awkward for Nvidia. Just like with AMD vs Intel on CPUs a few years ago everyone wants AMD to succeed here.
I just know that there is no chance in hell anyone will be able to buy a 4080 for $1200 flat. AMD cards can be purchased cheaper than MRSP at present. Nvidia cards are way more expensive than MSRP. The UK pricing for the 4080 is pretty retarded - I think it's £1500-1600. AMD's pricing seems to translate better to the UK.
 
Last edited:

winjer

Gold Member
I doubt that AMD cards will be CPU limited at 4K in ray tracing, and it looks like the same CPU was used for the rasterization benchmarks. But I agree we should wait until the reviews.

That is probably correct.
But RT reflections increase CPU usage significantly.
Usually a game engine will cull everything not within the field of view. But to render RT reflections, it's necessary to render objects off the players FOV.
So the amount of geometry, draw calls, NPC AI, etc, to be rendered will increase, as more of the world is increased.

Another thing to consider is ports that use a bad RT implementation, like Spider-man.
This one uses the CPU to do BVH sorting and traversal, so it's more load into the CPU ad PCI-e bus.
 

DaGwaphics

Member
Can anyone explain to me why they went from 7nm to 5nm and REDUCED the clockspeeds to 2.3 GHz? The 6900xt used to regularly hit 2.5 ghz.

Especially when nvidia went from 2.0 ghz to 3.0 ghz when going from 8nm to 4nm. I was expecting clocks to hit 3.0 GHz and 100 tflops. 61 is impressive but way below what the rumors were indicating. The total board power is also very conservative. If Nvidia is willing to go to 450-550 watts, why are they locking themselves to slower clocks and just 350 watts?

I really wonder what happened here. poor performance at higher clocks? logic failure? poor thermals? Even with 2.5 ghz they couldve beaten the 4090.

I'm sure higher clocks can be supported. It's possible that AMD is trying to create a situation where the reference specs are easier to deal with for AIBs. One thing that has happened recently (especially on the Nvidia side) is that you don't get many reference size designs anymore with smaller coolers at closer to msrp prices (the 6800 was supposed to be a dual slot card but AMD was the only vendor that made one). You might see cheaper models closer to the reference specs with more expensive models with higher clocks and power draw being sold as the OC cards, etc.
 

64bitmodels

Reverse groomer.
r/AyyMD can't wait to welcome you. Especially that ones that will tell you "WuT ? I don't have driver issues! Driver problems are overstated i've not had problems for X months! Why are you having problems with drivers!" while you have grey/black screen among many other problems.
Struck a nerve, didn't i?

troll GIF


The funny part is that i actually haven't had any driver issues, if at all with my 6650 xt, so they're unironically right on that one in my experience.

There won't be many major games releasing with RT support as a requirement in the coming years because
1. the vast majority of steam users use a 1060, a card which doesn't even support RT so good luck getting any sales lmao
2. you need to purchase a 1200/1600 dollar card to even use it
3. these games are being made for the next gen consoles which are much weaker and can barely handle RT. They only get ported up to PC after the fact

And i know you arent gonna refute any of those points because the moment i brought them up to you once you instead just resulted to insults like a petulant child. Jensen isn't gonna ride your dick bro
 
Last edited:

DaGwaphics

Member
Starting next year many more games like Metro Exodus EE will release completely designed around utilizing raytraced lighting. Silent Hill 2 Remake is using Lumen from UE5 in 2023. I'm not sure that Jedi Survivor will use Lumen but it is an UE game and the developers have already said they intend to use a raytraced lighting system. Both are slated for a 2023 release. We even got a confirmation through a job listing that Starfield will utilize raytracing.



Important to note that so far games that require the tech for basic operation have been much better optimized for AMD hardware (because of the consoles) than titles where it is an added mode for PC players. Metro Exodus EE and the UE5 city sample are good examples of that. As much as Nvidia would like to double down on RT, the consoles will ensure that base requirements are well optimized for AMD.
 
Last edited:
If it would be better then it may be worth it for someone who just want the card that is better 🤷‍♂️ We’re all enthusiasts here and a price difference of $300 on a $4,000+ PC setup is hardly a deciding factor.
Is there really any point in buying a 4080 though when the 4090 is just a few hundred dollars more.
 

Gaiff

SBI’s Resident Gaslighter
Important to note that so far games that require the tech for basic operation have been much better optimized for AMD hardware (because of the consoles) than titles where it is an added mode for PC players. Metro Exodus EE and the UE5 city sample are good examples of that. As much as Nvidia would like to double down on RT, the consoles will ensure that base requirements are well optimized for AMD.
Not sure what you mean by "better optimized for AMD hardware" because Metro Exodus still runs far better on NVIDIA cards and even Spider-Man which is a PS5 port also runs much better on NVIDIA hardware.
 

DaGwaphics

Member
Not sure what you mean by "better optimized for AMD hardware" because Metro Exodus still runs far better on NVIDIA cards and even Spider-Man which is a PS5 port also runs much better on NVIDIA hardware.

Nvidia will run it better still, but the disparity isn't there to nearly the extent that it is on other titles.

As can be seen in the screen grabs from the JayzTwoCents review of Arc 770. Notice how the 6600XT closes the gap with the 3060 in the game with the baseline RT requirement (thanks to heavy optimization for consoles), this same behavior repeats itself in titles like F1 and the UE5 City Sample as well. GI and things like that are going to be well optimized for AMD when they are part of a games basic specification.

comparison.jpg
 
Last edited:

Rickyiez

Member
I’m surprised by how much people care about RT. I’ve had a 3080 since launch and never ever turned that shit on, other than to see what it looks like for a moment. I’ll take the performance over the slightly better lights and shadows. I’m not saying I will buy this new set of AMD GPUs, but if I was in the market the RT performance would be low on the list of things I’d be looking for. I’m more concerned about things like the 1% lows and overall driver stability. If they get those right I might consider buying the XTX.
I finished Control with RT on using my 3080 Ti . It looks amazing and runs pretty well with DLSS

I hope more games utilized it as well as Control, it should be the minimum benchmark moving on
 
Last edited:

winjer

Gold Member
r/AyyMD can't wait to welcome you. Especially that ones that will tell you "WuT ? I don't have driver issues! Driver problems are overstated i've not had problems for X months! Why are you having problems with drivers!" while you have grey/black screen among many other problems.

Yes, issues with AMD drivers have been very overstated by nvidia fanboys.
In the last 20 years, close to one third of my GPUs were AMD/ATI. I didn't have many more issues with drivers with one brand or another.
Most of the time, they were just different.

Just recently nvidia has a few severe issues withe their drivers. For example.
Corruption with textures in a few games like FH5 and COD.
Severe performance issues with a release of GFE.
Increase in thread handles, resulting in lower system performance.
And these are know bugs, some already fixed by nvidia. Maybe you were affected by them, or not. But several people have been.

I also participate in the Guru3D forums, where at least one member is part of nvidia's driver team, that goes by the name of ManuelG.
That's why I know well about these bugs in nvidia drivers. And if anyone is having driver issues, they can report them to ManuelG, and he and his team will test and try to fix them.

And before you try to claim, that I'm only talking about nvidia bugs, there is also threads on Guru3d talking about the bugs in AMD's drivers.
 
Last edited:

Buggy Loop

Member
Yes, issues with AMD drivers have been very overstated by nvidia fanboys.
In the last 20 years, close to one third of my GPUs were AMD/ATI. I didn't have many more issues with drivers with one brand or another.
Most of the time, they were just different.

Just recently nvidia has a few severe issues withe their drivers. For example.
Corruption with textures in a few games like FH5 and COD.
Severe performance issues with a release of GFE.
Increase in thread handles, resulting in lower system performance.
And these are know bugs, some already fixed by nvidia. Maybe you were affected by them, or not. But several people have been.

I also participate in the Guru3D forums, where at least one member is part of nvidia's driver team, that goes by the name of ManuelG.
That's why I know well about these bugs in nvidia drivers. And if anyone is having driver issues, they can report them to ManuelG, and he and his team will test and try to fix them.

And before you try to claim, that I'm only talking about nvidia bugs, there is also threads on Guru3d talking about the bugs in AMD's drivers.

Dude, i was on ATI/AMD bandwagon since Mach series in early 1990’s

I know all about the cycle of AMD hopes. I’m saying it’s a déjà vu with RDNA 2 vs Ampere. Don’t expect more than 2% steam market share at the end of the gen. It’s dire, sad to say, but majority just want AMD to do good (who doesn’t) only to get cheaper Nvidia cards in the end. A few will buy just out of spite to Nvidia, but that’s literally the “there’s dozens of us” meme at this point.
 

winjer

Gold Member
Dude, i was on ATI/AMD bandwagon since Mach series in early 1990’s

I know all about the cycle of AMD hopes. I’m saying it’s a déjà vu with RDNA 2 vs Ampere. Don’t expect more than 2% steam market share at the end of the gen. It’s dire, sad to say, but majority just want AMD to do good (who doesn’t) only to get cheaper Nvidia cards in the end. A few will buy just out of spite to Nvidia, but that’s literally the “there’s dozens of us” meme at this point.

I'm one of the few in this forum, that stated that getting to high expectations for RDNA3 was a mistake, that would only lead to disappointment. At a time when people were hyping 2.5 performance increases for RT. And other non-sense.
Don't try to put me in the group of AMD fanboys. I'm neither for AMD nor NVidia, nor Intel. I'm for my wallet. And I'll buy the best bang for buck.

Now, you are correct, RDNA3 vs Ada Lovelace is almost a repeat of the RDNA2 vs Ampere fight.
But that's why AMD has priced RDNA3 cards much lower than NVidia.
 

DaGwaphics

Member
FYI, A770 is a ~17 TFLOPS with a 256-bit bus GDDR6 class card.

Who cares about the A770 in this context. I used the graphic to compare the 3060 and the 6600xt. How the two are much closer together in RT performance when the feature is part of the basic specification of the game (optimized for RDNA2 based consoles) vs. when it is a mode added as a bonus for PC players.
 

rnlval

Member
Oh god, it’s the AMD copium already

Let me guess, RDNA 4 will be the Nvidia killer?

The AMD cycle continues, as precise as Sonic fans’ hope for a new good game only to be crushed.
RX 7900 XTX has ~61 TFLOPS FP32 compute while RTX 4090 has ~82.6 TFLOPS FP32 compute. Modulate your expectations.
 
It kills the 4080. 4090 is better but not sure it is worth the price difference.

Why are you so sure the chiplet design will OC so well? 30% is insane gains.
Because the RDNA 2 was already extrem over clocker .RDNA 3 is the improvement of that.I don’t know if AMD changed something drastic but normally RDNA 3 should over clock extrem well.We will see when third party cards show up what’s possible
 
Top Bottom