• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

Leonidas

Member
Doesn't really indicate much though. The fact the card was continuously available for the first few weeks of its life was a lot more telling.

AMD is selling what they put on shelves, I doubt they had much to do with Nvidia's sales. LOL
The XT can easily be bought from Newegg and AMD.com the past few days at the MSRP of $899. Still in stock on both right now despite it being a new release and in the holiday season... the XT is a failure.

Hopefully this will lead to quick price reductions.
 
Last edited:

DaGwaphics

Member
The XT can easily be bought from Newegg and AMD.com the past few days at the MSRP of $899. Still in stock on both right now despite it being a new release and in the holiday season... the XT is a failure.

Hopefully this will lead to quick price reductions.

It doesn't seem like buyers are stumbling over themselves to scoop them all up, no (though maybe it's doing a bit better than the 4080 over the same period). Which is why I thought it was weird that the 7900XT was said to make it easy for Nvidia to get $899 out of the RTX4070ti. Why?

If the XT was also immediately sold out, maybe you could say it was establishing that performance level for the price point. As it is, I don't see where AMD's less than optimal sales would do anything to improve the forecast for a $899 4070ti. Nvidia may choose that price because of the 7900XT(assuming similar head-to-head performance -- because maybe they won't price any performance level less than the AMD equivalent), but than both vendors will just have cards with "meh" interest at the $899 price point.

The 4070ti might move a little better than the 4080 just because of the lower price, and the fact that it would be the same price as the AMD option vs. $200 more, but I'd expect it to be way below launch 3070 numbers if it is $899.

But, like I said before, maybe Nvidia doesn't care about the sell-through as long as they are getting the higher margins per chip. They might net more profits in the end that way, who knows.
 
Last edited:

Crayon

Member
They're both to blame, Nvidia with the 4080 pricing and AMD with the 7900XTX and 7900XT pricing. If the 4080 was priced like last gen it would be $700-$800, AMD would then have to rename the 7900XTX to 7800XT and sell it at a similar price point. AMD knew they didn't have a 90 series competitor but still gave it that moniker so they could sell it at $999 and have reviewers praise them for not increasing pricing over the previous gen.

They're both to blame for setting their own prices, not each other's. The theory that Nvidia gets to announce crazy high prices, then a month later AMD is expected to make Nvidia double back to do price cuts smells like cope.

Amd's part in this is that they saw Nvidia's high prices, and decided of their own volition to also price gauge and play naming games. AMD is responsible for AMD prices, Nvidia is responsible for Nvidia prices.

And above all, these overpriced top cards are flying off shelves while the step down upsell shit shows sit around all according to plan. Consumers consuming. In a way I could criticize Nvidia for not charging $2000 for the 4090 because apparently there's no shortage of 'enthusiasts' who would pay it.
 

MikeM

Member
Same. Just the motherboards leds to indicate that stuff is working are annoying, don't understand why people add all these extra led fans and cables, you can even get ram with RGB stuff on it for some reason.
I have argb fans and RAM. Sent them all to white. Looks clean AF with my case.
 
  • Strength
Reactions: GHG

welshrat

Member
Going to pick an AIB version next year. Prob nitro +. I currently have an MSI RX 6800 and it's the best card in have had in a decade. All this AMD hate is really odd. I know if you really want rt then Nvidia is the choice but honestly I haven't missed it and this card and drivers haven't missed a beat. I always swap between Nvidia and amd depending on what takes my fancy but at the moment I am sticking with red.
 
Last edited:
its pretty amazing in some rare instances the 7900 xtx gets the same raster performance as 4090 and in some rare instances the 6900xt reaches 3090 in raster performance. :pie_thinking:
 
its pretty amazing in some rare instances the 7900 xtx gets the same raster performance as 4090 and in some rare instances the 6900xt reaches 3090 in raster performance. :pie_thinking:
Average AMD driver quality tbh. There's a reason why the vast majority of gamers choose Nvidia and it isn't what AMD fanboys screech about daily.

Their hardware isn't terrible but it's always held back by the software.
 
Last edited:

Hoddi

Member
Average AMD driver quality tbh. There's a reason why the vast majority of gamers choose Nvidia and it isn't what AMD fanboys screech about daily.

Their hardware isn't terrible but it's always held back by the software.
It's basically the history of ATi/AMD. Excellent hardware but shoddy software.

I still think it's unfounded in many ways. People often talk about the drivers (in plural) being poor when a single driver package contains something like 10-15 different drivers and it's really just a couple of them that have problems. Their OpenGL driver has always performed poorly against nvidia and their D3D11 driver lacks support for certain optional performance features that nvidia supports. But beyond that it's like 90% nonsense that their drivers have issues. Their Vulkan and D3D12 drivers both seem perfectly fine from what I can tell while OGL/D3D11 don't really matter nowadays.

I haven't bought an AMD card for myself since 2011 but I wouldn't hesitate if they were priced better nowadays. They're just not good enough to warrant $1000 and for the same reason that RTX 4080 isn't good enough to warrant that either.
 
Last edited:

AGRacing

Member
Well I rolled the dice and managed to get an XTX ordered this morning direct from AMD. It seems they're releasing cards every morning at or just after 10 am. I believe my system as configured when it comes will be the last upgrade until an entirely new platform/PC.

I loved the 5700 XT and 6900 XT and I'm confident this card will be treated as well for the next couple of years.
 

twilo99

Member
They really had me with this launch because both my 5700xt and 6800xt have been great and they really over delivered at those price points, so I was 80% sure they will do the same with RDNA 3 .. maybe they can improve things with drivers, but I’m not buying the GPU, at least not for now.
 
Last edited:

PaintTinJr

Member
Average AMD driver quality tbh. There's a reason why the vast majority of gamers choose Nvidia and it isn't what AMD fanboys screech about daily.

Their hardware isn't terrible but it's always held back by the software.
It isn't when you factor out Windows and DirectX and instead use Proton/Linux in many cases.

Just like AMD CPUs were a 2nd class citizen to Intel on Windows over the years and only now with immense levels of compute and massive CPU caches is becoming harder for the Wintel MO to playout as normal. When Microsoft redeveloped DirectX for the original Direct-X-box Nvidia provided not only the GPU but provided Nvidia CG which used HLSL as a unified shader language to be used both with Nvidia CG and DirectX IIRC, and has been that way ever since. Nvidia's hand in DirectX makes them a first class citizen for the API - which is inferior as a hardware agnostic API to Opengl, Mantle and Vulkan - whereas AMD is effectively a 2nd class citizen, and the Windows vs Linux benchmarks differences support that IMHO.

Nvidia have something like 80% of the Windows gaming market, which has 95% of PC gaming market, so blaming AMD for the rigged game where they are always playing driver catch up hardly seems fair IMO. I haven't bought an AMD card - or ever bought an AMD CPU for myself - since they were ATI, so I'm not an AMD fanboy saying this, but I do recognise that benchmarking on WIndows with DirectX games isn't a reflection of the hardware or even the efforts AMD make with their drivers most of the time, and even Intel eluded to the additional performance in comparison their Arc can get using Vulkan based benchmarks.

Look at how Valve are getting way above AMD APU on Windows results with the SteamDeck APU and look at modern games like Calisto Protocol benchmarks - which was optimised for the PS5 Vulkan style API to see a better comparison of the hardware, even if it doesn't solve the reality that it is parity or likely worse product situation than buying Nvidia to use with Windows for gaming.
 

twilo99

Member
It isn't when you factor out Windows and DirectX and instead use Proton/Linux in many cases.

Just like AMD CPUs were a 2nd class citizen to Intel on Windows over the years and only now with immense levels of compute and massive CPU caches is becoming harder for the Wintel MO to playout as normal. When Microsoft redeveloped DirectX for the original Direct-X-box Nvidia provided not only the GPU but provided Nvidia CG which used HLSL as a unified shader language to be used both with Nvidia CG and DirectX IIRC, and has been that way ever since. Nvidia's hand in DirectX makes them a first class citizen for the API - which is inferior as a hardware agnostic API to Opengl, Mantle and Vulkan - whereas AMD is effectively a 2nd class citizen, and the Windows vs Linux benchmarks differences support that IMHO.

Nvidia have something like 80% of the Windows gaming market, which has 95% of PC gaming market, so blaming AMD for the rigged game where they are always playing driver catch up hardly seems fair IMO. I haven't bought an AMD card - or ever bought an AMD CPU for myself - since they were ATI, so I'm not an AMD fanboy saying this, but I do recognise that benchmarking on WIndows with DirectX games isn't a reflection of the hardware or even the efforts AMD make with their drivers most of the time, and even Intel eluded to the additional performance in comparison their Arc can get using Vulkan based benchmarks.

Look at how Valve are getting way above AMD APU on Windows results with the SteamDeck APU and look at modern games like Calisto Protocol benchmarks - which was optimised for the PS5 Vulkan style API to see a better comparison of the hardware, even if it doesn't solve the reality that it is parity or likely worse product situation than buying Nvidia to use with Windows for gaming.

Xbox SERIES XS? Those are millions of windows/DirectX machines exclusively using AMD
 
Well I rolled the dice and managed to get an XTX ordered this morning direct from AMD. It seems they're releasing cards every morning at or just after 10 am. I believe my system as configured when it comes will be the last upgrade until an entirely new platform/PC.

I loved the 5700 XT and 6900 XT and I'm confident this card will be treated as well for the next couple of years.
Would never buy the shitty stock cards. Partner cards are way better especially regarding noise levels.
 
Last edited:

hlm666

Member
It isn't when you factor out Windows and DirectX and instead use Proton/Linux in many cases.

Just like AMD CPUs were a 2nd class citizen to Intel on Windows over the years and only now with immense levels of compute and massive CPU caches is becoming harder for the Wintel MO to playout as normal. When Microsoft redeveloped DirectX for the original Direct-X-box Nvidia provided not only the GPU but provided Nvidia CG which used HLSL as a unified shader language to be used both with Nvidia CG and DirectX IIRC, and has been that way ever since. Nvidia's hand in DirectX makes them a first class citizen for the API - which is inferior as a hardware agnostic API to Opengl, Mantle and Vulkan - whereas AMD is effectively a 2nd class citizen, and the Windows vs Linux benchmarks differences support that IMHO.
Didn't AMD leverage the fact their hardware is in MS consoles to make dxr 1.1 a better fit for their gpus on windows?

If amd hardware is working better using a translation layer doesn't that mean it's their windows driver that is the problem? The direct x functions are still being used so if the problem was the direct x api pumping the calls through proton wouldn't fix anything? If proton ends up more performant than their native windows driver they should get whoever writes their linux driver to write their windows one.
 

AGRacing

Member
Would never buy the shitty stock cards. Partner cards are way better especially regarding noise levels.
Ah one of these guys. I'd love to buy a Red Devil 7900 XTX but at that price I'd just buy a stock 4080. You guys read price to performance review for stock cards... Use them as a reference point for all your flame wars... and then you all run out and buy cards with 300 dollar price premiums because they run a little cooler. Relax. After you buy a Ferrari you don't run to Walmart to put on a glowing eyes skull shifter knob.

My stock 6900 XT was absolutely fine. A great card, actually. Temps for this card have been lower than that in all of the reviews I have read. I will only undervolt the card as I did with the 6900 XT. The price was right.

I appreciate the concern but I'm gonna be just fine. Especially for what I paid.
 

Loxus

Member
I didn't catch the joke...
It's a joke because twitter users were claiming AMD launched the 7900 series before it was ready because the chip revision was A0, but all that became faults because the 6900 and many AMD card also released with A0 revisions.

We have to remember on twitter there's an AMD vs NVidia war going on, so don't believe many of the bad this and bad that rumors on twitter.
 

marquimvfs

Member
It's a joke because twitter users were claiming AMD launched the 7900 series before it was ready because the chip revision was A0, but all that became faults because the 6900 and many AMD card also released with A0 revisions.

We have to remember on twitter there's an AMD vs NVidia war going on, so don't believe many of the bad this and bad that rumors on twitter.
Ah, gotcha. A0 revisions on retail products aren't a bad thing. On the contrary, it means that they nailed the process, and all bugs (there's always bugs) were easily solved by software. It happens with every manufacturer. The bad (for the manufacturer, it almost never affect final client) is the opposite, when the fully functional product is only reached on latter revisions, like Intels on E0 stepping and so on.
 
Last edited:
Ah one of these guys. I'd love to buy a Red Devil 7900 XTX but at that price I'd just buy a stock 4080. You guys read price to performance review for stock cards... Use them as a reference point for all your flame wars... and then you all run out and buy cards with 300 dollar price premiums because they run a little cooler. Relax. After you buy a Ferrari you don't run to Walmart to put on a glowing eyes skull shifter knob.

My stock 6900 XT was absolutely fine. A great card, actually. Temps for this card have been lower than that in all of the reviews I have read. I will only undervolt the card as I did with the 6900 XT. The price was right.

I appreciate the concern but I'm gonna be just fine. Especially for what I paid.

Take a chill pill. Flame wars are you dumb??

I own a PowerColor RedDevil 5700XT and I'm just saying that I wouldn't buy a card at this noise level.

I don't give a fuck about your 4080 comparison.

Red Devil 5700 XT in silent bios 34 DB(a)

4080 stock is 37,5db(a)

So whats your point??
 

PaintTinJr

Member
Xbox SERIES XS? Those are millions of windows/DirectX machines exclusively using AMD
DirectX access om Xbox is through a hardware abstraction layer that Microsoft updates in an opaque fashion to eek out more performance after release if absolute needed or possible, all for not wanting to break backwards compatibility.

IIRC Elden ring on PS4/Pro/PS5 and Steam Deck ran without stutters at, or close to, launch and the DirectX versions - including the Series - still has issue - unless I missed the memo where it was fixed.

Carmack himself found his Rage ran better on Linux through WINE in pre-release testing than on Windows through DirectX, showing the problem was definitely Windows/DirectX.

It could still be AMD's drivers being second class citizens to Windows/DirectX problems compared to Nvidia, but then you have to wonder why performance is better on linux - typically than Nvidia at the same tier too - which would then mean AMD's linux drivers are superior to nvidia's Windows drivers, as well as their own.
 

PaintTinJr

Member
Didn't AMD leverage the fact their hardware is in MS consoles to make dxr 1.1 a better fit for their gpus on windows?

If amd hardware is working better using a translation layer doesn't that mean it's their windows driver that is the problem? The direct x functions are still being used so if the problem was the direct x api pumping the calls through proton wouldn't fix anything? If proton ends up more performant than their native windows driver they should get whoever writes their linux driver to write their windows one.
The remapping to vulkan leverages more performance because Vulkan is leaner and meaner, and the translation catches inefficiencies before translation AFAIK .
 

AGRacing

Member
Take a chill pill. Flame wars are you dumb??

I own a PowerColor RedDevil 5700XT and I'm just saying that I wouldn't buy a card at this noise level.

I don't give a fuck about your 4080 comparison.

Red Devil 5700 XT in silent bios 34 DB(a)

4080 stock is 37,5db(a)

So whats your point?
I made it - read it again slower and repeatedly until your (concern?) for my saving a few hundred dollars at the cost of 3 db just melts away and you stop feeling bad for me.

I do find it funny that I'm about to install a "shitty" card that's several times faster than what you're using though. Thanks for the LOL.
 
I do find it funny that I'm about to install a "shitty" card that's several times faster than what you're using though. Thanks for the LOL.

Maybe you can laugh about your own stupidity.

A card that got released years later after mine and costs more than double the price is faster?? No shit you
B6hS7BF.gif
 

Crayon

Member
It isn't when you factor out Windows and DirectX and instead use Proton/Linux in many cases.

Just like AMD CPUs were a 2nd class citizen to Intel on Windows over the years and only now with immense levels of compute and massive CPU caches is becoming harder for the Wintel MO to playout as normal. When Microsoft redeveloped DirectX for the original Direct-X-box Nvidia provided not only the GPU but provided Nvidia CG which used HLSL as a unified shader language to be used both with Nvidia CG and DirectX IIRC, and has been that way ever since. Nvidia's hand in DirectX makes them a first class citizen for the API - which is inferior as a hardware agnostic API to Opengl, Mantle and Vulkan - whereas AMD is effectively a 2nd class citizen, and the Windows vs Linux benchmarks differences support that IMHO.

Nvidia have something like 80% of the Windows gaming market, which has 95% of PC gaming market, so blaming AMD for the rigged game where they are always playing driver catch up hardly seems fair IMO. I haven't bought an AMD card - or ever bought an AMD CPU for myself - since they were ATI, so I'm not an AMD fanboy saying this, but I do recognise that benchmarking on WIndows with DirectX games isn't a reflection of the hardware or even the efforts AMD make with their drivers most of the time, and even Intel eluded to the additional performance in comparison their Arc can get using Vulkan based benchmarks.

Look at how Valve are getting way above AMD APU on Windows results with the SteamDeck APU and look at modern games like Calisto Protocol benchmarks - which was optimised for the PS5 Vulkan style API to see a better comparison of the hardware, even if it doesn't solve the reality that it is parity or likely worse product situation than buying Nvidia to use with Windows for gaming.

I switched to amd when they got into gear with that open source linux driver. Read an article about it, went right out to the best buy and picked up a 570 8gb. It kicked ass. I just upgraded that thing after a flawless four years and (the reasonably priced thank god) rdna2 was an easy decision.

Using the foss driver rolled into the os feels like not even having drivers. (They unfortunately don't port any of that fun control panel software tho, so it's even more like not having a driver lol) I've never even installed a driver. Stability is excellent. It feels integrated and it'll be a downer (but not necessarily deal-breaker) to give that up if I go nvidia. So you could say its a tangible selling point to me. Also cool if you are more foss-oriented.

Tldr- can confirm: AMD awesome for Linux.
 

Loxus

Member
7900xtx is slower than the rtx3060 on blender3d. lmao. As i have said months ago; amd gpus are trashes both in software and hardware.




blender-3.3-cycles-gpqec5k.jpg

blender-3.3-cycles-gp9eest.jpg
Isn't this because of Driver support and not necessary the cards not being powerful enough?
Can't blame AMD because Blender doesn't support they're RT hardware.

AMD Hardware Ray-Tracing Hopes To Be Ready For Blender 3.5
NVIDIA has long provided an OptiX back-end for Cycles that makes use of the RT cores on modern NVIDIA GPUs as an alternative to their CUDA back-end. The OptiX ray-tracing support within Blender has worked out very well and faster render times for modern NVIDIA GPUs. We've been eager to see AMD's similar ray-tracing support for upstream Blender but it's not coming with the next release (v3.4) but now confirmed to be targeting Blender 3.5.
 
Last edited:

Crayon

Member
I've taken in a bunch of YouTube rambling and at this point it seems all the believable factors are on the table when it comes to why n31 came up short. I'm convinced that it did fail to meet amd's targets. I know there are others that aren't.

At this point we might know all we can know until a: there is some driver update that changes something b: n32 and n33's performance is revealed c: 7950 revision comes out. Any other news is just as likely to be in the form of leaks, aka rumors.
 
i guess the best thing to do to see if these new cards run optimally:

-wait for future driver update
-FSR 3.0
-stable overclocking close to 3.0 Ghz

🤷‍♂️
 

Zathalus

Member
It isn't when you factor out Windows and DirectX and instead use Proton/Linux in many cases.

Just like AMD CPUs were a 2nd class citizen to Intel on Windows over the years and only now with immense levels of compute and massive CPU caches is becoming harder for the Wintel MO to playout as normal. When Microsoft redeveloped DirectX for the original Direct-X-box Nvidia provided not only the GPU but provided Nvidia CG which used HLSL as a unified shader language to be used both with Nvidia CG and DirectX IIRC, and has been that way ever since. Nvidia's hand in DirectX makes them a first class citizen for the API - which is inferior as a hardware agnostic API to Opengl, Mantle and Vulkan - whereas AMD is effectively a 2nd class citizen, and the Windows vs Linux benchmarks differences support that IMHO.

Nvidia have something like 80% of the Windows gaming market, which has 95% of PC gaming market, so blaming AMD for the rigged game where they are always playing driver catch up hardly seems fair IMO. I haven't bought an AMD card - or ever bought an AMD CPU for myself - since they were ATI, so I'm not an AMD fanboy saying this, but I do recognise that benchmarking on WIndows with DirectX games isn't a reflection of the hardware or even the efforts AMD make with their drivers most of the time, and even Intel eluded to the additional performance in comparison their Arc can get using Vulkan based benchmarks.

Look at how Valve are getting way above AMD APU on Windows results with the SteamDeck APU and look at modern games like Calisto Protocol benchmarks - which was optimised for the PS5 Vulkan style API to see a better comparison of the hardware, even if it doesn't solve the reality that it is parity or likely worse product situation than buying Nvidia to use with Windows for gaming.
Come on now. DX12 was better on AMD hardware for like 5 years. Like massively so at times. Microsoft leveraged a low level API that directly benefitted AMD immensely. DX12 was designed to directly take advantage of GCN and it's only when Turing released that Nvidia achieved parity.

It's odd that you mention Vulkan style PS5 API when Vulkan and DX12 are extremely similar as well.

Heck the game that a overclocked 7900 XTX can match the 4090 is in fact a DX12 game, Cyberpunk 2077. A Vulkan game like Doom Eternal has the 4090 stomping all over the 7900 XTX.
 

GymWolf

Member
Well I rolled the dice and managed to get an XTX ordered this morning direct from AMD. It seems they're releasing cards every morning at or just after 10 am. I believe my system as configured when it comes will be the last upgrade until an entirely new platform/PC.

I loved the 5700 XT and 6900 XT and I'm confident this card will be treated as well for the next couple of years.
Tell me more about the 6900.

Kinda thinking about getting one for me.

Do you consider it a full fledged 4k gpu? Or just an high end 1440p high framerate card?
 

winjer

Gold Member
AMD clarifies the confusion regarding the shader prefetcher.

"As with previous hardware generations, shader prefetching is supported on RDNA 3 according to [ gitlab link ]. The code in question controls an experimental feature that was not intended to be included in these products and will not be enabled in this product generation. It is standard in the industry to include experimental features to enable research and tuning for use in a future generation of products."
 

GreatnessRD

Member
That one is like what? 4% more powerfull while being like 300 euros pricier.

I'm not from US, here in europe prices are way over-inflated.
Yeah, like 6% before you clock it for a bit more, but yeah, 300 euros more doesn't sound fun. You'll still have a great time with the 6900 XT. I'd also consider it a high frame card for 1440p as AGRacing AGRacing stated. It can do 4K for sure, but depends on what you're looking for there. 4K Ultra + Raytracing or something, then you'd have to relax, lol
 

GymWolf

Member
Yeah, like 6% before you clock it for a bit more, but yeah, 300 euros more doesn't sound fun. You'll still have a great time with the 6900 XT. I'd also consider it a high frame card for 1440p as AGRacing AGRacing stated. It can do 4K for sure, but depends on what you're looking for there. 4K Ultra + Raytracing or something, then you'd have to relax, lol
I don't give a damn about rtx and i have no problem turning down a notch or 2 some useless settings like shadows, ao, reflections etc.

But not more than that, i'm not gonna pay 800 euros if i have to study 2 hours the perfect combination of settings to achieve 4k60 in any modern heavy\broken game.

This really doesn't look like a full fledged 4k gpu unfortunately.
 

b0uncyfr0

Member
I keep flopping between getting a 3080/6800xt/6700xt now or waiting a few months for the 7700xt/7800xt line.

AM5 boards stil being more expensive plus the x3d being teased certainly puts me in check.
 
Last edited:
Top Bottom