• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

GreatnessRD

Member
I don't give a damn about rtx and i have no problem turning down a notch or 2 some useless settings like shadows, ao, reflections etc.

But not more than that, i'm not gonna pay 800 euros if i have to study 2 hours the perfect combination of settings to achieve 4k60 in any modern heavy\broken game.

This really doesn't look like a full fledged 4k gpu unfortunately.
Goes with that ole moniker of depending on what you play. But it can for sure achieve your 4K 60 a high settings. Check out this 6900 XT review vs. the 3080 12GB for newer data from Hardware Unboxed.

 

GymWolf

Member
Goes with that ole moniker of depending on what you play. But it can for sure achieve your 4K 60 a high settings. Check out this 6900 XT review vs. the 3080 12GB for newer data from Hardware Unboxed.


I play demanding games and many open worlds, and i expect 4k60 ROCK SOLID without scaling down stuff like textures, details, effects etc., i can live with everything else on normal\high, but i'm never gonna turn any of the important stuff lower than ultra in a 2000++ dollars build.

Onestly if feels like only 4090, 7900xtx and maybe 4080 16gb are the only viable "future proof" 4k60 cards around.

I'm trying to save some money looking for 6900 tier cards but i onestly feel like they are not the right choice for something that i would keep for minimum 3 max 4 years, stuff like atomic heart, hogwarts etc. are gonna put on their knees these cards and i'm basically making a build to play all the big hits of 2023.

But i can't honestly pay the absurd euro tax on those already overpriced products and being ok with myself.

So i'm kinda stuck right now, i have almost all the parts ready except the gpu...
 
Last edited:

GreatnessRD

Member
I play demanding games and many open worlds, and i expect 4k60 ROCK SOLID without scaling down stuff like textures, details, effects etc., i can live with everything else on normal\high, but i'm never gonna turn any of the important stuff lower than ultra in a 2000++ dollars build.

Onestly if feels like only 4090, 7900xtx and maybe 4080 16gb are the only viable "future proof" 4k60 cards around.

I'm trying to save some money looking for 6900 tier cards but i onestly feel like they are not the right choice for something that i would keep for minimum 3 max 4 years, stuff like atomic heart, hogwarts etc. are gonna put on their knees these cards and i'm basically making a build to play all the big hits of 2023.

But i can't honestly pay the absurd euro tax on those already overpriced products and being ok with myself.

So i'm kinda stuck right now, i have almost all the parts ready except the gpu...
6950 XT is that card, but since you're Euro, wouldn't make sense to spend the money it will cost to get it. I'd just say wait for the 7900 XTX to get in stock or settle for a used 3090/Ti or the supposed 4080 price drop that won't happen now that AMD dropped the ball.
 

GymWolf

Member
6950 XT is that card, but since you're Euro, wouldn't make sense to spend the money it will cost to get it. I'd just say wait for the 7900 XTX to get in stock or settle for a used 3090/Ti or the supposed 4080 price drop that won't happen now that AMD dropped the ball.
I sincerely doubt that a gpu 4% more powerfull than a 6900 is "that card":lollipop_grinning_sweat:

The 7900xtx is like 1500 euros in europe, this is why i said euro tax...

A 3090ti is around 1500-1700, same for the 4080 16gb, unless nvidia cut the price of 500-700 euros there is no options really.

Believe me, i did my math.

Europeans are royally fucked.

P.s, i don't buy used pc parts.
 
Last edited:

GreatnessRD

Member
I sincerely doubt that a gpu 4% more powerfull than a 6900 is "that card":lollipop_grinning_sweat:

The 7900xtx is like 1500 euros in europe, this is why i said euro tax...

A 3090ti is around 1500-1700, same for the 4080 16gb, unless nvidia cut the price of 500-700 euros there is no options really.

Believe me, i did my math.

Europeans are royally fucked.

P.s, i don't buy used pc parts.
Well, according to HUB's 12 game average at 4K, the 6950 XT is 12% faster than the 6900 XT and firmly over 60 FPS. Do with that what you will. And I'm sure you did your math, I was just letting you know your grim choices based on your preference and what you're looking for. The TL; DR for your outcome will be its gonna be pricey in Euro land anyway you slice it, unfortunately. And used parts aren't that bad, but I feel you. Haha

Good luck with whatever you decide and end up getting. Make sure to show us your masterpiece!
 

GymWolf

Member
Well, according to HUB's 12 game average at 4K, the 6950 XT is 12% faster than the 6900 XT and firmly over 60 FPS. Do with that what you will. And I'm sure you did your math, I was just letting you know your grim choices based on your preference and what you're looking for. The TL; DR for your outcome will be its gonna be pricey in Euro land anyway you slice it, unfortunately. And used parts aren't that bad, but I feel you. Haha

Good luck with whatever you decide and end up getting. Make sure to show us your masterpiece!
Sorry if my tone was a bit jackassery (i think?), it was not my intention, i always love any type of help.

The "i did my math" was mostly out of frustration because even if i consider a 6950 as viable, they are way more pricey than the single 6900 asrock i found for 780 euros. (the other 6900 are way pricier)
 

winjer

Gold Member
Sorry if my tone was a bit jackassery (i think?), it was not my intention, i always love any type of help.

The "i did my math" was mostly out of frustration because even if i consider a 6950 as viable, they are way more pricey than the single 6900 asrock i found for 780 euros. (the other 6900 are way pricier)

Take a look at an Italian store called BPM Power, they have an Asus 6950XT for 873 euros.
Looks really beefy and good quality built.

The advantage of the 6950XT is that it has higher clocks for memory and caches.
RDNA2 usually tends to lose a bit of performance at higher resolutions because of lack of memory bandwidth. This will help out a bit.
 

GreatnessRD

Member
Sorry if my tone was a bit jackassery (i think?), it was not my intention, i always love any type of help.

The "i did my math" was mostly out of frustration because even if i consider a 6950 as viable, they are way more pricey than the single 6900 asrock i found for 780 euros. (the other 6900 are way pricier)
Oh nah, you good. I didn't take it any kind of way. Just was replying. You good my mans. I'm sorry that Nvidia and AMD have y'all in a severe chokehold over there. Hopefully, it calms down soon.
 

GymWolf

Member
Take a look at an Italian store called BPM Power, they have an Asus 6950XT for 873 euros.
Looks really beefy and good quality built.

The advantage of the 6950XT is that it has higher clocks for memory and caches.
RDNA2 usually tends to lose a bit of performance at higher resolutions because of lack of memory bandwidth. This will help out a bit.
I have 2 super noob questions because i always had nvidia gpus and i have kinda of a strange monitor disposition.

Basically i have a pc monitor attached via display port into the gpu and my oled television attached via hdmi, to switch from one panel to the other i just go to the control panel-multiple display-and i select what monitor is the active one and then press apply.

Is the amd counterpart of doing this as easy? windows automatically advise me that i'm using a g-sync display when i switch to my lg c1, is amd as user friendly when it comes to multiple monitors?

Is freesync as good as g-sync? (the lg c1 has both)

Thanks.
 
Last edited:

winjer

Gold Member
I have 2 super noob questions because i always had nvidia gpus and i have kinda of a strange monitor disposition.

Basically i have a pc monitor attached via display port into the gpu and my oled television attached via hdmi, to switch from one panel to the other i just go to the control panel-multiple display-and i select what monitor is the active one and then press apply.

Is the amd counterpart of doing this as easy? windows automatically advise me that i'm using a g-sync display when i switch to my lg c1, is amd as user friendly when it comes to multiple monitors?

Is freesync as good as g-sync? (the lg c1 has both)

Thanks.

I don't know about the changing of displays. So I can't help you with that.
But it's probably the same. AMD does have one tech called HydraVision for multiple displays.
Maybe someone else can help you with that question.

What is your monitor? If it's G-sync Premium, then it won't work with an AMD GPU.
If it's G-sync Compatible, then it works, simply because that is Freesync rebranded to Nvidia.
 

GymWolf

Member
I don't know about the changing of displays. So I can't help you with that.
But it's probably the same. AMD does have one tech called HydraVision for multiple displays.
Maybe someone else can help you with that question.

What is your monitor? If it's G-sync Premium, then it won't work with an AMD GPU.
If it's G-sync Compatible, then it works, simply because that is Freesync rebranded to Nvidia.
My monitor is a piece of shit that i only use for forum browsing, i do everything on my oled lg c1 that has both gsync and freesync.
 

winjer

Gold Member
My monitor is a piece of shit that i only use for forum browsing, i do everything on my oled lg c1 that has both gsync and freesync.

Then you won't have any problem.
Just a different control panel to change settings and stuff.

I recently got a 6800XT, after almost a decade with only nvidia.
Some things are similar, somethings are different.
 

MikeM

Member
I got my 7900xt card today. Rock solid after stess test. No coil whine, boost to 2700mhz with good temps.
Thats good to know. What temps are you getting?
I have 2 super noob questions because i always had nvidia gpus and i have kinda of a strange monitor disposition.

Basically i have a pc monitor attached via display port into the gpu and my oled television attached via hdmi, to switch from one panel to the other i just go to the control panel-multiple display-and i select what monitor is the active one and then press apply.

Is the amd counterpart of doing this as easy? windows automatically advise me that i'm using a g-sync display when i switch to my lg c1, is amd as user friendly when it comes to multiple monitors?

Is freesync as good as g-sync? (the lg c1 has both)

Thanks.
Yes it is. I do the same with my 6700xt. Display settings- flip to the other display. Win.
 

PaintTinJr

Member
Come on now. DX12 was better on AMD hardware for like 5 years. Like massively so at times. Microsoft leveraged a low level API that directly benefitted AMD immensely. DX12 was designed to directly take advantage of GCN and it's only when Turing released that Nvidia achieved parity.

It's odd that you mention Vulkan style PS5 API when Vulkan and DX12 are extremely similar as well.

Heck the game that a overclocked 7900 XTX can match the 4090 is in fact a DX12 game, Cyberpunk 2077. A Vulkan game like Doom Eternal has the 4090 stomping all over the 7900 XTX.
Vulkan and DX12 have completely different abstraction layers, even if they have similar feature sets- although on Windows all GPU API are working through an opaque DX layer since Vista - and saying DX12 was better on AMD hardware would then imply that AMDs hardware is easily ahead - given most games are DX on PC - and yet the perception of AMDs hardware running these DX games on WIndows is that the Nvidia hardware at similar tiers benchmarks higher with or without DXRT is enabled,

As for those benchmarks of Cyberpunk and Doom Eternal, do the linux/proton benchmarks of those identical Windows binaries give the same results for Nvidia RTX 4090 and AMD 7900XTX translating to Vulkan on linux? Or are those results never checked because there's native linux versions, so it is assumed the Windows version through translation will be slower - even though those tests would rule out WIndows/DX layer eating performance giving slanted results?
 

GymWolf

Member
It's useless to watch tests for a gpu on yt when they don't even say if they are using rtx or not.

Like, is spiderman (a ps4 game) using all the rtx turned on to get an avg of 52 in 4k with a 6950? i'm not even gonna speak about plague tale requiem performances...



At least it is nice to see that a 3090ti doesn't do that much better and only cost 600-800 euros more...
 
Last edited:

M1chl

Currently Gif and Meme Champion
I sincerely doubt that a gpu 4% more powerfull than a 6900 is "that card":lollipop_grinning_sweat:

The 7900xtx is like 1500 euros in europe, this is why i said euro tax...

A 3090ti is around 1500-1700, same for the 4080 16gb, unless nvidia cut the price of 500-700 euros there is no options really.

Believe me, i did my math.

Europeans are royally fucked.

P.s, i don't buy used pc parts.
Going to be honest, normally I wouldn't recommend but given the crypto crash and Eth merge, you can get new looking GPU for peanuts. Many of them have warranty attached to it, so there is no worry. And all mining cards are undervolted, so I wouldn't worry about being abused.
 

Zathalus

Member
Vulkan and DX12 have completely different abstraction layers, even if they have similar feature sets- although on Windows all GPU API are working through an opaque DX layer since Vista - and saying DX12 was better on AMD hardware would then imply that AMDs hardware is easily ahead - given most games are DX on PC - and yet the perception of AMDs hardware running these DX games on WIndows is that the Nvidia hardware at similar tiers benchmarks higher with or without DXRT is enabled,

As for those benchmarks of Cyberpunk and Doom Eternal, do the linux/proton benchmarks of those identical Windows binaries give the same results for Nvidia RTX 4090 and AMD 7900XTX translating to Vulkan on linux? Or are those results never checked because there's native linux versions, so it is assumed the Windows version through translation will be slower - even though those tests would rule out WIndows/DX layer eating performance giving slanted results?
DX12 and Vulkan may have different abstraction layers but the end result is almost the exact same. Very close performance, which makes sense as they were both designed for similar use cases. DX12 to have a low level API for GCN (because Xbox) and Vulkan being heavily influenced by Mantle, again a low level API for GCN. AMD has had tons of input on DX12 and onoing DX updates, just check GPUOpen.

AMD was far ahead of Nvidia when it came to both DX12 and Vulkan, any benchmark from before Turing would be proof enough for that.

As for your benchmark query, nobody has done that sort of testing but the 7900 XTX has not shown to be faster on Linux as compared to Windows. Nor have any historical tests shown Linux to be faster then Windows when it comes to gaming, performance between the two is usually very close with Windows being faster on average. This is true for AMD and Nvidia.
 

AGRacing

Member
My Reference 7900 XTX arrives direct from AMD Friday. Selling my reference 6900 XT.

My long term ambition for this new card is to run the upcoming wave of Unreal 5.1 titles at 1440p Ultra at maybe 60-90 fps. In Unreal 5.1 Fortnite - At the settings below the 6900 XT runs at about 90 fps on the ground ( note post processing, effects, shadows and ray tracing settings changed). It used to approach 165 fps on ultra with post processing turned to high. This PC is a 5800X3D AM4 platform so any future PC money will be a total platform reset 3+ years down the line.

Into the future!!!
EI17hvM.jpg
 
I find it kinda funny how people are faving about new amd cards... they are still bad value, lack features, good rt performance and eat a lot of power.
Sure, Nvidia needs their ass kicked but it's still not good...
And Intel dropped the ball. Nvidia keeps winning and fucking everyone.
 

GymWolf

Member
Do we have definite proofs about amd sam working or not for gaming?

Like how much performance gain are we talking about switching from a 13600k to an amd counterpart in the same price range?
 

Kataploom

Gold Member
I keep flopping between getting a 3080/6800xt/6700xt now or waiting a few months for the 7700xt/7800xt line.

AM5 boards stil being more expensive plus the x3d being teased certainly puts me in check.
I'll get the 6700xt as soon as my phone notifies me about my paycheck and probably get the 7700xt selling this one once that comes out... The 6700xt is a bitch of a card that can easily do 4K high to ultra at around 60fps if the game has FSR 2.0 or above, but I'm in need of an upgrade since I have a 1060 3gb
 

winjer

Gold Member
Do we have definite proofs about amd sam working or not for gaming?

Like how much performance gain are we talking about switching from a 13600k to an amd counterpart in the same price range?

Yes, several reputable sites have tested SAM and it works very well in several games. In other the gains are minor to non-existent.
But it's good to have it enabled.

Intel also has support for SAM, although they call it REBAR. And it works well with NVidia, AMD and Intel GPUs, as it follows the industry standard.
So you don't need to change CPUs to enable SAM/REBAR.
 

GymWolf

Member
Well, according to HUB's 12 game average at 4K, the 6950 XT is 12% faster than the 6900 XT and firmly over 60 FPS. Do with that what you will. And I'm sure you did your math, I was just letting you know your grim choices based on your preference and what you're looking for. The TL; DR for your outcome will be its gonna be pricey in Euro land anyway you slice it, unfortunately. And used parts aren't that bad, but I feel you. Haha

Good luck with whatever you decide and end up getting. Make sure to show us your masterpiece!
Can you link me this Hub 12 game comparison video?!

Thanks.
 

b0uncyfr0

Member
I'll get the 6700xt as soon as my phone notifies me about my paycheck and probably get the 7700xt selling this one once that comes out...
Too much hassle. Im coming from a 3770k and intel onboard GPU - another few months is nothing. Plus i have an X in the mean time.
 

b0uncyfr0

Member
Dude 2012...
Time for a completely new PC at this point. There is nothing to recycle from a computer 10 years old.
I beg to differ.

An FT02 - best air cooling case of all time. Still unbeaten or damn near close to the top
A Noctua D14
A Silverstone PSU - not quite sure about this one
A few NVME's
 

PeteBull

Member
Do we have definite proofs about amd sam working or not for gaming?

Like how much performance gain are we talking about switching from a 13600k to an amd counterpart in the same price range?
Depends on particular game and depends on what cpu u jumping too, but usually maybe additional 5-10% if u go amd+amd (ofc if u downgrade ur cpu from 13600k to much weaker tier amd cpu u cant expect any gainz in cpu limited scenarios).
Some games like fast ram, some just like high ipc/frequency, so its never 1 to 1 scenario where u can be sure u gonna get that 5-10%, it can even vary if u play game or just run its benchmark, in which case it usually isnt cpu heavy(ofc some exceptions).
Best to see performance of particular cpu+gpu combo is check youtube vids where u got settings/res and actually msi afterburner that shows u all the details of how particular game runs, and u can see where it all got tested. Dry fps numbers from benchmarks can vary a lot - some outlets test gpus in cpu bottlenecked scenarios, some outlets tests cpu's by running ingame benchmarks, hence big variation in scores.
 

GymWolf

Member
Depends on particular game and depends on what cpu u jumping too, but usually maybe additional 5-10% if u go amd+amd (ofc if u downgrade ur cpu from 13600k to much weaker tier amd cpu u cant expect any gainz in cpu limited scenarios).
Some games like fast ram, some just like high ipc/frequency, so its never 1 to 1 scenario where u can be sure u gonna get that 5-10%, it can even vary if u play game or just run its benchmark, in which case it usually isnt cpu heavy(ofc some exceptions).
Best to see performance of particular cpu+gpu combo is check youtube vids where u got settings/res and actually msi afterburner that shows u all the details of how particular game runs, and u can see where it all got tested. Dry fps numbers from benchmarks can vary a lot - some outlets test gpus in cpu bottlenecked scenarios, some outlets tests cpu's by running ingame benchmarks, hence big variation in scores.
Winjer told me that you can do kinda the same thing with an intel cpu and a thing called rebar or something.

Not gonna switch to an amd cpu if i can get similar results with intel.
 

winjer

Gold Member
Winjer told me that you can do kinda the same thing with an intel cpu and a thing called rebar or something.

Not gonna switch to an amd cpu if i can get similar results with intel.

SAM is just a fancy name that AMD gave to a function that exists in the PCI specs.
AMD does get the credit for being the first to enabled it and paving the way for Intel and NVidia.

Both modern AMD and Intel motherboards have support for this tech. It's not exclusive to AMD.
To enable it all it takes is to go to the BIOS, enable Above 4G decode and Resizeable BAR support.
Then boot to Windows, open the control panel of your GPU, be it AMD, Intel or NVidia and enable SAM/REBAR.

You don't need to switch system to enable this feature.

EDIT: here is a video showing how to enable SAM/REBAR on an Intel platform and AMD GPU.
 
Last edited:

GymWolf

Member
I have a 7900 xt coming in Friday. Seems to be right at the price performance level I was looking for.
If i may ask, why are you buying a 7900xt over a 6950xt? better raytracing?

I'm seeing some benchmarks and the difference is kinda small...
 

GreatnessRD

Member
I'm 99% close to buy a 6950 for around 800-850 euros but there is a power color 7900xt for like less than 1100 euros...
You can't go wrong with either choice. I'm cheap, so I'd always say go 6950 XT even though the 7900 XTX is like 30% better for 18% more monies. HOWEVER, if you're gonna keep the system for a long time, I might bite the bullet and grab the 7900 XTX. History has shown that AMD's drivers get better with time. Especially with the 6000 series of GPUs. The Raytracing is a lot better on RDNA 3, however. It's Ampere level and that's another think to look at. So I guess my final answer would be if you can afford the 7900 XTX and won't miss the savings, grab it. If you're pushing to fit it in your budget, grab the 6950 XT and have a good time.
 

GymWolf

Member
You can't go wrong with either choice. I'm cheap, so I'd always say go 6950 XT even though the 7900 XTX is like 30% better for 18% more monies. HOWEVER, if you're gonna keep the system for a long time, I might bite the bullet and grab the 7900 XTX. History has shown that AMD's drivers get better with time. Especially with the 6000 series of GPUs. The Raytracing is a lot better on RDNA 3, however. It's Ampere level and that's another think to look at. So I guess my final answer would be if you can afford the 7900 XTX and won't miss the savings, grab it. If you're pushing to fit it in your budget, grab the 6950 XT and have a good time.
My limit was 1000 euros, but i could go the extra 100 euros for something noticeably more powerfull, from the bench i'm seeing, the 7900xt is like 10-15% more powerfull but sometimes even less.
 

GreatnessRD

Member
My limit was 1000 euros, but i could go the extra 100 euros for something noticeably more powerfull, from the bench i'm seeing, the 7900xt is like 10-15% more powerfull but sometimes even less.
I just realized you said the 7900 XT. I read your post as the XTX because I'm like that thing is closer to 30% better, lol. Absolutely DO NOT waste money on the 7900 XT. Its either 7900 XTX or nothing. And if that's too far out the range, the 6950 XT is a no brainer. Please don't waste your money on the 7900 XT.
 

GymWolf

Member
I just realized you said the 7900 XT. I read your post as the XTX because I'm like that thing is closer to 30% better, lol. Absolutely DO NOT waste money on the 7900 XT. Its either 7900 XTX or nothing. And if that's too far out the range, the 6950 XT is a no brainer. Please don't waste your money on the 7900 XT.
Well, the 7900 xt is a newer card with more modern tech, better rtx, fsr3, more vram, and the drivers are probably gonna switch focus from series 6000 to series 7000 pretty soon, also the one i find is a power color branded, basically one of the best amd third parties, not some shtty brand like manli or gainward, less than 1100 are kind of a good\great deal...

it's like 250 euros difference for a modern card over the flagship older card.

I don't think that the 6950 is gonna be enough for the heaviest titles in the next 2-3 years.
UE5 use lumen that is a form of rtx, i don't wanna be limited by shitty rtx tech, directly or indirectly.

I could just wait but i really wanted to make the build before feb 2023.
 
Last edited:

RoboFu

One of the green rats
If i may ask, why are you buying a 7900xt over a 6950xt? better raytracing?

I'm seeing some benchmarks and the difference is kinda small...
Idk what benchmarks you are looking at.



The hardware differences are big and the 7900 will definitely improve with its drivers with time. Plus it’s only a $100 - $200 difference at most places I shop at online. 🤷‍♂️
 

GHG

Member
If i may ask, why are you buying a 7900xt over a 6950xt? better raytracing?

I'm seeing some benchmarks and the difference is kinda small...

The 7900xt is a better buy than the 6950xt.

Better performance, less power consumption, less heat.

People need to stop being concerned around the narratives that form around certain products online. The 7900xt is only a "bad" buy if you have the extra cash for the 7900xtx and don't make the step up. However, if the 7900xt is the best performing GPU you can get within budget then go for it.
 

GreatnessRD

Member
Well, the 7900 xt is a newer card with more modern tech, better rtx, fsr3, more vram, and the drivers are probably gonna switch focus from series 6000 to series 7000 pretty soon, also the one i find is a power color branded, basically one of the best amd third parties, not some shtty brand like manli or gainward, less than 1100 are kind of a good\great deal...

it's like 250 euros difference for a modern card over the flagship older card.

I don't think that the 6950 is gonna be enough for the heaviest titles in the next 2-3 years.
UE5 use lumen that is a form of rtx, i don't wanna be limited by shitty rtx tech, directly or indirectly.

I could just wait but i really wanted to make the build before feb 2023.
Do what you feel is best, friend. I just wouldn't do it for the cost. The 7900 XT, in my opinion, is not worth the extra $250 over the 6950 XT.
The 7900xt is a better buy than the 6950xt.

Better performance, less power consumption, less heat.

People need to stop being concerned around the narratives that form around certain products online. The 7900xt is only a "bad" buy if you have the extra cash for the 7900xtx and don't make the step up. However, if the 7900xt is the best performing GPU you can get within budget then go for it.
10% better performance, at best 15% on select game or two isn't worth the extra $250. Hell, He might can overclock the 6950 XT and close the gap even more. Another 3 or 4%. The 7900 XT is bad because the value is bad plain and simple. Yes, it's a solid card no doubt about it, but with other options out there, you'd have to a be psychopath to buy it at its current price. Given folks bought 3090 Ti's at upwards of $2500-3000, the folks are out there.
 

RoboFu

One of the green rats
Do what you feel is best, friend. I just wouldn't do it for the cost. The 7900 XT, in my opinion, is not worth the extra $250 over the 6950 XT.

10% better performance, at best 15% on select game or two isn't worth the extra $250. Hell, He might can overclock the 6950 XT and close the gap even more. Another 3 or 4%. The 7900 XT is bad because the value is bad plain and simple. Yes, it's a solid card no doubt about it, but with other options out there, you'd have to a be psychopath to buy it at its current price. Given folks bought 3090 Ti's at upwards of $2500-3000, the folks are out there.
nah
 

winjer

Gold Member
Idk what benchmarks you are looking at.



The hardware differences are big and the 7900 will definitely improve with its drivers with time. Plus it’s only a $100 - $200 difference at most places I shop at online. 🤷‍♂️


Does this guy even have the hardware he claims he is testing?
Anyone can forge the MSI afterburner stats, including frame rate.
Unfortunately, there have been too many channels on youtube that falsify results.

For example, in that video you posted, while playing CP2077 they claim a difference of around 40-50%
But people that we can verify have the cards, have much lower percentages.
Hardware Unboxed noted a 20% difference at 4K. And Guru3d noted a 21% difference. Gamers Nexus at 29%.
 

GHG

Member
Do what you feel is best, friend. I just wouldn't do it for the cost. The 7900 XT, in my opinion, is not worth the extra $250 over the 6950 XT.

10% better performance, at best 15% on select game or two isn't worth the extra $250. Hell, He might can overclock the 6950 XT and close the gap even more. Another 3 or 4%. The 7900 XT is bad because the value is bad plain and simple. Yes, it's a solid card no doubt about it, but with other options out there, you'd have to a be psychopath to buy it at its current price. Given folks bought 3090 Ti's at upwards of $2500-3000, the folks are out there.

Overclocking should never be factored into a GPU purchasing decision, it's a complete lottery.

Purchasing an older, less performant, more power hungry card for similar money just because of feelings around the pricing situation we find ourselves in is something a "psychopath" would do.
 
Top Bottom