• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

winjer

Gold Member
Overclocking should never be factored into a GPU purchasing decision, it's a complete lottery.

Purchasing an older, less performant, more power hungry card for similar money just because of feelings around the pricing situation we find ourselves in is something a "psychopath" would do.

In this case it's not that simple.
The performance difference between the 7900Xt and 6950XT around 15% on average. A bit more with RT games.
But he can get a 6950XT with 2 free games, for 870€. And he can get a 7900XT for 1090€.
The price difference does not justify the performance difference. Even if he was to OC the 7900XT, the 6950XT would still be the better value.
 

GreatnessRD

Member
Overclocking should never be factored into a GPU purchasing decision, it's a complete lottery.

Purchasing an older, less performant, more power hungry card for similar money just because of feelings around the pricing situation we find ourselves in is something a "psychopath" would do.
Fair enough. We'll agree to disagree in this case. I just don't see the juice being worth the squeeze in this case.
 

thuGG_pl

Member
Does this guy even have the hardware he claims he is testing?
Anyone can forge the MSI afterburner stats, including frame rate.
Unfortunately, there have been too many channels on youtube that falsify results.

For example, in that video you posted, while playing CP2077 they claim a difference of around 40-50%
But people that we can verify have the cards, have much lower percentages.
Hardware Unboxed noted a 20% difference at 4K. And Guru3d noted a 21% difference. Gamers Nexus at 29%.
Such videos are usually bullshit.
 

RoboFu

One of the green rats
Does this guy even have the hardware he claims he is testing?
Anyone can forge the MSI afterburner stats, including frame rate.
Unfortunately, there have been too many channels on youtube that falsify results.

For example, in that video you posted, while playing CP2077 they claim a difference of around 40-50%
But people that we can verify have the cards, have much lower percentages.
Hardware Unboxed noted a 20% difference at 4K. And Guru3d noted a 21% difference. Gamers Nexus at 29%.
Heres another. most games get 10 - 20 fps advantage on the 7900 xt thats a lot really and thats with "out of the gate" drivers that have a few known issues. It's already known that undervolting can you get you to near stock XTX levels if you want to mess around with it. but for me I play most games at 1440 so im just going to run it stock.






plus if you compare prices there really isnt any reason to get a 6950 with lesser hardware and lesser ram. It just doesn't make any sense. maybe if you find a sale somewhere that gives you a $400 price difference? but from what ive seen the 6950 is usually stuck at scalper prices and majority of the 7900 xts are actually less.

https://www.newegg.com/p/pl?N=100007709 601403917

https://www.newegg.com/p/pl?d=amd+radeon™+rx+7900+xt
 
Last edited:

winjer

Gold Member
Heres another. most games get 10 - 20 fps advantage on the 7900 xt thats a lot really and thats with "out of the gate" drivers that have a few known issues. It's already known that undervolting can you get you to near stock XTX levels if you want to mess around with it. but for me I play most games at 1440 so im just going to run it stock.






plus if you compare prices there really isnt any reason to get a 6950 with lesser hardware and lesser ram. It just doesn't make any sense. maybe if you find a sell somewhere th at gives you a $400 price difference? but from what ive seen the 6950 is usually stuck at scalper prices and majority of the 7900 xts are actually less.

https://www.newegg.com/p/pl?N=100007709 601403917

https://www.newegg.com/p/pl?d=amd+radeon™+rx+7900+xt


Another? Seriously, try posting benchmarks from verified, reliable sites, that actually have the hardware.

And just so you know, the person that is asking advise on which of these cards to buy, is in Italy. So the prices you posted mean nothing.
 

GreatnessRD

Member
Lol i didn't wanted to cause a fight over my gpu choice:lollipop_grinning_sweat:
Ken Watanabe Godzilla GIF by Legendary Entertainment

:pie_gsquint:
 

RoboFu

One of the green rats
Another? Seriously, try posting benchmarks from verified, reliable sites, that actually have the hardware.

And just so you know, the person that is asking advise on which of these cards to buy, is in Italy. So the prices you posted mean nothing.
You are focusing on the people pointing out the very few outliers like forza horizon 5 which a lot of people think is just because of the known driver issues. most games even on beta drivers that were benched 9+ days ago on all the big tech YT channels still have the majority of games at a10 - 20 fps advantage. hell some F1 game beat a 4090 on XTX.

But going by your criteria we cant even take any comparisons for real. it all changes as time goes on. can we take beta drivers from 9+ days ago for face value even if its from a outlet like linus or gamers nexus? not really. Id rather take an aggregate of many comaparisons and reviews to make amore informed decision and they all point to the 7900 xt being all around better even in its infancy to a 6950 at around the same price.
 
Last edited:

GymWolf

Gold Member
Idk what benchmarks you are looking at.



The hardware differences are big and the 7900 will definitely improve with its drivers with time. Plus it’s only a $100 - $200 difference at most places I shop at online. 🤷‍♂️

I saw a lot of benchmarks and the difference was variable from 5 to 20 frames, 10-15 average.

Unfortunately i live in europe and as we speak, there is around 250 euros difference between the 2 cards, and the 6950 is sold by a more trustworthy seller (not super important but still a factor to consider).

I can wait like another 30-50 days max, but i fear to lose this deals on the 6950/7900xt...
 

GHG

Gold Member
GymWolf GymWolf you will be able to see some reputable benchmarks for 4k gaming and thermals/power consumption etc here:


They have both the 6950xt and 7900xt in their comparisons.
 

RoboFu

One of the green rats
I saw a lot of benchmarks and the difference was variable from 5 to 20 frames, 10-15 average.

Unfortunately i live in europe and as we speak, there is around 250 euros difference between the 2 cards, and the 6950 is sold by a more trustworthy seller (not super important but still a factor to consider).

I can wait like another 30-50 days max, but i fear to lose this deals on the 6950/7900xt...

well the gist of it all is even a 3070 will probably last someone to the end of this console gen if you don't play at 4k. a 6950 will take you further if you can get one at a good price. the RT in 7000 series is a lot better if thats what you are looking for but even then a 3080 ti may be enough.
 

GymWolf

Gold Member
well the gist of it all is even a 3070 will probably last someone to the end of this console gen if you don't play at 4k. a 6950 will take you further if you can get one at a good price. the RT in 7000 series is a lot better if thats what you are looking for but even then a 3080 ti may be enough.
I aim to 4k60 with limited to no rtx, and some settings turned down a notch or 2.

My fear is for ue5 future games that use lumen, a form of rtx.

I watched a bench of the famous ue5 demo with the rocks and the fairy chick and it does like 70 average frames at 1440p on a 6950xt...
A full game with that engine and graphical prowess is gonna perform much much worse...i think?

I know that tech demos math doesn't always work this way, but still, it doesn't bode well for 4k60 at all...
 
Last edited:

GHG

Gold Member
I don't know how anyone can recommend a 6xxx series card in good conscience for 4k gaming when it's well documented they are bandwidth starved. For 1440p and below they are a great buy but at 4k those cards are going to be chugging hard in a couple of years.
 
Last edited:

winjer

Gold Member
You are focusing on the people pointing out the very few outliers like forza horizon 5 which a lot of people think is just because of the known driver issues. most games even on beta drivers that were benched 9+ days ago on all the big tech YT channels still have the majority of games at a10 - 20 fps advantage. hell some F1 game beat a 4090 on XTX.

But going by your criteria we cant even take any comparisons for real. it all changes as time goes on. can we take beta drivers from 9+ days ago for face value even if its from a outlet like linus or gamers nexus? not really. Id rather take an aggregate of many comaparisons and reviews to make amore informed decision and they all point to the 7900 xt being all around better even in its infancy to a 6950 at around the same price.

I'm focusing on results made by real reviewers.
HU, for example showed a 12% performance difference at 4K.

For the prices the user has in his country, the difference is of 220 euros, with another 120 euros in games.
For such a limited performance difference, it's not worth it.
 

winjer

Gold Member
I don't know how anyone can recommend a 6xxx series card in good conscience for 4k gaming when it's well documented they are bandwidth starved. For 1440p and below they are a great buy but at 4k those cards are going to be chugging hard in a couple of years.

One of the things the 6950XT does in comparison to the 6900XT is an increase in memory clocks speed and in the Infinity cache speed.
Look at Guru3d results with Cyberpunk 2077 with RT enabled, at 4K.
There is a 21.7% difference between the 7900XT and the 6950XT. But an even bigger difference between the 6950XT and the 6900XT, of 27.7%.
The 7900XT is the fastest of the 2 cards, but the 6950XT has the best price/performance.

index.php
 

GymWolf

Gold Member
I don't know how anyone can recommend a 6xxx series card in good conscience for 4k gaming when it's well documented they are bandwidth starved. For 1440p and below they are a great buy but at 4k those cards are going to be chugging hard in a couple of years.
I watched many benchmark, the best brand of 6950 stay comfortably in the 70-90+ frame range at 4k except for games with heavy rtx, way more for shit like cod, apex etc., I din't know how the fuck tpis this possible when on paper the card is only 4% better than a 6900, but here we are...

I only need 60 frames and when i'm gonna turn down a notch or 2 stuff like shadows, ssr, occlusion, other unimportant shit, i'm gonna gain another 10-20 frame at the very worst (in games where graphic options actually work), that's 30 to 50 frames over what i need so yeah, the best 6000 series can be considered a 4k gpu FOR NOW.

But yeah, future games are what slow me down from buying a 6950, not sure if games made for console with literally half the power are gonna stress that gpu that much, but everything is possible.


It could have been very usefull if a lot of ps5 ports were already on pc to see how they run...for now we can only speculate.
 
Last edited:
Does this guy even have the hardware he claims he is testing?
Anyone can forge the MSI afterburner stats, including frame rate.
Unfortunately, there have been too many channels on youtube that falsify results.

For example, in that video you posted, while playing CP2077 they claim a difference of around 40-50%
But people that we can verify have the cards, have much lower percentages.
Hardware Unboxed noted a 20% difference at 4K. And Guru3d noted a 21% difference. Gamers Nexus at 29%.
Pretty sure his benchmarks are fake.
 
Last edited:

GymWolf

Gold Member
GymWolf GymWolf you will be able to see some reputable benchmarks for 4k gaming and thermals/power consumption etc here:


They have both the 6950xt and 7900xt in their comparisons.
In games with HEAVY rtx they are both trash, without rtx they are both capable in 4k.

It doesn't look like the better rtx in the 7900xt make much of a difference really, they are both way behind 60 frames in 4k.
 
Last edited:

RoboFu

One of the green rats
In games with HEAVY rtx they are both trash, without rtx they are both capable in 4k.

It doesn't look like the better rtx in the 7900xt make much of a difference really, they are both way behind 60 frames in 4k.
They are not trash they are still better than last gen 3000 cards barring maybe a 3090ti. Are they as good as the $1200 or $1600 4080 and 90 at rt? No.. but a wouldn’t call them trash.
 

GymWolf

Gold Member
They are not trash they are still better than last gen 3000 cards barring maybe a 3090ti. Are they as good as the $1200 or $1600 4080 and 90 at rt? No.. but a wouldn’t call them trash.
Under 30 frame in 4k for 1000+ euros is no bueno, maybe not trash but surely not good either.
 

RoboFu

One of the green rats

He says his title was click bait in the video lol.
But no .. none of these cards are going to be msrp from 4 years ago ( a 2080 was $700 for ref while $800+ for aibs ) unfortunately today we have massive inflation and on top of that crazy scalping .
So $200 more than what these YTubers THINK It should be isn’t that bad really.
 

GreatnessRD

Member
@GreatnessRD and GHG GHG working to get "poor GPU purchasing decisions" added as a symptom of psychopathy in the next DSM. :messenger_beaming:
Nah, lol. GHG GHG is my guy. We both mean well even if we disagree this go around. I can even understand his point of being on a newer architecture, but when the 6950 XT is literally in spitting distances of the 7900 XT, spending that extra $200 just doesn't seem sound to me. As I told gym, if you're gonna go 7900 series, it has to be 7900 XTX or nothing until prices drop. Then it would make sense for the 7900 XT. But poor GPU purchasing decisions should be added as a symptom of psychopathic ways. In real life you had people buying 3050's while 6600's were stronger and CHEAPER just because it had that Nvidia sign on it.
 

GymWolf

Gold Member
Nah, lol. GHG GHG is my guy. We both mean well even if we disagree this go around. I can even understand his point of being on a newer architecture, but when the 6950 XT is literally in spitting distances of the 7900 XT, spending that extra $200 just doesn't seem sound to me. As I told gym, if you're gonna go 7900 series, it has to be 7900 XTX or nothing until prices drop. Then it would make sense for the 7900 XT. But poor GPU purchasing decisions should be added as a symptom of psychopathic ways. In real life you had people buying 3050's while 6600's were stronger and CHEAPER just because it had that Nvidia sign on it.
I'm a bit of a psycho yeah, right now i'm considering a 7900xt or 4080...

Curious enough, they both cost less than a damn vanilla 3090, don't ask me how it is possible...
 
Last edited:

Buggy Loop

Member
Here is an aggregate of a bunch of reviews, with percentage comparisons across several GPUs.

Interesting results i see for the 4080 and 4090 series is that they are sensitive to CPU. Benchers who crank up their CPUs to the max see the 4080 surpassing the 7900XTX, sometimes even close to the 7900XTX OC'd, while AMD seems to be less affected by lower tier of CPUs. I wonder why.
 

Kataploom

Gold Member
SAM is just a fancy name that AMD gave to a function that exists in the PCI specs.
AMD does get the credit for being the first to enabled it and paving the way for Intel and NVidia.

Both modern AMD and Intel motherboards have support for this tech. It's not exclusive to AMD.
To enable it all it takes is to go to the BIOS, enable Above 4G decode and Resizeable BAR support.
Then boot to Windows, open the control panel of your GPU, be it AMD, Intel or NVidia and enable SAM/REBAR.

You don't need to switch system to enable this feature.

EDIT: here is a video showing how to enable SAM/REBAR on an Intel platform and AMD GPU.

I finally got me a 6700xt and had to go through this, but first I had to update the BIOS (if it doesn't appear there having right CPU + GPU combo, this must be the solution)... My question is: Why is it not enabled by default? I don't get the benefits of having it disabled...
 

winjer

Gold Member
Interesting results i see for the 4080 and 4090 series is that they are sensitive to CPU. Benchers who crank up their CPUs to the max see the 4080 surpassing the 7900XTX, sometimes even close to the 7900XTX OC'd, while AMD seems to be less affected by lower tier of CPUs. I wonder why.

AMD has a more advanced and complex frontend on their GPUs, that reduces driver overhead by doing more stuff in hardware.
NVidia has a simpler frontend, so it relies more on the CPU.
 

//DEVIL//

Member
This channel is fake bullshit.

If you want real channel with similar content, "Testing Games" is the one.


why it's so hard for people to get this?

I will repeat this as PSA one more time. 99% of the videos like this ( or from a channel called mark or mike or mike joe or joker ) or whatever that has media like this, where the screen is split to 2/3/4 showing different video cards are FAKE.

again.. let me repeat that. FAKEE.. FAKEEEEE

you can actually put a 4090. name in the handler a 7900 XT



You would think of someone who has the time to make benchmark videos like this, at least he would have his own real channel where he talks about performance and stuff like the rest of normal YouTubers.

Please stop posting misleading youtube videos. if you want to bring something, get a screenshot from a trusted reviewer like hardware unboxing, Linus, jay2cents, and gamers nexus.


This is not to you Denton directly, but to anyone who actually post stuff like this.
 

GHG

Gold Member
Yeah the xtx is like 1400 to 1800.

The 4080 is cheaper strangely enough, at least the msi ventus 3 i found...


Prices are completely fucked up in europe.

Yep the XTX is more expensive than the 4080 here in the middle east as well. Not sure what the distributers for AMD's AIBs are thinking.

Is newegg's international store not an option for you?

Nah, lol. GHG GHG is my guy. We both mean well even if we disagree this go around. I can even understand his point of being on a newer architecture, but when the 6950 XT is literally in spitting distances of the 7900 XT, spending that extra $200 just doesn't seem sound to me. As I told gym, if you're gonna go 7900 series, it has to be 7900 XTX or nothing until prices drop. Then it would make sense for the 7900 XT. But poor GPU purchasing decisions should be added as a symptom of psychopathic ways. In real life you had people buying 3050's while 6600's were stronger and CHEAPER just because it had that Nvidia sign on it.

I think sometimes it's difficult to get a perspective of what each other are seeing given the wild differences in prices regionally at the moment. New 3090's are still listed for more than the 4080 here. Make it make sense.

At this point my advice for anyone shopping for a new gpu at the moment is either avoid it entirely or just get the best performing card you can for your budget regardless of whether it's a "good value" buy or not. Bottom line is if you're going to use it and enjoy it everyday for years then that's where the value is.
 
Last edited:

//DEVIL//

Member
Yep the XTX is more expensive than the 4080 here in the middle east as well. Not sure what the distributers for AMD's AIBs are thinking.

Is newegg's international store not an option for you?
Anyone who buys a 7900xtx over 4080 if they are at the same price or the 4080 slightly more expensive than 7900xtx needs a wack.

4080 is a better card overall even if in terms of noise and heat. let alone no DLSS or frame generation ( even if you wanna argue it's not great because you are wrong and there is always frame generation 2.0 that will fix some of the issues ) alone will make the 4080 a much better card, especially for real next gen games coming next year where it will be very demanding at 4k.

it's not to say the 4080 price is attractive. but if I am forced to pick between two bawls of shit, ill get the Nvidia card.

IMO it's either pick a cheap 6900xt or 3080/ 3090 for 500$ or so. or go all the way 4090 MSRP price. rest is really garbage in the middle. just over priced new plastic that is not worth the money for the performance gain. at least the 4090 is a Halo product and almost if not always twice the performance of a 3080 ( which is still an amazing great card )
 

GymWolf

Gold Member
Yep the XTX is more expensive than the 4080 here in the middle east as well. Not sure what the distributers for AMD's AIBs are thinking.

Is newegg's international store not an option for you?



I think sometimes it's difficult to get a perspective of what each other are seeing given the wild differences in prices regionally at the moment. New 3090's are still listed for more than the 4080 here. Make it make sense.

At this point my advice for anyone shopping for a new gpu at the moment is either avoid it entirely or just get the best performing card you can for your budget regardless of whether it's a "good value" buy or not. Bottom line is if you're going to use it and enjoy it everyday for years then that's where the value is.
Custom taxes for a product shipped from america should erase the advantage of a lower starting price.

If it is to save a small amount i prefer to buy from a way closer location.
 
Last edited:
  • Like
Reactions: GHG
In this case it's not that simple.
The performance difference between the 7900Xt and 6950XT around 15% on average. A bit more with RT games.
But he can get a 6950XT with 2 free games, for 870€. And he can get a 7900XT for 1090€.
The price difference does not justify the performance difference. Even if he was to OC the 7900XT, the 6950XT would still be the better value.
Value is subjective. If your only metric is frames per dollar, then sure the 6950 is the better value. If you perceive value as "the best performance within a given budget" and your budget can afford a 7900 XT, then get the 7900 XT,
 

GymWolf

Gold Member
I'm reading dreadful things about amd drivers on reddit, are they really so bad compared to nvidia?

If there is something that a noob like me want to avoid is to have driver problems that i don't know how to identify or solve.

Even if nvidia had some problems with their drivers, i never noticed anything so the flaws weren't so bad i guess, i think that i never had to roll back once...

Like i heard that you have to disable fast boot from windows because you can get a driver reset message or some shit?

p.s. i can't get behind the fact that a 6950 only does 70 frames average in spiderman morales WITHOUT rtx...

Aren't most sony ports decently optimized?



At least the other 2 are not THAT far...

And none of them achieve 60 on cyberpunk without rtx...jesus that game is on a league on his own...
 
Last edited:

winjer

Gold Member
I'm reading dreadful things about amd drivers on reddit, are they really so bad compared to nvidia?

No, they are not. In the last few months, NVidia probably had more issues than AMD, mostly with texture corruption in some games like FH5, and issues with Geforce experience.
But recently, they are mostly on par.

If there is something that a noob like me want to avoid is to have driver problems that i don't know how to identify or solve.

Just use DDU if going from AMD to NVidia. Asides from that, there is nothing special.
But if you do, a tech centric forum like Guru3d can help a lot., be it for AMD or NVidia.

Like i heard that you have to disable fast boot from windows because you can get a driver reset message or some shit?

I have fast boot enabled with no issues on the 6800XT.

p.s. i can't get behind the fact that a 6950 only does 70 frames average in spiderman morales WITHOUT rtx...

Aren't most sony ports decently optimized?



At least the other 2 are not THAT far...

And none of them achieve 60 on cyberpunk without rtx...jesus that game is on a league on his own...


That's why there are things like DLSS and FSR.
But CP2077 has a few settings that are just dumb. For example, SSR at psyco level is heavier than RT reflections on medium settings. And SSR at psyco doesn't even look that good.
Unfortunately, there are some games that have settings that are pushed too high, causing lower performance and having little to no improvements in image quality.
Digital Foundry, hardware unboxed and a few other tech channels usually do an optimized settings guide to avoid losing performance on features that don't improve image quality.
 

GymWolf

Gold Member
No, they are not. In the last few months, NVidia probably had more issues than AMD, mostly with texture corruption in some games like FH5, and issues with Geforce experience.
But recently, they are mostly on par.



Just use DDU if going from AMD to NVidia. Asides from that, there is nothing special.
But if you do, a tech centric forum like Guru3d can help a lot., be it for AMD or NVidia.



I have fast boot enabled with no issues on the 6800XT.



That's why there are things like DLSS and FSR.
But CP2077 has a few settings that are just dumb. For example, SSR at psyco level is heavier than RT reflections on medium settings. And SSR at psyco doesn't even look that good.
Unfortunately, there are some games that have settings that are pushed too high, causing lower performance and having little to no improvements in image quality.
Digital Foundry, hardware unboxed and a few other tech channels usually do an optimized settings guide to avoid losing performance on features that don't improve image quality.
Yeah when i wrote the post i completely forgot about fsr and some ultra settings being virtually useless, i have no problem turning down a notch or 2 some stuff, i'm not fixated with ultra settings everywhere, i literally started watching fsr benchmarks after i wrote the post, it average 75 frame in 4k + rtx with fsr 2.1 quality if i remember well.

Is it true that most games stll have the 1.0 version of fsr?

Unfortunately i'm reading a lot of stuff and i can't tell what is true or not.
 
Last edited:

Crayon

Member
I want to throw in my two cents
Yeah when i wrote the post i completely forgot about fsr and some ultra settings being virtually useless, i have no problem turning down a notch or 2 some stuff, i'm not fixated with ultra settings everywhere, i literally started watching fsr benchmarks after i wrote the post, it average 75 frame in 4k + rtx with fsr 2.1 quality if i remember well.

Is it true that most games stll have the 1.0 version of fsr?

Unfortunately i'm reading a lot of stuff and i can't tell what is true or not.

Fsr 1 is super easy to implement and can actually be forced universally. 2 is way more interesting and effective so I don't think developers would opt for 1 over 2. They'd probably support either or both each based on their own merits. Like fsr 1 is probably better for older cards so that would be a reason to include both. Fsr 2 looks like it could be pretty common going forward, but it's still relatively new so there's more games with 1 at the moment.
 

GymWolf

Gold Member
I want to throw in my two cents


Fsr 1 is super easy to implement and can actually be forced universally. 2 is way more interesting and effective so I don't think developers would opt for 1 over 2. They'd probably support either or both each based on their own merits. Like fsr 1 is probably better for older cards so that would be a reason to include both. Fsr 2 looks like it could be pretty common going forward, but it's still relatively new so there's more games with 1 at the moment.
Can i update the fsr version like i do with dlss just by coping a file inside the game folder?

P.s. in theory, the fsr2 should be for older cards and very common going forward, and fsr3 for the new ones and more rare.

From what i studied, fsr1 is pretty terrible, like dlss1 was.
 
Last edited:

winjer

Gold Member
Yeah when i wrote the post i completely forgot about fsr and some ultra settings being virtually useless, i have no problem turning down a notch or 2 some stuff, i'm not fixated with ultra settings everywhere, i literally started watching fsr benchmarks after i wrote the post, it average 75 frame in 4k + rtx with fsr 2.1 quality if i remember well.

Is it true that most games stll have the 1.0 version of fsr?

Unfortunately i'm reading a lot of stuff and i can't tell what is true or not.

Yes, there are still more games with FSR1 than with FSR2. But consider a few things:
There are games that have both FSR and FSR2 implemented.
There are already 101 games with FSR2 implemented. And more to come.
There is a mod that users can use to replace DLSS2 with FSR2. It does not work with all games, but it works with a lot of them, increasing the number of games with FSR2 by a significant number.

FSR2 and DLSS are not the only upscalers in the market. UE4 games have TAAU, some games have it as an option, but even with those that don't, TAAU can be enabled with a couple of commands copied into the engine.ini file.
UE5 has TSR, which is very similar in quality and performance to FSR2.1
 
Top Bottom