• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Since 2023 Late AMD has been seriously very poor with it's GPU performance compare to Nvidia equivalent specially RDNA 2 is being aged bad.

Avatar Frontiers of Pandora


Alan Wake 2


Lords of Fallen
https://www.techpowerup.com/review/lords-of-the-fallen-performance-benchmark/5.html
(UE5 Shifting from AMD to Nvidia recently in AA titles)

Banishers Ghost of New Eden
(Again UE5 title)

Helldrivers 2 (Nvidia is is nearly more than 30% faster than AMD equivalent ) (RTX 3080 TI is competing with RX 7900 XTX)



Palworld (Nvidia is nearly more than 20% faster than AMD equivalent)
 
Last edited:

DenchDeckard

Moderated wildly
Poor AMD, we really needed Intel to step up but it looks like its not happening either. Hopefully they can turn it around because NVIDIA is more or less the only player now, and PC is starting to get leaps ahead of consoles again.
 

Bojji

Member
Avatar used RT

Alan wake 2 is very Nvidia specific

For ue5 games most of them perform exactly how they should or even have AMD preference:

Hj5NCmO.jpg
kiBJmEf.jpg
 
Avatar used RT

Alan wake 2 is very Nvidia specific

For ue5 games most of them perform exactly how they should or even have AMD preference:

Hj5NCmO.jpg
kiBJmEf.jpg
I think you did not read. I meant late 2023. Just search how many title is being launched past 4 months and their performance. You are posting Mid 2023 titles and after some updates Nvidia is 20% faster than AMD in Robocop.
 
Last edited:

Bojji

Member
I think you did not read. I meant late 2023. Just search how many title is being launched past 4 months and their performance. You are posting Mid 2023 titles and after some updates Nvidia is 20% faster than AMD in Robocop.

You need to show me some proof for Robocop.

Late 2023/early 2024, games you missed

dvvXYnm.jpg
HdXh17F.jpg


XbGRqcv.jpg
BtevRAv.jpg


3080 and 6800XT are very close to each other just like they always were. There are always games that favor AMD or Nvidia, this is nothing new.
 
You have to go further back than just 2023 to find the point when AMD fell behind and never caught up again.

The Pascal architecture (GTX 1000 series) was introduced in 2016, and AMD never had a competitive GPU in both performance and features ever again after that one.
 

LiquidMetal14

hide your water-based mammals
Is this really a hot take worthy of a thread?

We know the perf vs cost.

I have a 4090 and can be modest enough to see the value in AMD cards.

I would say where were at today that if an equivalent RTX card is within 50-100 of the AMD variant, then I would consider an Nvidia. The features are worth that much on the Nvidia side. Any more than that, all bets are off.
 
You need to show me some proof for Robocop.

Late 2023/early 2024, games you missed

dvvXYnm.jpg
HdXh17F.jpg


XbGRqcv.jpg
BtevRAv.jpg


3080 and 6800XT are very close to each other just like they always were. There are always games that favor AMD or Nvidia, this is nothing new.


When you increase Resolution to 2K Nvidia starts beating AMD due to Nvidia has architecture issue in Ampere and Turing but not in RTX 4xxx due to large L2 cache.
 

Cyborg

Member
I mean consoles should use Nvidea tech as that would be more ''future proof'' in my humble opinion. AMD is so far behind in GPU and tech around it (like DLSS).
 
Is this really a hot take worthy of a thread?

We know the perf vs cost.

I have a 4090 and can be modest enough to see the value in AMD cards.

I would say where were at today that if an equivalent RTX card is within 50-100 of the AMD variant, then I would consider an Nvidia. The features are worth that much on the Nvidia side. Any more than that, all bets are off.
I am not comparing RTX 4090.

My main goal is to compare RTX 3080 10 GB with RX 6800 XT.

When it launched use to be equal to RX 6800 on raw performance ,however, now it is at times and since 2023 mid, most likely beating RX 6800 XT and in fall 2023 release it is beating even RX 6950 XT.
 
You need to show me some proof for Robocop.

Late 2023/early 2024, games you missed

dvvXYnm.jpg
HdXh17F.jpg


XbGRqcv.jpg
BtevRAv.jpg


3080 and 6800XT are very close to each other just like they always were. There are always games that favor AMD or Nvidia, this is nothing new.
You prove my point. Look at RTX 3080 12GB performance, beating RX 6900 XT or RX 6950 XT.
 

Bojji

Member


When you increase Resolution to 2K Nvidia starts beating AMD due to Nvidia has architecture issue in Ampere and Turing but not in RTX 4xxx due to large L2 cache.


BBiwWiz.jpg


6800xt and 3080 are pretty much on par here.

6800XT loses in 4K but this was always the case, 6800XT is limited by memory BW (same goes for 4070):

hMwJ8Yk.jpg


I said performance AND features

Try a Ray Tracing comparison instead to understand what I mean

Also try comparing DLSS to FSR

I know that but raster is still (and will be for a few years) the most important factor, their cards are cheaper for the same performance and usually with more vram.

You prove my point. Look at RTX 3080 12GB performance, beating RX 6900 XT or RX 6950 XT.

There was always just few % difference between 6800XT and 6900XT/3080 and 3080 12GB, they trade blows with each other:

QC9yvor.jpg
 
Last edited:

yamaci17

Member
do you think that's the whole story ? fsr quality at 1440p is near unusable. and downright unusable at 1080p
1440p dlss ultra performance with 480p input resolution has better temporal stability and almost same image clarity as FSR does with 960p input resolution



aside from raw performance, you get better per pixel treatment with DLSS over FSR in cases where players use upscaling.

I'm not even going to delve into Reflex and how responsive games become with it, especially in GPU bound scenarios.
 
Last edited:

winjer

Gold Member
BBiwWiz.jpg


6800xt and 3080 are pretty much on par here.

6800XT loses in 4K but this was always the case, 6800XT is limited by memory BW (same goes for 4070):

hMwJ8Yk.jpg




I know that but raster is still (and will be for a few years) the most important factor, their cards are cheaper for the same performance and usually with more vram.



There was always just few % difference between 6800XT and 6900XT/3080 and 3080 12GB, they trade blows with each other:

QC9yvor.jpg

Don't feed the troll.
This guy is not interested in facts, as we could already see from his previous thread.
 

PaintTinJr

Member
Instead of lumping all games in together for threads like this can people breakout the benchmarks of twitch/competitive shooters? which matter disproportionately more to your average kid getting into PC game that's planning on buying a monitor for 144Hz, too.

My nephew sent me specs for a "more powerful" system than his own yesterday that has a 8GB 4060, versus his current 6600XT, and for the £1100 he'd have to spend on the new system I told him he wasn't going to get value for money on the non-RTX, non-DLSS native resolution competitive shooters he plays.
 

Gaelyon

Gold Member
Ray tracing doesn't count because it makes AMD look bad.
Ray tracing is to graphic features what's 8K is to resolutions : nice but overexpansive. The cost of RT is disproportionate for actual GC gen and one of the reasons we have such expansive cards.
Given the choice I'd rather use a card with better rasterization, more ram and better price by FPS. DLSS is better than FSR though.
 

Bojji

Member
Ray tracing doesn't count because it makes AMD look bad.

RT still mostly makes very little difference, only game with RT that makes massive difference it CP2077 in PT mode, but you need something around 4070 super at least to make it playable with 1440p and frame generation. Without frame gen it will be hard to achieve playable framerates on 30 series cards, there are mods for that now but quality of native DLSS3 is superior.

Alan Wake 2 PT is mostly useless, game already has good GI with raster solution, what irritated me in this game was that they didn't let you use RT reflections without PT lighting. RT reflections actually make good difference in this game but PT lighting does not and performance impact of just RT reflections would be smaller. Of course they had to make more performance impact to sell overpriced Nv cards haha.

Of course this is subjective and you can think that there are many more games with impactful RT but I had 3070, 4070 and 4070ti in last few years and played most games with RT implementation to judge.
 

LiquidMetal14

hide your water-based mammals
I am not comparing RTX 4090.

My main goal is to compare RTX 3080 10 GB with RX 6800 XT.

When it launched use to be equal to RX 6800 on raw performance ,however, now it is at times and since 2023 mid, most likely beating RX 6800 XT and in fall 2023 release it is beating even RX 6950 XT.
Let me tell you something
 

SantaC

Member
Same OP who made this thread:


He got dogpiled in that thread lol.
 
I know that but raster is still (and will be for a few years) the most important factor, their cards are cheaper for the same performance and usually with more vram.
Your argument is a bad one because nobody spending $700-$1500 on a video card is going to be settling for turning off features like RT. I'm not going to buy a fucking 7900 XT or whatever and be like ok sure I'll turn RT off and only play with raster in Cyberpunk 2077. How fucking retarded would I need to be to do that when I could just buy the 4080 instead and have RT and DLSS and a superior gaming experience?

Even in the $400-$500 range of video cards, I think most people would say yes I prefer to play with RT when available. You have to get down to the $200-$300 range before RT is so non-performant on Nvidia cards that someone would be justified in turning it entirely off and playing with just raster.
 
Last edited:

Bojji

Member
Your argument is a bad one because nobody spending $700-$1500 on a video card is going to be settling for turning off features like RT. I'm not going to buy a fucking 7900 XT or whatever and be like ok sure I'll turn RT off and only play with raster in Cyberpunk 2077. How fucking retarded would I need to be to do that when I could just buy the 4080 instead and have RT and DLSS and a superior gaming experience?

Even in the $400-$500 range of video cards, I think most people would say yes I prefer to play with RT when available. You have to get down to the $200-$300 range before RT is so non-performant on Nvidia cards that someone would be justified in turning it entirely off and playing with just raster.

Difference in RT is overblown mostly, Radeon cards are much slower with PT but in standard RT calculations difference is relatively small, depends on how heavy RT is and how much "nvidia sponsored":

RQ0uyW0.jpg
 

Zathalus

Member
AMD is basically dead to me until they offer a meaningful competitor to DLSS. I can take reduced RT performance but FSR can fuck right off.

Intel managed to release XeSS over 2 years ago and FSR has had almost zero improvements since the initial release, frankly the software team at AMD comes off as incompetent.
 
Last edited:

Zathalus

Member
Difference in RT is overblown mostly, Radeon cards are much slower with PT but in standard RT calculations difference is relatively small, depends on how heavy RT is and how much "nvidia sponsored":

RQ0uyW0.jpg
RDNA3 can do alright if RT is relatively light, in the games you listed Far Cry 6 RT is basically unnoticeable for example. Its when you do heavy RT implementations that RDNA3 falls quite far behind such as Cyberpunk, Control, or Metro Exodus.
 

MH3M3D

Member
Yeahh.. well my 7800XT was 100 Euro's cheaper than a regular 4070 here in the Netherlands at the time, and it blows away the 4070.
 

Gaiff

SBI’s Resident Gaslighter
Difference in RT is overblown mostly, Radeon cards are much slower with PT but in standard RT calculations difference is relatively small, depends on how heavy RT is and how much "nvidia sponsored":

RQ0uyW0.jpg
The problem is those games where Radeon doesn’t fall to the wayside have shit-tier RT. Far Cry 6 and Forza? You can’t even tell it’s there. Cyberpunk, R&C, Dying Light 2, or Alan Wake 2 though? Yeah? Massive difference. When RT matters, AMD is usually much slower.
 

AGRacing

Member
4080 Super trades blows with the 7900 XTX, and smokes it in ray-tracing and they're both around the same price.

EDIT : and the 4080 uses less power
True. But we are less than a month into the existence of the Super. Ive had my 7900 XTX for over a year - it was cheaper than the 4080 non super I compared it against at that time and remained so all of 2023. It is faster at raster in the average. Not as fast at RT but could trade blows in many games with the 3090 ti.

AMD will eventually answer the 4080 Super challenge (maybe with a drop… maybe a higher clocked card, etc) but if i were buying TODAY I’d go 4080 super myself… but lets be real. The only reason the super DOES exist and IS $999 is the 7900 XTX.
 

Bojji

Member
RDNA3 can do alright if RT is relatively light, in the games you listed Far Cry 6 RT is basically unnoticeable for example. Its when you do heavy RT implementations that RDNA3 falls quite far behind such as Cyberpunk, Control, or Metro Exodus.

The problem is those games where Radeon doesn’t fall to the wayside have shit-tier RT. Far Cry 6 and Forza? You can’t even tell it’s there. Cyberpunk, R&C, Dying Light 2, or Alan Wake 2 though? Yeah? Massive difference. When RT matters, AMD is usually much slower.

HL, RT reflections, shadows and AO



CP, RT shadows and reflections



Hitman, RT reflections



In those games difference is as big as in control (or as little, depending on opinion), Fortnite is using hardware lumen. Biggest performance difference is in games that have nvidia sponsor logo, they may be (more) nvidia optimized.
 
Top Bottom