• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Since 2023 Late AMD has been seriously very poor with it's GPU performance compare to Nvidia equivalent specially RDNA 2 is being aged bad.

Mister Wolf

Member
HL, RT reflections, shadows and AO



CP, RT shadows and reflections



Hitman, RT reflections



In those games difference is as big as in control (or as little, depending on opinion), Fortnite is using hardware lumen. Biggest performance difference is in games that have nvidia sponsor logo, they may be (more) nvidia optimized.


None of those games feature RTGI.
 

OverHeat

« generous god »
Fortnite has hardware lumen and performs very close.

RTGI was invented by Nv, it isn't optimized for AMD by design.

Sy03LVO.jpg
 

DaGwaphics

Member
It is interesting to look back at cards that once performed similarly and see which ones stood the test of time. But maybe I'm just nerdy that way.
 

Zathalus

Member
HL, RT reflections, shadows and AO



CP, RT shadows and reflections



Hitman, RT reflections



In those games difference is as big as in control (or as little, depending on opinion), Fortnite is using hardware lumen. Biggest performance difference is in games that have nvidia sponsor logo, they may be (more) nvidia optimized.

Hogwarts Legacy performs quite a bit better on AMD GPUs but as soon as you switch RT on Nvidia dominates. The same goes for Hitman 3.

The 7900 XTX is quite a bit faster in both games compared to the 4080 without RT and when RT is switched on performance drops to well under it, so the relative RT performance difference is much greater then that graph would indicate. Instead of being a 10% drop, it's actually over 30%.
 

Buggy Loop

Member
Company that makes affordable graphics card is less powerful then a graphics card thats 3x as much, more news at 12

That's not the reality when RDNA 2 released. -$50 MSRP for a unicorn reference card that had super low quantities, which AMD even wanted to stop producing because probably their MSRP was bullshit and they had super low margins at that price. Then AIB versions that were more expensive than Nvidia's for the same card distributors. To not realize this is making me think that you really didn't look for a GPU back in 2020.
 
Last edited:

00_Zer0

Member
The dawn of ray tracing has screwed and skewed results up for AMD badly; however, it's still almost not worth it to own a ray trace enabled card even for most Nvidia card users. This mostly has to do with medium tier cards and budget friendly cards.

Sometimes it's because the performance hit it takes, even on expensive graphics cards that are only 1 to 2 years old. Other times it's how developers implement ray tracing in their games. Sometimes it looks good other times it looks like crap.
Sometimes it runs good, other times it runs like crap.

That's too big of a gamble for mid tier graphics card users to justify the upgrade from the cards they use now. Since it's still the wild west days of ray tracing I don't think it will be worth owning a mid tier or budget friendly ray trace enabled graphics card for one or two more generations.
 

Buggy Loop

Member
Difference in RT is overblown mostly, Radeon cards are much slower with PT but in standard RT calculations difference is relatively small, depends on how heavy RT is and how much "nvidia sponsored":

RQ0uyW0.jpg

Also depends when that Far Cry 6 benchmark was taken and if it was under this famour *sponsored* bug that also crawled its way in early Watch dogs and almost every Ubisoft crap with RT.

IKnZBJR.jpeg


For the same settings, AMD had lower resolution reflections. How much of this shit slided in from reviewers that just run benchmarks for numbers and don't really look.

But yeah, you don't stress RT and that's AMD's best use case. In-line ray tracing with least dynamic shaders. While Nvidia is more slow at lower effects due to the ASIC nature of their RT cores, it can handle much more stressful effects.
 

Buggy Loop

Member
Fortnite has hardware lumen and performs very close.

RTGI was invented by Nv, it isn't optimized for AMD by design.

RTGI is PR naming but fundamentally it's not doing anything "invented" or exclusive by Nvidia. It's maths.

DXR consortium which had AMD, Nvidia & Microsoft in the same room to implement it at API level. When a developer calls the DXR function, it's up to the card to realize what the function calls for. It's agnostic. While Nvidia was ahead by that time because it's basically Nvidia's proof of concept at SIGGRAPH 2017, AMD co-developed it during the consortium. They saw Turing available a generation before their implementation and they still managed to make a worst solution than the Turing RT core on a per CU & per clock basis.

By your definition, nothing is optimized for AMD by design. In-line ray tracing works best on AMD? Who do you think also co-developed it and had the first in-line ray tracing games with DXR 1.1 support and in their developer RT help papers refers to it as the best practice (if low dynamic shaders)? Nvidia, the games were Minecraft RTX & Metro Exodus EE. Guess who runs Metro Exodus EE very well although it was optimized on Nvidia? AMD.

Can't blame Nvidia for this shit. AMD don't optimize for an agnostic API that you participated in and then act like a victim? Tough fucking luck.
You sleep in class, you get shit results.

GOv8C3q.png


They also managed to blame "Nvidia optimized" for Portal RTX due to the shit performances they had initially when the performance had nothing to do with RT calls.


AMD's compiler decided to make a ridiculous 99ms ubershader without Portal RTX calling for it. 🤷‍♂️
I guess Nvidia is also to blame for AMD's drunk compiler.
 
Last edited:

Bojji

Member
RTGI is PR naming but fundamentally it's not doing anything "invented" or exclusive by Nvidia. It's maths.

DXR consortium which had AMD, Nvidia & Microsoft in the same room to implement it at API level. When a developer calls the DXR function, it's up to the card to realize what the function calls for. It's agnostic. While Nvidia was ahead by that time because it's basically Nvidia's proof of concept at SIGGRAPH 2017, AMD co-developed it during the consortium. They saw Turing available a generation before their implementation and they still managed to make a worst solution than the Turing RT core on a per CU & per clock basis.

By your definition, nothing is optimized for AMD by design. In-line ray tracing works best on AMD? Who do you think also co-developed it and had the first in-line ray tracing games with DXR 1.1 support and in their developer RT help papers refers to it as the best practice (if low dynamic shaders)? Nvidia, the games were Minecraft RTX & Metro Exodus EE. Guess who runs Metro Exodus EE very well although it was optimized on Nvidia? AMD.

Can't blame Nvidia for this shit. AMD don't optimize for an agnostic API that you participated in and then act like a victim? Tough fucking luck.
You sleep in class, you get shit results.

GOv8C3q.png


They also managed to blame "Nvidia optimized" for Portal RTX due to the shit performances they had initially when the performance had nothing to do with RT calls.


AMD's compiler decided to make a ridiculous 99ms ubershader without Portal RTX calling for it. 🤷‍♂️
I guess Nvidia is also to blame for AMD's drunk compiler.

I was talking about RTXGI, I meant nvidia version of it and some games are using it.

Of course AMD is worse in RT and no one is denying that, but not all difference comes from just that, some games were even released without RT working on AMD cards at launch, this shows how much developers cared.

Here is one of the most technically advanced games around, Avatar that has RTGI and other RT effects running all the time:

Fz2dhqY.jpg
F78mdud.jpg


12-16% difference, of course Nvidia is faster but it isn't mind blowing. Some people here said that you can't even use RT on AMD with playable framerate, that's the hyperbole that sends misinformation.
 

Shut0wen

Member
The issue is that they are becoming less competitive even on the price/performance basis, which was usually their strong point.
Amd has got such a reputation for being the cheap mans chip that at this point is it even worth it? Samsung tried it againest apple now look at there phones devices no where near as competitive to apple now (this is coming from a samsung user) even if amd made a graphics card that was better then nvidia, nvidia cucks will still come up with an excuse that its shit, pure facts
 

Shut0wen

Member
That's not the reality when RDNA 2 released. -$50 MSRP for a unicorn reference card that had super low quantities, which AMD even wanted to stop producing because probably their MSRP was bullshit and they had super low margins at that price. Then AIB versions that were more expensive than Nvidia's for the same card distributors. To not realize this is making me think that you really didn't look for a GPU back in 2020.
Gave up on pcs years ago, but the tech nvidia offers to consumers is overpriced, they literally adopted the apple strategy years ago and people lap it up
 

poppabk

Cheeks Spread for Digital Only Future
Are we looking at the same data? My 6900XT pretty much outperforms the $100 more expensive 3070 in everything. The Nvidia cards from that generation that should outperform it in RT are often just as if not more shit because they lack the VRAM.
 

Gaiff

SBI’s Resident Gaslighter
HL, RT reflections, shadows and AO
Goes from winning by 14% to losing by 11% with RT in 4K, That's a huge difference.
CP, RT shadows and reflections


RT in this game is shit and the game still runs like ass and is a stuttering mess even today. One of the worst ports of 2022 so congrats AMD.
Hitman, RT reflections



In those games difference is as big as in control (or as little, depending on opinion), Fortnite is using hardware lumen. Biggest performance difference is in games that have nvidia sponsor logo, they may be (more) nvidia optimized.

Goes from winning by 17% to losing by 13% in 4K.

2/3 games you cited see the 7900 XTX lose over 25% of its performance when enabling RT compared to the 4080 and you're seriously going to tell me it's alright? The 7900 XTX has a massive advantage in those games to begin with so the fact that it ends up trailing by over 10% in both is seriously damning.

In Alan Wake 2, the 4080 murks it by 62% with just ray tracing at 1440p. The 7900 XTX is 38% slower in Rift Apart. 27% slower in Dying Light 2. There are many more examples of games with great RT where the 7900 XTX loses horribly than the other way around.
 
RT and DLSS doesn't keep me dazzled like others so could switch to AMD, but the true Nvidia killer app keeping me around is DLDSR

I was talking about RTXGI, I meant nvidia version of it and some games are using it.

Of course AMD is worse in RT and no one is denying that, but not all difference comes from just that, some games were even released without RT working on AMD cards at launch, this shows how much developers cared.

Here is one of the most technically advanced games around, Avatar that has RTGI and other RT effects running all the time:

Fz2dhqY.jpg
F78mdud.jpg


12-16% difference, of course Nvidia is faster but it isn't mind blowing. Some people here said that you can't even use RT on AMD with playable framerate, that's the hyperbole that sends misinformation.

Everyone always laughs at AMD for being "a generation behind in RT", unironically typed out on their RTX 3060/3070/3080 PC's
 

Danknugz

Member
The issue is that they are becoming less competitive even on the price/performance basis, which was usually their strong point.
do people really think someone would write a whole article "haha AMD sucks look at how much better the 4090 is than the <whatever small amd card is i wouldn't know the name>"

i don't understand these kind of takes it's like people don't even think
 

Buggy Loop

Member
Gave up on pcs years ago, but the tech nvidia offers to consumers is overpriced, they literally adopted the apple strategy years ago and people lap it up

If Nvidia is overpriced then AMD is overpriced too. We're far far away from the ATI days of mid range cards being hyper competitive price wise. And back then they were all on the same feature set almost. Nowadays it's a lot more skewed technology wise.

DLSS's superiority, RT performances up to path tracing, easy reselling (keeps value) to professionals who want CUDA / Optix / ML, better VR, more waterblock options (because it's the most sold), much lower latency overall than AMD cards even without reflex, even at nearly double FPS at 1080p vs 2160p, still has higher latency with anti-lag on, drivers are better, peoples will jump on this but there's clear as day stuffs that is unacceptable, such as my brother having his 5700XT black screen and having no solution for years. Or convincing me that AMD's RDNA 3 taking a year to fix their VR drivers because it was performing worse than RDNA 2 is acceptable. Even an AMD driver guy, a year after release, thanked a random internet dude for generating graphs, like dude, what is your team doing? There's bugs on both side but it but that's just bad. Excusing this is not doing AMD any favors.

But you could just ignore all this for saving bucks, why not, some peoples will just run their cards are pure rasterization and have no professional uses. But were you able to actually save money?

Back in 2020's GPU craze : Impossible to get AMD ref card (Oh I tried every drops for months). Impossible to get Nvidia founder's edition in Quebec (french law bullshit).
AIBs vs AIBs, Nvidia was cheaper. Asus TUF 6800XT vs Asus TUF 3080, AMD was more expensive, and that was the cheapest AIB I can recall, let's not even go into derp territory for bullshit like Red Devil/Dragon and so on.

Then add the above factors. Why would I pick AMD other than just getting luck scoring anything by beating bots to the game at any drops. Landed my luck on Nvidia, because AMD effectively had a paper launch, even if they comically said they wouldn't, their steam hardware survey % says otherwise.

So MSRP vs MSRP during RDNA 2 vs Ampere is a dead end discussion. It was all bullshit. It can vary based on location but in Canada, it was easier to score nvidia cheaper AIBs by far. Months I spent on discord for warning that drops were happening and the ratio of peoples scoring nvidia vs AMD was not even close.

After everything calmed down and peoples stopped chasing GPUs? 6700XT is the go-to reference I proposed for mid-range builds for a long time, but that was effectively a generation later.

Peoples freaked out when Nvidia basically kept Ampere price the same and extrapolated the perf/$ of Ada accordingly. Then AMD did the same. There's no savior of this fucked up market, they're both fleecing. Not mentioning AMD as overpriced because of a usually fake MSRP comparison is delusional.
 
Intel will eventually overtake AMD in the GPU market.
The rumors are RDNA4 is not going to have a high end card at all. If this is true, then Intel Arc Battlemage might end up being better than RDNA4. Meanwhile Nvidia will have sole control of the high end with the successor to Lovelace in reportedly early 2025.
 

StereoVsn

Member
I should’ve made a thread when 7900XTX was beating a $300 more expensive 4080 in rasterization.

But I never did because what would be the point?

OP’s just trying to stir things up with nonsense.
I think now there is a lot better case for 4080 Super after a price drop and 7900 XTX is around $920-930 on the low side.

Overall, I think AMD equivalent card should be 10-15 less due to functionally gap. A lot of newer games are a lot more liberal about RT use too.
 
Top Bottom