• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

RX 7900 XTX has ~61 TFLOPS FP32 compute while RTX 4090 has ~82.6 TFLOPS FP32 compute. Modulate your expectations.
And considering everything that comes along with a 4090 if that XTX does well in real world benchmarks looks like I can finally get off NVidia for awhile

I really want to see the numbers of the XTX vs my current card a 3090
 

LiquidMetal14

hide your water-based mammals
RX 7900 XTX has ~61 TFLOPS FP32 compute while RTX 4090 has ~82.6 TFLOPS FP32 compute. Modulate your expectations.
Pixel Rate443.5 GPixel/s
479.6 GPixel/sTexture Rate1,290 GTexel/s
1,395.2 GTexel/sFP16 (half) performance82.58 TFLOPS
83.07 TFLOPS (1:1)FP32 (float) performance82.58 TFLOPS
83.07 TFLOPSFP64 (double) performance1,290 GFLOPS
1,298 GFLOPS (1:64)

*Gigabyte Gaming OC 4090
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
There won't be many major games releasing with RT support as a requirement in the coming years because …..
2. you need to purchase a 1200/1600 dollar card to even use it

You need a 1200/1600 card just to use ray tracing?

Come on man, you aren’t doing the Red Team any favors by going full retard like this.
 

Buggy Loop

Member
I doubt that AMD cards will be CPU limited at 4K in ray tracing, and it looks like the same CPU was used for the rasterization benchmarks. But I agree we should wait until the reviews.

https://www.hardwaretimes.com/amd-r...rown-with-potent-ray-tracing-cpu-performance/

There’s sadly a serious lack of latest CPUs being benched for RT, even less with RT at 4k. But at 1080p there’s 20-25% advantage and up to a 40% uplift in Hitman 3. Even if it’s 6-7 % for 4K (can’t find bench), it’s still shady as fuck to not bench two cards on same platform. Results against the 6950XT are worthless.
 

64bitmodels

Reverse groomer.
You need a 1200/1600 card just to use ray tracing?

Come on man, you aren’t doing the Red Team any favors by going full retard like this.
ok you caught my bluff
but still if you're the type of person still using a 1060 in 2022... safe to say i don't think you're gonna be able to get ahold of any RT capable GPUs in the near future
 

benno

Member
They didn't state anything of the sort. In fact they avoided referencing Nvidia at all when it came to performance claims.
they stated "the worlds most advanced gaming GPU, powering your gaming rigs for years to come"
So, they did state something of the sort, actually.

44m:15s
 
Last edited:

Pagusas

Elden Member
They didn't state anything of the sort. In fact they avoided referencing Nvidia at all when it came to performance claims.
Why are people upset comparing two flagship cards?

For many of us, money is no object, we want the best of the best, so comparing AMD's best offering to Nvidia's best offering makes complete sense.


A LOT of us would like to leave Nvidia, or at least see AMD give them an equal fight, like they have been doing with Intel recently, but they just can't seem to get fully there. With the chiplet design, maybe gen 2 of it will see a massive growth to get them back up to the top. Nvidia has been standing alone at the top of the hill for too long. Still though, they overdid it with the 4090 and made a dream card. I just wish AMD would do the same.
 
Last edited:

GHG

Member
they stated "the worlds most advanced gaming GPU, powering your gaming rigs for years to come"
So, they did state something of the sort, actually.

44m:15s


"Advanced" != "powerful"

You originally said powerful.

Why are people upset comparing two flagship cards?

I don't have a problem with it because the estimated/claimed performance figures are more than good enough for me. But the reality is that the 7900XTX is more similarly priced with the 4080 than it is with the 4090.
 

benno

Member
"Advanced" != "powerful"

You originally said powerful.
Sorry if my memory from a youtube video 17 hours ago isn't perfect.
"we've created the most world's most advanced gaming GPU - but that doesn't mean it's powerful or can run games as good as other GPU's"
you're really stretching this you know.
4MpTLFb.jpg
 

GHG

Member
Sorry if my memory from a youtube video 17 hours ago isn't perfect.
"we've created the most world's most advanced gaming GPU - but that doesn't mean it's powerful or can run games as good as other GPU's"
you're really stretching this you know.
4MpTLFb.jpg

No I'm not. It has features that are yet to appear on other vendors graphics cards. They are allow to advertise that.

Don't you have burning power cables to go and defend anyway?
 
Sorry if my memory from a youtube video 17 hours ago isn't perfect.
"we've created the most world's most advanced gaming GPU - but that doesn't mean it's powerful or can run games as good as other GPU's"
you're really stretching this you know.
4MpTLFb.jpg
I mean technically speaking the XTX is more advanced just AMD didn't push it to the edge of making fire to compete with the 4090

For its price point we are maybe talking what the 2nd most powerful GPU?

I think its a great sweet spot
 

benno

Member
No I'm not. It has features that are yet to appear on other vendors graphics cards. They are allow to advertise that.
yeah yeah. They use a new display port connector so now it's the most advanced gpu. sure.
Don't you have burning power cables to go and defend anyway?
You mean that other thread where it turns out I was correct?
As already mentioned, it seems you want to turn this into an us vs them when it isn't. Nothing is stopping me from putting the AMD into my 2nd PC
 

GHG

Member
yeah yeah. They use a new display port connector so now it's the most advanced gpu. sure.

And a chip let design and new encoders but let's ignore all of that and twist their words.

You mean that other thread where it turns out I was correct?
As already mentioned, it seems you want to turn this into an us vs them when it isn't. Nothing is stopping me from putting the AMD into my 2nd PC

Time will tell my friend, time will tell.

If there's nothing stopping you then there's no need for you to get upset about any of this.
 

DonkeyPunchJr

World’s Biggest Weeb
I think they said most powerful under $1000 and I am ok with that
Yup, the price is what makes this a good product. More specifically, the fact that they kept their flagship at $1000 while Nvidia is launching 4080 at freaking $1200.

If this had been a repeat of last gen (where AMD’s $1000 flagship 6900XT targeted the gap between $700 3080 and $1500 3090) it’d be a much different story.
 
Last edited:

benno

Member
If there's nothing stopping you then there's no need for you to get upset about any of this.
I'm not getting upset, you are. I replied to a question. You called me out as a liar. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I updated my post with a graphic demonstrating the behavior I'm talking about:

comparison.jpg
Has nothing to do with AMD optimizations. Cyberpunk is a heavier RT workload than Metro Exodus and the greater the RT workload, the wider the impact because in pure RT benchmarks, Ampere is twice as fast as RDNA 2.
 
Based on what I'm grasping here from the comments:

-Rasterization performance close to 4090 (5%-10% less)
-RT performance close to or on par with 3090 (but not gimped like RDNA2)
-costs less than 4080
-Display Port 2.1 support
-FSR 3.0 support coming soon (similar to DLSS 3.0)
-Updated FSR 2.2 support removing ghosting
-Updated Streaming and Video encoding
-Less power draw
-I'm sure you can have boost performance close to 3Ghz

You are getting a pretty decent product. I don't think anyone here was delusional hoping it to be the 4090 killer, but it is almost close in rasterization and that is pretty damn impressive.

I kept my expectations in check, and AMD met it, and even exceeded it. I am looking at the overall picture and the value proposition for this card and I think AMD nailed it.

NVIDIA is in trouble, because this particular architecture along with Intel's ARC are trying to lock NVIDIA out with their APU's and smart shift advantage by having the CPU and GPU being the same vendor and having a decent price/performance ratio. Making RT a luxury NVIDIA exclusive and making it discrete and expensive, while tanking performance is just ridiculous, and Intel and AMD have to find a way to negate and dismantle this narrow approach.

1) How much role does the CPU have in RT performance?
2) Can you offset some RT calculations to the CPU by writing and modifying code?
3) Can there be hardware implementations of RT to the CPU side with future CPU products from AMD and Intel? I mean is it only locked to the GPU card with their 'tensor RT cores'. RT is broad and I am sure there is more than 1 way to implement it.

Crytek made this using Vega in 2019?


Developers have to really make their RT implementation extremely polished, and software driven in their game engines. Relying on these proprietary ray tracing cores from NVIDIA seems lazy to me. I am sure you can even mimic or emulate raytracing with close estimates to 'real time RT simulations" without tanking performance. I could care less even if it's baked in as long as it is almost indistinguishable from "real time RT simulations." It is like comparing a high polygon model to low polygon model, and they look almost indistinguishable because of tricks and other factors to give the illusion of making it indistinguishable.
 
Last edited:

GHG

Member
I'm not getting upset, you are. I replied to a question. You called me out as a lair. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.

You're kidding me right?

The discussion here is whether or not the 7900xtx should be being compared to the 4090 or the more similarly priced 4080, and you claimed AMD stated it was the most powerful GPU which is factually incorrect. You were then corrected and you still doubled down despite it having been explained to you what they meant by their wording and why they chose those words.

If anyone here is attempting to make it a play on words thing, it is you, claiming people are stretching despite interpreting their claims correctly:

you're really stretching this you know.

All you've done since the start of this thread is talk about how the 4090 is more powerful. Everyone knows this, it's just that people who are looking to get a 7900 over a 4090 don't care, they are in 2 different price categories.
 
Last edited:

DaGwaphics

Member
Has nothing to do with AMD optimizations. Cyberpunk is a heavier RT workload than Metro Exodus and the greater the RT workload, the wider the impact because in pure RT benchmarks, Ampere is twice as fast as RDNA 2.

LOL it has everything to do with optimizing around the RDNA2 consoles. That's why you see this behavior in UE5 as well.

I was referring to @Mister Wolf regarding future games requiring RT hardware to run properly (a position I was agreeing with). I was just pointing out that because the minimum base for these games will be the consoles, any RT or GI required for basic operation will be performant on AMD hardware. As you can see from the example I listed. RTGI is going to be the standard for some upcoming games, but those games are not going to be unplayable on AMD hardware, to the contrary they will be built first for AMD hardware (the consoles).
 

iQuasarLV

Member
I'm not getting upset, you are. I replied to a question. You called me out as a liar. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.
Just to lean in here with an 'actually'
We are using words to express our points as we are literally typing out our thoughts. Words that require interpretation to understand what you are getting across. If you use the incorrect words to express yourself, you only have yourself to blame when others start detracting from the course you are on conversationally.
 

Gaiff

SBI’s Resident Gaslighter
LOL it has everything to do with optimizing around the RDNA2 consoles. That's why you see this behavior in UE5 as well.

I was referring to @Mister Wolf regarding future games requiring RT hardware to run properly (a position I was agreeing with). I was just pointing out that because the minimum base for these games will be the consoles, any RT or GI required for basic operation will be performant on AMD hardware. As you can see from the example I listed. RTGI is going to be the standard for some upcoming games, but those games are not going to be unplayable on AMD hardware, to the contrary they will be built first for AMD hardware (the consoles).
No, it has fuck-all to do with what you're claiming and I have no idea where you even get that information. The game is simply a much lighter workload than Cyberpunk 2077 which runs at a whopping 23fps at Ultra/4K on a 3090.

Your claim that the game is better optimized for AMD hardware has no basis. It runs well on AMD because the RT effects simply aren`t as heavy as Cyberpunk. The moment you crank them up more, the gap widens. That`s like saying a game is better optimized on weaker hardware because it runs at lower settings.
 
Last edited:

rnlval

Member
Pixel Rate443.5 GPixel/s
479.6 GPixel/sTexture Rate1,290 GTexel/s
1,395.2 GTexel/sFP16 (half) performance82.58 TFLOPS
83.07 TFLOPS (1:1)FP32 (float) performance82.58 TFLOPS
83.07 TFLOPSFP64 (double) performance1,290 GFLOPS
1,298 GFLOPS (1:64)

*Gigabyte Gaming OC 4090
Pixel and texture rates are raster graphics.
 

HoofHearted

Member
All you've done since the start of this thread is talk about how the 4090 is more powerful. Everyone knows this, it's just that people who are looking to get a 7900 over a 4090 don't care, they are in 2 different price categories.
 
Last edited:

Kenpachii

Member
So basically RDNA2 all over again.

Can't compete with nvidia's top, sits around x80 level, but is a gen behind with RT.

Price is only good because nvidia is asking way way to much. It's easily solvable by nvidia by just dropping a 4080ti.
 
Last edited:

DaGwaphics

Member
No, it has fuck-all to do with what you're claiming and I have no idea where you even get that information. The game is simply a much lighter workload than Cyberpunk 2077 which runs at a whopping 11fps at Ultra on a 3090.

Your claim that the game is better optimized for AMD hardware has no basis. It runs well on AMD because the RT effects simply aren`t as heavy as Cyberpunk. The moment you crank them up more, the gap widens. That`s like saying a game is better optimized on weaker hardware because it runs at lower settings.

You just explained the optimizations yourself. LOL

Because of the consoles, games that require RT/GI by default will need to run on lower powered RDNA2 parts. The result of this, just like with F1, ME EE, and the UE5 demo is that the RT won't be as heavy by design, and this lowers the Nvidia advantage. The consoles are the optimization in this scenario because they limit what the base RT requirements can be.

Nvidia will continue to win big in superfluous modes that Nvidia pays devs to include to sell their GPUs, but when the tech is a required building block for new games, AMD will hang in there just fine.
 

Dr.D00p

Member
It's easily solvable by nvidia by just dropping a 4080ti.

??

A 4080Ti is likely to cost a minimum $1399 & probably $1499 MSRP, and is probably still nearly a year away.

That's still a significant price difference that will keep many from going Nvidia.
 

Gaiff

SBI’s Resident Gaslighter
You just explained the optimizations yourself. LOL

Because of the consoles, games that require RT/GI by default will need to run on lower powered RDNA2 parts. The result of this, just like with F1, ME EE, and the UE5 demo is that the RT won't be as heavy by design, and this lowers the Nvidia advantage. The consoles are the optimization in this scenario because they limit what the base RT requirements can be.

Nvidia will continue to win big in superfluous modes that Nvidia pays devs to include to sell their GPUs, but when the tech is a required building block for new games, AMD will hang in there just fine.
The games will work. They certainly won`t hang in there just fine compared to NVIDIA. AMD will be woefully outclassed.
 

Rentahamster

Rodent Whores
The 7900XT probably has a $100 price reduction built in to its launch pricing, just in case Nvidia respond by dropping the price of the 4080.

The 7900XTX will stay at $999 but dropping the 7900XT to $799 would make more sense...but AMD won't do it until they have to.
That sounds like a plausible strategy.
 

DonkeyPunchJr

World’s Biggest Weeb
LOL it has everything to do with optimizing around the RDNA2 consoles. That's why you see this behavior in UE5 as well.

I was referring to @Mister Wolf regarding future games requiring RT hardware to run properly (a position I was agreeing with). I was just pointing out that because the minimum base for these games will be the consoles, any RT or GI required for basic operation will be performant on AMD hardware. As you can see from the example I listed. RTGI is going to be the standard for some upcoming games, but those games are not going to be unplayable on AMD hardware, to the contrary they will be built first for AMD hardware (the consoles).
You have any source on this? “Any RT or GI required for basic operation” - what does that even mean? To a skeptic this just sounds like you’re saying “any console-quality RT can also run well on a 7900”. Well duh.

Sounds to me that all this “optimized for AMD consoles” just means lower settings than what a PC is capable of (and the vast majority of PC games will allow for higher RT settings just like all the other settings). Not that there is some AMD specific optimization that won’t run as well on RTX.
 

HoofHearted

Member
It makes me wonder what Nvidia is thinking right now in regards to the 4080

The Ti model better have DP 2.1 as well
Their only option appears to be to reduce price at this point to remain competitive. Once they do - I wouldn't be surprised if AMD cuts the price of the XT another $100-150 in response.

Watching the details from LTT's update above is proving even further interesting given their testing results based on their own reference card..
 

iQuasarLV

Member
Nvidia pricing, they can lower it any time they want.
Umm, not at the price of 5nm wafer nodes and their monolithic die size. There is just no way they can lower the price without taking a huge hit on profits. The current 4000 series is pegged around $200+ just for a wafer cuts. not to mention the added cost of wafer fail coverage and a multitude of other logistical factors beyond R&D costs.
 

DaGwaphics

Member
You have any source on this? “Any RT or GI required for basic operation” - what does that even mean? To a skeptic this just sounds like you’re saying “any console-quality RT can also run well on a 7900”. Well duh.

Sounds to me that all this “optimized for AMD consoles” just means lower settings than what a PC is capable of (and the vast majority of PC games will allow for higher RT settings just like all the other settings). Not that there is some AMD specific optimization that won’t run as well on RTX.

I'm talking about games that can't function without RT, like ME EE. Or that require it to function well, like the UE5 demo. Those games will have to be well optimized for the RDNA2 consoles, and the results so far have shown that those games scale a lot better on AMD hardware than games that don't support RT on console. Like I said, I was just commenting on the post of another user that was talking about UE5/Lumen and other RT/GI based games releasing in the future.
 
Top Bottom