• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

Topher

Gold Member
I am not rooting for either teams. But to me the 4090 and the 4080 are high end cards. The 7900 xt is a mid range card in terms of performance. If I am going that route ( not really mid range but you know what I mean) then I would just go and buy a used 3090 for cheaper since I’ll still have access to same level ray tracing as well as close performance but cheaper

Yeah, I get what you are saying. Although I doubt I'd buy a used card with all the ones used for mining probably going on ebay, but I get your point.

So here is what I'm seeing right now as far as how the GPUs are laid out in the market.

$1600 4090
$1200 4080
$1099 3090 Ti
$999 7090 XTX
$899 7090 XT
$799 6950 XT
$679 6900 XT
$449 6750 XT

Of the top five, only the 4080, 3090 Ti, and 7900 XT are actually available to buy at MSRP.
 

//DEVIL//

Member
Yeah, I get what you are saying. Although I doubt I'd buy a used card with all the ones used for mining probably going on ebay, but I get your point.

So here is what I'm seeing right now as far as how the GPUs are laid out in the market.

$1600 4090
$1200 4080
$1099 3090 Ti
$999 7090 XTX
$899 7090 XT
$799 6950 XT
$679 6900 XT
$449 6750 XT

Of the top five, only the 4080, 3090 Ti, and 7900 XT are actually available to buy at MSRP.
Anyone who buy any last Gen card at msrp today needs a smack lol

Last Gen cards are used ( if clean ) or out of the picture.
You can tell anyway if it’s a mining rig or not . The fans less dusty, pics of the card in his personal pc. Box and receipt . These are kinda good indications that it was a personal use card most of the time.

Anyway congrats, you made the right call now ( finally )
 
Last edited:

Buggy Loop

Member
How does the 7900 XTX handle raytracing?
About as good as a 3090ti 3080 which is to say far from the "shit" RT that some people claim.

All of those are ultra + psycho RT FSR quality 1440p

Corrected

It's less than a 3080 Ti/3090 at native, it's just barely above a 3080. At a gen difference with much more CUs.

Welp.....I pulled a GymWolf GymWolf and flip-flopped from 7900 XT to a RTX 4080 FE. Found out Best Buy had a 10% off code so got the GPU for $1079.

For anyone wanting to get that deal, here is the code:

q4fy23save10

Where do you live again? Best Buy Canada or USA?
 
Last edited:

Topher

Gold Member
Corrected

It's less than a 3080 Ti/3090 at native, it's just barely above a 3080. At a gen difference with much more CUs.



Where do you live again? Best Buy Canada or USA?

USA

If that code doesn't work then try the Honey extension. It might find one that does. Worked for me today though.
 
Last edited:

SatansReverence

Hipster Princess
The XTX seems like a damn good card to me, If I was in the market for a new PC I'd absolutely consider it. I heard that it's a poor overclocker? Is that true?.
Reference cards maybe, 3x8pin AIBs offer markedly improved performance after overclocking but as always that's dependent on the silicon lottery.
Corrected

It's less than a 3080 Ti/3090 at native, it's just barely above a 3080. At a gen difference with much more CUs.



Where do you live again? Best Buy Canada or USA?
You corrected nothing.



Doesn't even lose to a 3090 once, beats the 3090ti twice.

https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/34.html
a9a37bbf32a666eab9e051393f4c05de.png

It trades blows with the 3090ti, a card that retailed for twice the MSRP 8 months ago.
 

Gaiff

SBI’s Resident Gaslighter
Reference cards maybe, 3x8pin AIBs offer markedly improved performance after overclocking but as always that's dependent on the silicon lottery.

You corrected nothing.



Doesn't even lose to a 3090 once, beats the 3090ti twice.

https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/34.html
a9a37bbf32a666eab9e051393f4c05de.png

It trades blows with the 3090ti, a card that retailed for twice the MSRP 8 months ago.

That's correct. Techpowerup are kinda stupid as they used a 5800X for the 4090 review and with RT, it gets CPU-limited in some games. Here is the meta review from 3DCenter with over 7000 benchmarks gathered from the biggest review sites.

t0X8kk7.png

j58qyzj.png


The 7900 XTX is functionally equal to a 3090 Ti in ray tracing.
 
Last edited:

MikeM

Member
Welp.....I pulled a GymWolf GymWolf and flip-flopped from 7900 XT to a RTX 4080 FE. Found out Best Buy had a 10% off code so got the GPU for $1079.

For anyone wanting to get that deal, here is the code:

q4fy23save10
Congrats! I would have bought one but in Canada they retail at $1,699 which is $450 more than what I paid for my 7900xt before taxes. Wound have been about $500 more including tax.
I am not rooting for either teams. But to me the 4090 and the 4080 are high end cards. The 7900 xt is a mid range card in terms of performance. If I am going that route ( not really mid range but you know what I mean) then I would just go and buy a used 3090 for cheaper since I’ll still have access to same level ray tracing as well as close performance but cheaper
7900xt should have been named 7800xt. However, it is the fourth most powerful card on the market for raster.

I looked at 3080ti and up cards from Nvidia and they were either the same price or vastly more expensive than the 7900xt in Canada. 7900xt was a no brainer from that regard.
 

PaintTinJr

Member
That's correct. Techpowerup are kinda stupid as they used a 5800X for the 4090 review and with RT, it gets CPU-limited in some games. Here is the meta review from 3DCenter with over 7000 benchmarks gathered from the biggest review sites.

t0X8kk7.png

j58qyzj.png


The 7900 XTX in functionally equal to a 3090 Ti in ray tracing.
In those conditions are we not just seeing those top cards limited by the (fundamentals of the architecture first and the) type of GDDR6/6X memory and their amount and arrangement of GPU caches - rather than compute?

AFAIK the Nvidia cards are all using GDDR6X vs 7900XT and XTX using GDDR6, and the Nvidia cards having an advantage on GPU cache. IMO if the AMD cards just levelled up on those two things the comparisons would be more favourable to showing how good the AMD design is to compete without using the same limitations of dedicated RT silicon that may not age so well with advances in real-time RT techniques coming from console use in the coming years.
 

winjer

Gold Member
In those conditions are we not just seeing those top cards limited by the (fundamentals of the architecture first and the) type of GDDR6/6X memory and their amount and arrangement of GPU caches - rather than compute?

AFAIK the Nvidia cards are all using GDDR6X vs 7900XT and XTX using GDDR6, and the Nvidia cards having an advantage on GPU cache. IMO if the AMD cards just levelled up on those two things the comparisons would be more favourable to showing how good the AMD design is to compete without using the same limitations of dedicated RT silicon that may not age so well with advances in real-time RT techniques coming from console use in the coming years.

A few things to consider. for one, GDDR6 has increase speed quite a bit in the last couple of years, catching up to GDDR6X.
GDDR6X uses PAM4, resulting in more bits transferred per clock cycle, but it can't clock as high as GDDR6. GDDR6X also consumes significantly more power and emits more heat.
We also have to consider that the 7900 uses a 384bit bus. Not the 256 bit bus of RDNA2 high end GPUs.
So the 7900XTX has 960.0 GB/s and the 4090 has 1,008 GB/s of memory bandwidth. And the 4080 "only" has 716.8 GB/s.
NVidia does have the advantage of having better color delta compression.
But AMD has bigger caches. L1 is 256Kb per array. Nvidia has 128KB per SM.
The 7900XTX has only 6MB of L2 cache. But also has 96MB of L3 cache. The 4080 has 64Mb of L2 cache and no L3 cache.
So in terms of the memory subsystem, NVidia and AMD are pretty close this time around.
 

RoboFu

One of the green rats
I am not rooting for either teams. But to me the 4090 and the 4080 are high end cards. The 7900 xt is a mid range card in terms of performance. If I am going that route ( not really mid range but you know what I mean) then I would just go and buy a used 3090 for cheaper since I’ll still have access to same level ray tracing as well as close performance but cheaper
What!? You do know the xt beats the 4080 in more than a few games and ties it in the others?
The xt is the 4080s direct competition while the xtx sits above it.
 
Last edited:

Topher

Gold Member
What!? You do know the xt beats the 4080 in more than a few games and ties it in the others?
The xt is the 4080s direct competition while the xtx sits above it.

That's not what I've seen when researching GPUs. Typically, the XTX and the 4080 and neck and neck in non-RT benchmarks and XT is slower than both.

QsOOEco.png


8DdbeXI.png





Having said that, it is the cheapest card of the new generation of GPUs so I think it is still a good option for those not wanting to spend over $1k.
 
Last edited:

Topher

Gold Member
That's correct. Techpowerup are kinda stupid as they used a 5800X for the 4090 review and with RT, it gets CPU-limited in some games. Here is the meta review from 3DCenter with over 7000 benchmarks gathered from the biggest review sites.

t0X8kk7.png

j58qyzj.png


The 7900 XTX in functionally equal to a 3090 Ti in ray tracing.

That's good stuff. What site provides that data?
 

Gaiff

SBI’s Resident Gaslighter
What!? You do know the xt beats the 4080 in more than a few games and ties it in the others?
The xt is the 4080s direct competition while the xtx sits above it.
Not really? The XT might win in games that heavily favor AMD such as COD but otherwise it sits firmly below the 4080 which is a tad bit slower than the 7900 XTX in raster (like 2-3% overall) but significantly faster in RT (around 25%).

That's good stuff. What site provides that data?
https://www.3dcenter.org/artikel/launch-analyse-amd-radeon-rx-7900-xt-xtx
 

PaintTinJr

Member
A few things to consider. for one, GDDR6 has increase speed quite a bit in the last couple of years, catching up to GDDR6X.
GDDR6X uses PAM4, resulting in more bits transferred per clock cycle, but it can't clock as high as GDDR6. GDDR6X also consumes significantly more power and emits more heat.
We also have to consider that the 7900 uses a 384bit bus. Not the 256 bit bus of RDNA2 high end GPUs.
So the 7900XTX has 960.0 GB/s and the 4090 has 1,008 GB/s of memory bandwidth. And the 4080 "only" has 716.8 GB/s.
NVidia does have the advantage of having better color delta compression.
But AMD has bigger caches. L1 is 256Kb per array. Nvidia has 128KB per SM.
The 7900XTX has only 6MB of L2 cache. But also has 96MB of L3 cache. The 4080 has 64Mb of L2 cache and no L3 cache.
So in terms of the memory subsystem, NVidia and AMD are pretty close this time around.
Good summary comparison, thanks.

My take would be that the L2 advantage is with RT, and the L3 advantage is with RT off, because of the depth and volume of the calculations in RT, whether AMD RT solution via compute and generic BVH acceleration or via Nvidia's dedicated RT units, I suspect the workload is hitting the L1/L2 as a hotspot at a job level and negating any L3 latency hiding.

I also supsect the latency using PAM4 will be almost half of PAM2 - assuming the additional latency to decode twice as many signals per transfer isn't 1-to-1 and just a fractional added latency - thus maybe allowing Nvidia to drop the number of lines and still get a latency advantage over the GDDR6 in RT workloads. I guess when Samsung make the competing product GDDR6W available at a good price - because AFAIK Micron and Nvidia are in exclusive partnership for GDDR6X from what I read - if AMD choose that instead for their top tier cards.
 

winjer

Gold Member
Good summary comparison, thanks.

My take would be that the L2 advantage is with RT, and the L3 advantage is with RT off, because of the depth and volume of the calculations in RT, whether AMD RT solution via compute and generic BVH acceleration or via Nvidia's dedicated RT units, I suspect the workload is hitting the L1/L2 as a hotspot at a job level and negating any L3 latency hiding.

I also supsect the latency using PAM4 will be almost half of PAM2 - assuming the additional latency to decode twice as many signals per transfer isn't 1-to-1 and just a fractional added latency - thus maybe allowing Nvidia to drop the number of lines and still get a latency advantage over the GDDR6 in RT workloads. I guess when Samsung make the competing product GDDR6W available at a good price - because AFAIK Micron and Nvidia are in exclusive partnership for GDDR6X from what I read - if AMD choose that instead for their top tier cards.

RT units and especially Tensor units are very dependent on the memory system.
But I think the biggest reason why AMD went with big L3 cache and smaller L2 cache is because of chiplets.
AMD can just paste come chips at the borders of the chip and have the advantage of better yields.
The other reason is that AMD already had developed the 32MB of L3 cache for it's Ryzen CPUs. So it was relatively cheap and easy to implement.

But what really matters on these GPUs is to avoid going many times to vram, as that costs latency, bandwidth and power.
Also, this focus on caches is due to memory speeds, both in latency and bandwidth, have lagged behind the improvements of compute capabilities of GPUs and CPUs.
 

MikeM

Member
That's not what I've seen when researching GPUs. Typically, the XTX and the 4080 and neck and neck in non-RT benchmarks and XT is slower than both.

QsOOEco.png


8DdbeXI.png





Having said that, it is the cheapest card of the new generation of GPUs so I think it is still a good option for those not wanting to spend over $1k.

Agreed. Regions make a difference too.

Cheapest 3090 (non ti model at that) used in my area is $1,300. Newegg’s cheapest 3090 is well over that.

On the 3090ti side, new models on newegg are going for almost $2,200. Used its around $1,950 for the cheapest in the province I reside.

I got my 7900xt for $1,223 new. Its not as good a deal as the xtx, but compared to the rest of the market its an absolutely smoking deal. People need to keep that in mind.
Not really? The XT might win in games that heavily favor AMD such as COD but otherwise it sits firmly below the 4080 which is a tad bit slower than the 7900 XTX in raster (like 2-3% overall) but significantly faster in RT (around 25%).


https://www.3dcenter.org/artikel/launch-analyse-amd-radeon-rx-7900-xt-xtx
As it should.
 
Last edited:

Topher

Gold Member
Not really a bad result for them, all things considered.

Yeah, but at the same time I would like to see AMD release a generation of GPUs that competes Nvidia's current generation of GPUs (in all aspects) rather than Nvidia's last generation of GPUs.

Agreed. Regions make a difference too.

Cheapest 3090 (non ti model at that) used in my area is $1,300. Newegg’s cheapest 3090 is well over that.

On the 3090ti side, new models on newegg are going for almost $2,200. Used its around $1,950 for the cheapest in the province I reside.

I got my 7900xt for 1,223 new. Its not as good a deal as the xtx, but compared to the rest of the market its an absolutely smoking deal. People need to keep that in mind.

And that's the thing. Right now everyone is going nuts to get the XTX so it is sold out and prices are being jacked up. I had an XT on order because I could get it at MSRP. Doesn't matter if XTX is a good deal or not if you can't buy it.

The prices of last gen cards are just completely nuts. The only viable option available is the 6950 at right under $800 as long as you have a PSU that can accommodate it. Somehow 6900 XT cards are typically above $1000. I mean....look at this shit...

BdFFUvj.png


So either wait for XTX prices to normalize (good luck), go last gen with the 6950 or get the 7900 XT. I might have opted for the 6950 if I wouldn't have had to upgrade my 750 watt PSU (probably need to anyway) or if the two free games were something other than the Callisto Protocol and Dead Island 2 which I have no interest in. 7900 XT is a good choice all things considered.
 
Last edited:

Deanington

Member
Welp.....I pulled a GymWolf GymWolf and flip-flopped from 7900 XT to a RTX 4080 FE. Found out Best Buy had a 10% off code so got the GPU for $1079.

For anyone wanting to get that deal, here is the code:

q4fy23save10

Its still $1,174 for me after the code. Fucking sales tax is pretty much 100. Still worth it?

EDIT: Fuck it, bought it. Thanks for the code!
 
Last edited:

Topher

Gold Member
Its still $1,174 for me after the code. Fucking sales tax is pretty much 100. Still worth it?

Mine was a bit less than that, but it really doesn't matter as you'll still pay sales tax on top of whatever you end up buying. So an XTX would still be less $100 cheaper if you could find one at MSRP. I guess the question is how important is ray tracing to you or, if not, are you willing to wait for the XTX come back in stock. I decided that the discounted 4080 has added benefits with ray tracing and DLSS that warrant the extra $79.
 

PaintTinJr

Member
RT units and especially Tensor units are very dependent on the memory system.
But I think the biggest reason why AMD went with big L3 cache and smaller L2 cache is because of chiplets.
AMD can just paste some chips at the borders of the chip and have the advantage of better yields.
The other reason is that AMD already had developed the 32MB of L3 cache for it's Ryzen CPUs. So it was relatively cheap and easy to implement.

But what really matters on these GPUs is to avoid going many times to vram, as that costs latency, bandwidth and power.
Also, this focus on caches is due to memory speeds, both in latency and bandwidth, have lagged behind the improvements of compute capabilities of GPUs and CPUs.
I think you are correct about the L3 size, but L2 sizes that Nvidia are using are far more expensive IMO and with Nvidia having something like 80% of the market share their ability to get a bigger volume discount and spread it across more cards - making it more cost effective - is probably the reason for AMD not having that L2 size at MSRP parity..

AMD having won both home console contracts also have the advantage that consoles are usually the driver target platform for AAA games with RT, so the developer techniques to maximize performance with less L2 cache and lesser GDDR6 on console come for free to their PC card customers.
 

GymWolf

Member
Agreed. Regions make a difference too.

Cheapest 3090 (non ti model at that) used in my area is $1,300. Newegg’s cheapest 3090 is well over that.

On the 3090ti side, new models on newegg are going for almost $2,200. Used its around $1,950 for the cheapest in the province I reside.

I got my 7900xt for $1,223 new. Its not as good a deal as the xtx, but compared to the rest of the market its an absolutely smoking deal. People need to keep that in mind.

As it should.
Wait, isn't the 7900xt american price 899 dollars? did you bought a third party?

How the fuck is US more pricey than europe? here you can buy a 7900xt for like 1100 euros, way less if you can detract the vat.
 
Last edited:

winjer

Gold Member
I think you are correct about the L3 size, but L2 sizes that Nvidia are using are far more expensive IMO and with Nvidia having something like 80% of the market share their ability to get a bigger volume discount and spread it across more cards - making it more cost effective - is probably the reason for AMD not having that L2 size at MSRP parity..

AMD having won both home console contracts also have the advantage that consoles are usually the driver target platform for AAA games with RT, so the developer techniques to maximize performance with less L2 cache and lesser GDDR6 on console come for free to their PC card customers.

AMD has similar volumes to NVidia. Remember that AMD also has CPUs, SoCs, FPGAs, GPUs, chipsets, GPGPU, etc.
Of course not all is N5. But neither is Nvidia.
 

Buggy Loop

Member
Reference cards maybe, 3x8pin AIBs offer markedly improved performance after overclocking but as always that's dependent on the silicon lottery.

You corrected nothing.



Doesn't even lose to a 3090 once, beats the 3090ti twice.

https://www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-merc-310-oc/34.html

It trades blows with the 3090ti, a card that retailed for twice the MSRP 8 months ago.


GN isn’t running native, apples and oranges 🤷 and a 12700KF, seriously?
Techpowerup has had so much shenanigans with mixing test benches and different CPUs for other cards.. it's not even worth bringing up. What's even their RT settings for cyberpunk? They don't say shit.

Babeltech with an i9-13900KF
Untitled-3-3.jpg


Frame chaser 13900k @ 5.8GHx, probably the fastest rig for reviews, removing the CPU bottleneck as much as possible


Watch full video actually, it's brutal. Even in rasterization against the 4080, it's not good. 7900XTX loses at almost everything with a good CPU to remove as much bottlenecks as possible. Meaning the 4000 series still have performance on the table for next wave of CPUs.

Digital foundry's with a 12900k put the 7900XTX at 3080 Ti level

But let's not even mention current gen 4080 in all this ;)

Cyberpunk overdrive patch will be brutal on AMD.
 
Last edited:

GymWolf

Member
GN isn’t running native, apples and oranges 🤷 and a 12700KF, seriously?
Techpowerup has had so much shenanigans with mixing test benches and different CPUs for other cards.. it's not even worth bringing up. What's even their RT settings for cyberpunk? They don't say shit.

Babeltech with an i9-13900KF
Untitled-3-3.jpg


Frame chaser 13900k @ 5.8GHx, probably the fastest rig for reviews, removing the CPU bottleneck as much as possible


Watch full video actually, it's brutal. Even in rasterization against the 4080, it's not good. 7900XTX loses at almost everything with a good CPU to remove as much bottlenecks as possible. Meaning the 4000 series still have performance on the table for next wave of CPUs.

Digital foundry's with a 12900k put the 7900XTX at 3080 Ti level

But let's not even mention current gen 4080 in all this ;)

Cyberpunk overdrive patch will be brutal on AMD.

Sorry if i ask, but what is wrong with a i7 12gen cpu for gaming?

Do i have to worry with a 13600k? technically an inferior but newer cpu?
 

thuGG_pl

Member
GN isn’t running native, apples and oranges 🤷 and a 12700KF, seriously?
Techpowerup has had so much shenanigans with mixing test benches and different CPUs for other cards.. it's not even worth bringing up. What's even their RT settings for cyberpunk? They don't say shit.

Babeltech with an i9-13900KF
Untitled-3-3.jpg


Frame chaser 13900k @ 5.8GHx, probably the fastest rig for reviews, removing the CPU bottleneck as much as possible


Watch full video actually, it's brutal. Even in rasterization against the 4080, it's not good. 7900XTX loses at almost everything with a good CPU to remove as much bottlenecks as possible. Meaning the 4000 series still have performance on the table for next wave of CPUs.

Digital foundry's with a 12900k put the 7900XTX at 3080 Ti level

But let's not even mention current gen 4080 in all this ;)

Cyberpunk overdrive patch will be brutal on AMD.


Yeah, because everyone will be rocking out with expensive and hot 13900k with OCed 8000+ DDR5. From what I'm seeing many gamers choose more sensible options for CPUs, like 12600/12700/13600/13700/5800X3d etc. and still pairing them with beefy graphic cards.
 
Last edited:

GymWolf

Member
Well shit, if fucking fortnite in ue5 put these cards on their knees (while still looking like absolute ass), we have no hope for 4k60 gaming with future, more graphical demanding ue5 games...

Well, it was a good run




Please tell me that this is not indicative of nothing for some reasons...
 
Last edited:

thuGG_pl

Member
Well shit, if fucking fortnite in ue5 put these cards on their knees (while still looking like ass), we have no hope for 4k60 gaming with future, more graphical demanding ue5 games...

Well, it was a good run




Please tell me that this is not indicative of nothing for some reasons...


Why not? Every brakethrough technology is demanding at first, then the hardware catches up. In 5 years time RT will be new normal.

It happened many times before and it's happening now. What comes to my mind is HL2, Doom 3 releases when event the stronges cards had a difficulty running them 60fps at high resolution. But they pushed the indsustry.
 
Last edited:

GymWolf

Member
Why not? Every brakethrough technology is demanding at first, then the hardware catches up. In 5 years time RT will be new normal.

It happened many times before and it's happening now. What comes to my mind is HL2, Doom 3 releases when event the stronges cards had a difficulty running them 60fps at high resolution. But they pushed the indsustry.
Yeah but epic said that nanite and lumen were lighter than classic rtx, and now a game that looks like absolute shit, tanks 1000+ dollars gpu just because of that, that thing is not pushing anything except the tech, it doesn't look anywhere near close the last ratchet game that only has rtx reflections, and that thing run at 4k40 on a 10 tf machine.

I thought that lumen was the smart way of doing rtx, not as good but less performance hog.

Same for nanite, no lod and super details by using the ssd and some sort of black magic to not be overly heavy.

It looks like i got it wrong.
 
Last edited:

thuGG_pl

Member
Yeah but epic said that nanite and lumen were lighter than classic rtx, and now a game that looks like absolute shit, tanks 1000+ dollars gpu just because of that, that thing is not pushing anything except the tech, it doesn't look anywhere near close the last ratchet game that only has rtx reflections, and that thing run on a 10 tf machine.

I thought that lumen was the smart way of doing rtx, not as good but less performance hog.

Same for nanite, no lod and super details by using the ssd and some sort of black magic to not be overly heavy.

It looks like i got it wrong.

Well, RT is simply expensive because it traces light rays. It doesn't matter if you put it cartoonish looking game like Fortnite or something more realistic. Just look at Portal RTX its completly path traced which is making it kill almost everything bar 40xx series. + in UE5 you get Nanite which basically eliminates the LOD pop in which is a huge plus for me.
 
Last edited:

SatansReverence

Hipster Princess
GN isn’t running native, apples and oranges 🤷 and a 12700KF, seriously?
Techpowerup has had so much shenanigans with mixing test benches and different CPUs for other cards.. it's not even worth bringing up. What's even their RT settings for cyberpunk? They don't say shit.

Babeltech with an i9-13900KF
Untitled-3-3.jpg


Frame chaser 13900k @ 5.8GHx, probably the fastest rig for reviews, removing the CPU bottleneck as much as possible


Watch full video actually, it's brutal. Even in rasterization against the 4080, it's not good. 7900XTX loses at almost everything with a good CPU to remove as much bottlenecks as possible. Meaning the 4000 series still have performance on the table for next wave of CPUs.

Digital foundry's with a 12900k put the 7900XTX at 3080 Ti level

But let's not even mention current gen 4080 in all this ;)

Cyberpunk overdrive patch will be brutal on AMD.


That moment you try to shit on a 12700k claiming it's a bottleneck when talking to someone who is using a 12700k and can confirm it's not a bottleneck at all.

Follow that up with literal no body youtube channels that verifiably have fucked benchmarks? Supposedly 46 fps RT medium at 1440p? Well, here is an actual benchmark of the 7900xtx 1440p native RT med in cyberpunk

xuL7bqw.png


What's that? Around a 30% deficit?

The aggregation of damn near every review of reference model 7900 xtx puts it at the same level as a 3090ti including in 4k benchmarks were there absolutely won't be any cpu bottlenecks. Deal with it.
 
Last edited:

Buggy Loop

Member
That moment you try to shit on a 12700k claiming it's a bottleneck when talking to someone who is using a 12700k and can confirm it's not a bottleneck at all.

Yes it is for 4000 series and even to some extent, ampere series scale almost as well with CPUs. 7900XTX is almost not affected by mid-range CPUs.

Take techpowerup's benchmarks of the 4090 who were initially on a 5800x, the same chart everyone at r/AMD used for their "up to" napkin math scaling that made them think they were within spitting distance of Nvidia's flagship (lol)


And then take into account 13900k vs 5800X3D, scales less at 4k, but then the 13900k is an overclocking monster if you want it to be, while stock it would be an average of 1.3% at 4K, overclocked it would easily reach 2-3%

The 4000 series basically keep wanting more, especially the 4090, next gen CPUs maybe quench it's thirst.

Follow that up with literal no body youtube channels that verifiably have fucked benchmarks? Supposedly 46 fps RT medium at 1440p? Well, here is an actual benchmark of the 7900xtx 1440p native RT med in cyberpunk

Digital foundry a nobody?
Oh, here's another "nobody" with a 3090 Ti over a 7900 XTX

RT_2-p.webp


Another "nobody"
Cyberpunk-2077-FPS-2160p-DX12-Ultra-RT-On.png


As per AMD marketing methods, 3090 Ti would be "up to" 1.21x the 7900 XTX in Cyberpunk 2077, taking hardware unboxed into it, upto 1.25x. Lead will probably increase with overdrive patch.

The aggregation of damn near every review of reference model 7900 xtx puts it at the same level as a 3090ti including in 4k benchmarks were there absolutely won't be any cpu bottlenecks. Deal with it.

The aggregation that includes all the silly RT shadow games?

01-03-QHD-RT-Index.png



Or Farcry 6 maybe?



The game that lowers RT reflections when it detects an AMD card and still somehow finds its way into benchmark suites?

But hey congrats! You saved maybe $200 if you picked the reference card, you're competing against last gen cards in RT!
 

SatansReverence

Hipster Princess
Yes it is for 4000 series and even to some extent, ampere series scale almost as well with CPUs. 7900XTX is almost not affected by mid-range CPUs.
A 12700k can handle cyberpunk ultra RT at over 100fps average, it's not bottlenecking anything.

Apparently, you're incapable of understanding what an aggregate means. I'll explain it. Take all the results from all the people, smash them together and the result is you're wrong.

I actually saved about $2000 by not buying a 3090ti, but do go on.

How ever will I live being a whole 20% behind a 4080 while spanking it in raster and that would be if I bought an underpowered and under cooled reference card which, this is a secret alright, I didn't. Oh the humanity!
 

analog_future

Resident Crybaby
Man, I keep trying to price out a PC build because the Series X and PS5 leave something to be desired in multiplat games at times, but every time I try to price something out I'm just gobsmacked by the costs compared to these $500 consoles.

To make a PC build worthwhile, I'd require XSX/PS5 "Quality" mode preset equivalents, maybe with a few higher settings, at 60fps. So pretty much 4k/high settings/60fps in new modern games for the entirety of this generation, so at least ~5 years.

But to build a PC that's capable of that, it sure seems like I need to be spending $1500 bare minimum, if not closer to $2000.

All this research has been an exercise in frustration. I just wish Microsoft and/or Sony would come out with a super high end console for $800+, but there's probably a tiny market for that.
 
Last edited:

Buggy Loop

Member
Man, I keep trying to price out a PC build because the Series X and PS5 leave something to be desired in multiplat games at times, but every time I try to price something out I'm just gobsmacked by the costs compared to these $500 consoles.

To make a PC build worthwhile, I'd require XSX/PS5 "Quality" mode preset equivalents, maybe with a few higher settings, at 60fps. So pretty much 4k/high settings/60fps in new modern games for the entirety of this generation, so at least ~5 years.

But to build a PC that's capable of that, it sure seems like I need to be spending $1500 bare minimum, if not closer to $2000.

All this research has been an exercise in frustration. I just wish Microsoft and/or Sony would come out with a super high end console for $800+, but there's probably a tiny market for that.

Then don’t look into AMD or Nvidia flagship cards?



Rich’s setup to match a series X with console equivalent settings is roughly $700. Not sure about Intel? Find a deep discounted RDNA 2 or a used ampere and be done with it. For console equivalent, no need for ultra high end CPU either.
 

Buggy Loop

Member
How ever will I live being a whole 20% behind a 4080 while spanking it in raster and that would be if I bought an underpowered and under cooled reference card which, this is a secret alright, I didn't. Oh the humanity!

Animated GIF


You’re a lost cause with that comment lol, spanked 4080 in rasterization. Where have you been since the reviews? Not a single outlet would be brainlet enough to claim that.

You’re prime material for r/AyyMD

Btw, find me the aggregate of all reviews with ultra RT for cyberpunk 2077 and that game only
 
Last edited:

analog_future

Resident Crybaby
Then don’t look into AMD or Nvidia flagship cards?



Rich’s setup to match a series X with console equivalent settings is roughly $700. Not sure about Intel? Find a deep discounted RDNA 2 or a used ampere and be done with it. For console equivalent, no need for ultra high end CPU either.


If you read my post, I don't want a console equivalent (and I certainly don't want an equivalent that costs $200 more). I want something that can do better but at a reasonable price.
 
Last edited:

SatansReverence

Hipster Princess
You’re a lost cause
The irony of that coming from someone who desperately searches for hours to find a shitty youtube channel that has benchmark results that are 30% off reality while crying about a 12700k being a bottleneck.

And now it's got to be cyberpunk and only cyber punk

Obama Reaction GIF


That must be some weapons grade copium you're huffing right now.
 
  • Like
Reactions: CLW

GHG

Member
If you read my post, I don't want a console equivalent (and I certainly don't want an equivalent that costs $200 more). I want something that can do better but at a reasonable price.

Define "reasonable".

You're essentially asking for >double console performance for <double the price.
 

GymWolf

Member
If you read my post, I don't want a console equivalent (and I certainly don't want an equivalent that costs $200 more). I want something that can do better but at a reasonable price.
A 6800 with fsr can probably do much better than any console in 4k.

You don't need ddr5 nor 32 gb, fast 16gb ddr4 is pretty cheap.

You don't need a z790 nor a z690 motherboard if you don't want to overclock and i think you can go away with a non-k i5 12gen or equivalent amd to save money and still being far superior to any console.

There are good cases way under 100 dollars.

I think you can make a more than decent build with not much more than 1000 dollars, less if you catch deals for every part.
 
Last edited:

Buggy Loop

Member
The irony of that coming from someone who desperately searches for hours to find a shitty youtube channel that has benchmark results that are 30% off reality while crying about a 12700k being a bottleneck.

And now it's got to be cyberpunk and only cyber punk

Obama Reaction GIF


That must be some weapons grade copium you're huffing right now.

Well you did say that you have aggregate results for cyberpunk 2077 right?

Ooohhh, you meant all RT games from all reviewers? Well that's what i was saying, useless. Doesn't represent hard hitting RT games because they're filled with RT shadow shit or even games that are broken on AMD like Farcry 6.
 

GHG

Member
TIL that $1500 max is < double the price of $500

That's not what you said you were willing to spend. You mentioned $800:

But to build a PC that's capable of that, it sure seems like I need to be spending $1500 bare minimum, if not closer to $2000.

All this research has been an exercise in frustration. I just wish Microsoft and/or Sony would come out with a super high end console for $800+, but there's probably a tiny market for that.
 

SatansReverence

Hipster Princess
Well you did say that you have aggregate results for cyberpunk 2077 right?

Ooohhh, you meant all RT games from all reviewers? Well that's what i was saying, useless. Doesn't represent hard hitting RT games because they're filled with RT shadow shit or even games that are broken on AMD like Farcry 6.
Cry Love GIF by Pudgy Penguins


Keep huffing.

I'm done here. You've been proven wrong beyond any question. The end.
 
Top Bottom