• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 7950 XTX, 7950 XT, 7800 XT, 7700 XT, 7600 XT, 7500 XT RDNA Spotted

Kataploom

Gold Member
Not in RT tho just to clarify
That's what happens when a new generation launches. 4090 is over 50% faster than 3090 Ti in RT. AMD flagship is only matching Nvidia last gen flagship in RT.
I'd say up to 20% money is worth it, so $1200 vs. $1000, which happens to be the MSRP of both cards...
Or $1150 vs. $960, which happens to be the price you can get the cheapest card on Newegg today.
What if I don't care about RT? Most people out there don't enable RT after the first time just to see how hard it hits on performance, have a laugh and disable it again lol
 

Leonidas

Member
That’s what I paid for my 7900XT. Same price as what I paid for a 3070ti, and it’s been a pretty substantial upgrade.

Had that deal not come up I probably would’ve just gotten a 4070ti.

Got to say I am happy with it thus far. Driver support has been solid, and I think Adrenalin is much better than Control Panel. I’ve also undervolted and overclocked my card, which was so much easier than I expected. First time doing so.

I’m really hoping Intel’s Battlemage is a contender. Their upscaler is very good, and I think they’re hungry enough to deliver strong value.
If you got 7900 XT for $600 you got a great deal, at that price I would have done so too.

What if I don't care about RT? Most people out there don't enable RT after the first time just to see how hard it hits on performance, have a laugh and disable it again lol
Most people out there are running weak RT hardware. If people had more powerful hardware they might keep it on.

I always turn it on nowadays since my hardware is powerful enough for it.
 

hinch7

Member
If the 7700XT only has 8GB of memory, that would be a real shame.
12GB should really be the baseline by now. Even for people on a strict budget, 8GB is going to be ass for upcoming ports of current gen only titles.

Those lower end cards look largely pointless as well. Can't see them being much better value than current RDNA 2 offerings, if they are just higher clocked with the same specs. Honestly, this gen feels like a updated refresh than anything.
 
Last edited:

PC Gamer

Has enormous collection of anime/manga. Cosplays as waifu.
AMD... Yawn.

They've shown their pricing and performance relative to Nvidia is garbage time after time.
ezgif-4-e0967ed3feuuihb.gif
 

MikeM

Member
At what price point do you then say, goddamn I can't ignore this and just pick up an AMD gpu? Then, later curl up in the shower saying how dirty you feel?

All I ever see is people on 6-series or 7-series cards acting like they bought into the eco-system, so they have the same level of voice as those 9-series owners. The only reason they follow the competition is to see if it can light a fire under their preferred brand's ass to lower prices. Regardless if the competition has higher performance in raw metrics everywhere else God forbid that one metric has to be hyper focused to exhaustion.

Running with a 3070 Ti, 5600 xt, and 6700 xt I could give a rats ass about $1000 gpu. I broke my bar of $300 to pay $500 for a gpu these last two generations. I will be damned if I double it.

That said, I would like to see just how far you put the two brands apart. What is your preferred brand worth? X dollars more because you just trust it? where is the trust separator? Is it 3:1 2:1 1:1. Because world wide sales show that I doubt more than a few of you folks actually plopped money down to buy a 4080 let alone a 4090. Given that a majority of this site's user base is in the 35-50 age category I really wonder just when life's expenses have you double checking those price tags before you look at the brand label. Yet, people sit here and shit on product performance because a single metric is posting last generation's #s even though the generational uplift is 45-50%. People say, "No, not good enough!"

We obviously have a floor, and that floor is $350 thanks to Intel. If up to five cards are releasing this year, the stack is going to be hard squeezed between $970 and $350. So where would you put your hypocracy aside and buy the competition because the money is in the right place. Because at the end of the day it IS about what you think your purchase is valued at. Not metrics. Not brand loyalty. Not marketing gimmicks. What is your ceiling and what is your performance per dollar point?

I will go first. $500 per purchase and $7 per frame. Anything higher can piss right off.

As an aside, it is also psychotic to pay $20 for a movie ticket, $13 for a foot long sub, $7 for a basic latte, $4 for a loaf of bread. Times are just downright fucking bonkers and I hate it.
Even tho I like high end with my 7900xt, I still value having some money for other things. My 7900xt and the leap to a 4080 was gonna cost about $500 CAD after tax. Thats a lot of extra games for a minor uptick in perf.
 

Buggy Loop

Member
What if I don't care about RT? Most people out there don't enable RT after the first time just to see how hard it hits on performance, have a laugh and disable it again lol

Well then there’s VR performances, nvenc, DLSS, frame gen, superior reflex latencies. Resell value of CUDA and Machine learning for any professionals

I leave RT on all the time 🤷‍♂️ plus the cool path tracing projects.

What would I get for say, picking a 7900XTX? ~2-4% higher rasterization performances with higher power consumptions? Only to lock my monitor to 120 fps and waste more performances? The rasterization race is getting meaningless, these games perform well when it’s not a shit port. Games that stress hardware in sub 60 fps range are RT games and DLSS brings them back higher.

I don’t get why honestly.

And I was ATI-AMD for 20 years, from their Mach series up to 2016. It used to be just about fps vs $ vs power. ATI was very aggressive on price and I kept buying them. Things are a lot more complex now for decisions. I also went through the tougher driver periods of ATI/AMD.. I know RDNA 2 & 3 are pretty good now, but I also had my brother with a 5700XT with constant weird problems and black screens until he sold it. That was not that long ago. AMD subreddit had like 2 years worth of peoples having troubles with these cards.

RDNA 2 I even attempted to get because getting any GPU during that period was a win, but if you put a 6800XT and 3080 on the table in 2020 for $50 difference? Why would I go AMD?

I honestly don’t know what tour de force they have to do to gain massive market share back. Pity buying is not a solution.
 

Skifi28

Member
In the past have we ever had any other card where the successor was a downgrade? That 7700 is looking so sad.
 
If you got 7900 XT for $600 you got a great deal, at that price I would have done so too.


Most people out there are running weak RT hardware. If people had more powerful hardware they might keep it on.

I always turn it on nowadays since my hardware is powerful enough for it.
If price is no object by all means get a 4090 AMD can't match it.

But there are morons out there who will buy a 4060 without even looking at AMD.
 

Kataploom

Gold Member
Well then there’s VR performances, nvenc, DLSS, frame gen, superior reflex latencies. Resell value of CUDA and Machine learning for any professionals

I leave RT on all the time 🤷‍♂️ plus the cool path tracing projects.

What would I get for say, picking a 7900XTX? ~2-4% higher rasterization performances with higher power consumptions? Only to lock my monitor to 120 fps and waste more performances? The rasterization race is getting meaningless, these games perform well when it’s not a shit port. Games that stress hardware in sub 60 fps range are RT games and DLSS brings them back higher.

I don’t get why honestly.

And I was ATI-AMD for 20 years, from their Mach series up to 2016. It used to be just about fps vs $ vs power. ATI was very aggressive on price and I kept buying them. Things are a lot more complex now for decisions. I also went through the tougher driver periods of ATI/AMD.. I know RDNA 2 & 3 are pretty good now, but I also had my brother with a 5700XT with constant weird problems and black screens until he sold it. That was not that long ago. AMD subreddit had like 2 years worth of peoples having troubles with these cards.

RDNA 2 I even attempted to get because getting any GPU during that period was a win, but if you put a 6800XT and 3080 on the table in 2020 for $50 difference? Why would I go AMD?

I honestly don’t know what tour de force they have to do to gain massive market share back. Pity buying is not a solution.
The thing is that all of that is still very optional to most people... I'd take DLSS but if I can't run a game decently at quality settings, in which point I'd be fine with FSR 2.2 Quality too, then I'd rather avoid the game entirely until I can upgrade.

See:
- VR... Not my thing at all
- Nvenc, CUDA, etc. It could have been my thing years before but I'm not on video production anymore. I do a lot of 3D but those are for real time applications like games anyway.
- Frame gen... I avoid those tech like a plague, literally can't see the benefit as a consumer... As an Nvidia marketing representative? OC, there is.
- Reflex... Yeah, that one I'd buy but even then that doesn't make or break a gaming experience, it won't make you win or loose matches like 99.99999999% of the time
- Path tracing. Same than RT, still years before games get designed with it in mind
- Lower consumption... I'll give you this one about 7900 cards, they fucked up, but I rather wait and see how other cards do

Everything you mentioned is very optional for basically anyone gaming at least for the rest of this gen, they are like those things that it's cool to have but if someone only want to play games with decent performance, I don't see how AMD value isn't worth it.

BTW, RDNA 1 cards had a hardware bug IIRC, which earned AMD the "bad drivers" stigma, they deserved it imo, they now have amazing products on what matters to most people... Nvidia advantages are extremely niche imo, and each of them target a different very niche group, so I don't see how AMD is actually inferior for the common denominator of gamer.

You know what's actually a feature same gamers would rather have than all of that? A decent amount of VRAM.
 
Last edited:

Leonidas

Member
If price is no object by all means get a 4090 AMD can't match it.
AMD can't match any of the other 40-series cards either if you look at efficiency and RT performance per dollar and upscaling quality (DLSS2), which is why they have to price their cards cheaper.

But there are morons out there who will buy a 4060 without even looking at AMD.
Why do you say they are morons? Most people probably just want the best product for a certain price. 4060 Series will beat the AMD competing series on features and the 4060 will most likely end up being one of the best selling GPUs this generation like the 2060 and 3060 before it... it's not because people are morons, it's because Nvidia has the better product.
 
Last edited:
If it has 128bit bus it kinda have to have 8GB.

Only "ok" GPU on this list is 7800XT but price is not, rest is trash:

7B0MQUp.png
What's the source on this? All this time the 7700 XT was rumored to run a 192bit bus and 12GB VRAM. The original source at r/AMD mentioned in the OP does not mention any specifications as far as I can see, just the names of the various SKUs?

A 8GB 7700XT doesn't make sense on this product line-up either. Everything down from 7800XT with 16GB VRAM has only 8GB VRAM or lower?? Where are the 12GB VRAM cards??
 

hlm666

Member
I don't understand wtf is going on with AMD, playstation selling gangbusters, the 7000 series cpus have the edge over intel and rdna3 should have been able to compete with nvidia better than it could with rdna2 and they post their first loss in years for the last quarter. All their chiplet designs were supposed to be the land of high performance without the monolithic costs and even with the prices going up they seem to be making less profit and it would have been tragic without xilinx. Nvidia gets alot of shit but they execute from a business perspective better than amd and intel currently.
 

Bojji

Member
In the past have we ever had any other card where the successor was a downgrade? That 7700 is looking so sad.

4060 will be worse than 3060 in memory amount, but i can't recall something like that in the past.

What's the source on this? All this time the 7700 XT was rumored to run a 192bit bus and 12GB VRAM. The original source at r/AMD mentioned in the OP does not mention any specifications as far as I can see, just the names of the various SKUs?

A 8GB 7700XT doesn't make sense on this product line-up either. Everything down from 7800XT with 16GB VRAM has only 8GB VRAM or lower?? Where are the 12GB VRAM cards??

Yeah it doesn't make much sense, we will see.
 
AMD can't match any of the other 40-series cards either if you look at efficiency and RT performance per dollar and upscaling quality (DLSS2), which is why they have to price their cards cheaper.


Why do you say they are morons? Most people probably just want the best product for a certain price. 4060 Series will beat the AMD competing series on features and the 4060 will most likely end up being one of the best selling GPUs this generation like the 2060 and 3060 before it... it's not because people are morons, it's because Nvidia has the better product.
The 4060 will not do any ray tracing. In that segment ray tracing is just a marketing tool and morons fall for it.
 
The 4060 will not do any ray tracing. In that segment ray tracing is just a marketing tool and morons fall for it.
LMAO no, because Nvidia owners also have DLSS and you can use that to offset the performance penalty of RT with minimal IQ degradation.

I played through the entirety of Portal RTX on a gaming laptop with a 3070 Laptop, which is roughly analagous to a desktop 3060 Ti in performance. It was fine because I was in 1080p and DLSS Performance and it looked perfectly acceptable considering the amount of crazy-ass path tracing that was going on.
 

PeteBull

Member
3080ti owner here, playing in 4k, i never play with rt enabled, i do check how games look with it enabled, but in the end always prefer higher res/more fps, even with dlss enabled, rt is always off.
 
Top Bottom