• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Between 3070 and 3080 performance with actual availability to purchase and I will buy.

I tend to hear this phrase repeated a lot, but between 3070 and 3080 is a pretty huge performance range.

For example if the 6800XT was say.. 1% slower on average than the 3080, technically that would be "between 3070 and 3080" in performance, while in the real world that would be on par performance wise. Similarly if it was say 3-5% slower, or any other small % metric.

I'm not speaking about you specifically with this next part as I don't believe what I'm about to say is your intent or thinking process, but I do think that some people state this to try and potentially downplay the amazing win this would be from AMD and especially the leap in performance from a 5700XT to a 6800XT.

When people say "between X and Y" the image it invokes tends to be sitting somewhere in the 50% mark (perhaps 40-60%), so whenever I hear that phrase it tends to make me believe the person using it is trying their best to downplay the performance of a potential 6800XT card for example.

Interestingly, if the 6800XT or any other AMD card was say 1-5% faster than a 3080, I get the impression the same people stating "between 3070 and 3080" would instead be mentioning "basically on par" in terms of performance and not say that the 3080 was "between a 3070 and 6800xt". Just an interesting observation I've noticed.

Again, I don't believe this is the case with you GHG GHG so please do not take offense, I believe you are genuinely stating that somewhere in that range would tick the boxes needed for you to buy. You just happened to be the most recent person to state that phrase which I have been meaning to repond to in the abstract for some time. Sorry if this came off as a jab at you in any way.
 

FireFly

Member
I tend to hear this phrase repeated a lot, but between 3070 and 3080 is a pretty huge performance range.

For example if the 6800XT was say.. 1% slower on average than the 3080, technically that would be "between 3070 and 3080" in performance, while in the real world that would be on par performance wise. Similarly if it was say 3-5% slower, or any other small % metric.
I suspect if the difference was within 5%, people would say the 6800 XT was on par, especially if the 6800 XT won its fair share of games. Since the mean is heavily influenced by outliers and a couple of bad losses could put the 6800 XT below the 3080, even if it wins in 50% or more of the titles.
 

GHG

Gold Member
I tend to hear this phrase repeated a lot, but between 3070 and 3080 is a pretty huge performance range.

For example if the 6800XT was say.. 1% slower on average than the 3080, technically that would be "between 3070 and 3080" in performance, while in the real world that would be on par performance wise. Similarly if it was say 3-5% slower, or any other small % metric.

I'm not speaking about you specifically with this next part as I don't believe what I'm about to say is your intent or thinking process, but I do think that some people state this to try and potentially downplay the amazing win this would be from AMD and especially the leap in performance from a 5700XT to a 6800XT.

When people say "between X and Y" the image it invokes tends to be sitting somewhere in the 50% mark (perhaps 40-60%), so whenever I hear that phrase it tends to make me believe the person using it is trying their best to downplay the performance of a potential 6800XT card for example.

Interestingly, if the 6800XT or any other AMD card was say 1-5% faster than a 3080, I get the impression the same people stating "between 3070 and 3080" would instead be mentioning "basically on par" in terms of performance and not say that the 3080 was "between a 3070 and 6800xt". Just an interesting observation I've noticed.

Again, I don't believe this is the case with you GHG GHG so please do not take offense, I believe you are genuinely stating that somewhere in that range would tick the boxes needed for you to buy. You just happened to be the most recent person to state that phrase which I have been meaning to repond to in the abstract for some time. Sorry if this came off as a jab at you in any way.

No offense taken.

Bang smack in between the 3070 and 3080 (somewhere around the 50% mark like you stated) is the minimum amount of performance uptick I need from my current GPU to make it a worthwhile upgrade.

Of course the more performance the better though, so if it's trading blows with (or betters) the 3080 then there's nothing to complain about and I'll be very happy.

The main thing is I need to actually be able to purchase something instead of seeing out of stock everywhere.
 

supernova8

Banned
I feel like price will be the massive deciding factor for getting this. Fingers crossed it's a decent one

My spidey sense tells me raytracing will be quite important once the consoles are out and there's a baseline of machines out in the wild that can all actually do raytracing. We'll end up with "basic" raytracing on consoles and then it'll be scaled up on PC potentially all the way to proper GI for any (future) GPUs that can handle it. So yeah you might say price but I'm actually starting to think ray tracing performance is quite important.

Don't want to splash on a GPU from AMD if I'm gonna get fucked over on the RT side of things like 2 or 3 years down the line.
 

geordiemp

Member
Hope we get some notes on Navi 21 Lite.

I doubt it, they are presenting high performance RDNA2 cards for PC. They all SEEM to have the same pattern of SE, front and back end with same shader array configs of 10 CU per shader array and scaled similarly it seems.

I just hope we get a peak at Navi 22, but it will be easy to infer anyway with such scaling and driver data.

 
Last edited:

llien

Member
The card that scores around GTX 1080 levels in Synthetics then gets around GTX 1080 levels in gaming?

That Vega card or another one?
Fair enough, my memory failed me.

6800 12GB - 3070 +/- 5% $450
6800 XT 16GB - 3080 -5% $600
6900 XT 16GB - 3080 +10% $900

6800 12GB - 3070 +/- 5% $$499
6800 XT 16GB - 3080 +-5% $750 (cause you can't really buy 3080 and +6GB)
6900 XT 16GB - 3090 +-5% $899 (cause 3080 + 10% is 3090)
 
Last edited:
Don't want to splash on a GPU from AMD if I'm gonna get fucked over on the RT side of things like 2 or 3 years down the line.

While I understand the sentiment, and can't say people are wrong if they choose to buy Nvidia cards for the superior RT performance, at the same time it depends on how long you intend to keep whatever Nvidia/AMD GPU you decide to buy this cycle.

The release cadence is supposedly becoming more rapid with RDNA3/Hopper supposed to launch by end of 2021 I think? (based on current projections/roadmaps to the best of my knowledge).

Those cards will likely increase RT performance even more than these lastest cards from AMD/Nvidia.

In 3 years time there could even be another refresh again, RDNA4/5000 series from Nvidia. So it will definitely depend on how long you intend to keep the card and how much RT performance is pushed more, with consoles as a baseline I think you would be fairly safe with either card from Nvidia or AMD this gen and not feel like you are missing out on features or lagging behind in performance.
 

FranXico

Member
I doubt it, they are presenting high performance RDNA2 cards for PC. They all SEEM to have the same pattern of SE, front and back end with same shader array configs of 10 CU per shader array and scaled similarly it seems.

I just hope we get a peak at Navi 22, but it will be easy to infer anyway with such scaling and driver data.


Kitty waiting for confirmation of his/her assertions I see.
 
Last edited:

geordiemp

Member
While I understand the sentiment, and can't say people are wrong if they choose to buy Nvidia cards for the superior RT performance, at the same time it depends on how long you intend to keep whatever Nvidia/AMD GPU you decide to buy this cycle.

The release cadence is supposedly becoming more rapid with RDNA3/Hopper supposed to launch by end of 2021 I think? (based on current projections/roadmaps to the best of my knowledge).

Those cards will likely increase RT performance even more than these lastest cards from AMD/Nvidia.

In 3 years time there could even be another refresh again, RDNA4/5000 series from Nvidia. So it will definitely depend on how long you intend to keep the card and how much RT performance is pushed more, with consoles as a baseline I think you would be fairly safe with either card from Nvidia or AMD this gen and not feel like you are missing out on features or lagging behind in performance.

The big unknown is AMD ray performance which has not been shown, all we can go on is ps5 console stuff shown which is not too shabby for a console APU.

It will be interesting to see what these big caches or shared caches (or both) bring to the performance table, especially for BVH efficiency.

Kitty waiting for confirmation of his/her assertions I see.

Safe bet, as so far all the Navi 21, 22 and 23 leaks seem to scale the same way. So something with 60 CU is a bet it will do same , but that seems without any CU disabled so.....who knows.
 
Last edited:
Fair enough, my memory failed me.



6800 12GB - 3070 +/- 5% $$499
6800 XT 16GB - 3080 +-5% $750 (cause you can't really buy 3080 and +6GB)
6900 XT 16GB - 3090 +-5% $899 (cause 3080 + 10% is 3090)

Actually you're right on the 6800 XT, $600 would be unrealistic, maybe $650 for a base SKU without any stock (to use Nvidia's game), but $750 for AIBs? We'll see soon...
 

vkbest

Member
3. I expect a DLSS alternative to be talked about but not launching until "2021 summer". I expect it to be a universal upscaling AI that is not game deepent (thus 99.9% compatible with games). I expect it wont look as good but itll do roughly good enough to check the box and please most people.

DLSS 2 works nice because they have a model based on textures from each game, its faster and more efficient. What kind of model would you have to get a good scaling AI for all games?
 
All I know is that AMD better be extremely cheap and absolutely kills Nvidia lineup in performance in every metric like some of these fanboys claim. If not, I'm definitely gonna call out some people. ~3 hours to go!
 

llien

Member
Actually you're right on the 6800 XT, $600 would be unrealistic, maybe $650 for a base SKU without any stock (to use Nvidia's game), but $750 for AIBs? We'll see soon...

I think easily. Cool running card, more RAM, actually available, beats $1200 2080Ti by 20-30%.

3080 might be a $699 card the way 2080Ti is $999, chuckle.

DLSS 2 works nice because they have a model based on textures from each game
FUD.
You describe 1.0, which failed miserably.
 
Last edited:

geordiemp

Member
DLSS 2 works nice because they have a model based on textures from each game, its faster and more efficient. What kind of model would you have to get a good scaling AI for all games?

Just temporal and a bit of ML, go watch DF latest spiderman analysis, they had no idea what Spiderman was doing or could tell.

Tech already exists in games already, so....

Spiderman....Was that using an Insomniac special technique, or is it in the hardware ? Nobody knows.
 
Last edited:
Just temporal and a bit of ML, go watch DF latest spiderman analysis, they had no idea what Spiderman was doing or could tell.

Tech already exists in games already, so....

Spiderman....Was that using an Insomniac special technique, or is it in the hardware ? Nobody knows.
Special technique? The ray traced reflection resolution was higher than the actual object. Or it's the opposite way around, by a humongous shot?
 
Last edited:
DLSS is too much of a big deal for me in case amd has no equivalent to offer

They've got some sharpening tech they should show in today's show. Not much is known about it, but they're not stupid, they know they have to have some answer to DLSS 2.0 even if it's half as good. I personally would take something that is inferior but can be used in most games over something that is only implemented in about 1% of titles.
 
All I know is that AMD better be extremely cheap and absolutely kills Nvidia lineup in performance in every metric like some of these fanboys claim. If not, I'm definitely gonna call out some people. ~3 hours to go!

Who claimed they were going to beat Nvidia on every metric?

Already we pretty much know they are going to lose at Ray Tracing for example.

The cards will likely trade blows at 4K with it being game by game wins for each card. One card or the other might end up with a slight advantage overall depending on how many games are benchmarked once we get 3rd party benchmarks.

Regarding price they could potentially match Nvidia but I think they will likely be $50 lower.

Power draw and efficiency is up in the air but I think these cards will most likely draw a little less than Nvidia, but who knows maybe they match or even draw more?

AMD will probably have a slight performance advantage at 1440p.

They will almost assuredly clock higher than Nvidia.

At least below the 3090 tier they will have more memory.

Only like 3 hours to go! I'd be curious to see what other metrics you are talking about here and who was claiming AMD would beat Nvidia in all of these?
 

tusharngf

Member
6800 12GB - 3070 +/- 5% $450
6800 XT 16GB - 3080 -5% $650
6900 XT 16GB - 3080 +10% $900

If true !! count me excited
giphy.gif
 
Looks like the naming is confirmed for the 6800XT (72CU?) and 6800 (60-64CU?)



The only question left is the rumoured AMD only Navi 21 XTX card.

Is it 80CU 6900XT?
Or is it 80CU special edition 6800XTX?
Or does it have the same CU count as 6800XT but higher clocks/binned chips?
 
Last edited:
I find XT XTX naming dumb.
Ok I can take XT as "level up" of sorts, but one is enough.
6800 XTX should better be 6900.

What makes it even more confusing is that the code naming of the die configurations don't necessarily match the retail card name.

We know that there are three Navi21 variants:
Navi21 XTX
Navi21 XT
Navi21 XL

The XT seems to correspond to the 6800XT, while the XL seems to correspond to the 6800.

Will they actually use "XTX" for the retail name of the top card?
 

AGRacing

Member
I want AMD to get cocky about stock on these cards... take some shots... "We have hundreds upon hundreds of thousands of these ready to go - I mean we went crazy - don't bother scalpers".
 
I want AMD to get cocky about stock on these cards... take some shots... "We have hundreds upon hundreds of thousands of these ready to go - I mean we went crazy - don't bother scalpers".

Even if they have far more stock than Nvidia did (and I think they most likely do), I think there will still probably be shortages after a while due to pent up demand for GPUs, people switching from Pascal/Polaris and all the extra lockdown money burning a hole in everyone's pockets as they can't go outside and spend it.

I mean I could be wrong but I expect there will still be shortages of some variety with these cards.
 

Dodkrake

Banned
Navi 21
- 6800: 64CU
- 6800 XL: 72CU
- 6900 XT: 80CU

Navi 22
- 6700: 36CU
- 6700 XTL: 40CU

Navi 21 Lite
- 56 CU

This is what we now so far of the 22 and 21 cards. The 6800 CU counts just leaked. Navi 21 Lite will likely not be showed.
 

duhmetree

Member
Navi 21
- 6800: 64CU
- 6800 XL: 72CU
- 6900 XT: 80CU

Navi 22
- 6700: 36CU
- 6700 XTL: 40CU

Navi 21 Lite
- 56 CU

This is what we now so far of the 22 and 21 cards. The 6800 CU counts just leaked. Navi 21 Lite will likely not be showed.
So, the PS5 is a 6700?

Final guesses...
6800 - $399 - 3070 comparable
6800XL - $599 - 3080 comparable
6900XT - $799 - 3090 comparable
 
Last edited:

Oh I see, that just confirms the name and memory size for 6800 and 6800XT.

The CU counts shown in the chart are just the current speculation based on the best rumours that we have. The actual CU counts have not been officially confirmed, although most people well connected are betting on 80/72/64
 
I don't care about the price. I care about availability. Give me on pair to Nvidia performance and ability to JUST BUY IT and you'll have my money, AMD.
 
  • Like
Reactions: GHG

Dodkrake

Banned
Oh I see, that just confirms the name and memory size for 6800 and 6800XT.

The CU counts shown in the chart are just the current speculation based on the best rumours that we have. The actual CU counts have not been officially confirmed, although most people well connected are betting on 80/72/64

Is it not based on a listing as mentioned? Because the rumors didn't point to a 64CU card.
 

AGRacing

Member
... and all the extra lockdown money burning a hole in everyone's pockets ...

Gee... I'm glad so many of you made extra $$$ during the lockdown .... enough where it is a known factor. My family had to dig into savings. Not blaming you... but SMH... that really annoys the hell out of me.... especially since I will bet you my taxes get spiked yet again to pay for all these apparently government funded PC upgrades. Ridiculous.
 
Gee... I'm glad so many of you made extra $$$ during the lockdown .... enough where it is a known factor. My family had to dig into savings. Not blaming you... but SMH... that really annoys the hell out of me.... especially since I will bet you my taxes get spiked yet again to pay for all these apparently government funded PC upgrades. Ridiculous.

I'm sorry to hear about your situation, I certainly wasn't trying to rub it in or anything of that nature.

Right now the majority of people are still working in some capacity, if their earnings are unaffected then it means they would normally be going to restaurants, bars, cinema, shopping more etc...

The fact that many people are not doing that during lockdowns means they likely have a surplus of income compared to how they normally operate.

These GPUs and the 3000 series are luxury high cost items, it makes sense that when talking about cards that start (right now anyway) at $500 and go all the way up to $1500 that it would be people with enough disposable income to afford such cards. I'm certainly not rich or in a 6 figure job, I live in Ireland and if you are from the US then the average wage here is likely much lower than in your country (depends on state and city I guess).

Luckily me and my wife are both still employed right now, I likely won't be buying a new GPU until march of next year at the earliest so I'm in no rush but it makes sense that 90% of people in these threads would have enough disposable income in the current economic climate to afford these cards.

I'm sorry about your situation and I genuinely hope you and your family the best, but coming into a luxury electronic non necessity item thread on a gaming forum to browbeat people over offhand comments including their predictions for sales/shortages of these luxury items given big changes to society/the economy currently and with the fact that most people are still working in some capacity is a little much.

Not trying to start anything with you and I do hope things improve for you and your family, but I don't really want to get into a big corona virus argument right now. We should all be excited about the prospect of these cards as gaming enthusiasts regardless of our financial or other real life situations.
 
Top Bottom