• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

TheContact

Member
What: AMD is holding an event for their RDNA2 GPUs.

When: Wednesday, October 28th at 12 PM EST / 4PM GMT [Edit: The Event has concluded but you can watch the VoD using the same link]

Where: AMD's official YouTube page [HERE]


Summary from the event

AMD Radeon RX 6800 (release date 11.18.2020 | Price $579 (around £440, AU$820)
  • 60 compute units
  • 1,815MHz game clock
  • 2,150MHz boost clock
  • 128MB Infinity cache
  • 16GB GDDR6 memory
  • 250W total board power


AMD Radeon RX 6800 XT specs: (release date 11.18.2020 | Price $649 (around £500, AU$1,000)
  • 72 compute units
  • 2,015MHz game clock
  • 2,250MHz boost clock
  • 128MB Infinity cache
  • 16GB GDDR6 memory
  • 300W total board power


AMD Radeon RX 6900 XT specs: (release date 12.8.2020 | Price $999 (£770, about AU$1,400)
  • 80 compute units
  • 2,015MHz game clock
  • 2,250MHz boost clock
  • 128MB Infinity cache
  • 16GB GDDR6 memory
  • 300W total board power
Images from the event

[GENERAL INFO]
VF8BxCU.png


[6800 XT STATS]
o1gXi3C.png

5UCneTs.png

uliQv2z.png


[COOL SHIRT]
ID2pfVc.png


[SMART ACCESS MEMORY (USES RYZEN 5000 CPU TO BOOST GPU PERFORMANCE)]
VyDpH1c.jpg

94TJ2W4.png


[REDUCED LAG TECHNOLOGY]
rLbcuJL.png


[HARDWARE BENEFITS]
wbvsSV8.png


[6800 XT PRICE ($649)]
JoJWqX0.png


[6800 (NON XT) STATS]
dng1RbS.png

q3Hz0PL.png

1JbRZ8F.png


[6800 NON-XT PRICE ($579)]
Jc2pbqm.png


!!BIG DADDY GPU ALERT!!

[RX 6900 XT STATS]

6VoZL81.png

rPhY1rv.png


[6900 XT PRICE ($999)]
BqLVcd5.png


[RELEASE DATES]
JFUUh7B.png
 
Last edited:

Papacheeks

Banned
When: October 28th at 12 PM EST

Where: The Event can be watched on AMD's official YouTube page [HERE]

What: AMD is holding an event for their RDNA2 GPUs. Their flagship GPU is set to compete with Nvidia's 3080.

This GPU uses the same RDNA2 architecture found in the next-gen systems, so even non-PCMR gamers might find interest in this

The supposed name of the flagship card is the RX 6900 XT which is set to compete with the 3080, and while it uses a slower speed GDDR6 when compared to Nvidia (which uses GDDR6X), it does have 16GB instead of 10 we find on the 3080.
The clock speeds are also supposedly up to 2.3Ghz (compared to Nvidia's 1.7Ghz).
On paper, this would make the 6900 XT more powerful than the 3080 but we'll have to wait for real world benchmarks to know for sure.

I would like to see them show off how their ray tracing works, specifically in comparison to Nvidia's.

06uGRonJpbZDiEmJm2CxqRs-2.fit_lim.size_768x99999.png

From other thread and igor's lab, it's better than turing in ray tracing performance, but not as good as ampere. But rasterization wize it's better than turing and in a good amount of test's beats ampere with less wattage.

Not including AIB cards which will clock higher.
 

regawdless

Banned
I don't think you got the boost clocks right. The 3080 boosts up to around 2ghz in game.

It's 1.71ghz on the spec sheet due to how their boost clocks are defined I think. Correct me if I'm wrong here though.
 

ethomaz

Banned
I don't think you got the boost clocks right. The 3080 boosts up to around 2ghz in game.

It's 1.71ghz on the spec sheet due to how their boost clocks are defined I think. Correct me if I'm wrong here though.
nVidia cards go over boost clocks... in fact it runs most of the time higher than it.
AMD cards runs most of time below the boost clock... more around the game clock.

They have different measurements.
nVidia boost clock is like the minimum it will reach in most of cases.
AMD boost clock is like the maximum it will reach in some very favorable scenarios (that is why they have Game clock that is more like the Boost clock from nVidia).
 
Last edited:

Bolivar687

Banned
It's been a long time since AMD broke my heart with Vega.

tumblr_oivi204PbP1tue45do1_500.gif


I have a GSync monitor now but none of that stuff really matters at high framerates anyway - just give me the performance and the availability and I'm there!!! :messenger_blowing_kiss:
 

Papacheeks

Banned
It's been a long time since AMD broke my heart with Vega.

tumblr_oivi204PbP1tue45do1_500.gif


I have a GSync monitor now but none of that stuff really matters at high framerates anyway - just give me the performance and the availability and I'm there!!! :messenger_blowing_kiss:

It's seriously good that there is competition again. Even if these dont flat out beat nvidia, it puts pressure on them to give us better bang for the buck. Which is something nvidia has been over charging for many years.

It's the reason I wont buy a nvidia card for a long time.

I've bought in my lifetime, voo-doo cards, 6800 GT, 7800GT, 7900 GTX, GTX 285, GTX 570, GTX 770. After the 700 series and the debacle with the memory on the 900 series I was done with nvidia for a while.
During my time with dx 9 cards i had a couple Radeon cards. And for the money an x1600pro by sapphire was surprising back in the day for it being 128bit.

Nvidia though a large company with a lot of investment in AI among other things, has become kind of an ass of a company in giving people actual bang for the buck.
And I've found that over time, Radeon cards seem to age better.

Waiting till spring to go all out on a new build to replace by Radeon VII. But will be buying close to the highest end radeon by Sapphire when they release.
 

Papacheeks

Banned
I'm a realist and I don't expect the Big Navi to beat Ampere, but nevertheless I'm hyped, it looks like they are slowly but surely catching up, and as oppose to RDNA1 cards we won't be getting an outdated hardware at launch this time around.

RDNA 1 beat on average a RTX 2070 Super. It just didnt have dedicated hardware for ray tracing but could do it through software.
 
nVidia cards go over boost clocks... in fact it runs most of the time higher than it.
AMD cards runs most of time below the boost clock... more around the game clock.

They have different measurements.
nVidia boost clock is like the minimum it will reach in most of cases.
AMD boost clock is like the maximum it will reach in some very favorable scenarios (that is why they have Game clock that is more like the Boost clock from nVidia).
Off topic, but those rare occasions that I can actually agree with ethomaz ethomaz , gives me hope for our society as a whole.
 
Nice, looking forward to seeing what they reveal and along with final configuration, software stack/features and pricing.

I get the feeling a lot of people expecting bargain basement pricing will probably be pretty disappointed.

If leaks are accurate (which they seem to roughly be?) then it looks like the 6800XT should be a solid competitor to the 3080 with Ray Tracing being a bit behind Ampere but ahead of Turing.

Which would be a massive win for AMD, especially when there were tons of people only expecting them to be able to compete with the 3070 at best.

I will be interested in seeing final naming scheme and for them to reveal their very top card. Is it a special edition of the 6800XT (same CUs, better binned, higher clocked?).

Is it the full 80CU (6900XT?) card vs a potential 72CU 6800XT?

I suspect the clocks on the reference models will sit somewhere between 2.1 - 2.2Ghz (game clock), maybe boosting occasionally higher?

AIB cards, OC's I'd assume according to leaks can reach around 2.3-2.4Ghz sustained clocks.

Just setting expectations correctly, as some might confuse the leaked clocks for AIB models as the reference model once the reveal happens and claim the cards don't live up to the hype. I'm looking at you longdi longdi :messenger_winking_tongue:
 
Last edited:
Yeah I think it is unfair to place the success or failure of large projects on just one person.

We see this a lot in things like famous directors for films or games (Speilberg, Kojima etc...) when it is normally a whole team responsible. So I think Raja probably gets a pretty raw deal in the head cannon of AMD super fans.

Having said that, he seemed to be the one pushing towards compute heavy cards with Vega and since he left Radeon group seems to have gone from strength to strength with RDNA. Of course this could be coincidental with the fact that AMD were actually profitable again thanks to Ryzen and had revenue to invest in R&D, new hires and expansion. In addition since joining Intel their GPU division hasn't seemed to move the needle yet, but that just again could be coincidence with Intel being Intel and floundering a lot as of late.

Either way, AMD looks like they have come a long way if all the leaks end up panning out.
 
Last edited:

Pagusas

Elden Member
So expectations:

1. I expect the 6900XT to trade blows with rasterized performance of the 3080 and likely be 5% better in a few case.
2. I expect it to be about 33% worse in RT cases.
3. I expect a DLSS alternative to be talked about but not launching until "2021 summer". I expect it to be a universal upscaling AI that is not game deepent (thus 99.9% compatible with games). I expect it wont look as good but itll do roughly good enough to check the box and please most people.
4. I expect an 6900XTX to be a limited availabity version with higher boost clocks and maybe more CUs? Doubtful but maybe.
5. I expect the 6900XT to be $799, and the 6900XTX to be $899. or $949.


and one thought idea: If they are using their default cores for RT calculations, I wonder if they can do a Crossfire solution that has one card dedicating its full power to RT operations? If so that could be monumental for alot of us, esspecially if you could pair different cards like a 6700 and a 6900 together. Yeah it wold cost alot, but who cares, we are enthusiast and people like me would buy a 3900rtx if it was available right now.
 
Last edited:

Senua

Member
So expectations:

1. I expect the 6900XT to trade blows with rasterized performance of the 3080 and likely be 5% better in a few case.
2. I expect it to be about 33% worse in RT cases.
3. I expect a DLSS alternative to be talked about but not launching until "2021 summer". I expect it to be a universal upscaling AI that is not game deepent (thus 99.9% compatible with games). I expect it wont look as good but itll do roughly good enough to check the box and please most people.
4. I expect an 6900XTX to be a limited availabity version with higher boost clocks and maybe more CUs? Doubtful but maybe.
5. I expect the 6900XT to be $799, and the 6900XTX to be $899. or $949.
I assume you mean the 6800xt?
 
and one thought idea: If they are using their default cores for RT calculations, I wonder if they can do a Crossfire solution that has one card dedicating its full power to RT operations? If so that could be monumental for alot of us, esspecially if you could pair different cards like a 6700 and a 6900 together. Yeah it wold cost alot, but who cares, we are enthusiast and people like me would buy a 3900rtx if it was available right now.
That would be HUGE and amazing if they implemented something like this.
 
For my expectations:

1. I expect 6800XT to match or slightly beat the 3080 in performance. (obviously depending on the title, some wins for both cards)

2. I expect there to be a card above the 6800XT, either a 6900XT or a 6800XT special edition with maybe higher base clocks...fast memory maybe? (16gbps vs 18?)

3. I expect infinity cache to be a real thing, talked about on stage and is how they are closing the bandwidth gap of a 256bit bus.

4. I expect RT to be about 20% behind 3080 for the 6800XT, so this will be most noticeable in fully path traced games (Quake, Minecraft). In hybrid rendering situations (ie: Most games) I expect that gap to shrink a bit (wasn't the real FPS difference for RT from Turing to Ampere in most games measured at around 10%? I remember reading that somewhere, who knows how much but Ampere clearly has the win with RT here over RDNA2.

5. I also expect some kind of DLSS competitor announced, possibly demoed on stage in one title and probably launching with a driver update in December/January. I think it will likely be almost as good as DLSS but work on every game at the driver level. My thinking is very similar to Pagusas Pagusas here, only I believe they will launch it earlier than summer.

6. I expect $649 for the 6800XT, if the higher card is a 6900XT and if it competes with 3090 and is not special edition of 6800XT then I expect a price for that to be $799-$1000

7. I expect lower or at most equal power draw across the stack to Ampere.

8. I expect some implementation of Microsoft's DirectStorage API, so just like RTX IO

9. I expect possibly some new unannounced software features or upgraded versions of existing features such as Anti-Lag. I expect these will likely be available as part of GPUOpen. This one is kind of vague I admit but if they are working on some new features for their suite then this would be the perfect time to announce them.
 

longdi

Banned
Nice, looking forward to seeing what they reveal and along with final configuration, software stack/features and pricing.

I get the feeling a lot of people expecting bargain basement pricing will probably be pretty disappointed.

If leaks are accurate (which they seem to roughly be?) then it looks like the 6800XT should be a solid competitor to the 3080 with Ray Tracing being a bit behind Ampere but ahead of Turing.

Which would be a massive win for AMD, especially when there were tons of people only expecting them to be able to compete with the 3070 at best.

I will be interested in seeing final naming scheme and for them to reveal their very top card. Is it a special edition of the 6800XT (same CUs, better binned, higher clocked?).

Is it the full 80CU (6900XT?) card vs a potential 72CU 6800XT?

I suspect the clocks on the reference models will sit somewhere between 2.1 - 2.2Ghz (game clock), maybe boosting occasionally higher?

AIB cards, OC's I'd assume according to leaks can reach around 2.3-2.4Ghz sustained clocks.

Just setting expectations correctly, as some might confuse the leaked clocks for AIB models as the reference model once the reveal happens and claim the cards don't live up to the hype. I'm looking at you longdi longdi :messenger_winking_tongue:

Hi you call me? :messenger_waving:

Just think if the reference game clocks are about 2ghz, why will amd leave performance on the table by letting Aib play up to 2.4ghz?

Amd hasn't got a big competitive success since 290x'ish. You will think they will go in hard on Nvidia asap.
 
Last edited:
Hi you call me? :messenger_waving:

Just think if the reference game clocks are about 2ghz, why will amd leave performance on the table by letting Aib play up to 2.4ghz?

Amd hasn't got a big competitive success since 290x'ish. You will think they will go in hard on Nvidia asap.

Ah you know it's all in good fun! Just ribbing you a little! :messenger_grinning_smiling:

Supposedly the rumour is that the cards overclock very well and AMD wanted to make AIBs happy by leaving them a little room to OC and differentiate their cards/brand vs the competition/reference. This also allows AMD to keep their official power draw lower and show good perf per watt in reviews while allowing AIBs to go balls to wall with higher clocks and power draw.

We have already seen leaks from people in possession of some AIB cards and they do in fact seem to clock at 2.3-2.4Ghz game clock so I'm just saying if the reference cards clock at 2.1 or 2.2Ghz I want to nip that narrative of underperformance or not living up to the hype train in the bud before it starts.
 
Last edited:

Boneyblaff

Member
I've got a Red Devil 5700xt, so the angel on my shoulder says I don't need to buy whatever they're peddling.... But I already have money aside, sooo there's that.
 

ethomaz

Banned
My expectations are low.

- RX 6800XT will trade blows with RTX 3070 being a bit better in some games.
- RX 6800XT will be a benchmark monster with scores similar or even higher than RTX 3080 in Firestrike.
- I expect a future card to match RTX 3080 performance in games.
- I expect it to be heavy limited in RT performance.
- No solution like nVidia DSLL yet.
 

Rentahamster

Rodent Whores
and one thought idea: If they are using their default cores for RT calculations, I wonder if they can do a Crossfire solution that has one card dedicating its full power to RT operations? If so that could be monumental for alot of us, esspecially if you could pair different cards like a 6700 and a 6900 together. Yeah it wold cost alot, but who cares, we are enthusiast and people like me would buy a 3900rtx if it was available right now.
That would be great if possible. I wonder if the increased bandwidth of PCIE 4.0 could make implementing something like that easier now than in the past.
 

Rentahamster

Rodent Whores
Now that the console release is soon, I expect AMD can be a bit more forthcoming about the new features that the consoles also have. I suspect that AMD had to schedule their announcements and info dumps around MS and Sony's schedule so as not to conflict with their plans about releasing info about their consoles.

I'm curious to see what fruits have been bared from AMD's collaborations with MS and Sony.
 

thelastword

Banned
My expectations are low.

- RX 6800XT will trade blows with RTX 3070 being a bit better in some games.
- RX 6800XT will be a benchmark monster with scores similar or even higher than RTX 3080 in Firestrike.
- I expect a future card to match RTX 3080 performance in games.
- I expect it to be heavy limited in RT performance.
- No solution like nVidia DSLL yet.
What is DSLL?
 
My expectations are low.

- RX 6800XT will trade blows with RTX 3070 being a bit better in some games.
- RX 6800XT will be a benchmark monster with scores similar or even higher than RTX 3080 in Firestrike.
- I expect a future card to match RTX 3080 performance in games.
- I expect it to be heavy limited in RT performance.
- No solution like nVidia DSLL yet.

Why are there so many uniformed opinions on this board? And just as bad, so many amd fanboys? Guys, there is a rational middle between these two things. Do some research before posting this nonsense.
 

Ascend

Member
My expectations are low.

- RX 6800XT will trade blows with RTX 3070 being a bit better in some games.
- RX 6800XT will be a benchmark monster with scores similar or even higher than RTX 3080 in Firestrike.
- I expect a future card to match RTX 3080 performance in games.
- I expect it to be heavy limited in RT performance.
- No solution like nVidia DSLL yet.
giphy.gif
 

Rentahamster

Rodent Whores
My expectations are low.

- RX 6800XT will trade blows with RTX 3070 being a bit better in some games.
- RX 6800XT will be a benchmark monster with scores similar or even higher than RTX 3080 in Firestrike.
Comparable to a 3070 in actual games but beating a 3080 in benchmark apps? What?

Maybe depending on if raytracing is involved, but that's a pretty big discrepancy.
 

llien

Member
Yeah I think it is unfair to place the success or failure of large projects on just one person.
Jim Keller?
Von Braun?
Korolev (Soviet Space Program)

There are people with monumental impact.
While it depends on work of many, most of them are easily replaceable.
 
Jim Keller?
Von Braun?
Korolev (Soviet Space Program)

There are people with monumental impact.
While it depends on work of many, most of them are easily replaceable.

I'm not disagreeing with you exactly, and personally I'm glad that Raja has moved on. I agree that certain key people can have gigantic impact if they have the right level of ability, vision, team behind them, backing from execs, finances and most importantly being at the right place at the right time to maximize that impact.

But at the same time, in a general sense, forgetting about AMD or GPUs for a moment, on a any large project often times a figurehead or the overall guy in charge tends to take the spoils and credit if things go well and also the blame and ire of management/the public if things go wrong. It's just a fact of life so I was just thinking maybe Raja is actually a really nice guy with a lot of talent who happened to be in the wrong place at the wrong time while a bunch of dominos were falling? Hard to say exactly, more just food for thought.
 

llien

Member
But at the same time, in a general sense, forgetting about AMD or GPUs for a moment, on a any large project often times a figurehead or the overall guy in charge tends to take the spoils and credit if things go well and also the blame and ire of management/the public if things go wrong. It's just a fact of life so I was just thinking maybe Raja is actually a really nice guy with a lot of talent who happened to be in the wrong place at the wrong time while a bunch of dominos were falling? Hard to say exactly, more just food for thought.
Well, that's because someone always leads and that someone does is not necessarily particularly good at it and looking from outside we can only guess.

My problem with Koduri is not only lackluster projects (Vega was outright terrible, RDNA on the same process beats 330mm2 chip with HBM2 with 250mm2 and GDDR), but also embarrassing comments and marketing.
Both of those things stopped once he was gone.
I doubt it is just a coincidence.
 

SantaC

Member
Jim Keller?
Von Braun?
Korolev (Soviet Space Program)

There are people with monumental impact.
While it depends on work of many, most of them are easily replaceable.
You forgot the most important one in history. Robert Oppenheimer
 

notseqi

Member
I just saw a documentary about manhattan project. It is easily the most difficult project ever handled.

I was in awe
It's bananas. Did you read the Feynman autobiography? 'Surely you're joking, Mr. Feynman!'. Sheds some light on how unattached they are to the political cause but using the cause to further their own ambitions. Great read.

edit: i a word
 
Last edited:
Top Bottom