• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Rikkori

Member
Since when does trade blows mean "wipe the floor" with someone? You are either trolling or just living in a dream reality.

VRS the future??? That future would of come into fruition a while ago if it were true. AMD is developing something to combat DLSS because they know they need to or else they would not have asked everyone to please wait until 2021 to see what they have. I have a feeling it will be competitve with DLSS.

Its good to see AMD competitive but its hyperbolic misinformation like this shit which leads to disappointment when people take it seriously.

I also agree that it "wipes the floor" since its slightly faster, cheaper, consumes less power, overclocks better and has 60% to 100% more Vram. What more could you possibly want?
Exactly. It's like when Olympic athletes beat each other by a second, it's not a big deal to us but in context that one second can seem the difference between the elite and an amateur. It's the same here, it's not gonna be like either AMD or NVD will come out with a 100% faster card than each other, but winning in ALL these metrics, it's basically a wipe-out. I mean really - Nvidia literally has nothing left on HW side where it wins! People are really underestimating what a gigantic achievement this is, and remember - Nvidia is no slouch, they've been pushing hard with every release!
 

BluRayHiDef

Banned
Microsoft's DirectML upscling vs bilinear:

carcompare.png


twocars.png




DLSS 2 is a glorified TAA => blurry, detail wiping transformation a very far cry from what insane people claim it to be.

Ahem.

 
While I think the RDNA2 cards are very impressive performance wise lets not go crazy.

They don't "wipe the floor" with Ampere, that's silly.

The 6800XT will trade blows with the 3080, game by game basis either one card or the other will "win" normally by a small number of frames or small percentage. Some will be so close it will essentially be a draw (1-3fps difference). If you turn on SAM and Rage Mode the 6800XT will generally pull ahead of the 3080 by a little, the 3080 will probably pull ahead in games with RT enabled by a little. We are not going to see night and day differences here people.

The 6900XT without SAM and RM on will likely be just a hair or so behind the 3090 in rasterization when all is said and done. Maybe like 2 or 3% behind overall? Possibly 5% at max. If you turn on SAM and RM it will generally close the distance to essentially be on par. Granted the card draws 50 watts less and costs $500 less for very close performance. If the OC headroom rumours are true then pushing up the power envelope to match 3090 should provide pretty great results.

And 6800 clearly beats 3070 with a very clear undisputed win.

Now given that these cards are cheaper, lower power draw, on a smaller chip/card and have less transistors, smaller bus, not to mention cool exclusive features like SAM and Infinity Cache, more VRAM (outside 3090) and that they are competitive in performance with Nvidia is pretty amazing.

But they are not going to demolish Nvidia or wipe the floor with the 3000 series in rasterization performance. Anyone expecting that is going to be disappointed come benchmarks day. They will be roughly on par on a game by game basis with wins and loses for each card. Depending on the titles benchmarked and how many of them there are, the 3000 series might actually pull ahead in the overall average by 1 or 2% margin potentially. Lets just wait and see final performance in real benchmarks, the impressive part is that both Nvidia and AMD are pretty much on par now in performance, which is great for everyone.

The cards reportedly do have a lot of overclocking headroom, so doing a manual OC, especially in the case of 6900XT which is against a 3090 with 50 more watts, should provide additional performance gains. I expect AIB OC models of all the 6000 series to really excel. However we have not seen "proof" of this necessarily with benchmarks/announcements from AIBs so lets not take it as confirmed fact until we get confirmation.

I think AMD seem to have done an amazing job here, especially seeing how far behind they were before and how much bigger Nvidia is as a company compared to the Radeon group inside AMD.
 
Last edited:

Lethal01

Member
but winning in ALL these metrics, it's basically a wipe-out. I mean really - Nvidia literally has nothing left on HW side where it wins!

I guess I haven't been paying attention.
Does the 6800XT have better Raytracing performance than the RTX3080?
 

llien

Member
DF shit, I won't even click it.

Why do we even need some youtuber to make up ou rminds?
Had the statement in the title been true, I wouldn't have spotted blurry mess and identified which of the pics is actual 4k and which is Nvidia's TAA derivative upscaling from 1440p, so quit pushing this BS.
 
Last edited:
I know November 18th (?) is when they'll be available, but will that just mean pre-orders are up, or will they come sooner? I am in the UK and want to order a 6800XT as it will be a nice upgrade to my 1080Ti for 4K. Not fussed about raytracing - just want a solid 4K60fps in as high a setting as possible.
 

llien

Member
done a quick SOTR raytraced shadows bench maxed at 4k on my rtx3080:

sotr4krtmjj9o.png


vs.

rx 6800 (presumably)

amd-radeon-rx-6800-toemj92.jpg

That's curious, although, exactly the same FPS looks fishy.

I don't think brute-force hardware RT as NV is pushing it is going anywhere, but it would be hilarious if AMD beats greens even at that.

That Zen3 cache inside might play a major role in it.


I know November 18th (?) is when they'll be available, but will that just mean pre-orders are up, or will they come sooner?
6800 and 6800XT "will go on sale" on 18th of Nov, 6900XT at 8th of Dec.
Whatever "will go on sale" means.
The statement would sound dumb, if it actually meant "would go on pre-order", but we'll see soon enough.

I personally expect sizable number of cards, definitely more than 3080 to be available for sale, and pre-orders won't be closed the way it was for 3080 as AMD actually wants to sell those cards at that price, unlike Nvidia.
 
Last edited:


This really sells the RX 6000 graphics cards. What they demonstrated:

- DXR 1.1 Ray tracing which the 6800XT seems to handle with ease
- Infinity Cache allowing ultra texture quality setting (4K X 4K texture sizes) eats 12GB of VRAM (this simply wouldn't be possible on 3070 with the same performance)
- Fidelity FX showing Contrast Adaptive Sharpening (CAS) looks very impressive, image going from washed out blurry to sharp and contrasted.
 
That's curious, although, exactly the same FPS looks fishy.

I don't think brute-force hardware RT as NV is pushing it is going anywhere, but it would be hilarious if AMD beats greens even at that.

That Zen3 cache inside might play a major role in it.

turned out that RT shadows were only set to high in the AMD bench. not ultra.
 

Elias

Member
This really sells the RX 6000 graphics cards. What they demonstrated:

- DXR 1.1 Ray tracing which the 6800XT seems to handle with ease
- Infinity Cache allowing ultra texture quality setting (4K X 4K texture sizes) eats 12GB of VRAM (this simply wouldn't be possible on 3070 with the same performance)
- Fidelity FX showing Contrast Adaptive Sharpening (CAS) looks very impressive, image going from washed out blurry to sharp and contrasted.
also this is without rage mode or SAM
 
also this is without rage mode or SAM

Well actually we don't know that for sure, it wasn't mentioned anywhere in the video. If this is a promotional partner video from AMD to hype their GPUs/this game then I would expect them to want every advantage they can get to show the best performance. In that sense I expect that this was using SAM+RM, nothing really wrong with that as they are real things you can turn on.
 

Elias

Member
Well actually we don't know that for sure, it wasn't mentioned anywhere in the video. If this is a promotional partner video from AMD to hype their GPUs/this game then I would expect them to want every advantage they can get to show the best performance. In that sense I expect that this was using SAM+RM, nothing really wrong with that as they are real things you can turn on.
I may have mispoke on the use of rage mode, but this definitely isn't using SAM. see the timestamp
 

FireFly

Member
Microsoft's DirectML upscling vs bilinear:
From the article:

"We couldn’t write a graphics blog without calling out how DNNs can help improve the visual quality and performance of games. Take a close look at what happens when NVIDIA uses ML to up-sample this photo of a car by 4x."

Based on their DirectML talk, Microsoft have so far only been showing results with Nvidia's DLSS model, applied through DirectML.
 
Last edited:

llien

Member
Based on their DirectML talk, Microsoft have so far only been showing results with Nvidia's DLSS model, applied through DirectML.
That is your conjecture, not something from the article.
There is no TAA/blurring to DirectML upscaling that DLSS 2.0 is using so heavily.
 

FireFly

Member
That's (at best) DLSS 1.0, that, you know what happened to that, right?
Ok, so it's conceivable that the Nvidia model used to produce the car screenshot in 2018 is somehow more accurate than the one Microsoft showed off in 2019. But:

1.) If it was so much better why aren't Nvidia using it themselves, since it was their model?
2.) Who cares anyway, since Nvidia will never give away their technology, and especially not to a competitor? The Super Resolution feature that AMD releases will depend on what AMD and Microsoft can create together, independently of Nvidia.
 

llien

Member
F FireFly
In the video they mention that even 1080p upscale takes 15ms on a 2080Ti.
That would be 60ms for 4k upscaling.

Not usable in games at this perf levels.

Code is here:

Contrary to your assumptions, it includes the weights and topology, so the "model".
 

The Skull

Member
If Pascal users didn't upgrade to Turing GPUs due to their poor RT performance, what makes you think they'll want to upgrade to a GPU that's not even clear whether it'll have RT at all due to the APIs used in games?

What??? The 6000 series will have native ray tracing support for all DXR games. It's only Nvidia's proprietary crap it won't support. Saying it won't have it at all is just wrong.
 

ZywyPL

Banned
What??? The 6000 series will have native ray tracing support for all DXR games. It's only Nvidia's proprietary crap it won't support. Saying it won't have it at all is just wrong.

That's exactly my point, some games will support it, the performance is still yet to be revealed, but there's a big chance it won't be available in others at all, just like let's say HairWorks in TWC3, then what? Because we're talking about 500-1000$ at stake, so sorry, but AMD has a lot to prove if they want to win gamers wallets.
 

Elias

Member
That's exactly my point, some games will support it, the performance is still yet to be revealed, but there's a big chance it won't be available in others at all, just like let's say HairWorks in TWC3, then what? Because we're talking about 500-1000$ at stake, so sorry, but AMD has a lot to prove if they want to win gamers wallets.
All rtx games use drx
 

Ascend

Member
If Pascal users didn't upgrade to Turing GPUs due to their poor RT performance, what makes you think they'll want to upgrade to a GPU that's not even clear whether it'll have RT at all due to the APIs used in games?
Pascal users didn't refrain from upgrading to Turing because of the poor RT performance. They refrained from upgrading to Turing due to the poor rasterization performance for the price.
 
That's exactly my point, some games will support it, the performance is still yet to be revealed, but there's a big chance it won't be available in others at all, just like let's say HairWorks in TWC3, then what? Because we're talking about 500-1000$ at stake, so sorry, but AMD has a lot to prove if they want to win gamers wallets.
They would have to pull a very huge performance gap for me. Especially after the 5700XT fiascos. I can't imagine spending that much on a GPU, just to deal with driver issues, black screens, crashes, etc, and no raytracing or AI upscaling. Upgrading from Pascal only led to Turing, as you couldn't get better GPU's from anyone else.
 

spyshagg

Should not be allowed to breed
Pascal users didn't refrain from upgrading to Turing because of the poor RT performance. They refrained from upgrading to Turing due to the poor rasterization performance for the price.

Correct.

The Raytracing "weight" changed only when AMD revealed GPUS with the same rasterisation as ampere for less money, last week.
 

llien

Member
If Pascal users didn't upgrade to Turing GPUs due to their poor RT performance, what makes you think they'll want to upgrade to a GPU that's not even clear whether it'll have RT at all due to the APIs used in games?

That Pascal users didn't upgrade to Turing, that opened new frontiers on overpriced cards, "due to RT" is your, rather naive, assumption.
 
Last edited:

ZywyPL

Banned
They would have to pull a very huge performance gap for me. Especially after the 5700XT fiascos. I can't imagine spending that much on a GPU, just to deal with driver issues, black screens, crashes, etc, and no raytracing or AI upscaling. Upgrading from Pascal only led to Turing, as you couldn't get better GPU's from anyone else.

That Pascal users didn't upgrade to Turing, that opened new frontiers on overpriced cards, "due to RT" is your, rather naive, assumption.

What if price isn't an issue? Because it isn't for many PC games with top-tier 3-5k$ rigs, that get replaced in 2-3 years with another, new top-tier hardware, and again, and again? I'm fully aware that Turing cards broke the limit for many people of how much they are willing to spend on a PC component, but many wouldn't bat an eye for more or less the same performance but with fancy RT effects instead. Bare in mind a ton of initial 2080Ti benchmarks showed 100-150FPS at 4K, so it was absolutely brutal, but then the first wave of RT benchmarks with a mere 35-40FPS completely killed the hype, barely anyone wanted to go from high-refresh gaming experience to console-like level of performance, especially at that price. And I feel the situation might repeat again with the upcoming Big Navi cards, where the initial benchmarks show remarkable performance, but then when RT benchmarks show up, they can single-handedly kill all the hype for those cards as well. If AMD was confident they would've show some potential benchmarks, but instead they are giving us some strange hints that RT might not even be toggleable in some titles, that's not how you sell you product to people...
 

llien

Member
What if price isn't an issue?
Because it isn't for many PC games with top-tier 3-5k$ rigs
It is somewhat "what if sun was cold" question.

I have paid $4k for a TV, then my son broken it with, cough, a broken sword (one part flew out of , and I immediately bought a replacement (same model, it was $1k cheaper though).
That doesn't mean though, that I simply go and grab highest priced crap available.

Had price not mattered to people, only 1080Ti and later (much more expensive) 2080Ti (which is quite a bit faster than the former) would be sold.
Mid and low end simply wouldn't exist, since "price isn't an issue".

2080Ti is 30-50% faster than 1080Ti which by itself is great a reason to upgrade.
But 2080Ti costs twice as much.

And if you think "not so cool RT performance" (what RT performance does 1080Ti have pretty please?) and not double the price affected the sales, that's a naive take in my books.
 
It is somewhat "what if sun was cold" question.

I have paid $4k for a TV, then my son broken it with, cough, a broken sword (one part flew out of , and I immediately bought a replacement (same model, it was $1k cheaper though).
That doesn't mean though, that I simply go and grab highest priced crap available.

Had price not mattered to people, only 1080Ti and later (much more expensive) 2080Ti (which is quite a bit faster than the former) would be sold.
Mid and low end simply wouldn't exist, since "price isn't an issue".

2080Ti is 30-50% faster than 1080Ti which by itself is great a reason to upgrade.
But 2080Ti costs twice as much.

And if you think "not so cool RT performance" (what RT performance does 1080Ti have pretty please?) and not double the price affected the sales, that's a naive take in my books.
Pays 7k for tv's, but doesn't understand spending more than $250 for a gpu, and tries to call out people who love enthusiast pc gamers. Color me fucking blind. On top of that, but shilling for AMD even long before RX 6000 being announced.... You already lost all credibility after the whole Death Stranding debacle and not being able to tell the difference between the most obvious differences on that $4k TV.... I'm lost for words








H13EBXy.jpg
 
Not sure which part of "if price didn't matter, only high end parts would exist" was hard to comprehend.

Strong butthurt is scary to watch.
Butthurt definitely is scary to watch when called out on bs, definitely agree there. We all pick and choose what our priorities are and what we spend money on. Its petty as hell to hate on enthusiast gamers decisions, while criticizing them, while also having a huge bias to back it up. Can't spend 7k on tv's, but hate on people who want the best raytracing for a mere fraction of the price you paid. That's the definition of butthurt in a nutshell.
 

llien

Member
Its petty as hell to hate on enthusiast gamers decisions, while criticizing them
All of this is your imagination.
Nobody singled out "enthusiast gamers" in this thread, and you were the only user in the thread to criticize somebody's buying habits.
Calm down.

1080Ti didn't have any RT.
2080Ti was nearly 50% faster at 4k.

I find it obvious it sold less because 2080Ti costs twice as much as 1080Ti.
If somebody thinks "it was because RT performance wasn't there" coupled with "price doesn't matter" coupled with "50% faster performance is not a good reason to upgrade", I can shrug it off as naive, but I won't lose sleep over someone thinking that.
 
All of this is your imagination.
Nobody singled out "enthusiast gamers" in this thread, and you were the only user in the thread to criticize somebody's buying habits.
Calm down.

1080Ti didn't have any RT.
2080Ti was nearly 50% faster at 4k.

I find it obvious it sold less because 2080Ti costs twice as much as 1080Ti.
If somebody thinks "it was because RT performance wasn't there" coupled with "price doesn't matter" coupled with "50% faster performance is not a good reason to upgrade", I can shrug it off as naive, but I won't lose sleep over someone thinking that.
You've definitely done it in many threads now. Let's not pretend here. I'm not criticizing you for spending money on your TV dude. I'm only pointing out your blatant contradictions is all.

We all know enthusiast cards never sell as much as other models, no surprise there... Unless you think the 6900xt will sell much more than the 6800....??

I upgraded from the 1080ti to 2080ti because it has better rasterization and supports raytracing and DLSS. It also gave me better frames than anything the 5700xt could, by a long shot, which was AMD best card at the time. Now it's a much closer lineup between the two companies, and all comes down to those who want to have better raytracing and AI upscaling or a little faster rasterization.
 
Last edited:

mitchman

Gold Member
That Pascal users didn't upgrade to Turing, that opened new frontiers on overpriced cards, "due to RT" is your, rather naive, assumption.
Having a 1080, I didn't upgrade because the performance increase didn't warrant the price increase. I will upgrade now to either 3080 or 6800XT, depending on which pre-order arrives first.
 

GymWolf

Gold Member
So uh, is it true that amd is not gonna have scarcity problems right??
I'm seeing this narrative in memes and gif about big navi but i don't know how true this is...
 
So uh, is it true that amd is not gonna have scarcity problems right??
I'm seeing this narrative in memes and gif about big navi but i don't know how true this is...

I think they will have a hell of a lot more stock than Nvidia, but given the pent up demand for high end GPUs especially with nobody able to buy 3080s etc.. I think they will sell out whatever they make regardless of the number.

There will be shortages I think, plus they are providing the same 7nm chips for Zen 3 and the consoles.
 

GymWolf

Gold Member
I think they will have a hell of a lot more stock than Nvidia, but given the pent up demand for high end GPUs especially with nobody able to buy 3080s etc.. I think they will sell out whatever they make regardless of the number.

There will be shortages I think, plus they are providing the same 7nm chips for Zen 3 and the consoles.
Fuck, so even if i go with amd i still need to wait 2-3 months for a restock at decent not-inflated prices...
 

Ascend

Member
I think they will have a hell of a lot more stock than Nvidia, but given the pent up demand for high end GPUs especially with nobody able to buy 3080s etc.. I think they will sell out whatever they make regardless of the number.

There will be shortages I think, plus they are providing the same 7nm chips for Zen 3 and the consoles.
I agree... But I think restocking should go better than nVidia, because the TSMC 7nm node has been used for quite a while by AMD and they likely have optimized production and shipment. Yields at Samsung must be terrible if nVidia has so much trouble with restocking.

I am waiting for a Sapphire Nitro anyway, if I decide to get the 6800XT.
 
Top Bottom