• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

BluRayHiDef

Banned
Does anyone think that the benchmark results will hold up when independent reviewers get a hold of the cards? Also, how bad do you think the ray tracing benchmarks will be, particularly in Control?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Clockspeeds are key to RNDA 2

Wonder which APU runs at RDNA 1 speeds


tenor.png

Yo!
 
Does anyone think that the benchmark results will hold up when independent reviewers get a hold of the cards? Also, how bad do you think the ray tracing benchmarks will be, particularly in Control?
Benchmarks will always be higher than actual performance when given to reviewers. Whether it be AMD, Nvidia, Intel, etc. I honestly don't even want to think of how control will play with RT, as it's pointless without DLSS/alternative. With AMD's gpu's, I'd imagine you would can only play at lower resolution with RT enabled, or high resolution with no RT, but not both.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Does anyone think that the benchmark results will hold up when independent reviewers get a hold of the cards? Also, how bad do you think the ray tracing benchmarks will be, particularly in Control?
Ill tell you this. Few people give a shit about ray tracing. Most will glady turn it off to get the performance.
 

notseqi

Member
Cerny worked on PS5 and RDNA2 at the same time licensing AMazeDtek back to AMD.

Fuck RT until the FPS pick up significantly. Shitty DLSS as a dumb holdover because they were scared to get pipped in high res.
 

Malakhov

Banned
But the performance is better. Look at Watch Dogs Legion that just came out. Extremely heavy on hardware at max settings. Cant be played 4k/60 on anything, native. But nvidia has DLSS and Raytracing, which permits the game to run at 4k/60. AMD will launch its 1000 dollars card and it will play this game at 1440p in order to be playable and without raytracing.
 
In this topic i learned that the red team vs green team war is as hilarious as console war.

please continue 🕺
It really is. I let it be known a while back that I would switch back to AMD if they nail performance of RT and have an answer to DLSS.


Upon further looking though, I think a bunch of people in here are rooting for AMD because of the console wars, as many of these guys don't even game on PC 🤡🤡🤡
 

Invalid GR

Member
Can someone clarify how 8 additional CUs from the 6800XT to the 6900XT can make up for the difference in performance and and pricing while staying at the same wattage?
 

llien

Member
These things are damn crazy good on AMD front:

1) RDNA2 80CU is more than 2 times faster than RDNA1 40CU
2) Power consumption 300W vs 220W, so +36% power consumed for +100% more performance
3) Shockingly cool on memory front, merely 256bit bus (that cache things, which they literally "stole" from Zen3 must be something)
4) Amazing progress on power consumption front, note that RDNA1 was a bit inferior, but still 7nm
5) Taking on $1499 lolomonster of Jensen Huang


I was not expecting this but the lowest RDNA 2 is 579$. People getting high end would definitely want good RT though.

Even those strange people who are into funny reflections in a handful of games, remind me, what is the fastest card for what price that they can buy from NV at this point?

Can someone clarify how 8 additional CUs from the 6800XT to the 6900XT can make up for the difference in performance and and pricing while staying at the same wattage?
Binned chips most likely.
AMD did this crazy thing with CPUs too.
Note that 8 CUs is more than 10% more computing power.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Where is the benchmark for the AMD cards running this on par or better?
 

00_Zer0

Member
I missed the last part of the presentation. My work pulled me away after the lower tier 6800 reveal. My question is did AMD give any indication of when third parties can start to debut their offerings including prices and specs?
 
Last edited:

Xyphie

Member
Can someone clarify how 8 additional CUs from the 6800XT to the 6900XT can make up for the difference in performance and and pricing while staying at the same wattage?

It will only be ~10% or so faster at equal clocks. If they end up using similar power it's down to 6900XT having higher quality silicon.
 

llien

Member
AMD IS BACK BOYS!!!!! HELL YEAH!!!!
ZEN 3 5900X
BIG NAVI 6900 XT


Pure ownage, on two fronts.
Fuck, I normally settle with 250-ish GPUs, but dayum, that 6800 is soo exciting.

Where is the benchmark for the AMD cards running this on par or better?
There was rumor (which this reveal aligns well with) that mentioned game benches shown to him. (6800 or 6800XT, I don't remember) He mentioned RT wise that card was on par with 2080Ti.
 

BluRayHiDef

Banned
Cerny worked on PS5 and RDNA2 at the same time licensing AMazeDtek back to AMD.

Fuck RT until the FPS pick up significantly. Shitty DLSS as a dumb holdover because they were scared to get pipped in high res.

Ridiculous post. DLSS provides image quality that matches a target resolution but that is provided at a lower resolution, and the subsequent boost in frame rate is immense. Control at native 4K with ray tracing fully enabled runs at about 25 frames per second on an RTX 3080; however, via DLSS Quality Mode, the frame rate jumps to the mid fifties and low sixties.
 
AMD has done incredible things with this. I’m even more thrilled to see such beastly ‘top shelf’ gear in the next gen consoles this time around. This is going to be a really awesome gen I can feel it already.

Compared to how bottlenecked and weak the consoles from the beginning of last gen were in comparison this really is incredible for us gamers.
 
Last edited:

OverHeat

« generous god »
I have both an RTX 3080 and an RTX 3090 and I regret NOTHING. DLSS makes playing games at 4K with ray tracing enabled a viable option, and AMD's RX 6000 Series doesn't have an equivalent to it yet.

Furthermore, there are a dozen upcoming games that will feature ray tracing and DLSS.

How many times are you going to tell GAF that you have both cards...
 

GymWolf

Gold Member
It really is. I let it be known a while back that I would switch back to AMD if they nail performance of RT and have an answer to DLSS.


Upon further looking though, I think a bunch of people in here are rooting for AMD because of the console wars, as many of these guys don't even game on PC 🤡🤡🤡
I'm very interested in that 6800xt and amd has 2-3 months to show me their version of dlss and convince me to buy their gpu, i don't care about rtx but i do care about having 4k with 30-40% less resources that i can use for other things, you know, unimportant stuff like framerate and details.

Yes, the dlss is not vastly used for now, but it's used enough on triple A games and i buy a lot fo them, cyberpunk being a very important one.

Amd has to show me why their 6800xt is better than a 3080 except for the slightly better price, when you are already spending 800 euros for a gpu, 50 euros less are really nothing, i take dlss in 10 games every year (the most pessimistic timeline because they are gonna be more than that) over 50 euros every day of the week.

edit: i forgot about the 6800xt having 6gb more than a 3080, that's already something to think about it.
 
Last edited:

justiceiro

Marlboro: Other M
Huh, so I guess there is no replacement for the 5600xt yet...
I think will pick that one up in black friday then... Except if nvidia announces a 3060 until them.
 

jonnyXx

Gold Member
No RT specific Hardware, no Tensor hardware equivalent either. Without those AMD needed to blow Nvidia out of the water for it to really be close.
Taking that into consideration, I think the 6800 and 6800XT should be priced lower. The 6900XT however is $500 cheaper than the 3090 so I can't complain.
 

notseqi

Member
Ridiculous post. DLSS provides image quality that matches a target resolution but that is provided at a lower resolution, and the subsequent boost in frame rate is immense. Control at native 4K with ray tracing fully enabled runs at about 25 frames per second on an RTX 3080; however, via DLSS Quality Mode, the frame rate jumps to the mid fifties and low sixties.
Nice. Too expensive and a shame on PC.
 

llien

Member
So, AMD lineup at the momen looks like:

6900XT - $999 => 2-2.1 faster than 5700XT
6800XT - $650 => 6900XT-10%
6800 - $579 => about 2080Ti + 18%
NOTHING
5700XT - $350 (street price)
5700 - $300


I'm very interested in that 6800xt and amd has 2-3 months to show me their version of dlss and convince me to buy their gpu
I doubt AMD would be able produce enough cards to satisfy demand, for anyone to want you to leave the green reality distortion field..

The real 6900XT competitor will be a 3080 Ti 12GB at ~$999.
Even without Zen3 synergy and OC, 6900XT beats 3090, why would anyone pay $999 for a more power hungry card with 4GB less RAM?
I mean, green fanboys would, but they would no matter what anyhow, so who cares.
 

llien

Member
I think the 6800 and 6800XT should be priced lower.
6800 is 18% faster than 3070 and has twice RAM, why can't it cost $79 more?
6800XT is faster than 3080, has 6GB more RAM and... is even cheaper than it, despite availability issues of 3080.

Why would AMD price stuff even lower?

On top of it, AMD seems to still have OC headroom, at least on 6800 series, unlike Ampere.
If not OC, Fury X was matching 980Ti at higher than 1080p resolutions.
 
Last edited:

notseqi

Member
6800 is 18% faster than 3070 and has twice RAM, why can't it cost $79 more?
6800XT is faster than 3080, has 6GB more RAM and... is even cheaper than it, despite availability issues of 3080.

Why would AMD price stuff even lower?

On top of it, AMD seems to still have OC headroom, at least on 6800 series, unlike Ampere.
If not OC, Fury X was matching 980Ti at higher than 1080p resolutions.
ti's inc
 

Kenpachii

Member
It's -$500 than the RTX 3090? Uses less energy, hits higher clocks and will go higher than game boost on top of the optimizations they showed if your using a ryzen 5000 cpu. Add all of that extra performance up.

But you wont.

Once u realize 1500 bucks for a 3090 is a utter joke. U will understand why i draw my conclusion the way i do.
 

rofif

Banned
can someon tldr this for me?
Was my 3080 fe a good purchase for 700usd ? I even sold watch dogs 2 for 30, so 3080fe costed me 670 :p
btw i still have year sub of geforce now if someone wants to buy it :D
 
Last edited:

Chiggs

Gold Member
Ridiculous post. DLSS provides image quality that matches a target resolution but that is provided at a lower resolution, and the subsequent boost in frame rate is immense. Control at native 4K with ray tracing fully enabled runs at about 25 frames per second on an RTX 3080; however, via DLSS Quality Mode, the frame rate jumps to the mid fifties and low sixties.

I think DLSS is nice in theory, but when I use it, the image just looks...weird. I am basing this on my experience with Wolfenstein Youngblood and Control...but mostly Youngblood. Maybe it's a title by title sort of thing.
 
Great showing from AMD. Hopefully nvidia will respond with something, since I'm still buying an nvidia gpu (gsync, dlss, dlss in vr, etc.).

But please stop posting that nonsense about DLSS.
 

Pagusas

Elden Member
can someon tldr this for me?
Was my 3080 fe a good purchase for 700usd ? I even sold watch dogs 2 for 30, so 3080fe costed me 670 :p

the 3080 is fine at that price. Basically AMD showwed it will likely be the best brute force card this gen, but we havnt seen RT or DLSS alternative demos or numbers yet. Wait for reviews.
 

rofif

Banned
I think DLSS is nice in theory, but when I use it, the image just looks...weird. I am basing this on my experience with Wolfenstein Youngblood and Control...but mostly Youngblood. Maybe it's a title by title sort of thing.
Depends on the game. If good, it stabilizes all aliasing incredibly well
 

Irobot82

Member
can someon tldr this for me?
Was my 3080 fe a good purchase for 700usd ? I even sold watch dogs 2 for 30, so 3080fe costed me 670 :p
btw i still have year sub of geforce now if someone wants to buy it :D

Yeah man you fine.

This is more for us who are still looking at what to upgrade to.
 

Xyphie

Member
Even without Zen3 synergy and OC, 6900XT beats 3090, why would anyone pay $999 for a more power hungry card with 4GB less RAM?
I mean, green fanboys would, but they would no matter what anyhow, so who cares.

"+Rage mode/+SAM" is overclock/Zen 3 synergy, so this is the best case (+AMD picked games). Expect that claimed 300W will be more like ~330-350W for these results, so that's the same as a RTX 3090. AMD giving no figures at all for RT performance (this means it's behind nVidia) is a good reason to buy a Ampere card over the similarly priced RDNA2 card, not that I'd expect you to recognize that.

AMD-Radeon-RX-6900-XT.jpg
 

smbu2000

Member
In this topic i learned that the red team vs green team war is as hilarious as console war.

please continue 🕺
Yes, it definitely happens. Nvidia has been dominant for a long time so many people get caught up in only supporting them even when the competition helps to bring about better prices and better performance.
Same with Intel (blue) vs AMD (red).

I prefer to go with what’s best or gives the best price/performance depending on what’s available.
I previously had an Intel CPU/AMD GPU combo and I’m currently using an AMD CPU/Nvidia GPU combo.

I’d have no qualms about switching back to an AMD GPU though. 6800XT is looking pretty tempting.
 

BluRayHiDef

Banned
the simple fact that you need dlss to play with ray tracing decently, should tell you that (at leas right now), it's not a feature that is important to many people, because too expensive in terms of performance loss.

Ridiculous post. Why does the use of DLSS matter if the end result is indistinguishable from native resolutions? This is a silly post. Resolutions are becoming so massive (e.g. 4K and 8K) that using brute force to run games, especially with advanced and heavily taxing features such as ray tracing, is not sensical but using intelligence (i.e. AI upscaling) to do so is sensical.
 

Kenpachii

Member
can someon tldr this for me?
Was my 3080 fe a good purchase for 700usd ? I even sold watch dogs 2 for 30, so 3080fe costed me 670 :p
btw i still have year sub of geforce now if someone wants to buy it :D

Nvidia:
- DLSS more performance in games that will use it
- Better ray tracing performance
- driver stability and support in games
- more heat
- bigger probably
- bigger power supply needed ( more energy consumption )
- low v-ram not next gen proof. 8gb/10gb ( 3090 doesn't have this issue )
- good cooler.

AMD:
- less heat
- a bit cheaper
- next gen future proof v-ram 16gb/16gb/16gb
- driver stability unknown
- a bit faster
- good cooler.
- lower raytracing performance
- no dlss aternative atm.
- probably a bit smaller

That's so far it.

Only GPU for next gen that makes sense that is on the market is 6800xt currently for the higher end.
 
Last edited:

longdi

Banned
"+Rage mode/+SAM" is overclock/Zen 3 synergy, so this is the best case (+AMD picked games). Expect that claimed 300W will be more like ~330-350W for these results, so that's the same as a RTX 3090. AMD giving no figures at all for RT performance (this means it's behind nVidia) is a good reason to buy a Ampere card over the similarly priced RDNA2 card, not that I'd expect you to recognize that.

AMD-Radeon-RX-6900-XT.jpg

Yap, bloodly Amd dont support their 3000 Ryzen with smart memory access.
Not buying their 6000 series.

I expect Nvidia to launch the rumored 3080 Super/Ti with 12GB ram now. Probably around $900, get that instead!
Hopefully it is built on tsmc too.
 
Last edited:

00_Zer0

Member
Until I see RT performance I'm holding my breath. Not buying a $1k card if it cant play RT games at 90% of Nvidia's offerings
Not an AMD fanboy, but I will go out on a limb and say that their future super sampling offering will allow the 6800 XT to grow into a greater card than it already is. I am not too concerned about RT performance right now, and by the time this new SS is offered it will gain a much needed performance boost in RT intensive games.

For them to mention they are working on this and future technology with MS(DX 12 Ultra) to make games load faster/run smoother gives me the confidence to go team red this time.
 

Pagusas

Elden Member
Not an AMD fanboy, but I will go out on a limb and say that their future super sampling offering will allow the 6800 XT to grow into a greater card than it already is. I am not too concerned about RT performance right now, and by the time this new SS is offered it will gain a much needed performance boost in RT intensive games.

For them to mention they are working on this and future technology with MS(DX 12 Ultra) to make games load faster/run smoother gives me the confidence to go team red this time.

I care about Cyberpunk right now, so "future promises" dont do much for that game, when we already know AMD wont even be supported at launch with RT. I have a feeling alot of people are basing things on Cyberpunk performance, so thats a big knock against them that'll hopefully be rectified quickly.
 

rofif

Banned
Nvidia:
- DLSS more performance in games that will use it
- Better ray tracing performance
- driver stability and support in games
- more heat
- bigger probably
- bigger power supply needed ( more energy consumption )
- low v-ram not next gen proof. 8gb/10gb ( 3090 doesn't have this issue )
- good cooler.

AMD:
- less heat
- a bit cheaper
- next gen future proof v-ram 16gb/16gb/16gb
- driver stability unknown
- a bit faster
- good cooler.
- lower raytracing performance
- no dlss aternative atm.
- probably a bit smaller

That's so far it.

Only GPU for next gen that makes sense that is on the market is 6800xt currently for the higher end.
it's 10gb gddr6x vs 16gb gddr6. I wonder what will matter more... probably 3080 will be too slow by the time games require that vram anyway
 
Top Bottom