• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Publishes More Radeon RX 6900 XT, RX 6800 XT & RX 6800 RDNA 2 Graphics Card Benchmarks in 4K & WQHD

Smart Access Memory (SAM) does seem like the killer feature revealed during the show. A lot of people are sleeping on the potential of this.

Very smart move from AMD here to leverage the momentum and mindshare of the Ryzen CPUs to upsell Radeon GPUs.

The benchmarks shown on the AMD site all have SAM enabled, which gives a nice boost. In some games the boost is 1-2% or not noticable at all, in one outlier, Foza we see an 11% boost and then for most others a 5-6% boost.

At the actual reveal event we did see 6800XT numbers without SAM enabled. Although on the website results they have it enabled for all of the cards.

As always, wait for 3rd party benchmarks, my prediction is as follows:

6800 - Demolishes 2080ti/3070

6800XT - On par with 3080 at 4K, they trade blows and have close wins and loses in various titles. When total FPS average is being shown at the end of a review for all titles the cards should be within a few % of each other. This may actually vary between each card "winning" overall depending on the titles benchmarked by the reviewers. With SAM+Rage Mode 6800XT takes the lead. At 1440p 6800XT will have a slight lead over 3080 (without SAM or RM).

6900XT - Will be a few % slower than 3090 when all is said and done (maybe 3-5% slower?). Granted it will still be quite close for $500 less than the 3090 and while using 50watts less power. If you OC the reference model to increase the power draw to match the 3090 in power draw then they will likely be on par. If the OC headroom rumours are true expect AIB models of this card to exceed 3090, which doesn't have much OC headroom left in the tank. If you enable SAM+RM on the reference it should roughly match 3090.

Regarding Super Resolution, not a lot is clear right now other than AMD are working on this and we should hopefully hear more soon. It will likely release as a driver update, current rumours suggest Dec/Jan for release but who knows how accurate that is.

It is not clear if the Super Resolution feature is simply utilizing MS DirectML Super Resolution technology or if AMD is building a solution on top of this that adds to it. It could also be the case that like Nvidia worked closely with MS on DXR1.0 and AMD worked closely with them on DXR1.1, perhaps AMD is working in tandem with MS to develop this Super Resolution feature as part of Direct X 12 Ultimate. Hard to say exactly.

The rumour suggests that this technique works in a different way to DLSS but will achieve similar results. Supposedly this technology will be slightly behind DLSS in image quality but much faster in terms of performance compared to DLSS. We will have to wait and see what happens.

Regarding Ray Tracing, it seems clear that AMD is a little behind Nvidia this generation. According to the Port Royal RT benchmark a 6800XT (AIB I assume) was a bit faster than 2080ti and around 20% slower than 3080.

Granted this is a synthetic benchmark so it is likely a theoretical max for all cards listed. Outside of MineCraft and Quake most games use hybrid rendering, here the gap between Turing and Ampere massively shrinks for example. At worst case scenario, look at 2080ti performance with RT enabled, add a few extra frames for additional RT performance of RDNA2 and then add more frames for the additional raster performance of RDNA2 vs Turing and you should get a rough idea of performance.

Interestingly, if Super Resolution is in fact faster in performance than DLSS in any meaningful way then RDNA2 cards may see an additional uplift in performance when RT is enabled. But we will have to wait and see if this is a real thing or not, so far AMD/MS are pretty quiet on this front so we should hear more soon hopefully.

But overall even taking these things into account, I still think Ampere is going to maintain the RT performance crown and there will still be a somewhat noticeable but not extreme gap in performance in RT games with Ampere coming out on top. If you need the absolute best RT performance this generation of cards then you are best going for a 3000 series card as RDNA2 will be a bit behind.

Speaking about RT with DXR, something interesting that I've seen mentioned is that DXR 1.0 was developed as a collaboration between Nvidia and MS, it was designed to work with Nvidia's RT hardware solution and optimized for this. AMD then worked with MS on the DXR 1.1, which seems to work differently than 1.0 and should presumably work better with AMDs hardware RT solution than DXR 1.0 does.

So far, to the best of my knowledge, all of the RT benchmarks and RTX titles released so far use DXR 1.0 as the basis of their RT. It will be interesting to see what kind of performance uplift (if any) we see when games supporting DXR 1.1 start to appear. Interestingly, pretty much all RTX titles right now are sponsored by Nvidia and as such optimized for their hardware/RT solution. This definitely puts AMD is a rough position for performance comparisons as all the games are optimized for their competitor's solution, which would make their RT performance potentially seem worse than it is (although Nvidia is still ahead one way or the other).

To add a final bit of potential consumer confusion/compatibility nightmares, right now it is not clear if current "RTX" branded titles are using a proprietary fork of DXR1.0 or even using it as a baseline but adding proprietary extensions. If this is the case then the current crop of RTX branded titles may not work on AMD hardware at all, which would be pretty shitty from a consumer point of view. We will have to wait and see how this pans out as it is not super clear at the moment.

There is also the possibility of contractual "Ray Tracing exclusivity" clause, which could mean RTX sponsored titles may not be allowed to enable RT for AMD cards for some undetermined amount of time, I'm pulling numbers from my ass here but for example 6 months maybe or a year. Granted I have no confirmation of any of this but seeing CyberPunk state that RT will not work on AMD cards at launch does not bode well.
 

thelastword

Banned
Jup that's why i don't really put much weight in those AMD numbers. I wanna see the real numbers without the 5000 exclusive stuff. Its a shitty move of AMD for sure.
Let's be honest. RT is only in a few games at the moment, paying all that Nvidia money for lower rasterization performance at a higher power draw just because of a few raytracing games that it can't even do properly or be performant in without the use of lowering the footprint through DLSS is not exactly something to be praised.

I remember a time, matching an Intel CPU with an Nvidia GPU, got you extra performance. Nvidia did better in DX11 having collaborated with Intel and all NV cards with Intel CPU's had a lower CPU footprint over an AMD card, couple that with game works, PhysX and the tessellation artillery of war vs AMD and it got even sillier relative to monopolistic and cruel business moves. What AMD is doing here is nothing similar......Would Toyota parts not work better on Toyota. Besides, AMD announced smart shift over a year ago, they announced synergy between their GPU's and CPU's to leverage more performance, they are in a good place that they are the doing both CPU's and GPU's, they should use that advantage to bolster their products even more.....It's more performance for us as gamers and they do specify that they are using Smart Access Memory in the benches. These cards are not even OC'd yet like the Nvidia Fe Cards, so I'm happy that they can use their advantage there to give us more performance. We should be praising the technology AMD is pushing here, having the CPU access the entire stack of the 16GB of Vram and infinity cache and as Herkleman said, it can go up to 13% of extra performance per title, but developers can program their games to draw even more performance with Smart Access Memory, so who knows if programmed with SAM in mind we may see uplifts of 20-30% in the future....

You can't knock AMD for that, you have to thank them for these revolutionary technologies as a gamer. It may very well mean a boost to their RT performance as well. The truth is you can't knock a company for making their two products work well or better together, they are here to sell their products not the competition's. Apple products work better together, I'm sure Intel CPU + GPU will work better together. We can't be sorry for Nvidia because they don't have a CPU. We all know what Nvidia did when they had the monopoly, what they are still doing with all their proprietary stuff. All AMD stuff is open, even their RT, synergy between their products should be their advantage and no one should knock them for this...

I think the biggest point is this. Come the 5th of November, the most performant gaming CPU's will be Ryzen 5000 series. Hardware Unboxed have already changed their benchmarking rig to a 3950X for a while now, so switching to the highest end 5000 Ryzen will be the go to CPU in a few days. AMD is now in pole position, it would make no sense then to say, it's using Smart Access, because who would test the 6900XT on a less performant CPU, when matching it with a Ryzen 5000 gives you up to 13% of extra performance, without devs even programming for SAM yet or without the slightest OC. People can't pitch Nvidia as the undisputed king and that AMD would not even match the 3070 in rasterization, and now that it blows their 3090 away with less powerdraw and there is a mountain of performance that is not even tapped yet with SAM and RAGE OC, people now shift the goal post to DLSS and RT.....for 6 or so games. AMD has RT too and the way it's engineered means that all these technologies will boost it's performance in RT just as rasterization, the work they did on the CU's, the work they did to achieve even less latency is something that should be commended and embraced because gamers are winning here. People are so concerned about RT. Would you pay an extra $500 for better RT performance or DLSS on 6 games? Yet, just as they shoved AMD aside as never being able to compete with Nvidia in rasterization, they are doing the same with RT and DLSS......People forget that the RT in AMD cards will be leveraged much more than the RT in Nvidia cards going forward and super resolution will be much more performant over DLSS. So imagine this, if AMD already has the perf crown on GPU's in rasterizaion, how much more performant will it be with Super Resolution or SR with it's RT tech.....

In short, all of these AMD technologies are early days, there is lots of performance to be had in both AMD's RT technology and of course SAM, I see lots of potential in SAM in particular. Devs can push performance much more when programmed specifically for these CPU+GPU twins. Have we seen any ground up game utilizing Radeon Rays and SAM......I'd say no, it's early days, but even with RT and Super Resolution it's going to be an interesting show in the coming months. AMD already has rasterization, that alone is reason to pick an AMD card and an AMD CPU and forget the competition. And to think AMD is giving you all that performance at $500 less than a 3090.. So everybody expecting less performance when the reviewers do their breakdown are in for a rude awakening, it looks like it's more hope for AMD to fail there more than anything, even if they are already offering better performance with or without SAM and a much cheaper price at that too. When the reviewers do their analysis, they will use Rage mode, they will OC, they will keep it at stock, they will use SAM. Are people going to say it's unfair?, when every FE card is already OC'd from the factory..? Imagine, Nvidia used DLSS in benchmarks to sell 8k and 4k performance in it's initial showing. A 35TF card has to depend on DLSS.....I think it will become even sillier when these reviewers OC these AMD cards or at least run them up to Nvidia power draw numbers.....And lord forbid what I've been hearing about Super Resolution. People want to talk about DLSS, but the only reason that's there is because RT is still not performant on NV cards at normal resolutions. SR is going to make the AMD rasterization numbers look even more overwhelming and will boost their RT even more...It's what the consoles will use, so you can bank on it being more successful and more widespread among the devs, both SR and Radeon Rays...
 

thelastword

Banned
Super resolution is the dark horse here, wouldn't buy any new gpu until I know what that's all about.
The most important thing right now is higher frames in rasterization games, anything else is just a bonus. All that performance for much cheaper and lower powerdraw. Even more power and potential through Smart Access and Rage Mode......This is what gamers have been asking from AMD and they have delivered here in spades.....I believe their Smart Access Memory is the real gamechanger here, their best feature so far, the way this will be leveraged almost guarantees AMD performance dominance in the future....Gamers win massively here, they asked for performance, not only do they have it, but the potential for devs to squeeze higher performance out of these technologies apart from the automatic 13% uptick is what's interesting. Now you put Super Resolution in the picture and I don't see how AMD will lose the performance crown ever again or in the future.....
 
Cool but no RT performance and DLSS equivalent, no buy...

That's what I am wondering, is the performance based on:

Raytracing ON
DLSS equivalent ON
at same time?


Also dont forget, AMD said additional 5% to 10% performance boost when you pair it with Ryzen 5000 series.
 
Last edited:

thelastword

Banned
Leaked and preliminary RT performance in Tomb Raider on the RX 6800 vs the 2080ti and 3070...


4k
AMD-Radeon-RX-6800-Shadow-of-the-Tomb-Raider-4K.png



1440p
AMD-Radeon-RX-6800-Shadow-of-the-Tomb-Raider-QHD.png
 

WakeTheWolf

Member
My only worry for these new cards is having the same issues that RX 5700 XT had. The black screen issues are still happening and I've just had it happen about 5 times today with my Red Devil edition. Only resolution is RMA i hope we dont have to deal with this on the next cards.
 
My only worry for these new cards is having the same issues that RX 5700 XT had. The black screen issues are still happening and I've just had it happen about 5 times today with my Red Devil edition. Only resolution is RMA i hope we dont have to deal with this on the next cards.

As far as I know most of the show stopper driver issues have been resolved by now for 5700XT, sounds like a hardware fault might be causing it?
 

Ascend

Member
My only worry for these new cards is having the same issues that RX 5700 XT had. The black screen issues are still happening and I've just had it happen about 5 times today with my Red Devil edition. Only resolution is RMA i hope we dont have to deal with this on the next cards.
The Red Devil became very popular, but, PowerColor is not exactly the most reliable hardware vendor. I would never buy that brand, and I wouldn't be surprised if yours is a hardware issue rather than a driver issue.
 

thelastword

Banned
The Red Devil became very popular, but, PowerColor is not exactly the most reliable hardware vendor. I would never buy that brand, and I wouldn't be surprised if yours is a hardware issue rather than a driver issue.
As far as I'm aware the issues were fixed a long time ago with an update, but even then most people did not have any issues, like hardware unboxed.....I think people must also realize that you can get a bad defective product in a bunch filled with perfectly fine ones. It happens with every product known to man, even those with the highest quality reputation. Some people had driver issues, but some people had badly manufactured cards too.....I still remember lots of people had space invaders graphics at Turing's launch, cards that use to shut down and BSOD their machines or even cards that never worked........From memory, NV has had more of these issues than AMD..

https://www.techpowerup.com/249077/...with-turing-based-cards-its-not-a-broad-issue
 

llien

Member
Since this is a Ryzen 5000 CPU based platform, AMD GPUs take full advantage of the Smart Access Memory technology which allows CPUs to take full use of the graphics memory that is featured on the Radeon RX 6000 series graphics cards, allowing for up to 11% increase in performance across select titles.
Isn't it a combo of "Rage" (auto-OC) and SAM?
I recall on demo they only showed it combined.
And other sources stating SAM gives around 3% difference.

Of course, they're only showing results that are favorable to them
They show very wide range of games, including those, where AMD cards did not perform well.
People should stop treating AMD presentations the way they (rightfully) treat NV.
At least with Lisa Su as presenter, AMD hasn't been caught on any lies so far.
 
Last edited:

JimboJones

Member
I wonder if intel will try something similar with their GPU(if they ever appear 🙄), shame it's limited to the 5000 series but still it's an interesting advantage.
 
Ready tracing
I wonder if intel will try something similar with their GPU(if they ever appear 🙄), shame it's limited to the 5000 series but still it's an interesting advantage.
AMD is going to now become an ecosystem if evening is going to be limited to generation. Although it does have some upsides
 

Eliciel

Member
It should work with any 500 series Motherboard, so the X570 or the B550 should both work.

I'm not sure if this is to do with hardware specification on the new boards or something lacking in the old boards. Maybe AMD can backport SAM to X470/B450 boards too in the future?

Asus and MSi will be supporting X470/B450 with BIOS/Firmware Update, has been confirmed already, and I will Update to Ryzen 5000 Series on an Asus X470 in Future..
 
Last edited:

gspat

Member
will Smart Access Memory only work on X570? So if you use a 5000 series processor with B550 or a 4-series board, you can't use it?
Using windows, yes. This has been a feature under Linux for a while now. I forget what they call it there. Works with any CPU/GPU that meets whatever the criteria is.

Edit:.

From Phoronix.com:

Smart Access Technology works just fine on Linux. It is resizeable BAR support which Linux has supported for years (AMD actually added support for this), but which is relatively new on windows. You just need a platform with enough MMIO space. On older systems this is enabled via sbios options with names like ">4GB MMIO".
 
Last edited:

MadYarpen

Member
Asus and MSi will be supporting X470/B450 with BIOS/Firmware Update, has been confirmed already, and I will Update to Ryzen 5000 Series on an Asus X470 in Future..

Could you please give us a link?

I'm using b450 from MSI, this would be a massive factor in gpu decision
 

Eliciel

Member
Could you please give us a link?

I'm using b450 from MSI, this would be a massive factor in gpu decision



 
Last edited:

Amiga

Member
what dose 30% better raytracing actually look like? how are we supposed to appreciate the difference?

for DLSS2, nVidia has it AMD don't(for now). so it's easy to decide here
 

Eliciel

Member
With no PCIe 4.0, there wont be an SAM on those boards.


Allegedly even that will be supported.
Surprisingly the Baseline Boards will have the biggest "benefit".


Although, I am Not certain If there are even more up-to-date information about that..



Wouldn't make Sense to me If that doesn't include "full Zen 3 capability Support"
 
Last edited:

Allegedly even that will be supported.
Surprisingly the Baseline Boards will have the biggest "benefit".


Although, I am Not certain If there are even more up-to-date information about that..



Wouldn't make Sense to me If that doesn't include "full Zen 3 capability Support"

While we know that a BIOS update is coming for 400 series motherboards to support Zen 3, we don't know if this includes the SAM feature or not. It is still unclear as all AMD have mentioned at their conference and stated on their website is that a 500 series motherboard is required. I would love for there to be support for the 400 series motherboards, I have an X470 mobo right now so would love to be able to just pop in a Zen 3 CPU and avail of the SAM functionality to give me a little boost.
 

Eliciel

Member
While we know that a BIOS update is coming for 400 series motherboards to support Zen 3, we don't know if this includes the SAM feature or not. It is still unclear as all AMD have mentioned at their conference and stated on their website is that a 500 series motherboard is required. I would love for there to be support for the 400 series motherboards, I have an X470 mobo right now so would love to be able to just pop in a Zen 3 CPU and avail of the SAM functionality to give me a little boost.
Fully agreed, Same situation and yeah we do not know for certain what is going to happen...crossing fingers we can make use of these boards with SAM feature
 
Last edited:

Kenpachii

Member
Let's be honest. RT is only in a few games at the moment, paying all that Nvidia money for lower rasterization performance at a higher power draw just because of a few raytracing games that it can't even do properly or be performant in without the use of lowering the footprint through DLSS is not exactly something to be praised.

I remember a time, matching an Intel CPU with an Nvidia GPU, got you extra performance. Nvidia did better in DX11 having collaborated with Intel and all NV cards with Intel CPU's had a lower CPU footprint over an AMD card, couple that with game works, PhysX and the tessellation artillery of war vs AMD and it got even sillier relative to monopolistic and cruel business moves. What AMD is doing here is nothing similar......Would Toyota parts not work better on Toyota. Besides, AMD announced smart shift over a year ago, they announced synergy between their GPU's and CPU's to leverage more performance, they are in a good place that they are the doing both CPU's and GPU's, they should use that advantage to bolster their products even more.....It's more performance for us as gamers and they do specify that they are using Smart Access Memory in the benches. These cards are not even OC'd yet like the Nvidia Fe Cards, so I'm happy that they can use their advantage there to give us more performance. We should be praising the technology AMD is pushing here, having the CPU access the entire stack of the 16GB of Vram and infinity cache and as Herkleman said, it can go up to 13% of extra performance per title, but developers can program their games to draw even more performance with Smart Access Memory, so who knows if programmed with SAM in mind we may see uplifts of 20-30% in the future....

You can't knock AMD for that, you have to thank them for these revolutionary technologies as a gamer. It may very well mean a boost to their RT performance as well. The truth is you can't knock a company for making their two products work well or better together, they are here to sell their products not the competition's. Apple products work better together, I'm sure Intel CPU + GPU will work better together. We can't be sorry for Nvidia because they don't have a CPU. We all know what Nvidia did when they had the monopoly, what they are still doing with all their proprietary stuff. All AMD stuff is open, even their RT, synergy between their products should be their advantage and no one should knock them for this...

I think the biggest point is this. Come the 5th of November, the most performant gaming CPU's will be Ryzen 5000 series. Hardware Unboxed have already changed their benchmarking rig to a 3950X for a while now, so switching to the highest end 5000 Ryzen will be the go to CPU in a few days. AMD is now in pole position, it would make no sense then to say, it's using Smart Access, because who would test the 6900XT on a less performant CPU, when matching it with a Ryzen 5000 gives you up to 13% of extra performance, without devs even programming for SAM yet or without the slightest OC. People can't pitch Nvidia as the undisputed king and that AMD would not even match the 3070 in rasterization, and now that it blows their 3090 away with less powerdraw and there is a mountain of performance that is not even tapped yet with SAM and RAGE OC, people now shift the goal post to DLSS and RT.....for 6 or so games. AMD has RT too and the way it's engineered means that all these technologies will boost it's performance in RT just as rasterization, the work they did on the CU's, the work they did to achieve even less latency is something that should be commended and embraced because gamers are winning here. People are so concerned about RT. Would you pay an extra $500 for better RT performance or DLSS on 6 games? Yet, just as they shoved AMD aside as never being able to compete with Nvidia in rasterization, they are doing the same with RT and DLSS......People forget that the RT in AMD cards will be leveraged much more than the RT in Nvidia cards going forward and super resolution will be much more performant over DLSS. So imagine this, if AMD already has the perf crown on GPU's in rasterizaion, how much more performant will it be with Super Resolution or SR with it's RT tech.....

In short, all of these AMD technologies are early days, there is lots of performance to be had in both AMD's RT technology and of course SAM, I see lots of potential in SAM in particular. Devs can push performance much more when programmed specifically for these CPU+GPU twins. Have we seen any ground up game utilizing Radeon Rays and SAM......I'd say no, it's early days, but even with RT and Super Resolution it's going to be an interesting show in the coming months. AMD already has rasterization, that alone is reason to pick an AMD card and an AMD CPU and forget the competition. And to think AMD is giving you all that performance at $500 less than a 3090.. So everybody expecting less performance when the reviewers do their breakdown are in for a rude awakening, it looks like it's more hope for AMD to fail there more than anything, even if they are already offering better performance with or without SAM and a much cheaper price at that too. When the reviewers do their analysis, they will use Rage mode, they will OC, they will keep it at stock, they will use SAM. Are people going to say it's unfair?, when every FE card is already OC'd from the factory..? Imagine, Nvidia used DLSS in benchmarks to sell 8k and 4k performance in it's initial showing. A 35TF card has to depend on DLSS.....I think it will become even sillier when these reviewers OC these AMD cards or at least run them up to Nvidia power draw numbers.....And lord forbid what I've been hearing about Super Resolution. People want to talk about DLSS, but the only reason that's there is because RT is still not performant on NV cards at normal resolutions. SR is going to make the AMD rasterization numbers look even more overwhelming and will boost their RT even more...It's what the consoles will use, so you can bank on it being more successful and more widespread among the devs, both SR and Radeon Rays...

I don't care much myself for raytracing at this point, performance hit is laughable still at this point and it offers nothing but being a tech demo still to this day. DLSS is interesting to me as it gives more performance and quality uplift something i always want. So yea i agree with u on that.

Then about nvidia gpu's and intel cpu's drivers:

Nvidia GPU's did better because nvidia actually made half decent drivers and went out to developers something AMD used to do to make their products work well. Saying that nvidia and intel had a packt together is simple false because on AMD cpu's the gains were also there, it was pure driver related but also techniques like physx that didn't got offloaded to the CPU because AMD had no answer towards it. It's AMD's fault not Nvidia's fault on this front. No excuses here for AMD they failed hard here.

And this is why i slammed earlier on in another thread AMD for stop wasting people's time with side games like dirt 5 and start focusing on juggernaut titles and how it performs there, because people in the market for a high end gpu probably already have a high end CPU in there PC setups currently.. AMD should be in the works with Cyberpunk like they did with GTA 5 and AC games / battlefield in the past to get support going and make the game work well for their products at day one. Yet we hear nothing from them on this matter. It's a bad sign while nvidia is screaming from the rooftops how they support this and that and that. It feels like 2013 all over again where wither gets hairworks and stressfx from amd was nowhere to be found. And no this was not nvidia or cyberpunks fault it was AMD refusing to communicate at any level or do any effort. it took them 4-5 months to fix crossfire while on day one sli worked perfectly fine. Hell even people themselves fixed the drivers for AMD and that took them months to implement in their own drivers.

Look i am not against progress and i am not against better higher frame rates improvement techniques etc. The thing where i am against is shady advertisement where people think they get something but in reality they don't. For example if Nvidia simple showcases numbers with DLSS active and say its how there card performs and dont's show it without dlss. It's misleading because a game without DLSS will not showcase those gains and can fall behind the games u buy the card for without you knowing this specifically. People aren't that smart and fall for these type of shit marketing tricks all day long and get burned by it.

Example:

6900xtvs-768x768.jpg


where's the chart without it? not found that day.

Another good example is where jimmy buys a GPU from AMD because its faster then the nvidia for 150 bucks. But its tested by all tech youtubers on nasa cpu's that cost as much as jimmies car and realizes when he come home he gets half of the frame rates because his cpu isn't up for the task and would have gotten far better performance from that nvidia gpu out of it as result. Jimmy is mad now.

And that will repeat again for the same reasons as u mention. A good 6000 series benchmark to showcase u exactly how it performs will be 5000 series CPU / 3000 series CPU and a Intel CPU. Sadly techyoutubers won't realize this and will go with whatever flows there boat which will result in either they have really good numbers because they pushed a 5000 series chip in there test box and mislead jimmy because he only has a 3000 series cpu or a intel cpu ( 75% is still intel basically ) and there you go.

Then your argument of but everybody will test with 5000 series chip.

We don't know this, it could very well be that the chip doesn't perform the way AMD is advertising and falls below the 10900k, intel could release a new cpu or series that is faster again and bye bye 6000 gains. Also even if the 5000 series is 1-5% faster will benchers care to swap?

See how this is going?

Then about pricing.

3090 isn't worth 1500. basing the price on that solution is simple dumb as hell. The same as the 2080ti wasn't worth 1200 bucks.

Then your part about super resolution being more performant, RT being better etc etc. It's all wishful thinking at this point. we have nothing to indicate that's going to be the case. As far as it stands now they are behind the curve and did nothing to clear this up. Also nvidia could adopt that solution also if they wanted too.

Look the cards are great, i love the 16gb model, i hate them locking there cards full performance behind specific hardware and i like there raw GPU performance.
What AMD needs to do is in my vision now.

Do a new presentation:
- showcase super resolution and its performance + release date
- show raytracing performance with it
- explain how they focus on drivers and developers
- showcase a list of high profile games people buy these cards for ( hint not dirt 5 ) and showcase that u worked with the developers to make your card work optimal.

Create confidence.

Because what i read on tech forums is nothing but people bitching about the price ( from 400 > 1000 ), the CPU locking and no confidence in there drivers. That's why i stated they need sharp prices but didn't do so.

We will see tho.
 
Last edited:
I don't care much myself for raytracing at this point, performance hit is laughable still at this point and it offers nothing but being a tech demo still to this day. DLSS is interesting to me as it gives more performance and quality uplift something i always want. So yea i agree with u on that.

Then about nvidia gpu's and intel cpu's drivers:

Nvidia GPU's did better because nvidia actually made half decent drivers and went out to developers something AMD used to do to make their products work well. Saying that nvidia and intel had a packt together is simple false because on AMD cpu's the gains were also there, it was pure driver related but also techniques like physx that didn't got offloaded to the CPU because AMD had no answer towards it. It's AMD's fault not Nvidia's fault on this front. No excuses here for AMD they failed hard here.

And this is why i slammed earlier on in another thread AMD for stop wasting people's time with side games like dirt 5 and start focusing on juggernaut titles and how it performs there, because people in the market for a high end gpu probably already have a high end CPU in there PC setups currently.. AMD should be in the works with Cyberpunk like they did with GTA 5 and AC games / battlefield in the past to get support going and make the game work well for their products at day one. Yet we hear nothing from them on this matter. It's a bad sign while nvidia is screaming from the rooftops how they support this and that and that. It feels like 2013 all over again where wither gets hairworks and stressfx from amd was nowhere to be found. And no this was not nvidia or cyberpunks fault it was AMD refusing to communicate at any level or do any effort. it took them 4-5 months to fix crossfire while on day one sli worked perfectly fine. Hell even people themselves fixed the drivers for AMD and that took them months to implement in their own drivers.

Look i am not against progress and i am not against better higher frame rates improvement techniques etc. The thing where i am against is shady advertisement where people think they get something but in reality they don't. For example if Nvidia simple showcases numbers with DLSS active and say its how there card performs and dont's show it without dlss. It's misleading because a game without DLSS will not showcase those gains and can fall behind the games u buy the card for without you knowing this specifically. People aren't that smart and fall for these type of shit marketing tricks all day long and get burned by it.

Example:

6900xtvs-768x768.jpg


where's the chart without it? not found that day.

Another good example is where jimmy buys a GPU from AMD because its faster then the nvidia for 150 bucks. But its tested by all tech youtubers on nasa cpu's that cost as much as jimmies car and realizes when he come home he gets half of the frame rates because his cpu isn't up for the task and would have gotten far better performance from that nvidia gpu out of it as result. Jimmy is mad now.

And that will repeat again for the same reasons as u mention. A good 6000 series benchmark to showcase u exactly how it performs will be 5000 series CPU / 3000 series CPU and a Intel CPU. Sadly techyoutubers won't realize this and will go with whatever flows there boat which will result in either they have really good numbers because they pushed a 5000 series chip in there test box and mislead jimmy because he only has a 3000 series cpu or a intel cpu ( 75% is still intel basically ) and there you go.

Then your argument of but everybody will test with 5000 series chip.

We don't know this, it could very well be that the chip doesn't perform the way AMD is advertising and falls below the 10900k, intel could release a new cpu or series that is faster again and bye bye 6000 gains. Also even if the 5000 series is 1-5% faster will benchers care to swap?

See how this is going?

Then about pricing.

3090 isn't worth 1500. basing the price on that solution is simple dumb as hell. The same as the 2080ti wasn't worth 1200 bucks.

Then your part about super resolution being more performant, RT being better etc etc. It's all wishful thinking at this point. we have nothing to indicate that's going to be the case. As far as it stands now they are behind the curve and did nothing to clear this up. Also nvidia could adopt that solution also if they wanted too.

Look the cards are great, i love the 16gb model, i hate them locking there cards full performance behind specific hardware and i like there raw GPU performance.
What AMD needs to do is in my vision now.

Do a new presentation:
- showcase super resolution and its performance + release date
- show raytracing performance with it
- explain how they focus on drivers and developers
- showcase a list of high profile games people buy these cards for ( hint not dirt 5 ) and showcase that u worked with the developers to make your card work optimal.

Create confidence.

Because what i read on tech forums is nothing but people bitching about the price ( from 400 > 1000 ), the CPU locking and no confidence in there drivers. That's why i stated they need sharp prices but didn't do so.

We will see tho.

If I can paraphrase and distill down your main points I think they would look something like this:

1. Advertising only with RM+SAM for 6900XT is a little misleading.

2. You believe benchmarking games on this GPU with a top of the line CPU could be misleading as the average person might not be able to afford that CPU.

3. You think the 6900XT is too expensive.

4. You believe benchmarking reviews should use a bunch of different CPUs to show performance with lower tier/different vendors.

5. You would like more info about/benchmarks with RT and Super Resolution.

6. You want AMD to partner with/sponsor more high profile tentpole AAA PC titles/ports.

Would you say that was a roughly accurate summary given some artistic/editorial license?

Ok if so, lets address these one by one:

1. Advertising only with RM+SAM for 6900XT is a little misleading.

I agree, AMD want to show their card in the best possible light, as does Nvidia, Intel, Sony etc... As people who are reasonably well informed we should call out marketing BS and theoretical figures/lab results that don't reflect real world scenarios. We should do this no matter who is presenting the information, I said the same during the 3000 series reveal and this is also true here. No disagreement from me here. AMD should have shown the 6900XT without SAM/RM.

2. You believe benchmarking games on this GPU with a top of the line CPU could be misleading as the average person might not be able to afford that CPU.

Can't say I agree with you here. Differences in performance due to a CPU should be saved for CPU reviews, they are not that helpful for GPU reviews as the fairest representation is seeing the GPU unleashed at its maximum potential performance with minimal bottlenecking possible from the CPU. Afterall it is a GPU benchmark/review, not a CPU one. While this/top line CPUs are expensive they are still products that people can actually buy for a reasonable amount of money, they are consumer grade. We are not talking about data center, server or professional grade CPUs here that cost in the thousands. Also, for AMD's slides both Nvidia and AMD GPUs were using the same CPU so if any uplift is there they will both benefit. Same for 3rd party benchmarks.

3. You think the 6900XT is too expensive.

Compared to the performance delta between 6800XT and 6900XT, yes it is overpriced for that small amount of extra performance. I agree with you, but then products don't exist in a vacuum abstracted away from the market and competitor's products/pricing/positioning. All companies will price in relation to their competitors/segements in the market. The competition for the 6900XT is the 3090, which is $1499, as such for that segment and compared to the competition, the pricing is actually amazing with a $500 saving vs the competition for similar performance. Would it be nicer if it was cheaper? Sure, but AMD is not a charity and if they have a competitive product it will be priced competitively with the competition. I could understand if 3090 was 900 for example and the 6900XT was more expensive, but complaints about the price here a little silly when the 3090 exists as direct competition.

4. You believe benchmarking reviews should use a bunch of different CPUs to show performance with lower tier/different vendors.

I think this would be nice to see how GPUs perform across the CPU stack, but doing benchmarks across 12-20 games is incredibly time consuming as it is, I don't realistically see this happening anytime soon for practical reasons. In addition, people are interested in how the GPU performs, not the CPU.

5. You would like more info about/benchmarks with RT and Super Resolution

No argument here, being very silent on RT makes it appear as though they have something to hide. If they are a little behind the competition in RT performance it would be logical to assume they would not show benchmarks that show them losing across the board for example. However if the leaked Tomb Raider RT benchmark result is true for the 6800, and if that same performance carries across to most RT titles then I think they perform pretty well here. The second possibility is that RT drivers may not be final yet and could still be in development with more performance to squeeze out before launch so they didn't want to show something unfinished. There is also the fact that even if they are happy with raw RT performance, they know that without Super Resolution being ready that they would be directly compared to Nvidia GPUs running with DLSS enabled, so it would not be a favourable comparison until SR is ready.

Granted we can only speculate but AMD have mentioned we will get more information closer to launch about RT and Super Resolution so we should hear more soon. In terms of 3rd party reviews/benchmarks, they are going to benchmark with RT on and off for the titles that support it so we will get to see the performance in action in a real scenario. Super Resolution will likely launch a bit later than reviews which is a pity as we won't be able to directly compare performance with SR vs DLSS but it is what it is. I think we all want to know more about these two features.

6. You want AMD to partner with/sponsor more high profile tentpole AAA PC titles/ports.

I agree, AMD could improve more here, although they are partnering with Far Cry 6, which is a pretty big title. The problem with this approach is that developers will go to whoever has the largest pockets/staff for support/marketing budget. Nvidia is far larger than Radeon group, even AMD itself so for the forseeable future Nvidia will have an advantage here until AMD grows market/mindshare/profits which they seem to be on the right path to achieving.
 
Last edited:

thelastword

Banned
I don't care much myself for raytracing at this point, performance hit is laughable still at this point and it offers nothing but being a tech demo still to this day. DLSS is interesting to me as it gives more performance and quality uplift something i always want. So yea i agree with u on that.

Then about nvidia gpu's and intel cpu's drivers:

Nvidia GPU's did better because nvidia actually made half decent drivers and went out to developers something AMD used to do to make their products work well. Saying that nvidia and intel had a packt together is simple false because on AMD cpu's the gains were also there, it was pure driver related but also techniques like physx that didn't got offloaded to the CPU because AMD had no answer towards it. It's AMD's fault not Nvidia's fault on this front. No excuses here for AMD they failed hard here.

And this is why i slammed earlier on in another thread AMD for stop wasting people's time with side games like dirt 5 and start focusing on juggernaut titles and how it performs there, because people in the market for a high end gpu probably already have a high end CPU in there PC setups currently.. AMD should be in the works with Cyberpunk like they did with GTA 5 and AC games / battlefield in the past to get support going and make the game work well for their products at day one. Yet we hear nothing from them on this matter. It's a bad sign while nvidia is screaming from the rooftops how they support this and that and that. It feels like 2013 all over again where wither gets hairworks and stressfx from amd was nowhere to be found. And no this was not nvidia or cyberpunks fault it was AMD refusing to communicate at any level or do any effort. it took them 4-5 months to fix crossfire while on day one sli worked perfectly fine. Hell even people themselves fixed the drivers for AMD and that took them months to implement in their own drivers.

Look i am not against progress and i am not against better higher frame rates improvement techniques etc. The thing where i am against is shady advertisement where people think they get something but in reality they don't. For example if Nvidia simple showcases numbers with DLSS active and say its how there card performs and dont's show it without dlss. It's misleading because a game without DLSS will not showcase those gains and can fall behind the games u buy the card for without you knowing this specifically. People aren't that smart and fall for these type of shit marketing tricks all day long and get burned by it.

Example:

6900xtvs-768x768.jpg


where's the chart without it? not found that day.

Another good example is where jimmy buys a GPU from AMD because its faster then the nvidia for 150 bucks. But its tested by all tech youtubers on nasa cpu's that cost as much as jimmies car and realizes when he come home he gets half of the frame rates because his cpu isn't up for the task and would have gotten far better performance from that nvidia gpu out of it as result. Jimmy is mad now.

And that will repeat again for the same reasons as u mention. A good 6000 series benchmark to showcase u exactly how it performs will be 5000 series CPU / 3000 series CPU and a Intel CPU. Sadly techyoutubers won't realize this and will go with whatever flows there boat which will result in either they have really good numbers because they pushed a 5000 series chip in there test box and mislead jimmy because he only has a 3000 series cpu or a intel cpu ( 75% is still intel basically ) and there you go.

Then your argument of but everybody will test with 5000 series chip.

We don't know this, it could very well be that the chip doesn't perform the way AMD is advertising and falls below the 10900k, intel could release a new cpu or series that is faster again and bye bye 6000 gains. Also even if the 5000 series is 1-5% faster will benchers care to swap?

See how this is going?

Then about pricing.

3090 isn't worth 1500. basing the price on that solution is simple dumb as hell. The same as the 2080ti wasn't worth 1200 bucks.

Then your part about super resolution being more performant, RT being better etc etc. It's all wishful thinking at this point. we have nothing to indicate that's going to be the case. As far as it stands now they are behind the curve and did nothing to clear this up. Also nvidia could adopt that solution also if they wanted too.

Look the cards are great, i love the 16gb model, i hate them locking there cards full performance behind specific hardware and i like there raw GPU performance.
What AMD needs to do is in my vision now.

Do a new presentation:
- showcase super resolution and its performance + release date
- show raytracing performance with it
- explain how they focus on drivers and developers
- showcase a list of high profile games people buy these cards for ( hint not dirt 5 ) and showcase that u worked with the developers to make your card work optimal.

Create confidence.

Because what i read on tech forums is nothing but people bitching about the price ( from 400 > 1000 ), the CPU locking and no confidence in there drivers. That's why i stated they need sharp prices but didn't do so.

We will see tho.
AMD has shown some impressive stats already, the gains on AMD cards with SAM are not 13% across the board, but rather up to 13%,....just bigger gains on games they are already leading by a great deal.....Pegging an AMD card with an Intel CPU will still give great performance on AMD cards vs Nvidia because the AMD cards are faster, they are not even OC'd yet like NV cards are, neither have we seen AIB Radeon cards yet, which will be reviewed too down the line. By launch day, their drivers will be even better, so I'm not seeing the concern for lesser performance through the youtubers come launch day. If anything I expect even higher performance on top of what AMD showed because youtubers are going to maximize the power slider by default, they may even use the free rage mode oc button.....Whatever they do, they will compare across the board, even with intel CPU's, but youtubers have always used the best CPU's for testing the games, don't know why they need to botch that trend now since AMD has potent GPU's on the high end....
The consoles using RDNA2 should help somewhat with PC optimization. Look at RDR2, it was so optimized for GCN that an RX 580 could match a GTX 1070
Yeah, there are some games I want to see on RDNA 2 like RDR 2, Horizon etc...I think Dirt will continue to perform better on AMD as well. So the games shown at AMD's presser was just the tip of the iceberg....
 

01011001

Banned
is there any word on raytracing? all these benchmarks are only raterization I assume?

the real test will be if Watch Dogs + Raytracing is comparable in terms of performance
 

Pagusas

Elden Member
is there any word on raytracing? all these benchmarks are only raterization I assume?

the real test will be if Watch Dogs + Raytracing is comparable in terms of performance

Nothing but leaks so far. At least we'll have Watchdogs for reviewers to pit the cards head to head with in terms of RT. Though I fully expect AMD to say "immature drivers" will cause depressed results and they expect uplifts in the future.
 

01011001

Banned
Nothing but leaks so far. At least we'll have Watchdogs for reviewers to pit the cards head to head with in terms of RT. Though I fully expect AMD to say "immature drivers" will cause depressed results and they expect uplifts in the future.

well if the Series X version is anything to go by it won't be that bad.
but I wouldn't expect RTX30 levels of RT performance of course.
 

Pagusas

Elden Member
well if the Series X version is anything to go by it won't be that bad.
but I wouldn't expect RTX30 levels of RT performance of course.
Thats my expectation as well. I think we'll see 30% of the raster performance of a 3080 with RT when head to head without DLSS. I think AMD's DLSS solution will need to get here quickly to be able to compete though.
 
Top Bottom