Ascend
Member
Uh... You're aware that Vulkan would not exist without AMD's Mantle, right?AMD's recent GPUs have all been very late responses to what Nvidia has long had on the market, and only then they come out with a marginally better value product, if the only thing you really care about is performance/$ in mainstream benchmarked games. Try a little something outside mainstream like OpenGL emulation on CEMU, and performance was atrocious until finally we're getting Vulkan options, which we can't really thank AMD for.
As for the whole OpenGL emulation on CEMU... How is that AMD's fault? Isn't it the job of the developers to optimize it for AMD hardware?
Do you know what that CPU overhead 'issue' actually entails? Let me give you a hint. Why does Crysis, even to this day, run like sh*t on even the latest hardware?And let's not go into AMD's old DX9/DX11 CPU overhead issue that they never fixed. These sort of issues prop up left and right once you go out of the comfort zone that is the mainstream benchmark games, and even those have some doozies that AMD often fails to address. I don't want it to be like that, but that's just reality.
AMD did improve a lot on the wrongfully named CPU overhead issue with driver updates, but I guess either nobody knows or nobody really cares.
Yes, things are like that, but the why is still important.
How can you say AMD 'follows suit' and use Chill and Boost as an example when nVidia never implemented those? Additionally, this really shows how skewed your perspective really is... I mean...There's far more to Nvidia's feature support than just Ray-tracing. They've pushed most of the real improvements in the industry, beginning with things like G-Sync. AMD has merely followed suit. Things like Chill, Boost, Async compute, what do these really offer me when AMD GPUs are louder, slower and demand more power regardless? FreeSync was useful response to proprietary G-Sync, but it would not exist without it either, and most often the implementations were sub-par.
Who uses smaller nodes first? (In Before "They do it because they need to!")
Who had DX 10.1 features first?
Who came with a unified shader architecture first?
Who first came with tessellation support on their cards?
Who was the first to put sound through HDMI on their cards?
Who had DX12 support first?
Who had concurrent async compute first?
And how exactly did the whole Anti-lag thing go? Oh right... nVidia lied saying they did it first, and then they released a low latency mode to compete with AMD after AMD already released their anti-lag. But somehow it's AMD that follows suit...
But seriously, AMD cards are rarely louder, slower and more power hungry at the same time, especially if taking AIB cards into account. And that's the exact function of Radeon Chill, which no one cares about because it's specifically AMD's. That reduces your power consumption significantly, if you really truly care about that. In reality, people don't care about it, except to trash AMD.
I disagree that FreeSync would not exist without G-Sync. Maybe the timing of it changed, but it was pretty much inevitable. And if we start talking about variable refresh rate over HDMI, nVidia still does not have that. And that is one example that has dragged on for ages on nVidia's side. But I guess people only pay attention when it's AMD dragging their feet and missing certain features....
Hello variable refresh rate over HDMI again.Then there's the software feature set. AMD has a new pretty GUI, and half the shit in it doesn't work on their newest GPUs as per Gamers Nexus. Nvidia's control panel is old, but at least it works. Then there's features like Ansel and now adding Reshade filter support on top of what they already. Where's AMD's equivalent OSD features? They took forever even with Relive or whatever they call the equivalent of Shadowplay is these days. Even if AMD manages to bring some marginally useful software feature that Nvidia doesn't have, image sharpening for example, it's usually replicated in no time, whereas the opposite takes forever or never happens at all. That's the reality of the differences between AMD and Nvidia when it comes to software support and resources they can dedicate.
Radeon Chill?
ReLive is now practically superior to ShadowPlay. Look which one is awarded being the best for features...;
Software Reviews & Buying Guides
See our best software for video editing, antivirus protection, parental controls, and more. Read reviews and get our unbiased, expert product recommendations.
www.lifewire.com
As for Gamers Nexus stuff that doesn't work. Did you look at the fixed issues list in the latest driver release? Here;
Are there issues? Yes. At least AMD is open about it in their driver releases, which issues are fixed and which still are pending. nVidia's issues are never published like this, but, just go to the nVidia forums and you'll see how many issues people have. But somehow, that still is not seen as an issue among the gaming community. It's still true that nVidia has way more resources than AMD, which makes it even more baffling that AMD is slammed for what they offer, rather than praised, considering their limited resources.
Maybe because you said " if this GPU comes some time in H2 and goes right against Nvidia's next gen, suddenly it'll just be another 5700XT at best "I'm not acting like it's bad in itself. Not sure how you got that idea.
Expressed in this way, it comes over as extremely belittling.
Remember this?Not sure how you got that idea. I got a 2070S for ~100€ more compared to what I could've gotten a 5700XT for a few reasons. I play games like BOTW on CEMU, and modded Skyrim LE. Performance with these games on Radeon is atrociously bad, due to their ignored optimizations for things like OpenGL and CPU overhead issues like I already mentioned. I play on a 4K panel, and even 2070S isn't fast enough for it, and going 5700XT would not help there and beyond there's no choice at all. AMD cards also like to suck a bunch of idle power for no reason on multi-monitor setups. I could go on endlessly about all these sort of little issues that AMD never gets around to addressing, which ultimately makes it more reasonable to pick an Nvidia card. It's so much more than just the perf/$ in currently benchmarked titles, and that's where Radeon starts to stumble.
NVIDIA Finally Fixes Multi-Monitor Power Consumption of Turing GeForce 20. Tested on RTX 2070, 2080 and 2080 Ti.
Today, NVIDIA released their GeForce 416.81 drivers, which among others, contains the following changelog entry: "[Turing GPU]: Multi-monitor idle power draw is very high. [2400161]". Back at launch in September, Turing was plagued with very high non-gaming power consumption, in both...
www.techpowerup.com
Didn't think so. Because it only matters when it's AMD.
Node "advantage"... The reason nVidia is practically always later on a node is because they prefer the matured process. Releasing products on a smaller nodes early is not really an advantage at all.You seem to forget AMD has a node advantage, one that'll be erased this year.
Having said that, even if you correct for node size, the 5700XT is still smaller compared to the 2070 Super. The architecture advantage of RDNA will be reduced but will not go away after nVidia's node shrink, if things remain similar. This might change with AMD's incorporation of hardware RT, but we'll see.
Additionally, AMD has reserved practically the whole of TSMC's 7nm wafers. nVidia would have to go to Samsung for their chips, and it's well-known that TSMC's 7nm tech, both DUV and EUV, are superior to Samsung's at this point where it matters most; yields.
No it isn't. It shows how slow or fast an architecture is. It becomes arguably useless in end products if the clock speeds vary by a lot, like in the case of Gen 1 Ryzen (4GHz) and Intel's CPUs (5GHz). But for GPUs of AMD vs nVidia the clocks aren't really that different, so that makes it extremely relevant.Clock to clock is a useless metric in real world products.
While Vega was about 15% slower than Pascal, RDNA is equal to Turing (technically it's 1% faster, but that's margin of error stuff). Put differently, RDNA is a 39% uplift in per clock performance compared to Polaris, and 28% over Vega. If you don't understand how huge this is, no one can help you. And this is the reason why some of us are quite optimistic about big Navi and RDNA2.
And if what they just said at CES is true, they cannot be underestimated. They updated Vega CUs, and received a 59% uplift in performance. That would now put Vega at 31% higher performance per clock than RDNA 1... So.. Yeah...
As already mentioned, the 5700 series is the go-to card for anything over $300. If someone buys nVidia, they were most likely going to buy nVidia anyway.I'm not saying AMD is dragging because Nvidia has a lot of cards on the market. I'm saying they're dragging because they don't have competitive products out on the market in several segments. Any fool can see that. They had essentially 5700XT and non-XT last year, everything else was either old or effectively useless in terms of moving the market. This year they'll have Navi 20. That's probably it. Nvidia in turn will likely refresh their whole stack or close to it.
Vega 56 and 64 were pretty good deals last year, despite their age. The 5600 series is coming out now, and is the obvious choice for the $200 - $300 range. But I guess it will be considered 'late', because it's AMD.
Polaris, especially the RX 570, was still the king in the $100 - $200 segment, except no one cared about it because it's not an nVidia card. The nVidia cards released later in this segment were never considered 'late', despite Polaris dominating that range for years.
AMD is doing what they need to be doing. They are following their own releases, rather than adapting to nVidia. I don't know if you noticed, but, after a long while, nVidia has released a huge list of cards to actually combat AMD's releases. When was the last time that happened? The momentum has shifted from AMD adapting to nVidia, to nVidia adapting to AMD. nVidia are faster for sure, but I see a change in trend. Most don't see it yet.
It's not the job of consumers, but it's a good idea to be conscious of what you're supporting with your money.If AMD is just competitive, people still go for Nvidia, right? It's not the job of consumers to help failing companies. It's their job to sell the product, and in AMD's case that's going to require a product faster and more efficient than Nvidia.
People blindly buying nVidia cards is the equivalent of people blindly putting money into loot boxes. And when someone comes around and says that there no loot boxes in this publisher's game, buy this game instead, everyone starts saying that the loot boxes add to the gaming experience and that games with loot boxes are superior. And then they go on to say that it's the job of the publisher without the loot boxes to convince people to buy their game. Sounds like those people are already convinced, and that publisher is better off not bothering.
As for what AMD requires, nobody really knows. There was no real competition for the RX 570 for quite a while. No one bought it. Everyone went for the 1050 Ti instead.
When AMD releases a card that is faster and more efficient, people will start talking about drivers.
When AMD fixes the drivers, people will start talking about price.
When AMD lowers the price, people will start talking about ray tracing.
When AMD adds ray tracing, people will start talking about whatever else is required to still justify how nVidia is better.
And do you see that as a good, or a bad thing? To me, that is one of the worst things to happen in this industry.There's absolutely no worry right now that AMD is going to deliver a product that will dominate Nvidia in the GPU market. They could have a dominating product for half a decade and they still wouldn't be where Nvidia is right now in terms of market leadership.
And I'll say again. People thought the same about Intel. Yeah Yeah, nVidia isn't sleeping like Intel is. I get it. But AMD's CPU brand making a name for itself will inevitably carry over to their GPU sales. Because people are shallow like that. Don't underestimate what can happen.
They didn't have an updated architecture to go along with it.If AMD can push some next gen console RT/Navi advantage in the PC market, great for them, but we've all heard these stories before. They amounted to less than nothing this gen.
So that you can buy nVidia cheaper, right?I would like nothing more than AMD to actually offer better products.
And for some reason the mainstream doesn't really use AMD cards. Why is that?As it stands, we're a long way off from that. I don't bemoan people who get themselves an AMD GPU, there's plenty of use cases where they offer plenty of value and sufficient support, aka the mainstream volume segment I talked about.
There are always reasons. But the majority of reasons given are either obviously biased or simply excuses. It's basically the same reasons that people use when they prefer an iPhone rather than an Android phone. The funny thing is that the majority of the large companies see the value in AMD. But for some reason, gamers don't. If AMD really was as trash as people make them out to be, they wouldn't be the primary choice for consoles, super computers and even Apple products.I bemoan people who think there's no reason to pick Nvidia other than overpaying for more performance or blind brand worship.
I find this argument quite amusing, considering nVidia generally does way better in mainstream benchmarks than AMD, and AMD products perform better and better over time.Or people that claim AMD is on par with their feature set outside "that useless ray-tracing feature". That's crassly oversimplifying things. On paper Radeon is always better perf/$. Once you start playing with them (year or two later) and shit don't work right (you don't play that mainstream benchmark), it might not feel that way anymore.