• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why are nvidia dominating amd with so much higher prices?

twilo99

Member
I've had no major driver issues with my 6800xt and overall it plays all current titles I've tried very well so I dunno
 

KungFucius

King Snowflake
The 4090 is cheaper than the 3090Ti and cheaper than the 3090 adjusted for inflation. The 'problem' is that the 4080s are priced relative to the 4090 instead of less than half the price like last gen and people don't like that.

Look it gives the perfect opportunity for AMD to come in and take over in the 500-900 range if they have something compelling. If AMD has something better then they can win some mindshare. If people buy a launch 4090 and find out the AMD top card is better and cheaper they will remember that. If not, they will continue to ignore AMD and get in line on day 1 for Nvidia.
 

bitbydeath

Member
People don’t like change.
If price wasn’t a factor everyone would have moved onto the superior iMacs, but many are set in their ways with what they already use.
 

Soltype

Member
Feels like AMD is always playing catch up.I can't speak for anyone else, but buying an AMD card feels like settling as opposed to getting a bargain.
 

LiquidMetal14

hide your water-based mammals
Much like when Intel was the best but more expensive, I migrate to Nvidia.......for now.

I'm all AMD now and not loyal to just one brand even on GPU's.
 

supernova8

Banned
People who complain about Nvidia pricing and threaten to go to AMD never actually do it, they just want AMD to undercut Nvidia and force Nvidia to lower prices so they can buy Nvidia again.
Tonight Show Comedian GIF by The Tonight Show Starring Jimmy Fallon


Yeah I've actually jumped between NVIDIA and AMD quite a lot over the years and I can say I've never had any issues with NVIDIA GPUs but have semi-regularly had problems with AMD (ATI at the time) cards including driver issues and heat issues.

The interesting thing is that AMD managed to turn the tables on Intel despite having a terrible product offering for many years (basically from the Phenom era until Zen it was just hot trash that couldn't compete despite seemingly having way more cores) and also consider that switching from Intel to AMD does require a major leap of faith because once you've bought it you're kinda stuck with it as your "platform" as opposed to GPUs that can be switched out year to year.

With GPUs, excusing the driver and heat issues (which seem to be gone with the RX 6000 series), I don't recall AMD having a reaally bad product. They may be behind NVIDIA (and have been clearly behind since around the 8800 series if my memory serves me) but it always seems like they're "good but not good enough" as opposed to "terrible".

With that in mind, it makes you wonder why they cannot "beat" NVIDIA when they have shown they can beat Intel.

I reckon it's simple - Intel dropped the ball so Ryzen ended up looking like the better option compared to Intel's stale offering.
In contrast, Nvidia has almost never dropped the ball so even if Radeon innovates, GeForce is still ahead.
In other words, maybe AMD just got lucky against Intel and it's unlikely to happen against Nvidia.
 

Neo_game

Member
Some theories say they control inventory a lot better.

By making it seem like a commodity/not as readily available as AMD GPU's they end up selling more. It's obvious that they have a not of RTX 3000 stock now, but instead of dump they'll make RTX 4000 more expensive. With RTX 2000 and 3000 it's obvious they never wanted to flood the market.

They also sell directly to a lot of OEM's, as well as miners (before the crash).

Better hashrate and better resell value.

Seeing these used to pay themselves, in the long run better made it more profitable per unit.

They are, but FSR2 is closing the gap on DLSS.

It's not that they needed to use tensor cores for DLSS as you had DLSS 2.0 running on regular shader units on Control before they revamped it to require them. It's just that it was free to do DLSS there seeing it wasn't being used for anything else in a gaming scenario.

Nvidia using it for that though, most likely limits it's use for AI or other game related task if developers wanted to. So AMD doesn't really need to have them if they have more performance in the first place. For games that is.

Tensor Cores of course are valuable for professional workloads and AMD doesn't have a solution for that.

With RT, AMD's solution is seemingly similar, adding ray accelerators to the CU's. It can pay off later on as well against dedicated units, if the GPU's have overhead that is. Again, it's "free" on Nvidia because they'd be sitting unused otherwise.

They should, yes.

Undercutting them in price shouldn't be hard though.

I also want AMD to do well and be competitive I think RDNA3 will be good. Although I personally haven't bought AMD card like most guys here lol. Competition is good for consumers if both manufacture are competitive just like how it is in CPU space ATM but AMD market share is still low both in CPU and GPU. Nvidia is just too big, I din't know they are even richer than Intel until recently. https://www.pcgamer.com/which-is-bigger-intel-amd-nvidia/
 
NVidia branding and marketing is a huge advantage. Just like it was against Intel on the CPU front, AMD only has a chance if they completely outclass NVidia.
 
Last edited:

Ironbunny

Member
I can only talk of myself but I've been locked into nvidia due to gsync monitor. Now that I'm forced to buy a new monitor AMD is and option again. Having freesync & gsync supporting monitor means in future I can leap between brands. I'm mostly hoping AMD does the ZEN for GPU's.

The same talk was when Intel ruled the CPU world but now you cant go wrong which ever you choose.

In contrast, Nvidia has almost never dropped the ball so even if Radeon innovates, GeForce is still ahead.
In other words, maybe AMD just got lucky against Intel and it's unlikely to happen against Nvidia.

Its not luck if you actually innovate and the competition does not push forward. And unlike Intel Nvidia seems to be moving forward every generation.
 

Hoddi

Member
Yeah I've actually jumped between NVIDIA and AMD quite a lot over the years and I can say I've never had any issues with NVIDIA GPUs but have semi-regularly had problems with AMD (ATI at the time) cards including driver issues and heat issues.
I can kinda agree with that since I've also been using both for two decades now. But I think people tend to conflate stability issues with performance issues and I don't think nvidia has been any better in terms of stability. I think both are actually quite fine in that regard (overall) but nvidia's drivers have caused me more than a few BSODs when their cards are still new. Things like sleep mode causing the KM driver to crash.

My issues with AMD have always been purely down to performance and mainly in their OpenGL and D3D11 drivers. But there's like 20 other drivers in a driver package and I've never really had any issues with the rest of them. System stability has been no better or worse than nvidia's, in my opinion, and their D3D12/Vulkan drivers even seem to edge out nvidia nowadays.
 
Last edited:

supernova8

Banned
Its not luck if you actually innovate and the competition does not push forward. And unlike Intel Nvidia seems to be moving forward every generation.

Yeah maybe I worded that badly. What I meant is that if NVIDIA were in Intel's shoes, they'd probably have been innovating for many years before that so that when Ryzen finally came out, the competition would be so far ahead that it wouldn't be considered "innovation", rather "catching-up ish".
 

twilo99

Member
RDNA3 and AM5 will be a very potent combo..

With both consoles running on RDNA I think we will end up with a lot of well optimized games for that architecture too.
 

PhoenixTank

Member
Honestly i don't like having 30% of the silicon dedicated for AI/RT, Nvidia also increased the L2 cache
by massive amount which will eat even more silicon, if they made a GPU without all that specialized
silicon we could end up with much faster GPUs that can handle 4k with ease also nvidia is using this
tech to make their previous GPUs obsolete very fast
This is a bit off on a tangent from the main topic but I haven't seen it mentioned elsewhere on Gaf:
Twitter analysis so, handful of salt required... but apparently the Ada SM (Streaming Multiprocessor) is barely changed over Ampere. Yes they've heavily pumped up L2 cache and they can fit more SMs, clocks will be higher, and they have that interesting reordering operations tech etc. which will all help performance but my understanding is that architecturally not much has improved on the raster side. This would somewhat explain why Nvidia is very much sticking to "Look at our RT performance! Look at how fuckin' sweet that DLSS3 framerate is! 2X-4X better!" in their own comparisons against last gen.
I don't typically spend my days looking at block diagrams but:

Ampere:
raD52-V3yZtQ3WzOE0Cvzvt8icgGHKXPpN2PS_5MMyZLJrVxgMtLN4r2S2kp5jYI9zrA2e0Y8vAfpZia669pbIog2U9ZKdJmQ8oSBjof6gc4IrhmorT2Rr-YopMlOf1aoU3tbn5Q.png

Ada:


Ignore sizing, that isn't representative but the layout and counts are really damn similar.
Those left INT32 and FP32 blocks are pregrouped on the Ada block diagram, as one datapath, and that is also the case with Ampere. Second datapath has a bit of a discrepancy showing FP64 on Ampere rather than FP32 but I'm pretty sure that is just down to a datacenter block diagram for Ampere vs consumer Ada.

Again, I could be misinterpreting and I wouldn't have thought much of it if not for some replies on the tweet but it suggests that the meat of those big performance gain numbers are elsewhere rather than across the board.
Could make for a very interesting result over the next couple of months.
 

PeteBull

Member
NVidia branding and marketing is a huge advantage. Just like it was against Intel on the CPU front, AMD only has a chance if they completely outclass NVidia.
Branding and marketing plays a role but believe me, no marketing gonna help if ur card crashing like a mofo coz of unstable drivers like was so often case with amd products(and yup even if 5% of their cards crash like that, it can make huge wave, customer lost once not to price or performance but actual usuability/stability of a card is customer u lose for a very long time if no forever.
 

Kumomeme

Member
nvidia currently has advantages in term of mindshare

so it is not easy for amd to turn the table in blink of eye.
 
People just trust Nvidia more when it comes to software support and support for new features.
That sounds like a sound strategy if you need those features, but while nvidia has the creativity to go ahead and implement these "next gen" features in their hardware, they're always proprietary leaving the rest of the industry to figure out how to do it in an open environment.

CUDA = OpenCL
Nvidia Gsync= Freesync and VRSS
DLSS=FSR
RT Cores=Ray Accelerators on compute units?

Probably more. I value a lot AMD ethos in, when making their own solution to things actually releasing them open and, if their own solution is not picked despite being open (or is renamed and developed further) actually contributing to the development of the solution.

Nvidia is only focused in maintaining the lead instead, but it works for them. We need AMD, we also need Nvidia, but they could stand to be a better company than they are. I never had extensive problems with AMD drivers personally and had a PC with a top range Nvidia GPU crashing every day due to something GPU related so I never understood that part of the issue. But that's just my own personal experience.

What I never enjoyed, is the extra AMD heat, not sure I was comparing apples to apples because I wasn't doing benchmarks side by side, but damn.
 
Last edited:

Admerer

Member
It started with drivers, before Nvidia's exclusive features like physX, Gsync, rtx, etc.
ATi had a bad reputation with it's drivers, Nvidia has always had a better drivers/software team and was one step ahead of ATi even when they had inferior hardware (5870 vs gtx480).

There was always a feature or two that Nvidia cards had that was lacking on the red side, small stuff even, like Rivatuner or better compatibility with emulators.
Even now if you ask any VR enthusiast or streamer what graphics card to by they'll say Nvidia, just because they have a better codec right now.

I think Nvidia GeForce is also a better brand to market than ATi radeon, just sounds cooler, GeForce even sounds fast.
 

Nocturno999

Member
I always had issues with AMD cards. Drivers are just terrible.

Still, I'm going to skip the RTX 40 series and keep my 2070 Super for a couple of years more.
 

Admerer

Member
When AMD bought ATi, AMD was way behind Intel in market share and mind share, so when the average consumer saw the AMD brand they though of a second rate CPU manufacturer which didn't help its graphics brand.

Plus, Nvidia having the high-end performance crown the last few generations meant enthusiasts/influencers shilled for Nvidia, hurting the AMD graphics brand.
 
Top Bottom