• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia financial results for third quarter of FY23 - 51% YoY decrease in gaming GPUs

Gaiff

Gold Member
Oh really? You might want to let the stock trackers know, who have an entire month's history of 1 - 2 minute sell outs (which are actually instantaneous sell outs, cause 1-2 minutes just so happens to be how long it takes retail product pages to update, and trackers to log it).
Seems to be an issue in the US. You can find them here but they do sell out quickly. If you see 10 in stock, they'll likely be gone within an hour.
 

chromhound

Member
happy on the block GIF by Global TV
 

winjer

Gold Member
This is bad. Very bad. They can't allow their stocks to tank so they will overcharge us to maintain high margins and satisfy their investors.

A company like nvidia makes it's profits by selling millions of units. Not by selling thousands.
If they price their GPUs out of most consumers reach, they get much fewer sales.
The 4090 is a prosumer card. That huge vram buffer is very usefull for rendering, editing, game making, etc.
Miners could justify paying thousands of dollars per GPU. But few people can justify paying thousands of dollars just to play games.

These tech companies have to understand that the 2 last years were the exception. Not the rule.
The sooner they get this into their minds, the sooner they start selling normal quantities.
 

Reallink

Member
Seems to be an issue in the US. You can find them here but they do sell out quickly. If you see 10 in stock, they'll likely be gone within an hour.

Because your retailers are prescalping them. If US retailers sold them all at $2200+ they'd sit in stock here as well.

A company like nvidia makes it's profits by selling millions of units. Not by selling thousands.
If they price their GPUs out of most consumers reach, they get much fewer sales.
The 4090 is a prosumer card. That huge vram buffer is very usefull for rendering, editing, game making, etc.
Miners could justify paying thousands of dollars per GPU. But few people can justify paying thousands of dollars just to play games.

These tech companies have to understand that the 2 last years were the exception. Not the rule.
The sooner they get this into their minds, the sooner they start selling normal quantities.
They make the bulk of their profits now by selling to corporations for cloud and AI. They have no reason or incentive to waste wafers on a low margin part like an affordable 4060. 1 Quadro sell is equal to like 20 4060s, and at 4 billion dollars a quarter, corporations are buying basically the same volume as gamers ever were.
 
Last edited:

Gaiff

Gold Member
Because your retailers are prescalping them. If US retailers sold them all at $2200+ they'd sit in stock here as well.
No, they're selling at MSRP or AIB prices and closely match US prices when converted. I've been monitoring for a few weeks and you get stocks every day or other day.
 
Last edited:

winjer

Gold Member
Because your retailers are prescalping them. If US retailers sold them all at $2200+ they'd sit in stock here as well.


They make the bulk of their profits now by selling to corporations for cloud and AI. They have no reason or incentive to waste wafers on a low margin part like an affordable 4060. 1 Quadro sell is equal to like 20 4060s, and at 4 billion dollars a quarter, corporations are buying basically the same volume as gamers ever were.

True, they make a lot of money on server and professional market.
But the gaming market also makes a lot of money. Margins may be smaller, but the market is bigger.
As long as there are waffers, it's worth to make gaming GPUs.
Mind you, it's was the gaming division that for nearly a couple of decades, that paid of the development of all other sectors.
 

Reallink

Member
True, they make a lot of money on server and professional market.
But the gaming market also makes a lot of money. Margins may be smaller, but the market is bigger.
As long as there are waffers, it's worth to make gaming GPUs.
Mind you, it's was the gaming division that for nearly a couple of decades, that paid of the development of all other sectors.

Contrary to fake rumors spun up to drive clicks and likes about them scrambling to offload excess wafers, there is very clearly nowhere near enough capacity. They can't even come close to satisfying demand for a (comparatively) ultra low volume $2000 halo product, never mind an XX60 that'll move 20x this volume. Same goes for the made up horseshit about warehouses sitting on mountains of 3XXX inventory. EVGA cleared out every one of their 3XXX's within an hour or two of discounting them. Obviously that wouldn't be possible if they were really sitting on these unfathomably large surpluses of card. Logically, the AIB's themselves would be getting into price wars with each other desperate to sell their unsellable inventories. In reality all the 3080's and lower (which don't have a 4XXX replacements on the horizon) are still selling for well above FE MSRP. Sure looks like Nvidia and AIB's are worried about overstock they'll never sell /s. I don't know who makes this shit up, or to what end.
 
Last edited:

supernova8

Banned
We're sort of fucked in the short term because the scalping pandemic has demonstrated that people really are willing to pay these stupid prices.

Plus, you can't hand wave it away with "oh crypto miners" because look at stuff like the PS5, people are still seemingly willing to pay inflated prices well above MSRP for it, and it has no use for mining.

I think part of it is because people have built up a lot of wealth over the last 2 years or so (either through saving or investing) due to having excess capital as a result of not going out much (or as much) following COVID. Therefore, we'll probably have wait for people to burn through said extra capital (read: wait for people to not have the extra financial headroom they had) before GPU prices come back down.
 
Last edited:

Reallink

Member
We're sort of fucked in the short term because the scalping pandemic has demonstrated that people really are willing to pay these stupid prices.

Plus, you can't hand wave it away with "oh crypto miners" because look at stuff like the PS5, people are still seemingly willing to pay inflated prices well above MSRP for it, and it has no use for mining.

I think part of it is because people have built up a lot of wealth over the last 2 years or so (either through saving or investing) due to having excess capital as a result of not going out much (or as much) following COVID. Therefore, we'll probably have wait for people to burn through said extra capital (read: wait for people to not have the extra financial headroom they had) before GPU prices come back down.

GPU mining has been dead for at least 6-8 months, and was officially coffined and buried 2 months ago when Etherium went proof of stake. A new shit coin will likely rise to take it's place some day, but for now this is 100% real demand, there's nothing artificial propping it up. It's not just GPU's either, a bunch of products are doing gangbusters @ prices 50%, 100%, and even 200% higher than they were a couple years ago. People are still paying thousands of dollars over sticker for cars 4 - 6 months in advance. Unfortunately the root cause is not transitory stay-at-home savings or stimulus checks, it's the raises so many businesses doled out (and continue to dole out) to combat labor shortages. Meaning it's permanent, even if you choose to ignore the fact prices never go back down once they go up.
 
Last edited:
Tip to all the people waiting for AMD to launch price-competitive GPUs just to buy Nvidia GPUs at a lower price later on:
- Don't be surprised if AMD eventually stops making PC GPUs and Nvidia starts charging $2000 for a 3060 equivalent.

Why would AMD stop making GPUs?
 

supernova8

Banned
GPU mining has been dead for at least 6-8 months, and was officially coffined and buried 2 months ago when Etherium went proof of stake. A new shit coin will likely rise to take it's place some day, but for now this is 100% real demand, there's nothing artificial propping it up. It's not just GPU's either, a bunch of products are doing gangbusters @ prices 50%, 100%, and even 200% higher than they were a couple years ago. People are still paying thousands of dollars over sticker for cars 4 - 6 months in advance. Unfortunately the root cause is not transitory stay-at-home savings or stimulus checks, it's the raises so many businesses doled out (and continue to dole out) to combat labor shortages. Meaning it's permanent, even if you choose to ignore the fact prices never go back down once they go up.
Haha that last bit really hit hard! Fucking annoying isn't it! They (corporations) always find a way/excuse to keep prices high once they go up.
 

CuNi

Member
Good. Let then bleed money for their greed.

Next gen GPUs need to drop the price by at least 30% before I'm even going to consider buying on release day.
 

Buggy Loop

Member
What is it Yo2Y? I expect last year was unusually inflated by the interest in crypto mining.

This

All tech companies are in the downtrending from the stupidly inflated COVID times. Like no shit, you didn’t sell as much as the crypto/COVID craze.
 

Buggy Loop

Member
I don't know who makes this shit up, or to what end.


iu


And a few others

It’s ok, let your brain exit this existence and come to AMD hype train, where every little tidbits are tinfoil hat and this time we beat evil Nvidia! Oh wait? Already 80% of the stupid rumours surrounding the RDNA 3 last year were debunked? Never mind that, to hype and beyond, sit on that MOUNTAIN of ampere cards Nvidia lolololol
 

Haggard

Banned
We're sort of fucked in the short term because the scalping pandemic has demonstrated that people really are willing to pay these stupid prices.
Lol what? 90 % of the ridiculous priced hardware went to miners...... And that market segment ceased to exist.
 

Kadve

Member
I really hate this whole "people aren't buying because our stuff is too expensive, lets jack up prices even further so that we can keep profits the same" mentality every company seems to have now.
 

Buggy Loop

Member
For people like Buggy Loop, NVidia can do no wrong. And AMD is always wrong.

I guess 30 years with ATI/AMD fried a few brain cells, too high on copium all those years for the next Nvidia killer, was hard to stop the dopamine rush of all the stupid rumours that always fumbled into nothing.

The thing is that we clearly don’t live in the same universe and most peoples here have no fucking clue about tech. Only have to look at the discussions we had for the lighter PS5 cooling radiator to get an idea or the fact that even /r/AMD has lower expectations than some users here.

AMD is going to see the same downwind as any tech companies after COVID craziness, at the dawn of a recession. Just look at Ryzen 7000 series not flying off the shelves, you think that won’t reflect in their profits?

I mean, all aboard “hur hur hur Nvidia hurt Nvidia bad yayyyy!” Train, let’s just leave the brain at the door.


Interesting take … what exactly did he “debunk“?

That Nvidia / PCI-SIG did not fuckup the cable.

User error being the vast majority, with clear indication from the cables sent to him that they were not properly seated, and a possibility of FOD sprinkled into this as anything manufactured in life.

All of Igor’s and JayZ’s speculation : out the window.



I mean just take 30 mins of your time to watch this.

This is why gamers nexus is a league apart from all other tech YouTubers.
 
Last edited:

winjer

Gold Member
I guess 30 years with ATI/AMD fried a few brain cells, too high on copium all those years for the next Nvidia killer, was hard to stop the dopamine rush of all the stupid rumours that always fumbled into nothing.

The thing is that we clearly don’t live in the same universe and most peoples here have no fucking clue about tech. Only have to look at the discussions we had for the lighter PS5 cooling radiator to get an idea or the fact that even /r/AMD has lower expectations than some users here.

AMD is going to see the same downwind as any tech companies after COVID craziness, at the dawn of a recession. Just look at Ryzen 7000 series not flying off the shelves, you think that won’t reflect in their profits?

I mean, all aboard “hur hur hur Nvidia hurt Nvidia bad yayyyy!” Train, let’s just leave the brain at the door.

Give me a break. I have had more nvidia GPUs than AMD. Including the one I have now.
Just because other people are not nvidia fanboys, does not mean they are AMD fanboys.

You are constantly spamming AMD threads with negative comments. Constantly with that "copium" non-sense, trying to bait people into arguing with you.
And in NVidia threads, you are constantly defending it, even when they do bad.
A lot of people have already noticed your bias.
 
Last edited:

Dr.D00p

Member
The 4080 FE has just been restocked at the Nvidia store here in the UK and its still showing as in stock a full 15 minutes after dropping!

Lol..the cheapest (and best, IMO) 4080 you can buy and not sold out instantly just one day after launching....Nvidia should take heed about what this is telling them.
 

adamosmaki

Member
I guess 30 years with ATI/AMD fried a few brain cells, too high on copium all those years for the next Nvidia killer, was hard to stop the dopamine rush of all the stupid rumours that always fumbled into nothing.

The thing is that we clearly don’t live in the same universe and most peoples here have no fucking clue about tech. Only have to look at the discussions we had for the lighter PS5 cooling radiator to get an idea or the fact that even /r/AMD has lower expectations than some users here.

AMD is going to see the same downwind as any tech companies after COVID craziness, at the dawn of a recession. Just look at Ryzen 7000 series not flying off the shelves, you think that won’t reflect in their profits?

I mean, all aboard “hur hur hur Nvidia hurt Nvidia bad yayyyy!” Train, let’s just leave the brain at the door.




That Nvidia / PCI-SIG did not fuckup the cable.

User error being the vast majority, with clear indication from the cables sent to him that they were not properly seated, and a possibility of FOD sprinkled into this as anything manufactured in life.

All of Igor’s and JayZ’s speculation : out the window.



I mean just take 30 mins of your time to watch this.

This is why gamers nexus is a league apart from all other tech YouTubers.

Definately was blown out of proportion the whole thing considering almost everything is user error but nvdia imo should have used an IC on the cable to detect if its properly seated or not. Wouldn't really cost them that much
 

Buggy Loop

Member
Give me a break. I have had more nvidia GPUs than AMD. Including the one I have now.
Just because other people are not nvidia fanboys, does not mean they are AMD fanboys.

Oh you took it personally? I’m referring to me as an AMD fan lol, but hey, if the hat fits.
 
Nvidia can go take a fuck to themselves. The prices they ask are insane. I'd like to upgrade my card but there's no way I'm paying stupid money for a card. I'm not buying a last gen 3000 card.

Even if you can get one at a good price there is the ridiculous cable/power problem. I would need to get a completely new PSU, burn through more electricity, and possibly damage the card, case, or even my house.

I hope the new AMD cards deliver. I'm not looking for an insane performance increase. My current card is a 2080 and i'm playing at 1440p 144hz. Not interested in raytracing either.
 

Buggy Loop

Member
I'm just tired of seeing you shitpost on every thread.
Everyone else is trying to have a conversation, until you show up.

Everyone is trying to have a wet dream you mean and uninterrupted circle jerk. You seriously want to go through the list of nonsense rumours RDNA 3 had in the past year that fell flat on its face?

Let’s pick one of many, which btw MLID and its church even deletes videos after they’re proven as bollock.

d8hbr48ybmy91.png



The dies are not 3D stacked and there is one single compute die not multiple compute dies.

There is only 96MB of infinity cache not 512MB.

The memory controller is 384-bit and not 256-bit like he claimed with “very high confidence”

It’s uses PCIe 4.0 not 5.0.

These 4 things I mentioned would have been decided by AMD and been in production before he made his video. If he had any real sources or legitimate information he would not have gotten them so wrong.

I would jump in Nvidia threads if there were unrealistic expectations about ML or RT or silicon fab unicorns with expected lower costs at TSMC or some wet dream of such, but I don’t have to 🤷‍♂️
Where are the stupid rumours on Nvidia side? Can you even name an MLID equivalent on Nvidia side who is expecting Jesus in GPU form?

I was on AMD side all these years and saw the stupid shit like the 970 memory debacle and so on. 4080 pricing is stupid, is it to clear ampere cards? Maybe. I don’t care, it is what it is, it won’t fly off the shelves till a price drop because that card is too close to the 4090 behemoth.

The cable connector had a lot of finger pointing in the past month, a lot of trolling from AMD side “hur hur hur I don’t want a fire” for what in the end? User error?

Bunch of tech YouTubers click baiting. Bunch of peoples falling for it. This forum’s knowledge in tech and the tech engineering side of it is weak. The shit I read in the PS5 cooler debate, eeessshhh.
 
Last edited:

winjer

Gold Member
Everyone is trying to have a wet dream you mean and uninterrupted circle jerk. You seriously want to go through the list of nonsense rumours RDNA 3 had in the past year that fell flat on its face?

Let’s pick one of many, which btw MLID and its church even deletes videos after they’re proven as bollock.

d8hbr48ybmy91.png



The dies are not 3D stacked and there is one single compute die not multiple compute dies.

There is only 96MB of infinity cache not 512MB.

The memory controller is 384-bit and not 256-bit like he claimed with “very high confidence”

It’s uses PCIe 4.0 not 5.0.

These 4 things I mentioned would have been decided by AMD and been in production before he made his video. If he had any real sources or legitimate information he would not have gotten them so wrong.

I would jump in Nvidia threads if there were unrealistic expectations about ML or RT or silicon fab unicorns with expected lower costs at TSMC or some wet dream of such, but I don’t have to 🤷‍♂️
Where are the stupid rumours on Nvidia side? Can you even name an MLID equivalent on Nvidia side who is expecting Jesus in GPU form?

I was on AMD side all these years and saw the stupid shit like the 970 memory debacle and so on. 4080 pricing is stupid, is it to clear ampere cards? Maybe. I don’t care, it is what it is, it won’t fly off the shelves till a price drop because that card is too close to the 4090 behemoth.

The cable connector had a lot of finger pointing in the past month, a lot of trolling from AMD side “hur hur hur I don’t want a fire” for what in the end? User error?

Bunch of tech YouTubers click baiting. Bunch of peoples falling for it. This forum’s knowledge in tech and the tech engineering side of it is weak. The shit I read in the PS5 cooler debate, eeessshhh.

You really think I'm defending AMD rumors and all that non-sense? I'm one of those who criticized those leakers with huge expectations. Just go look at those threads.
Don't bother trying to bait me into a fanboy fight. I don't care about nvidia, AMD or Intel. I care about myself. And I'll criticize everyone of them, when they screw with consumers.
 

64bitmodels

Reverse groomer.
User error being the vast majority, with clear indication from the cables sent to him that they were not properly seated
user error like this doesnt show up with 8 pins. he even said in the video that all of these were caused by user error then it's pretty easy to deduce that the design is flawed, even if it wasn't caused by the pins directly. You can't blame someone for dropping a carton full of eggs when said carton holding the eggs was made out of slippery ice that was giving the guy frostbite
 
Last edited:

ToTTenTranz

Banned
Why would AMD stop making GPUs?


AMD isn't in the dGPU business to do charity for consumers who are loyal to Nvidia. They're in it to make money like everyone else.

If the general consumer market only regards AMD's dGPU offerings as a convenient tool to lower dGPU prices from Nvidia, then AMD will be stuck as the "low-end option" and never get the sales volume or profitability that would be expected from the high-performing company that they became.
If AMD doesn't get sufficient returns from the R&D needed to make new GPU chips and graphics cards, they'll eventually stop making them.


They're the undisputed leader in PC and console APUs, and it seems their datacenter GPUs are doing quite well too. AMD is now in a position where they don't really need to sell discrete PC GPUs to survive or even to prosper.
Though it's not like AMD would stop making GPUs, they'd only stop making discrete GPUs for the PC consumer market. APUs with powerful iGPUs have gone past low-end dGPU performance territory with Rembrandt, and with Phoenix it's probably going up towards RTX3050/RX6500 levels. With chiplet tech and rising CPU/APU power delivery, we're probably going to see AMD and Intel's iGPUs progressively creeping up and taking marketshare away from discrete GPUs.
 

HoofHearted

Member
That Nvidia / PCI-SIG did not fuckup the cable.

User error being the vast majority, with clear indication from the cables sent to him that they were not properly seated, and a possibility of FOD sprinkled into this as anything manufactured in life.

All of Igor’s and JayZ’s speculation : out the window.



I mean just take 30 mins of your time to watch this.

This is why gamers nexus is a league apart from all other tech YouTubers.


I did watch the video - multiple times - my takeaways from the video:

Key direct quotes below:
  • All of the cables can fail
  • What makes them fail (in order of what he documented in the video):
    • Foreign Object Debris in the cable caused by improper manufacturing or scraping of the bumps within the connector combined with high current and/or poor connection
      • Pins can scrape on the bumps of connection and create debris that can heat up inside of the plastic by creating poor points of contact
    • Another point of failure
      • Extremely improper insertion by the user
        • This is where they show an example of what an "extreme improper insertion" looks like where the adapter is practically falling out of the connector
        • This is also shown at the beginning of the video
      • Even if it is user error - at some point it's like if the design is so bad that it encourages user error within the amount of irregularity then it is a combination of user error plus design error
    • Extreme Improper Insertion in combination with a taut wire on one or more pins
What he called out in the video as debunked:
  • Weak solder joints
Other notable comments:
  • Over time the copper will oxidize raising the resistance (in the cable) - our Failure Analysis Lab contact wasn't particularly happy about the build quality of the cables, specifically the lack of durability he was dissatisfied with
  • We notably were unable to re-create the failure (attempted in our previous video) with a partial connection of approximately 1mm sticking out, giving the user enough of the feeling the adapter is connected properly but the adapter didn't fully click..
  • These cables don't really click, which is part of the potential design oversight we think
Final review - Reasons for failure (in order):
  • Foreign Object Debris / Manufacturing Defects
  • Heat generated through a parallel high resistance conductive path caused by Foreign Object Debris
  • EXTREMELY bad user error
Just watched it again and at no point during the video did he state that the root cause of this was unilaterally "user error being the vast majority"...

If anything he tends to appear to focus on the FOD being the primary root cause of issue - overall, the video simply provides additional information / details further outlining potential manufacturing problems with the cable along with design issues/impacts that ultimately contribute to user error(s).
 

Buggy Loop

Member
Just watched it again and at no point during the video did he state that the root cause of this was unilaterally "user error being the vast majority"... If anything he tends to appear to focus on the FOD being the primary root cause of issue - overall, the video simply provides additional information / details further outlining potential manufacturing problems with the cable along with design issues/impacts that ultimately contribute to user error(s).

"However the failure in general is overwhelmingly uncommon based on the statistics that we have today and many of the failures are actually very easily avoidable, not all of them, but the vast majority are"

Which one could he be talking about? Oh must be the FOD one, the one that is clearly "avoidable", that one cable from a user.

"..that's as far as that particular analysis can go, a delaminating plating isn't good if that's what's happening here [talking about the sole FOD cable they have], but based on our other findings, we don't believe it's the cause of the melting or at least not the primary one... as we said there's one major last piece of the puzzle and that's the partial insertion of the connector.."

Then they proceed making 3d animations of the connector badly put in and angled and melt cables with only one method, the user error one.

"... Foreign object debris which happens from insertion cycles and that's not.. we don't think that's the most common failure to be extremely clear about that.."

Ponder Denzel Washington GIF by Entertainment Tonight


Also other theory debunked, the double split terminal pried apart that would cause a failure.

You must be the only one on the internet who interpreted his video as FOD being the primary root cause of the issue. Congratulations! Please send Steve an email to appreciate his investigative work and a link to your post.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
user error like this doesnt show up with 8 pins. he even said in the video that all of these were caused by user error then it's pretty easy to deduce that the design is flawed, even if it wasn't caused by the pins directly. You can't blame someone for dropping a carton full of eggs when said carton holding the eggs was made out of slippery ice that was giving the guy frostbite
Agreed! Trying to prevent user errors should be part of the design goals you set. Also, it was not just user errors…
 

Buggy Loop

Member
Agreed! Trying to prevent user errors should be part of the design goals you set. Also, it was not just user errors…

Can we return to the real world for a moment? The guy you’re replying to is saying “user error like this doesn’t show up with 8 pins”

That’s a really fucking bold claim in a world of fuckups of all kinds, with molex, usb, AIO, 24 pin, cpu 8 pin etc

https://www.google.com/search?q=8+p...hUKEwjmzJ_Axrj7AhXEGDQIHZ8SB_4QrQIoBHoECCQQBQ


Should we go into higher failure rates that actually got put under the rug as being negligible?

https://www.pcworld.com/article/394099/ryzen-5000-failure-rates-we-reality-check-the-claims.html

  • Ryzen 5000 series fails at 2.9 percent.
  • Ryzen 3000 series fails at 3 percent.
  • ThreadRipper 3000 series fails at 2.5 percent.
For comparison, the company’ data on Intel chips:

  • Intel 9th-gen fails at 0.9 percent.
  • Intel 10th-gen fails at 1.2 percent
Didn’t stop me from buying a 5600x and then a 5800x3d.

Now imagine, 3% for a small supplier. Now we are talking about 0.04% out of 125k cards sold.

What about those poor sobs who fuck up the pins on a CPU by not putting it correctly in the socket? Certainly more than 50 peoples in the span of all these sockets?

Where do we stop with idiot proofing?

Why aren’t we talking about actual design failures that have been sold for thousands of units and will fail catastrophically?

https://www.arctic.de/en/lf-service-kit

Not a fucking beep on the radar for tech YouTubers. Because that doesn’t bring the clicks. It’s a fucking clown show. Igor and Jayz lost a lot of credibility here after gamers nexus dropped real research.
 
Top Bottom