• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How Nvidia Won Graphics Cards

BennyBlanco

aka IMurRIVAL69
Thinking about Nvidia/AMD in terms of quality good/bad also simply means you were exposed to the immense marketing nvidia invests into streamers, forums, and youtube influencers. The money both companies have at their disposal to invest into new products and marketing is also night and day in favor of Nvidia. You have to be impressed by AMD at this point.

Yeah, everyone who likes Nvidia products is a paid shill.
 

T8SC

Member
They won because 3DFX made a mistake with motion blur compared to Nvidia's transform and lighting, then Nvidia bought 3DFX and brought motion blur back and i still turn it off to this day,

3Dfx's mistake was to try to produce the chips along with the PCB etc from the Voodoo 3 onwards. The earlier Voodoo products were made by the likes of Hercules & Creative with the graphics chip sourced from 3Dfx.

9415a7ca9844158f24429179467b3f1f.jpg


Vs

51PVW58D51L._AC_.jpg



They delays of the "Rampage" project also hindered them further. They struggled to get the Voodoo 4 & 5 to market and they ended up arriving almost at the same time and due to the delays, were already behind the Geforce 256 as they were aimed at the TNT2 era.

The Rampage was to compete with Geforce 256/Geforce 2 but due to the delays 3Dfx decided to keep bolting more & more VSA-100 chips together and go with brute force until Rampage was released. The Voodoo 5 6000 never made it to market though due to many problems with drivers etc.

3dfx_voodoo5_6k_1.jpg


The loss of revenue eventually caught up to them and they declared bankruptcy with Nvidia buying the majority of their assets and a lot of it went into the Geforce FX series cards.

The demise of 3Dfx left only ATI, Nvidia & Matrox as the main players, with Matrox quitting the "gaming" market after their failed Parhelia range of cards.
 

Laptop1991

Member
3Dfx's mistake was to try to produce the chips along with the PCB etc from the Voodoo 3 onwards. The earlier Voodoo products were made by the likes of Hercules & Creative with the graphics chip sourced from 3Dfx.

9415a7ca9844158f24429179467b3f1f.jpg


Vs

51PVW58D51L._AC_.jpg



They delays of the "Rampage" project also hindered them further. They struggled to get the Voodoo 4 & 5 to market and they ended up arriving almost at the same time and due to the delays, were already behind the Geforce 256 as they were aimed at the TNT2 era.

The Rampage was to compete with Geforce 256/Geforce 2 but due to the delays 3Dfx decided to keep bolting more & more VSA-100 chips together and go with brute force until Rampage was released. The Voodoo 5 6000 never made it to market though due to many problems with drivers etc.

3dfx_voodoo5_6k_1.jpg


The loss of revenue eventually caught up to them and they declared bankruptcy with Nvidia buying the majority of their assets and a lot of it went into the Geforce FX series cards.

The demise of 3Dfx left only ATI, Nvidia & Matrox as the main players, with Matrox quitting the "gaming" market after their failed Parhelia range of cards.

I had 5 Voodoo's and 2 were Voodoo 3's that ran games up until shader model 2 was introduced in 2004, and bought the Geforce 1 at Xmas 99, i had a couple of PC's back then, while that may be all true in the background, the average PC gamer wouldn't know about it until after the fact and bought cards on their features, and 3DFX made a mistake with motion blur at the time, Nvidia's features were more impressive, i remember the articles in PC Zone and i switched and yeah the Voodoo 5 was never released and by then Voodoo had lost its customer base to Nvidia.
 
Actually after ATI become AMD they never had a flagship GPU again.

That tells a lot.

AMD indeed killed ATI.

Who developed those GPU?
AMD, the CPU side that worked on CPUs, or the people who came from ATI that actually developed the GPUs?
Disgusting nostalgia.
 
[a flagship GPU is] A GPU that outperform the competition.
No. They didn’t.

"flagship" generally just means the company's most important or top-of-the-line product. it doesnt need to be the best in the market.
("flagship" refers to the most important ship in a fleet, or the ship that carries the captain)

for example, only considering consumer-grade GPUs, which is nvidia's flagship card? the 3080ti or the 3090?
arguably, the 3090 is just for bragging rights... a boutique item, if you will; with the 3080ti being the card that gamers will be actually interested in... so the 3080ti actually matters more.
nvidia may agree: the 3090 was released first, buy they even use the term "flagship" in a promo for the 3080ti when it released (link).
 

Kenpachii

Member
This is why nvidia won the war

1) drivers
2) drivers
3) drivers
4) Goes to developers to make sure there game works on there hardware of titles people want to play which results in drivers day 1 and a stable good experience always.
5) User experience is great stable and the card performs well because drivers.
7) Constant pushing Technology advancements that AMD can't compete with and are useful "physx", "tesselation", "nvenc", "RTX voice", "shadowplay", "3D", "Ratracing", "dlss'", AI.
6) It just works.

What does AMD do however?

1) make comparable hardware ( sometimes worse sometimes better )
2) total silence
3) says its everybody else fault besides them
4) always late
5) inferior user experience
6) time for the next card announcement, everything is great guys
7) repeat.

AMD and ATi's biggest enemy's are themselves and with there latest far cry 6 and valhalla debacle that clearly paid to nuke ampere performance because they simple can't keep up otherwise is not helping there image even remotely.

As i would say back in the day 10 years ago, and it still holds up today.

I pay rather 50 bucks more and have 10% less performance, if that means i will avoid AMD's drivers and get nvidia drivers served. It's just sad.

However for consoles and cpu's there hardware works well. Why? u get rid of there dog shit driver teams and support teams with there victim complex. The hardware was never the issue, its the shit tier company itself that simple doesn't give a rat ass for support outside of the launch window of there new product.

Anyway funny enough all the cards he talked about in the video i have owned or had.
 
Last edited:

01011001

Banned
AMD is so far, the only company in the world technologically capable of providing the levels of performance, wattage, and packaging needed to make the PS5/XSX for the price you can get them at the store. You would get a lot worse for an Intel/Nvidia derivative consuming the same 200 watts.

Thinking about Nvidia/AMD in terms of quality good/bad also simply means you were exposed to the immense marketing nvidia invests into streamers, forums, and youtube influencers. The money both companies have at their disposal to invest into new products and marketing is also night and day in favor of Nvidia. You have to be impressed by AMD at this point.

that is simply not true. there are gaming laptops that draw about as much power, maybe slightly more, as the new consoles while having Nvidia GPUs that are more powerful than them. and that's with a full on 144hz screen, ontop of the GPU and CPU, that also draws power

also you don't need to see any marketing from Nvidia to simply know they have the better graphics cards on the market for a long time now. AMD is catching up slowly, but they are still not there. and DLSS only made this harder for AMD to catch up.

AMD is a full generation behind in terms of raytracing performance and they don't have any tech like DLSS to weaken the performance impact Raytracing has... and no FSR is not even remotely comparable as it is actually worse than using TAAU in UE4 games and similar TAA upsampling solutions other games have. even checkerboarding or Capcom's interlacing options on PC are better than using FSR even in Ultra Quality mode often

just because Nvidia has more money to throw around doesn't mean I have to pity buy an AMD product because poor AMD doesn't have the PR money to spend either btw... so that point is also not really of interest as a consumer.


if AMD can keep up the amount of performance gains per generation that they had in the past few years, then maybe... MAYBE they will catch up to Nvidia soon. but that depends also on how Nvidia's next GPU lineup will look like
 
Last edited:

peronmls

Member
I couldn't be the only one who saw AMD commercials back in the day. Also are most GPU console GPU chips AMD anyway since the Wii.?
 

spyshagg

Should not be allowed to breed
Yeah, everyone who likes Nvidia products is a paid shill.

No wonder discussions start out of thin air. People read what they want.

Look here, I implied that "shills" are the ones who believe AMD to be of poor quality. Take your bait else where.
 

spyshagg

Should not be allowed to breed
that is simply not true. there are gaming laptops that draw about as much power, maybe slightly more, as the new consoles while having Nvidia GPUs that are more powerful than them. and that's with a full on 144hz screen, ontop of the GPU and CPU, that also draws power

also you don't need to see any marketing from Nvidia to simply know they have the better graphics cards on the market for a long time now. AMD is catching up slowly, but they are still not there. and DLSS only made this harder for AMD to catch up.

AMD is a full generation behind in terms of raytracing performance and they don't have any tech like DLSS to weaken the performance impact Raytracing has... and no FSR is not even remotely comparable as it is actually worse than using TAAU in UE4 games and similar TAA upsampling solutions other games have. even checkerboarding or Capcom's interlacing options on PC are better than using FSR even in Ultra Quality mode often

just because Nvidia has more money to throw around doesn't mean I have to pity buy an AMD product because poor AMD doesn't have the PR money to spend either btw... so that point is also not really of interest as a consumer.


if AMD can keep up the amount of performance gains per generation that they had in the past few years, then maybe... MAYBE they will catch up to Nvidia soon. but that depends also on how Nvidia's next GPU lineup will look like


You will have to deal with the truth.


AMD was the only company who had the technology to deliver the performance seen on PS5/XSX in a single SOC under 200w at the prices sony/ms were willing to pay.


Try to understand why:


- An Intel/Nvidia solution would be more expensive (Three chips instead of one - CPU + GPU + CHIPSET) from 2 different nodes (Intel + samsung)

- No infinity fabric solution to connect all three. New solution needed. More expenses.

- Nvidia + samsung node is not ahead of AMD + TSMC in perf/watt his generation, its behind!

- Intel node is not ahead of AMD + TSMC in perf/watt on the CPU.

- At the suggested 200watts of budget for the consoles, Intel + Nvidia would deliver less rasterization and processing capabilities than what you got with AMD.



I and everyone here will grant you the Raytracing side of the conversation.


But on AMD, the raytracing silicon occupies an insignificant portion of the chip! while on Nvidia you would have to give up rasterization performance to accommodate the dedicated raytracing cores, which when they are doing nothing, they are doing nothing.
 

spyshagg

Should not be allowed to breed
"flagship" generally just means the company's most important or top-of-the-line product. it doesnt need to be the best in the market.
("flagship" refers to the most important ship in a fleet, or the ship that carries the captain)

for example, only considering consumer-grade GPUs, which is nvidia's flagship card? the 3080ti or the 3090?
arguably, the 3090 is just for bragging rights... a boutique item, if you will; with the 3080ti being the card that gamers will be actually interested in... so the 3080ti actually matters more.
nvidia may agree: the 3090 was released first, buy they even use the term "flagship" in a promo for the 3080ti when it released (link).

He is indeed wrong. AMD did have both a "flagship" and the fastest single GPU on the market under the brand AMD. It was the R9 290X.

And It came at 450$ less than the previous champion the 1000$ Titan.

 
Last edited:

ZywyPL

Banned
No one's questioning whether or not Sony made a last minute (or rather, last year considering the delay) change to the dual-Cell PS3, but the original Geforce 8800 GTX/GTS went into the stores 3 days before the PS3 did in Japan, 9 days before the US and 4 months before Europe.

So it wasn't ready for PS3 launch anyway, as the new PS3 prototypes had to be completed long before the launch so that the devs could start preparing the launch games, hence the old, already proven Geforce that could be added in no time.
 
Last edited:
Don't understand the hate for Nvidia that so many have. Yes I had to overpay for a 3060 but the pandemic, chip shortage and cryptocurrency bullshit is not their fault.
 

winjer

Gold Member
The G80 actually came out a couple of weeks before the PS3.
That's why the "timings" excuse doesn't really fit.

That's a very interesting "what if" scenario. The PS3 with a G80 and full DX10 support would have been an impressive console for the time.
But I guess the main reason was price. The PS3 was already losing a lot of money to Sony, because of low yields of the Cell CPU and the BR drives.
Adding the newest and best GPU on the market, would make the PS3 too expensive.
 

ZywyPL

Banned
That's a very interesting "what if" scenario. The PS3 with a G80 and full DX10 support would have been an impressive console for the time.
But I guess the main reason was price. The PS3 was already losing a lot of money to Sony, because of low yields of the Cell CPU and the BR drives.
Adding the newest and best GPU on the market, would make the PS3 too expensive.

That would make the PS3...

NINE HUNDRED AND NINETY NINE US DOLLARS!!!


But like I said in my previous post, the devs needed the hardware much sooner than that to start working on the games already.
 

winjer

Gold Member
But like I said in my previous post, the devs needed the hardware much sooner than that to start working on the games already.

That has a point of truth.
But I bet that if Sony could have a G80 on the PS3 at the same price of the G71, they would rather get the G80.
Damned be the devs. They could have a much more advanced and powerful GPU.
 
Top Bottom