• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How Nvidia Won Graphics Cards

gundalf

Member
I highly recommend watching this channels other videos, pure gold if you are interested in how those small tech startups became the big companies they are today.

Also on a side note, while this video was not about videogame consoles nor SGI, I think at 3:33 it was a missed opportunity to mention the Nintendo 64 and Nintendos partnership with SGI. The Nintendo 64 was basically a slimmed down SGI workstation and they burned a lot of capital on SGI to make this happen with a (as well know) lackluster success which shaped Nintendo forever to be the company that they are today.
 

stranno

Member
How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs. I think they only developed the NV2A for XBOX and the PlayStation 3 graphics.

That failed start was like: let's develop a Saturn graphics card, what could possibly go wrong.
 
Last edited:

stranno

Member
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
Not "paying the quality" is how do your get a console for $250-500. Instead of 800 or 1000.

Add Nintendo to the pack. They used ATI/AMD on GameCube, Wii and Wii U.
 
Last edited:

lils

Member
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.

ok, there's a lot to unpack here

sony and microsoft want a system on a chip (SoC). that means CPU and GPU in the same package. only AMD can provide this

intel graphics would be godawful and nvidia probably couldn't make a CPU at all, so there you go.

also, consoles are low-budget gaming machines. paying more for hardware would mean an increased cost to the consumer (no they won't take a loss on the consoles, that ship sailed a long, long time ago).

console users don't want the best graphics, they want a system that's just about workable for the minimum cost possible. sony/microsoft make their money by locking them into their walled garden ecosystem so they can extract their online fees and a big slice of game sales.

in short, the way consoles work is to get the cheapest possible hardware into as many people's hands as possible, and an AMD SoC is the only way to do that.
 

Sentenza

Member
How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs. I think they only developed the NV2A for XBOX and the PlayStation 3 graphics.
The profit margins of the console market were virtually irrelevant to Nvidia, which si why they let it to AMD, so not sure what you are even trying to argue here.
 

stranno

Member
And what do they use on the Switch again?
Because it was the only decent SOC in the market (alternatives were Rock chip or MTK, yeah.. no) and Switch was already built, since it was just probably the Nvidia Tablet X1, cancelled a few months before the announcement of the Switch.
 

winjer

Gold Member
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.

Actually, it's the gamers fault.
Most consoles gamers don't want to pay high prices for their consoles.
This gen price went up to 500$ and a lot of people started complaining.
Now imagine if the price was 1000$. There would be such a huge backlash.
You can bet that Sony and MS would not subsidize 500$ on a console.

Consoles are a product of compromise. They are not the ultimate expression of performance.
They need to have low cost, because most of the consumer base is very price sensitive.
They have a limited power budget, that has to be shared with CPU, GPU, memory, SSD/HDD, BR-Drive, etc.
They have a limited die space, to make an SoC, that includes CPU, GPU and IO.

nVidia already had one console with MS and one with Sony. Just one.
Rumors say nVidia is a very difficult partner to work with, especially on such a price sensitive market, such as console manufacturing.
AMD already had several with Sony and MS. And Nintendo.
Who knows if the Switch is the last Nintendo console, for which nVidia is doing the SoC.
At the time, AMD had nothing that could fit inside the Switch's power budget.
Maxwell was so ahead in terms of power efficiency, that AMD was not an option for Nintendo and it's handheld console.
But AMD has done great improvements in power efficiency. And is now a viable option, as the Steam Deck shows.
 
Last edited:

Derktron

Banned
I mean I had them for 4 years and no issues so far with the games I want to play, now with AMD some problems or some games I did have problems with.
 

Derktron

Banned
Jensen is that you?
Nvidia Gpu GIF by alexibexi

Maybe
 

ToTTenTranz

Banned
And what do they use on the Switch again?
An old and pretty crappy SoC from 2015 that Nvidia couldn't sell to anyone else after the failure that was the Pixel C, so they sold it to Nintendo for peanuts after 2 years of letting all the stock dry in a warehouse.
IIRC Nvidia couldn't even get the Cortex A53 cluster to work in that chip. There's a reason Nvidia stopped developing ULP chips and steered the Tegra development to automobile.


The Switch could have been a much more powerful and future-proof console, had they used e.g. a Snapdragon 835 radio-less derivative, or an in-house chip using a contemporary PowerVR architecture.
Don't think for a second that the old and failed Tegra X1 was the best choice for a handheld gaming console back in 2017. It was just the only sensible choice for the money Nintendo is willing to give for their processing hardware.
 

Arioco

Member
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.


Yes, because when SONY went for Nvidia with PS3 they delivered a super high-end GPU that humiliated 360. Oh, wait! Actually AMD design for XBOX was a generation ahead the year before PS3 launched, with unified shaders and being able to do HDR and AA simultaneously. 🤷‍♂️
 

spyshagg

Should not be allowed to breed
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.

AMD is so far, the only company in the world technologically capable of providing the levels of performance, wattage, and packaging needed to make the PS5/XSX for the price you can get them at the store. You would get a lot worse for an Intel/Nvidia derivative consuming the same 200 watts.

Thinking about Nvidia/AMD in terms of quality good/bad also simply means you were exposed to the immense marketing nvidia invests into streamers, forums, and youtube influencers. The money both companies have at their disposal to invest into new products and marketing is also night and day in favor of Nvidia. You have to be impressed by AMD at this point.
 
Last edited:

ArtHands

Thinks buying more servers can fix a bad patch
How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs. I think they only developed the NV2A for XBOX and the PlayStation 3 graphics.

That failed start was like: let's develop a Saturn graphics card, what could possibly go wrong.

Thread ain’t wrong though because he combined gpu on PC and consoles
 

01011001

Banned
Yes, because when SONY went for Nvidia with PS3 they delivered a super high-end GPU that humiliated 360. Oh, wait! Actually AMD design for XBOX was a generation ahead the year before PS3 launched, with unified shaders and being able to do HDR and AA simultaneously. 🤷‍♂️

to be fair that was also Sony's fault because they at first wanted to have 2 cell chips in there, one for graphics and one for CPU tasks, until their developers basically told them how stupid that idea is and that the best they can do with that hardware is HD PS2 graphics.

then they had to hastily get a GPU in there and went with whatever Nvidia could come up with in that short timespan.

so that was more of a planning issue not so much an Nvidia issue.
ATi (AMD) and Microsoft worked together all along on the 360 and so had a fitting GPU in the pipeline for much longer. hence it was way better than what nvidia came up with on short notice
 

Arioco

Member
to be fair that was also Sony's fault because they at first wanted to have 2 cell chips in there, one for graphics and one for CPU tasks, until their developers basically told them how stupid that idea is and that the best they can do with that hardware is HD PS2 graphics.

then they had to hastily get a GPU in there and went with whatever Nvidia could come up with in that short timespan.

so that was more of a planning issue not so much an Nvidia issue.
ATi (AMD) and Microsoft worked together all along on the 360 and so had a fitting GPU in the pipeline for much longer. hence it was way better than what nvidia came up with on short notice


That doesn't change the fact that a whole year after 360 launched Nvidia still didn't have anything like the GPU ATI had designed for 360. Not even for PC. No matter how early SONY had asked Nvidia for a GPU for their new console, they were lagging behind ATI at that time. Their next gen of graphic card did have unified shaders, but that was too late for PS3 anyways, even when PS3 launched a whole year after 360, as I said.
 
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
You need a history lesson. For 8th generation, Nvidia Kepler was inferior to ps4 Amd GCN.

For 7th generation, Nvidia was inferior until the 8xxx series in 2006 ; 360 was already out and Nvidia hoodwinked Sony into buying inferior 7xxx series for ps3. Nvidia actively mislead Sony by saying fixed pixel pipelines were better than unified (xbox 360) and then Nvidia had a unified architecture the next year.

With the original xbox Nvidia did indeed provide the best graphics chip (minus memory bandwidth) but screwed MS in not letting them die shrink the graphics chip, costing MS a lot of money on the xbox and was ultimately one factor in motivating them to launch 360 sooner.

Basically, Nvidia has only been good to Nintendo (so far).
 

ToTTenTranz

Banned
so that was more of a planning issue not so much an Nvidia issue.
ATi (AMD) and Microsoft worked together all along on the 360 and so had a fitting GPU in the pipeline for much longer. hence it was way better than what nvidia came up with on short notice
The X360 was developed in record time (close to 2 years?), hence the RRoD issue.
Microsoft picked up a wild GPU project from Bitboys and ArtX (both acqiired by AMD/ATi at the time) that was eventually named R500, which wasn't great for PC because it probably couldn't scale very well due to the eDRAM requirements.


More likely, Nvidia just wasn't willing to sell Sony their state-of-the-art G8x architecture, as they wanted to claim PC Master Race superiority in their discrete GPU offerings. A PS3 with a G84 would have probably been a completely different beast.


Nvidia actively mislead Sony by saying fixed pixel pipelines were better than unified (xbox 360) and then Nvidia had a unified architecture the next year.
The G80 actually came out a couple of weeks before the PS3.
That's why the "timings" excuse doesn't really fit.
 
Last edited:

01011001

Banned
That doesn't change the fact that a whole year after 360 launched Nvidia still didn't have anything like the GPU ATI had designed for 360. Not even for PC. No matter how early SONY had asked Nvidia for a GPU for their new console, they were lagging behind ATI at that time. Their next gen of graphic card did have unified shaders, but that was too late for PS3 anyways, even when PS3 launched a whole year after 360, as I said.

the PS3 GPU didn't only lack unified shaders it was also in general less powerful, and I could imagine that might have had something to do with the short timeframe in combination with them not having a good low power chip ready in time
 

ZywyPL

Banned
More likely, Nvidia just wasn't willing to sell Sony their state-of-the-art G8x architecture, as they wanted to claim PC Master Race superiority in their discrete GPU offerings. A PS3 with a G84 would have probably been a completely different beast.

Sony initially wanted the Cell to render the graphics, they even planned PS3 to actually have two Cell processors, but turned out it's not really up to the task (politely speaking) once the devs get their hands on the prototypes, and Sony turned to NV for a quich turnaround solution, and NV gave them basicaly a GF7800 with half the ROPs. Besides, the 8000 GPU series weren't even ready until half a year later.
 

ToTTenTranz

Banned
Sony initially wanted the Cell to render the graphics, they even planned PS3 to actually have two Cell processors, but turned out it's not really up to the task (politely speaking) once the devs get their hands on the prototypes, and Sony turned to NV for a quich turnaround solution, and NV gave them basicaly a GF7800 with half the ROPs. Besides, the 8000 GPU series weren't even ready until half a year later.
No one's questioning whether or not Sony made a last minute (or rather, last year considering the delay) change to the dual-Cell PS3, but the original Geforce 8800 GTX/GTS went into the stores 3 days before the PS3 did in Japan, 9 days before the US and 4 months before Europe.
 
Last edited:
The Switch could have been a much more powerful and future-proof console, had they used e.g. a Snapdragon 835 radio-less derivative, or an in-house chip using a contemporary PowerVR architecture.
Don't think for a second that the old and failed Tegra X1 was the best choice for a handheld gaming console back in 2017. It was just the only sensible choice for the money Nintendo is willing to give for their processing hardware.

No, the Switch could only have used Tegra X1. It was the only hardware available with a PC GPU at the time, a GPU that allowed all these ports.

- AMD killed ATI.
- nVidia delivered superior product.

This video talks about the beginning, and no, AMD didn't "killed" ATI, "maybe" ATI did.
When they went inside AMD they had all the liberty work and they worked well. Just at some point (around GCN) they stopped delivering, leading to low sales, less money, crisis.
 

ethomaz

Banned
No, the Switch could only have used Tegra X1. It was the only hardware available with a PC GPU at the time, a GPU that allowed all these ports.



This video talks about the beginning, and no, AMD didn't "killed" ATI, "maybe" ATI did.
When they went inside AMD they had all the liberty work and they worked well. Just at some point (around GCN) they stopped delivering, leading to low sales, less money, crisis.
Actually after ATI become AMD they never had a flagship GPU again.

That tells a lot.

AMD indeed killed ATI.
 

ethomaz

Banned
Don't know what you mean by flagship GPU, but if you are talking about the High end, they had several.
A GPU that outperform the competition.
No. They didn’t.

AMD killed ATI in 2010 and that was around the last time AMD/ATI could say to have a better GPU than nVidia.

It is over a decade of nVidia dominance already.
 
Last edited:

ToTTenTranz

Banned
No, the Switch could only have used Tegra X1. It was the only hardware available with a PC GPU at the time, a GPU that allowed all these ports.
I don't know what you mean by "PC GPU".
Starting with PowerVR Series 7 Plus (2016 iPhones) and Adreno 500 (2017 Androids like Pixel 2 and OnePlus 5) all GPUs from PowerVR and Qualcomm are fully compliant with Vulkan 1.0/1.1, OpenGL 3.3 and DirectX 12 with FL11_1 / Shader Model 5.1.


They're both well above the level of feature compliance needed for the PC ports you mention. The devs probably have a lot more trouble compiling to the ARMv8 ISA given the original x86-64 code, than they'd ever need to adapt to either of these GPUs.
 

winjer

Gold Member
A GPU that outperform the competition.
No. They didn’t.

They did.
The Radeon 7970 is much better than the Geforce 680.
The start was a bit shaky for AMD, being a new arch and new drivers.
But after a while, the 7970 just beat the 680 completely. Especially when games started using low level APIs.

After a while, the R290X was slightly faster than the 780Ti.
It's only problem was the asinine idea of not allowing AIBs to use custom coolers for the first 6 months.
This game the card a reputation of being too hot, when in fact it consumed the same power as the 780Ti.

The Radeon X1000 range of cards was faster than the Geforce 7000. Especially on games with heavier pixel shader , where it destroyed the geforce.

For most of it's life, AMD/ATI were very competitive with nvidia.
The bid downfall came with Maxwell. With this gen, nVidia got a huge performance and efficiency boost.
On the other hand, AMD bet all on low level APIs, like DX12, Vulkan and Mantle.
But these APIs were being slowly adopted, and Radeon GPUs suffered a lot under DX11 and OpenGL.
 
Last edited:

ethomaz

Banned
They did.
The Radeon 7970 is much better than the Geforce 680.
The start was a bit shaky for AMD, being a new arch and new drivers.
But after a while, the 7970 just beat the 680 completely. Especially when games started using low level APIs.

After a while, the R290X was slightly faster than the 780Ti.
It's only problem was the asinine idea of not allowing AIBs to use custom coolers for the first 6 months.
This game the card a reputation of being too hot, when in fact it consumed the same power as the 780Ti.

The Radeon X1000 range of cards was faster than the Geforce 7000. Especially on games with heavier pixel shader , where it destroyed the geforce.

For most of it's life, AMD/ATI were very competitive with nvidia.
The bid downfall came with Maxwell. With this gen, nVidia got a huge performance and efficiency boost.
On the other hand, AMD bet all on low level APIs, like DX12, Vulkan and Mantle.
But these APIs were being slowly adopted, and Radeon GPUs suffered a lot under DX11 and OpenGL.
What?

The fact that HD 7970 never beat GTX 680 completely (the opposite to be fair) tells a lot about why you think AMD delivered in GPU industry.


perfrel.gif


perfrel_1920.gif


perfrel_2560.gif
 
Last edited:

winjer

Gold Member
What?

The fact that HD 7970 never beat GTX 680 completely (the opposite to be fair) tells a lot about why you think AMD delivered in GPU industry.

You got 2 graphics that show performance from the launch of the 680. 2 months after the launch of the 79070.
But if you recall, I said in the start of that gen, the Radeon, because it was a completely new arch that targeted low level APIs, it fared not so well.
Now compare with games released a bit later, and newer drivers, and the tables have turned. By a huge margin.
But consider that even with your graphics, the 7970 only loses by 4-7%. That's a small difference.

Here's a test with plenty of games.

The RX 570 is the fastest of the four tested cards followed by the GTX 1050 Ti, then by the HD 7970, and finally by the GTX 680 in distant last place. Both of these 7 year old former flagship cards fall behind the newer cards, and most of the demanding newer games cannot be well-played even at 1920×1080 without dropping down well below Ultra settings.
In most games, the HD 7970 has aged better and can still handle higher settings with better framerates than the GTX 680 which is crippled by its limited 2GB vRAM framebuffer.
 
Last edited:

ethomaz

Banned
You got 2 graphics that show performance from the launch of the 680. 2 months after the launch of the 79070.
But if you recall, I said in the start of that gen, the Radeon, because it was a completely new arch that targeted low level APIs, it fared not so well.
Now compare with games released a bit later, and newer drivers, and the tables have turned. By a huge margin.
But consider that even with your graphics, the 7970 only loses by 4-7%. That's a small difference.

Here's a test with plenty of games.
It doesn’t need to be at launch lol
HD 7970 did not outperform GTX 680 even years after except for games in resolutions that required more than 2GB of RAM.
 

winjer

Gold Member
It doesn’t need to be at launch lol
HD 7970 did not outperform GTX 680 even years after except for games in resolutions that required more than 2GB of RAM.

Even with tons of games showing the 7970 winning, you insist the 680 was the best.
The 680 was only the best at launch, by a small margin.

And then there is also the 290X vs 780Ti
And the X1000 series.

Also, the 4870 and the 5870, although not as fast as the Geforce, they were competitive, and much more power efficient than the 200 and 400 series.
 
Last edited:

ethomaz

Banned
Even with tons of games showing the 7970 winning, you insist the 680 was the best.
The 680 was only the best at launch, by a small margin.

And then there is also the 290X vs 780Ti
And the X1000 series.

Also, the 4870 and the 5870, although not as fast as the Geforce, they were competitive, and much more power efficient than the 200 and 400 series.
Of course that’s are these others lol
Last time AMD was ahead nVidia was when nVidia used Fermi.

After that like demonstrated with GTX 680 vs HD 7970 nVidia was the top dog.

Coincidently after AMD killed ATI.
 
Last edited:

winjer

Gold Member
Of course that’s are these others lol
Last time AMD was ahead nVidia was when nVidia used Fermi.

After that like demonstrated with GTX 680 vs HD 7970 nVidia was the top dog.

Coincidently after AMD killed ATI.

Only after Maxwell, was nVidia top dog.
During kepler, AMD was the better of the two.

And still, there are several examples to show your statement that after AMD purchased ATI, the company never had flagship products, is wrong.
So don't try to move the goal post from 2006, to 2012.
 

spyshagg

Should not be allowed to breed
Actually after ATI become AMD they never had a flagship GPU again.

That tells a lot.

AMD indeed killed ATI.

You forgot about the R9 290X. It beat the titan when it was released, becoming the fastest GPU in the world when it did.

Nvidia responded with the 780Ti, which although a bit faster than the 290x, eventually lost the battle as the years went by and the 290x matured.
 

Laptop1991

Member
They won because 3DFX made a mistake with motion blur compared to Nvidia's transform and lighting, then Nvidia bought 3DFX and brought motion blur back and i still turn it off to this day,

As for ATI/Amd they did win with the Radeon 980 series back in 2003, but since then their drivers and cards wern't as good mostly until now imo, i would say it's good they are competing with Nvidia again but the prices are ridiculous now for new cards.
 
ok, there's a lot to unpack here

sony and microsoft want a system on a chip (SoC). that means CPU and GPU in the same package. only AMD can provide this

intel graphics would be godawful and nvidia probably couldn't make a CPU at all, so there you go.

also, consoles are low-budget gaming machines. paying more for hardware would mean an increased cost to the consumer (no they won't take a loss on the consoles, that ship sailed a long, long time ago).

console users don't want the best graphics, they want a system that's just about workable for the minimum cost possible. sony/microsoft make their money by locking them into their walled garden ecosystem so they can extract their online fees and a big slice of game sales.

in short, the way consoles work is to get the cheapest possible hardware into as many people's hands as possible, and an AMD SoC is the only way to do that.
Explain Switch now.
 

lils

Member
Explain Switch now.

of course. the switch has an ARM CPU, meaning it's more difficult to port games over because it does not use the x86 architecture that is standard in the industry, meaning the switch doesn't have as many ports.

also, its CPU is a piece of garbage at barely over 1ghz
 
Top Bottom