Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs
Not "paying the quality" is how do your get a console for $250-500. Instead of 800 or 1000.Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
Jensen is that you?Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
Mass market isn't going to pay $1500 (more in line with what it actually would cost) for a console with a "quality" GPU. It was a good decision.Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
The profit margins of the console market were virtually irrelevant to Nvidia, which si why they let it to AMD, so not sure what you are even trying to argue here.How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs. I think they only developed the NV2A for XBOX and the PlayStation 3 graphics.
Not "paying the quality" is how do your get a console for $250-500. Instead of 800 or 1000.
Add Nintendo to the pack. They used ATI/AMD on GameCube, Wii and Wii U.
Because it was the only decent SOC in the market (alternatives were Rock chip or MTK, yeah.. no) and Switch was already built, since it was just probably the Nvidia Tablet X1, cancelled a few months before the announcement of the Switch.And what do they use on the Switch again?
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
An old and pretty crappy SoC from 2015 that Nvidia couldn't sell to anyone else after the failure that was the Pixel C, so they sold it to Nintendo for peanuts after 2 years of letting all the stock dry in a warehouse.And what do they use on the Switch again?
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
How they won PC graphic cards. Since console graphics have been mostly AMD cards or APUs. I think they only developed the NV2A for XBOX and the PlayStation 3 graphics.
That failed start was like: let's develop a Saturn graphics card, what could possibly go wrong.
Yes, because when SONY went for Nvidia with PS3 they delivered a super high-end GPU that humiliated 360. Oh, wait! Actually AMD design for XBOX was a generation ahead the year before PS3 launched, with unified shaders and being able to do HDR and AA simultaneously.
to be fair that was also Sony's fault because they at first wanted to have 2 cell chips in there, one for graphics and one for CPU tasks, until their developers basically told them how stupid that idea is and that the best they can do with that hardware is HD PS2 graphics.
then they had to hastily get a GPU in there and went with whatever Nvidia could come up with in that short timespan.
so that was more of a planning issue not so much an Nvidia issue.
ATi (AMD) and Microsoft worked together all along on the 360 and so had a fitting GPU in the pipeline for much longer. hence it was way better than what nvidia came up with on short notice
You need a history lesson. For 8th generation, Nvidia Kepler was inferior to ps4 Amd GCN.Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
The X360 was developed in record time (close to 2 years?), hence the RRoD issue.so that was more of a planning issue not so much an Nvidia issue.
ATi (AMD) and Microsoft worked together all along on the 360 and so had a fitting GPU in the pipeline for much longer. hence it was way better than what nvidia came up with on short notice
The G80 actually came out a couple of weeks before the PS3.Nvidia actively mislead Sony by saying fixed pixel pipelines were better than unified (xbox 360) and then Nvidia had a unified architecture the next year.
Actually, it's because...Blame Microsoft and Sony being cheap skates over not wanting to pay for quality.
...this.Actually, it's the gamers fault.
That doesn't change the fact that a whole year after 360 launched Nvidia still didn't have anything like the GPU ATI had designed for 360. Not even for PC. No matter how early SONY had asked Nvidia for a GPU for their new console, they were lagging behind ATI at that time. Their next gen of graphic card did have unified shaders, but that was too late for PS3 anyways, even when PS3 launched a whole year after 360, as I said.
More likely, Nvidia just wasn't willing to sell Sony their state-of-the-art G8x architecture, as they wanted to claim PC Master Race superiority in their discrete GPU offerings. A PS3 with a G84 would have probably been a completely different beast.
No one's questioning whether or not Sony made a last minute (or rather, last year considering the delay) change to the dual-Cell PS3, but the original Geforce 8800 GTX/GTS went into the stores 3 days before the PS3 did in Japan, 9 days before the US and 4 months before Europe.Sony initially wanted the Cell to render the graphics, they even planned PS3 to actually have two Cell processors, but turned out it's not really up to the task (politely speaking) once the devs get their hands on the prototypes, and Sony turned to NV for a quich turnaround solution, and NV gave them basicaly a GF7800 with half the ROPs. Besides, the 8000 GPU series weren't even ready until half a year later.
The Switch could have been a much more powerful and future-proof console, had they used e.g. a Snapdragon 835 radio-less derivative, or an in-house chip using a contemporary PowerVR architecture.
Don't think for a second that the old and failed Tegra X1 was the best choice for a handheld gaming console back in 2017. It was just the only sensible choice for the money Nintendo is willing to give for their processing hardware.
- AMD killed ATI.
- nVidia delivered superior product.
Actually after ATI become AMD they never had a flagship GPU again.No, the Switch could only have used Tegra X1. It was the only hardware available with a PC GPU at the time, a GPU that allowed all these ports.
This video talks about the beginning, and no, AMD didn't "killed" ATI, "maybe" ATI did.
When they went inside AMD they had all the liberty work and they worked well. Just at some point (around GCN) they stopped delivering, leading to low sales, less money, crisis.
Actually after ATI become AMD they never had a flagship GPU again.
That tells a lot.
AMD indeed killed ATI.
A GPU that outperform the competition.Don't know what you mean by flagship GPU, but if you are talking about the High end, they had several.
I don't know what you mean by "PC GPU".No, the Switch could only have used Tegra X1. It was the only hardware available with a PC GPU at the time, a GPU that allowed all these ports.
A GPU that outperform the competition.
No. They didn’t.
How Nvidia Won Graphics Cards
What?They did.
The Radeon 7970 is much better than the Geforce 680.
The start was a bit shaky for AMD, being a new arch and new drivers.
But after a while, the 7970 just beat the 680 completely. Especially when games started using low level APIs.
After a while, the R290X was slightly faster than the 780Ti.
It's only problem was the asinine idea of not allowing AIBs to use custom coolers for the first 6 months.
This game the card a reputation of being too hot, when in fact it consumed the same power as the 780Ti.
The Radeon X1000 range of cards was faster than the Geforce 7000. Especially on games with heavier pixel shader , where it destroyed the geforce.
For most of it's life, AMD/ATI were very competitive with nvidia.
The bid downfall came with Maxwell. With this gen, nVidia got a huge performance and efficiency boost.
On the other hand, AMD bet all on low level APIs, like DX12, Vulkan and Mantle.
But these APIs were being slowly adopted, and Radeon GPUs suffered a lot under DX11 and OpenGL.
What?
The fact that HD 7970 never beat GTX 680 completely (the opposite to be fair) tells a lot about why you think AMD delivered in GPU industry.
The RX 570 is the fastest of the four tested cards followed by the GTX 1050 Ti, then by the HD 7970, and finally by the GTX 680 in distant last place. Both of these 7 year old former flagship cards fall behind the newer cards, and most of the demanding newer games cannot be well-played even at 1920×1080 without dropping down well below Ultra settings.
In most games, the HD 7970 has aged better and can still handle higher settings with better framerates than the GTX 680 which is crippled by its limited 2GB vRAM framebuffer.
It doesn’t need to be at launch lolYou got 2 graphics that show performance from the launch of the 680. 2 months after the launch of the 79070.
But if you recall, I said in the start of that gen, the Radeon, because it was a completely new arch that targeted low level APIs, it fared not so well.
Now compare with games released a bit later, and newer drivers, and the tables have turned. By a huge margin.
But consider that even with your graphics, the 7970 only loses by 4-7%. That's a small difference.
Here's a test with plenty of games.
The HD 7970 vs. the GTX 680 – revisited after 7 years
The Retro Series – the HD 7870 vs. the GTX 680 compared with the RX 570 and the GTX 1050 Ti The GTX 680 versus ... Read morebabeltechreviews.com
It doesn’t need to be at launch lol
HD 7970 did not outperform GTX 680 even years after except for games in resolutions that required more than 2GB of RAM.
Of course that’s are these others lolEven with tons of games showing the 7970 winning, you insist the 680 was the best.
The 680 was only the best at launch, by a small margin.
And then there is also the 290X vs 780Ti
And the X1000 series.
Also, the 4870 and the 5870, although not as fast as the Geforce, they were competitive, and much more power efficient than the 200 and 400 series.
Of course that’s are these others lol
Last time AMD was ahead nVidia was when nVidia used Fermi.
After that like demonstrated with GTX 680 vs HD 7970 nVidia was the top dog.
Coincidently after AMD killed ATI.
Actually after ATI become AMD they never had a flagship GPU again.
That tells a lot.
AMD indeed killed ATI.
Explain Switch now.ok, there's a lot to unpack here
sony and microsoft want a system on a chip (SoC). that means CPU and GPU in the same package. only AMD can provide this
intel graphics would be godawful and nvidia probably couldn't make a CPU at all, so there you go.
also, consoles are low-budget gaming machines. paying more for hardware would mean an increased cost to the consumer (no they won't take a loss on the consoles, that ship sailed a long, long time ago).
console users don't want the best graphics, they want a system that's just about workable for the minimum cost possible. sony/microsoft make their money by locking them into their walled garden ecosystem so they can extract their online fees and a big slice of game sales.
in short, the way consoles work is to get the cheapest possible hardware into as many people's hands as possible, and an AMD SoC is the only way to do that.
Switch is ARM, not x86.Explain Switch now.
Explain Switch now.