• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

kraspkibble

Permabanned.
Pragmatically, NV would not do this if there would be no need.
Most likely the top end cards on Samsung 8nm would have peak power consumption way above of what normal connectors area guaranteed to deliver.

So far in the leaks, the biggest card was rated at "only" 320w.


Oh boy.

i think 12 pins would be more for cards like Titan or Quadro. There really is no need for gaming GPUs to use that much power. Nvidia would be royally screwing over a lot of people if they were asking for silly money for a new GPU and requiring them to replace the PSU.
 

llien

Member
i think 12 pins would be more for cards like Titan or Quadro. There really is no need for gaming GPUs to use that much power. Nvidia would be royally screwing over a lot of people if they were asking for silly money for a new GPU and requiring them to replace the PSU.
Welp, is asking people who are into $1k+ GPUs to get PSU upgrade really too much?
 

sono

Member
hqdefaultrfkkt.jpg
 

GHG

Member
Welp, is asking people who are into $1k+ GPUs to get PSU upgrade really too much?


It's not about the money, it's a fucking ball-ache.

I always feel like I'm gonna break something when removing the 24 pin from motherboards. Then there's the cable management... Urgh.

Ideally I only ever want to do all that shit once when I build a PC then I can just add hard drives, ram and change the graphics card if and when I need to.
 

M1chl

Currently Gif and Meme Champion
It's probably much more due to the power distribution on card rather than power demand, at least I hope.
 

GHG

Member
I have this PSU. Will it be sufficient?

It's not so much about the overall power requirements, the main concern seems to be around this new 12 pin connector that doesn't exist on any current psus. There's no cable that will come directly out of your psu (modular or otherwise) that fits thus new spec.

We need to see if an adapter/convertor is possible and what that looks like.
 
Last edited:

BluRayHiDef

Banned
It's not so much about the overall power requirements, the main concern seems to be around this new 12 pin connector that doesn't exist on any current psus. There's no cable that will come directly out of your psu (modular or otherwise) that fits thus new spec.

We need to see if an adapter/convertor is possible and what that looks like.
Would two 6-pin connectors not suffice?
 

GHG

Member

kraspkibble

Permabanned.
Welp, is asking people who are into $1k+ GPUs to get PSU upgrade really too much?
well no but that's not the point is it?

I don't understand why people think this would require new PSUs.... there would have to be a connector in the box
yeah fuck that. if the PSU isn't designed to run that much power then i'm not risking frying my entire PC.

if it can be powered with the cables that come with the PSU then what's the point in making any changes?
 
Last edited:

llien

Member
A100 is 400W, so there's 0% chance the consumer cards will have that much power consumption, let alone much higher. Top-end Ampere will be 250-300W, as always.

Uh, A100 is TSMC 7nm, the rest is supposed to be Samsung 8nm.
Leaked card is shown with TDP of 320W and we don't even know if it is the fastest to come.


Nvidia wouldnt release a card that doesnt fit 99% of current PSUs.
Nvidia wouln't introduce new connector, if there was no need either.
xx80Ti buyers are anyhow less than 1% of the consumers (about 8% of the revenue though.
 

PhoenixTank

Member
if it can be powered with the cables that come with the PSU then what's the point in making any changes?
I assume occupying the space of 12 pins on the board is preferable to 16 pins from a 2x8. From what I believe I read the 12 pin format is meant to require stringent power delivery and thicker cable gauge too.
 

ShirAhava

Plays with kids toys, in the adult gaming world
Really glad I waited to get a whole new PC and just got a RTX 2060 Super

Between these PSU changes, New GPU/CPU's coming out HDMI 2.1 and next gen ram/sdd tech on the way

A top of the line PC today is gonna be really outdated in two years (more so than normal)
 

Kuranghi

Member
Pragmatically, NV would not do this if there would be no need.
Most likely the top end cards on Samsung 8nm would have peak power consumption way above of what normal connectors area guaranteed to deliver.

So far in the leaks, the biggest card was rated at "only" 320w.


Oh boy.

I take it you think the pricing will be that low again, apart from the obvious of new technology + no competition in high end models meaning they can charge whatever they like, are there other reasons for it? Like who they have doing the process? I don't follow hardware nearly as much as software.

Or let me know what you meant if its not that, I don't have much to go on lol. I thought you maybe meant AMD won't ever bother to compete in the high end again but you didnt bold that part of my message.
 
Maybe the reason why ps5 is gigantic in size?

New cards are probably making new record for power draw. Since consoles are using amd parts, I wonder what will be situation with big(ger) Navi.
 

YCoCg

Gold Member
FFS, are we really going to let Nvidia now dictate power supplies? First they're jacking up GPU prices for years (LOL $600 entry point for lowest model) and now they want us to probably buy into their "unique" design PSU's which will probably be fucking $400 or something.
 

llien

Member
I take it you think the pricing will be that low again, apart from the obvious of new technology + no competition in high end models meaning they can charge whatever they like, are there other reasons for it? Like who they have doing the process? I don't follow hardware nearly as much as software.

Or let me know what you meant if its not that, I don't have much to go on lol. I thought you maybe meant AMD won't ever bother to compete in the high end again but you didnt bold that part of my message.

Makes no sense for NV to go through the hassle of introducing new connector, unless they need it.
NVs biggest (datacenter - machine learning oriented) A100 chip was done at TSMC => they wouldn't do it, if Samsung 8nm could deliver better or at least comparable results.

Rumors put AMD's biggest chip to be 505mm2 and 80CUs. 2080Ti is about 40% faster than current 40CU chip.
Rumors also note that the fastest gaming Ampere struggles to be 40% faster than 2080Ti (that is 96% faster than AMD's 40CU) and is bigger than 600mm2, but on Samsung 8nm (which is more of a 10nm rebrand with some improvements)
Lisa explicitly said that AMD will roll out high end GPUs.

So, at the end of the day:
A given: AMD having a card faster than 3080. Radeon team is in full swing mode, no more starving budgets.
Possible, but not very likely: AMD beating 3080Ti/3090

Maybe the reason why ps5 is gigantic in size?
I'd say 40CU vs 56CU in XSeX is. Sony is forced to squeeze a bit more perf, with a lot more power consumption.
 
Last edited:

Ovek

7Member7
If NV does stick a new power connector on one it will be the titan ultra high end card, because if you have the cash to drop on one of them you will have the cash to drop on a new power supply as well.
 

Kuranghi

Member
Makes no sense for NV to go through the hassle of introducing new connector, unless they need it.
NVs biggest (datacenter - machine learning oriented) A100 chip was done at TSMC => they wouldn't do it, if Samsung 8nm could deliver better or at least comparable results.

Rumors put AMD's biggest chip to be 505mm2 and 80CUs. 2080Ti is about 40% faster than current 40CU chip.
Rumors also note that the fastest gaming Ampere struggles to be 40% faster than 2080Ti (that is 96% faster than AMD's 40CU) and is bigger than 600mm2, but on Samsung 8nm (which is more of a 10nm rebrand with some improvements)
Lisa explicitly said that AMD will roll out high end GPUs.

So, at the end of the day:
A given: AMD having a card faster than 3080. Radeon team is in full swing mode, no more starving budgets.
Possible, but not very likely: AMD beating 3080Ti/3090


I'd say 40CU vs 56CU in XSeX is. Sony is forced to squeeze a bit more perf, with a lot more power consumption.

Cheers, thats cool, might go back to AMD again if the perf to price ratio is right. Last time was a 7870, been nvidia since 970 though.

Also, you guys ever heard of something like this before?:

So I bought a Club3D 7870 from scan.co.uk and it came and I put it in the PC, connected the power to the card, etc. I start using it and when it starts to take on a load it just black screens and shuts down the PC. I open the case to check its all seated right and the power connectors securely connected as well. Same problem so I start emailed Club3D to make sure I'm not doing something stupid before I send it back to scan for a replacement.

They give me basic instructions which ofc includes: make sure you have a proper PSU and that you connect the GPU power connectors. In the pictures it shows 2 x 6 pin connectors and I look at the card and it has only ONE. I tell them this and they don't believe me and just keep giving basic information about setup/troubleshooting, after a few emails I send them a pic of the cards power connectors and they immediately change their tune, saying "Please return the card to scan and we'll sort you out with a new one". I checked in GPU-Z before returning it and it identifies as a 7870 and NOT a 7850 (which has 1 x 6 pin) and the branding it all 7870 from box to card itself.

So I ended up getting it sorted and it was fine (kinda, more on that below), I guess the card just wasn't getting enough power from 1 x 6 pin + the lane power so it crapped out at load.

Do you all think that was just improper manufacturing? Or that they mixed up a 7850 and put the 7870 firmware on it, branded it as a 7870 and then put it in a 7870 box? The latter seems SO unlikely, because thats a lot of mixups but the with the former I don't even know if its possible for a GPU to be manufactured wrongly in that way.

The worse part was that they had no more 7870s at that price so I had to go with the 7850 instead. Still good but I always missed that 7870...

Anyone ever heard of anything like that?
 

Ailike

Member
To those asking if so and so PSU can run whatseewhoit,... if you don't know you have no business having the top of the line card.
 

GHG

Member
To those asking if so and so PSU can run whatseewhoit,... if you don't know you have no business having the top of the line card.

That's really not a helpful attitude.

Some people might be willing to upgrade their PSU if necessary but won't want to do so if they don't need to.
 

kiphalfton

Gold Member
Welp, is asking people who are into $1k+ GPUs to get PSU upgrade really too much?

Yes, because it will effectively add an additional cost to an already expensive GPU. Doesn't help PSU prices are already high right now. A 850W PSU is $150 on sale.

Nothing wrong with not wanting to have to invest more money into your PC, as the GPU shouldn't require EVERYBODY to upgrade their PSU if they elect to go with the high end options.

Imagine you just bought a new PSU, with these inflated prices, and 2 months down the line Nvidia announces this from out of left field. How long has a 750W-850W PSU sufficed? Historically speaking, for what like the past 10 years. Making a purchase decision, due to trends - that kind of wattage having proven enough for many many years - only for something to come out that doesn't really seem all that necessary because "they want to avoid making people have two PCIe power cables coming out of their PSU, hooked up to their GPU" is ridiculous. Yes it's a rumor, but if 2 x 8 pin power connectors work, FFS just use that. I understand nothing is "future proof", but these high-end GPU's will be the only thing that is making use of this. And only used by Nvidia.
 
Last edited:

CuNi

Member
Just to make it clear, the 12-pin connectors will come eventually, there is no reason not to get them.
The current cable setup is already way outdated and overdue.
 

TriSuit666

Banned
Just to make it clear, the 12-pin connectors will come eventually, there is no reason not to get them.
The current cable setup is already way outdated and overdue.

Sure I saw Steve from GN mention something about a new prototype PSU that does away with the +5V and 12V lines and moves those onto the motherboard that they were trying to get their hands on.
 

Siri

Banned
Not everyone has a computer they built themselves. Replacing a PSU is not an easy place to begin. There could be proprietary parts as well/
 

cryogenic7

Member
It's not so much about the overall power requirements, the main concern seems to be around this new 12 pin connector that doesn't exist on any current psus. There's no cable that will come directly out of your psu (modular or otherwise) that fits thus new spec.

We need to see if an adapter/convertor is possible and what that looks like.
It will come with an adapter, like the cards always have.
 

Gp1

Member
I know this C R A Z Y dude that runs a RTX 2070 super on a 550W PSU
the box said 600 but this dude does not give a fuck.

To be fair, those PSU wattage recommendations are pretty generous. As long as you don't mess with the 12v line recommendations...
 
Top Bottom