To be fair, those PSU wattage recommendations are pretty generous. As long as you don't mess with the 12v line recommendations...
Fuck our electrical bills, amirite?
Not quite. Trash tier PSUs (and I mean potential toaster tier) often can't provide their specced wattage. A 750W toaster might only be able to get up to 500W.Also, 550W PSU != 550W PSU.
You have a varying amount of efficiency when under load, that's what those "Bronze/Silver" etc. labels mean.
It's weirdly a thing that very few people realize, how technically a difference in Bronze vs Platin or so can mean fully powering your system or limiting it in power!
Power supplies are like amplifiers. You have different levels of efficiency, which would correlate to bronze, silver, gold, and platinum. You are paying extra for the efficiency, which is worth the money to many people. Some platinums can be overpriced though.
Definitely. I won't get anything higher than gold. Especially as most come with warrantees that outlast the actual PSU.Platinums are definitely in the realm of diminishing returns.
Unless you run a business where you have a load of PC's running 24/7 and you're looking to maximise margins then the extra efficiency they offer is not worth it.
Say, 20 hours per week gaming, 50 weeks => 1000 hours per year.Yeah, That extra 70 cents a month sure is going to break the bank.
I think you guys forget how hot Navi cards already are
AMD struck gold with TSMCW1zzard on TechpowerUp, famous Nvidia biased site I might add, is worried the 3080 Ti might draw over 400W! That's fucking crazy if true, Nvidia may have scored an own goal going with Samsung's 8nm process, TSMC are just too good right now so Nvidia's dispute with them is really braindead.
Why do you guys sound like people from parallel universe, what the heck is "two generation behind"?...two generation behind...
Doesn't seem like you have a clue, how much they actually consume.forget how hot Navi cards already are
The A100 is using TSMC's 7 nm process though, not Samsung's 8 nm. The 5700 XT was rated at 225W, and with the 50% performance per watt improvement AMD have promised, they should be able to double the 5700 XT's performance for 300W. I think if Nvidia is using the 8nm process, it is conceivable AMD could match them in performance per watt, with Nvidia beating them in absolute performance by pushing above 300W.I think you guys forget how hot Navi cards already are, despite being made in 7nm process and only 36-40CUs, AMD won't be able to step it up all of a sudden, while doubling the CU count at the same time as rumored, I'd say they are in the worse position as far as power consumption and heat goes, always have been since GCN. As for the Ampere cards themselves, if they will allow for 100+ FPS with RT ON people will jump over them like crazy and never ask a singe question, that's just how it works, always have been. Which again, A100 is 400W, 250 in PCIE version (https://www.nvidia.com/en-us/data-center/a100/), so I cannot imagine/explain where any extra wattage would be needed for? They will obviously add RT cores to the cunsumer cards, but those will replace some of the CUDA cores, so the end result won't change. Bottom line is, absolutely nothing will change as long as AMD will be two generation behind, as long as NV will be unquestionably the fastest out there they will be able to do whatever they want, and the consumers will have to suck it up. I wouldn't be surprised if Intel catches up with NV before AMD does TBH if RDNA2 doesn't close the gap this time around, like, how many tries, how many generations can we hope AMD will finally give them some competition?
Not that you're wrong, but, if the console GPUs are anything to go by, they made some great advances in power consumption. Navi cannot run over 2 GHz, but the PS5, a console, can. It actually bodes well for AMD.I think you guys forget how hot Navi cards already are, despite being made in 7nm process and only 36-40CUs, AMD won't be able to step it up all of a sudden, while doubling the CU count at the same time as rumored, I'd say they are in the worse position as far as power consumption and heat goes, always have been since GCN. As for the Ampere cards themselves, if they will allow for 100+ FPS with RT ON people will jump over them like crazy and never ask a singe question, that's just how it works, always have been. Which again, A100 is 400W, 250 in PCIE version (https://www.nvidia.com/en-us/data-center/a100/), so I cannot imagine/explain where any extra wattage would be needed for? They will obviously add RT cores to the cunsumer cards, but those will replace some of the CUDA cores, so the end result won't change. Bottom line is, absolutely nothing will change as long as AMD will be two generation behind, as long as NV will be unquestionably the fastest out there they will be able to do whatever they want, and the consumers will have to suck it up. I wouldn't be surprised if Intel catches up with NV before AMD does TBH if RDNA2 doesn't close the gap this time around, like, how many tries, how many generations can we hope AMD will finally give them some competition?
Bigger cards are notoriously more energy efficient (as they have lower clocks):The A100 is using TSMC's 7 nm process though, not Samsung's 8 nm. The 5700 XT was rated at 225W, and with the 50% performance per watt improvement AMD have promised, they should be able to double the 5700 XT's performance for 300W. I think if Nvidia is using the 8nm process, it is conceivable AMD could match them in performance per watt, with Nvidia beating them in absolute performance by pushing above 300W.
PC graphics cards coming out later (maybe even this holiday) with 80 CUs at over 2GHz are going to be nuts.Not that you're wrong, but, if the console GPUs are anything to go by, they made some great advances in power consumption. Navi cannot run over 2 GHz, but the PS5, a console, can. It actually bodes well for AMD.
Not that you're wrong, but, if the console GPUs are anything to go by, they made some great advances in power consumption. Navi cannot run over 2 GHz, but the PS5, a console, can. It actually bodes well for AMD.
The image is very small, but the XSX PSU is rated for 315W;But both consoles are much larger than ever before, with both companies putting strong emphasis on cooling, which was also unheard of before, so that gives a hint that the power RDNA2 gives doesn't come so freely.
Sounds like a big clusterfuck. September 1st will be an interesting day for sure.NV's card is said to be using it, AIBs stick with multiple 8-pin:
![]()
GeForce RTX 3090: 12-pin PCIe on Founders Edition, not on custom cards
We're hearing that NVIDIA will use a new 12-pin PCIe power connector on the GeForce RTX 3090 Founders Edition, not custom cards.www.tweaktown.com
I know this C R A Z Y dude that runs a RTX 2070 super on a 550W PSU
the box said 600 but this dude does not give a fuck.
Didn't know if this would warrant it's own thread but I guess they were closer than I thought.
Those recommendations tend to be way too strict. Even a high-end CPU paired with a 2080Ti is unlikely to crack the 450W mark, even running at full load for extended periods of time (which pretty much never happens while gaming).I know this C R A Z Y dude that runs a RTX 2070 super on a 550W PSU
the box said 600 but this dude does not give a fuck.
WOAAAH WE HAVE A MAD LAD HERE !!Those recommendations tend to be way too strict. Even a high-end CPU paired with a 2080Ti is unlikely to crack the 450W mark, even running at full load for extended periods of time (which pretty much never happens while gaming).
NV's card is said to be using it, AIBs stick with multiple 8-pin:
![]()
GeForce RTX 3090: 12-pin PCIe on Founders Edition, not on custom cards
We're hearing that NVIDIA will use a new 12-pin PCIe power connector on the GeForce RTX 3090 Founders Edition, not custom cards.www.tweaktown.com
I've reread this a few times and I'm not entirely sure what you're getting at is based on firm ground.TBH thats not going to happen.
The Idea of the 12 Pin connectors for GPUs and Motherboards etc is to reduce efficiency loss when converting AC to all those DC Voltages like 12V, 5V, 3.3V etc.
So the motherboard designs need to be changed as well to reflect that and I think that'll happen mid 2020 at the earliest for consumers.
Doing a staggered release would be very stupid as you'd force people to buy a new GPU to have that 12 Pin connector for the GPU and then half a year later force people YET AGAIN to buy a new PSU for the 12 Pin motherboard connector.
That would kill adoption rate before it even had a chance to begin.
the use of an adaptor is possible or people will have to buy new PSUs?
I've reread this a few times and I'm not entirely sure what you're getting at is based on firm ground.
It sounds like you're conflating 12Pin PCIE with ATX12VO. Intel's ATX12VO standard has a 10 Pin Motherboard connector, and the power supplies only provide 12V (as you mentioned).
Nvidia don't seem to give a damn about an official standard here but PCIE GPU cables are already 12V/GND only. Not a sparky but there doesn't seem to be anything incompatible there that couldn't be solved with a new modular cable or an adapter from 3 x 8 Pin PCIE (probably in the box!).
To me their goal seems to be to increase available power delivery without going OTT with cables and wasting board space, but that doesn't have anything to do with efficiency in relation to efficiency loss. The only reason to buy a new PSU for these rumoured cards would be if your PSU isn't actually capable enough to power them anyway.
ATX12VO is going to take a while to become THE standard but yes a new PSU for that but you'll be building a new system anyway. Same thing when we went through previous ATX revisions. I don't see a double whammy of pain here.
Sorry if I've misread and got the wrong end of the stick there, just trying to make sense of it.
Ahh fair enough.All good, if anything I probably have worded it shitty as I typed that right after waking up and still in bed lol.
What I meant is that I don't think all NVcards will have that new 12-Pin connector as I highly doubt that it would look good in terms of publicity if you either are forced to A) buy a new PSU that has said 12-Pin connector or B) use janky adapters. Also, if you don't buy a whole new rig but just upgrade your GPU, you exactly have the issue described above. Use the adapter or buy new PSU. And if you decide to upgrade the rest of the rig 1-2 years later, you need to get a new PSU yet again. (if you bought a new one with the GPU that is). I just don't think they'll do such a staggered release but go all out and release 12-Pin PCIe and 10-Pin ATX at the same time with full compatible PSUs.
Ahh fair enough.
Nah you're right that not all the AIBs are on board with this 12Pin plan if the rumours are to be believed.
Add option C of a new cable for modular supplies and you're about right. I don't expect B to be janky, though. Time will tell.
I see what you're getting at with the knock on effect of buying a new PSU and then system, but the sad fact is that Nvidia aren't Intel nor are they currently working together on this. The PSU & motherboard manufacturers of course play a role but I'd be pleasantly surprised to see Nvidia reach out to Intel and vice versa on it and come up with a plan to launch/push hard for this sort of thing at the same time.
I guess I'm pessimistic on the planning not being a shit show, and optimistic on the hardware side of things while you're the opposite of that![]()