• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why did earlier consoles consume so little power?

Melon Husk

Member
Funny, I was thinking about this a couple of weeks ago. Classifying game consoles based on power draw would be interesting.

NDS consumed less than 1 W whereas the Switch Lite is more like a "PS1 with a battery" in terms of power draw, consuming ~7.5 W. PS1 did not have a battery and thus did not require a fan. The threshold for air cooling in a mobile handheld console is somewhere around 10W when you sum the total heat from battery + processor?
 
Last edited:

kraspkibble

Permabanned.
Mac Studio consumes like 60w max and still rapes a 1000w+ top end PC. Maybe next time go for ARM instead of the aging x86.
i have a rather powerful PC but the amount of power it consumes is bothering me now especially with rising energy costs. i've been thinking about getting rid of it, buying a console for gaming, and maybe using my iPad as my main device or buying a MacBook/Mac.
 
Last edited:

UnNamed

Banned
Probably reliability.
A 68000 can't sustain an high clock because many reasons, mainly manufacting issues, CPUs had more problems and hardware failures back in the day.
So you can have a 200w CPU at 90 degrees and it's not a problem but you can't have a 68000 at 15w because heat and melting you can't simply solve with a heatsink.
 

Bo_Hazem

Banned
i have a rather powerFUL PC but the amount of power it consumes is bothering me now especially with rising energy costs. i've been thinking about getting rid of it, buying a console for gaming, and maybe using my iPad as my main device or buying a MacBook/Mac.

I've been a desktop user since like 1996, and never used a Mac before nor I really love the company. But traveling around and wanting a semi-portable system the Mac Studio seems perfect to me as the keyboards/mice/screens are independent. I hate laptops as it's all connected and a broken screen/keyboard can ruin it to me. Also some of these GPU's can really die due to their insane power draw = heat and some coding shit like that one game that used to kill 3090:


Well, you can't play "real" games on Mac anyway, not like I do game on my gaming desktop. For my specific needs, M1 is a beast for video decoding/encoding with lots of hardware acceleration including Apple ProRes if I ever consider shooting in 16-bit RAW. Also most video-editing software on the market, especially DaVinci Resolve Studio 18, are optimized for it.

Mac-Studio-Shield-Lifestyle-1.jpg
 
Huh? That doesn't show that. If you really wanted to know with certainty, you'd have to calculate the ratio of silicone area to performance for both consoles and compare them. I'm pretty sure PS5 will be an order of magnitude more efficient per unit of silicone.

The PS5 CPU is 95 times faster than PS2. If PS5 was somehow LESS efficient per unit of silicone, it would need at least 95 times more of it just based on that one statistic.

Imagine you have a machine that's 1 foot wide that spits out 1 pancake per minute. And then a machine that is two feet wide that spits out 100 pancakes a minute. But because it's twice as large, you call it less efficient for the materials in the machine. That is type the comparison being made here and it's silly!!
What I meant to say (I said it in an incoherent way as I was tired) is it's taking bigger and bigger chips to see more performance gains compared to the past, and they need more watts to function.

I know what you're saying, that performance to watt ratio is far far better today, but it has to be. I'm talking about what chips could achieve relative to their time, though.

Basically PS5 is less efficient because it can't use a chip that is as small and low power as PS2, if it did we wouldn't have anything that resembled a generational leap in performance compared to the past.

Though PS5 to PS2 is not the best comparison to show this as PS5 is not more power hungry than og PS3.

Compare PS3 to PS2, then ps1 to PS2 and compare wattage and silicon area and you'll see that was the turning point for efficiency.

Or look at the rumors for the 4xxx rtx gpus. Everything is getting more power hungry to facilitate performance gains ; that is the opposite of efficiency. Because they can't do it with die shrinks alone anymore.
 
Last edited:

Wildebeest

Member
The ARM chips used in mobile phones show that CPU designers are fully capable of making useful chips that do not suck huge amounts of power or need a lot of active cooling. It is just a competition to achieve high-end performance. That vintage generations didn't push this so hard is perhaps a side effect of prices coming down. They were expensive enough without multiplying the amount of chips by 10 or so.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
Inflation 👍
Back then it was a pretty adequate to power the tech, I'm sure they wanted to aim lower and considered it to be high.
You'd probably be like Doc Brown when he found out how much power the DeLorean needed if you showed someone a PS5 in 1995.
 

BlackTron

Member
What I meant to say (I said it in an incoherent way as I was tired) is it's taking bigger and bigger chips to see more performance gains compared to the past, and they need more watts to function.

I know what you're saying, that performance to watt ratio is far far better today, but it has to be. I'm talking about what chips could achieve relative to their time, though.

Basically PS5 is less efficient because it can't use a chip that is as small and low power as PS2, if it did we wouldn't have anything that resembled a generational leap in performance compared to the past.

Though PS5 to PS2 is not the best comparison to show this as PS5 is not more power hungry than og PS3.

Compare PS3 to PS2, then ps1 to PS2 and compare wattage and silicon area and you'll see that was the turning point for efficiency.

Or look at the rumors for the 4xxx rtx gpus. Everything is getting more power hungry to facilitate performance gains ; that is the opposite of efficiency. Because they can't do it with die shrinks alone anymore.

I have to say I still disagree. "Relative to the time" is a very murky way of looking at it. Regardless of the year, prettier games require more calculations, and as games get prettier eventually it will be too many calculations to do without active cooling. This was true in the early 90s as well even when we were playing 2D games on consoles, 3D games were on PC which of course had fans and heatsinks.

While it's true that there are diminishing returns from die shrinks, requiring more power/bigger chips, that doesn't mean they're less efficient. Just that the rate of added efficiency per gen is reduced. Diminishing returns, not reversed returns.
 
I have to say I still disagree. "Relative to the time" is a very murky way of looking at it. Regardless of the year, prettier games require more calculations, and as games get prettier eventually it will be too many calculations to do without active cooling. This was true in the early 90s as well even when we were playing 2D games on consoles, 3D games were on PC which of course had fans and heatsinks.

While it's true that there are diminishing returns from die shrinks, requiring more power/bigger chips, that doesn't mean they're less efficient. Just that the rate of added efficiency per gen is reduced. Diminishing returns, not reversed returns.
Both of our views are valid here.

The manufacturing is more efficient, but the way companies are achieving higher performance is less efficient. Hardware design philosophy if you will.

I think a minimal heatsink was reasonable, like N64. No fan on N64 either.

But today's components are just not built to last as long, with how hot they run and how much cooling they need.

Anyways, I'm halfway playing the devil's advocate here, as I have really beefy cooling on my PC and am pushing the hardware there. I draw the line at 300 watt GPUs though :p
 

Filben

Member
why older consoles couldn’t go beyond 25W pre-2000.
I think they could but there was no need because components didn't consume as much power because it was not "so much going on" in those components like today.

The only correlation between power and performance is efficiency. It's not like you can put more power in the system and get more performance out of it. Power is the fuel for your components but not for the performance.

Think of your brain, which is practical a computer and uses around 20% of your body's energy. You can feed 4000kcal a day and the only thing you'll get is fat. You won't be better at calculating, thinking, etc just because you have the energy. You need to processing power to process and utilise that energy. So if let's say you play chess all day you'll need more energy for you brain then usual and more than for sitting around and doing nothing.
 

John Wick

Member
Because that's what technology was at the time, the chip ran at mere 100-300MHz not couple of GHz, and there were dozen of transistors a instead if billions. In other wirds it just wasn't possible to bump up the power consumption any further.
The OP could have easily answered the question by using common sense. There weren't any components that used that much power. Your using today's power usage and applying it back to the 90's.
Was there any GPU capable of 10tf?
Any CPU's running at 2.5ghz?
 
Top Bottom