Google Calls For Power Supply Design Changes 377
Raindance writes "The New York Times reports that Google is calling 'for a shift from multivoltage power supplies to a single 12-volt standard. Although voltage conversion would still take place on the PC motherboard, the simpler design of the new power supply would make it easier to achieve higher overall efficiencies ... The Google white paper argues that the opportunity for power savings is immense — by deploying the new power supplies in 100 million desktop PC's running eight hours a day, it will be possible to save 40 billion kilowatt-hours over three years, or more than $5 billion at California's energy rates.' This may have something to do with the electricity bill for Google's estimated 450,000 servers."
Proposal spells doom for USB powered devices (Score:3, Interesting)
Thanks,
Jim
Re:No... (Score:5, Interesting)
Re:Combine it with a UPS (Score:2, Interesting)
While we're at it let's ditch motherboards too. (Score:2, Interesting)
Re:What in a modern computer actually uses 12V? (Score:2, Interesting)
Of course, 'violations' of the voltage on 'Rs-232' ports has historically been really really common. Old PCs often had problems operating with serial mice, because the voltage span on the RS-232 ports on some machines was only a few volts. I remember an old Northgate 386 at work that had that problem.
Why not -48? (Score:4, Interesting)
The advantage of -48 over 12 volts is that there will be less loss through resistance and smaller conductors can be used. Of course, there is a greater risk of electric shock, but I would think -48 would be pretty safe.
48 volts is also the standard for Power over Ethernet (IEEE 802.3af) [wikipedia.org]. This may not be compatible, though, since telcos run -48, not +48, though some equipment can operate with either (though some cannot).
Re:No... (Score:3, Interesting)
Apparently they hired expert ergonomic and industrial designers to figure out how many servers and workstations they could cram into a mobile semi-trailer lab, while still making it comfortable to work in. Kind of a neat optimization problem I think.
Re:I've wanted this for years. (Score:2, Interesting)
Next generation computer should have 12v plug and special cable, so that it can take 12v source from outside.
What's important is the cable and socket but be different with 110v or 235v to avoid "accident".
i would love to see conputer running from a car battery
Re:No... (Score:3, Interesting)
I'm guessing the answer was lots and lots...... there are quite a few technical challenges as you say, power, cooling, and making sure that the machines survive the journey, too.
It would be a neat side business if Google went into providing server farms and data centers for other businesses; as other people have mentioned they have a lot of smart people working on the associated problems.
Hey, it could save their asses if this whole internet thing doesn't pan out
Re:What in a modern computer actually uses 12V? (Score:5, Interesting)
Basically, when a machine fails, it is pulled from the rack and replaced with an identical machine with a cookie cutter image. Kinda like the Borg
When a box fails it is probably instantly detected by some machine monitor and taken offline (think: the 'crop' tenders in the Matrix I). The sysadmins arent going to waste time plugging a video cable into the rack... just pull it. Toss the box into a repair queue and let the tech's put a video card into it if needed. Remeber: 100's of machines fail for them every day. That's a fact from the Google talk in 05.
Bad idea (Score:3, Interesting)
Google's perspective is rather unique, they use super-cheap desktop systems that individually do not use a lot of power and thus running them off 12v DC might make sense. But in any other, more conventional datacenter, servers have multiple power supplies that can EACH pull 800w of power. Now when you're running 110v AC that means you're pulling ~7 amps through a single cable. You need datacenter grade power cables for this, but it's still sane. Now you can get datacenter equipment that runs 48v DC, but those cables end up running ~15 amps through them, so now you need substantially stronger cable - cable so thick that running it becomes a seriously difficult task due to the guage of the wire!
More likely the direction people are going (and have been for some time) is to 208v AC or 3 phase 220v AC. Now you've just halved the current draw, meaning that your PDUs don't need to be as hefty, your wire doesn't have to be as thick, your coils don't get as hot, etc.
Running 12v DC in any real data center would be ludicrous - the amount of current you'd have to draw through your cables would be way beyond a safe level.
Also AC/DC conversions are cheap these days. And remember, DC can kill you just as easily as AC when your DC Voltage is that low.
Re:What in a modern computer actually uses 12V? (Score:5, Interesting)
You are correct that hard drives generally use just 5V, but the rest of your points are not even close. Modern CPUs require lower voltages, higher current, and tighter regulation, which is why DC-DC power supplies are now on motherboards instead of running directly from an ATX supply.
Furthermore, running a rack of servers on 5V rails would be absolutely absurd. Do you have any idea what the amperage would be? The bus bars would have to be several inches thick, the transmission loss would be enormous, and if you accidentally shorted them.... forget it!
Something like 48VDC might work but then you lose out on all the economies of scale driven by the 110/240VAC standard.
Just match the power supply to the motherboard and be done with it. Standardizing on one voltage is impractical, and besides, how would it improve "efficiency"?
This is about voltage to the boards, not the box (Score:5, Interesting)
Most of the postings so far have it all wrong. Google is not proposing 12VDC into a desktop PC or 12VDC distribution within the data center. What they're proposing is that the only DC voltage distributed around a computer case should be 12VDC. Any other voltages needed would be converted on the board that needed it.
This is called "point of load conversion", and involves small switching regulators near each load. Here's a tutorial on point of load power conversion. [elecdesign.com]
It's been a long time since CPUs ran directly from the +5 supply. There's already point of load conversion on the motherboard near the CPU. Google just wants to make that work off the +12 supply, and get rid of the current +5/-5/+12/-12 output set.
What lower voltage power supply? (Score:3, Interesting)
Awesome for us Off-Grid'ers... (Score:1, Interesting)
It'd be great to know that I can hook anything up without having to kludge mods to avoid frying them. At least I didn't go to 24 or 48 Volts...
Comment removed (Score:5, Interesting)
I don't get it. (Score:3, Interesting)
What they are recommending is that the power supply only have 12V out, and all other DC-DC conversions take place on the mother board. Unfortunately, the article didn't go into any detail as to how this would save power, and I don't see how it would make much difference. To me it just seems like you are moving components off the PS and onto the motherboard. Perhaps there is an EE around who could explain it to me.
Re:This is about voltage to the boards, not the bo (Score:3, Interesting)
Re:What in a modern computer actually uses 12V? (Score:3, Interesting)
Re:I've wanted this for years. (Score:3, Interesting)
No, he misses the Convergent Technologies NGen. This was a pretty powerful x86 platform that also used external power supplies. The nicest thing about it was that it was quiet: the power supplies (yes, plural; the number you need varied according to your internal hardware) used passive cooling, so only internal heat sources needed to be cooled.
This was 1983, which was when IBM introduced the PC-AT [wikipedia.org], the machine which defines "compatibility" to this very day. And the AT used a big, noisy internal power supply. Technologically a big step backwards, but one that everybody was forced to imitate, including Convergent.
So here it is 20 years later, and we're just now beginning to talk about quiet and efficient power supplies again. Kind of sad, really.
Re:What in a modern computer actually uses 12V? (Score:3, Interesting)
The one used by the majority of DC electric devices, not just computers. The one compatable with existing external power supplies such as solar, home gas powered generators, your car battery, etc.
If motherboards were designed to run on 12v DC you could put a socket on the back of the case and jack into anything that gave you 12v DC. You could take your home desktop straight to the RV, boat, or cabin in the woods running off a turbine in the little stream or the windmill; without inverters.
I've been talking about his shit for decades. I've talked about it here. You might almost think that Google:
http://hardware.slashdot.org/comments.pl?sid=1977
KFG
Re:No... (Score:5, Interesting)
The heat losses in S-100 on-card linear regulators were immense! That and the weight of the (linear) transformers helped make the Apple ][, with its switching power supply, so popular (I still have an old Poly power transformer; makes a great doorstop).
Some mainframe computers used the scheme mentioned by others -- polyphase high-frequency AC distribution. High frequency (think 800 Hz) power transformers are small and efficient; that's why switching supplies run at high frequencies (in the hundreds of KHz range).
Efficiency is not only about wasting less power, it's about generating less heat!
Re:No... (Score:4, Interesting)
Re:Big ego department (Score:3, Interesting)
Another way to get more efficiency (Score:3, Interesting)
Another way to get more efficiency is to operate the Switched-mode power supply [wikipedia.org] at the higher voltage it supports, usually 220 to 250 volts. In most of the world this is already done. In North America computers are typically run on 120 volts (in Japan this is 100 volts). In general, these power supplies are more efficient by about 3% or so, on the higher voltage. Of course, be sure to flip the voltage switch if it has one, or otherwise verify that it does support the higher voltage.
For a single computer, it would not be worth adding the extra circuit to get 240 volts. But if you run several, it could be worth doing so, especially if you have so many that it exceeds the capacity of one 120 volt 15 or 20 amp circuit (you could have twice as many on the same amperage if operating at 240 volts). If you already have a circuit dedicated to the computers, that circuit could be converted from 120 volts to 240 volts by changing out the circuit breaker from a one pole to a two pole type, marking the white neutral wire with red or black tape to comply with electrical code identification requirements, attaching these wires to that new breaker (not to the neutral bus), and installing a 240 volt style outlet (NEMA 6-15R or 6-20R). These are the steps that would be used to install an outlet for a big window air conditioner (which you might need anyway with so many computers). Then you can use this [interpower.com] power cord.