Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

DC Power Saves 15% Energy and Cost @ Data Center 371

Krishna Dagli writes "Engineers at the Lawrence Berkeley National Laboratory and about 20 technology vendors this month will wrap up a demonstration that they said shows DC power distribution in the data center can save up to 15 percent or more on energy consumption and cost. The proof-of-concept program, set up at Sun Microsystems' Newark, Calif., facility, offered a side-by-side comparison of a traditional AC power system and a 380-volt DC distribution system, running on both Intel-based servers and Sun systems."
This discussion has been archived. No new comments can be posted.

DC Power Saves 15% Energy and Cost @ Data Center

Comments Filter:
  • Safety (Score:5, Insightful)

    by TimeTrav ( 460837 ) on Wednesday August 09, 2006 @09:57AM (#15873282)
    I, for one, would not be comfortable working around high power DC. Call me paranoid, but I rather enjoy my heart beating with its current interval. You can take all the precautions you want, but accidents do happen.
  • by RotateLeftByte ( 797477 ) on Wednesday August 09, 2006 @10:04AM (#15873345)
    Telephone Companies had known this for years. This is why you can get 48vDC versions of most systems.
    In a telephon e exchange 48v DC is the norm.
    They have huge batteries and standy generators to keep the phone syste, running.
     
  • by Anonymous Coward on Wednesday August 09, 2006 @10:04AM (#15873346)
    > DC will kill you much quicker than AC of the same voltage/amperage.

    > Then again, you don't have to worry about shorting yourself to ground with DC.

    I think you had better take that electronics class again...
  • Edison (Score:3, Insightful)

    by Rob Kaper ( 5960 ) on Wednesday August 09, 2006 @10:05AM (#15873350) Homepage
    Good to see some more DC in use. Tesla was right about AC for many applications but DC has its merits and any useful application of DC is a credit to Edison's scientific achievements.
  • by jc42 ( 318812 ) on Wednesday August 09, 2006 @10:07AM (#15873375) Homepage Journal
    ... those claims of saving "up to 15 percent or more".

    That pretty much covers the entire range of possibilities.

    I often wonder why they didn't say something like "up to 50 percent or more" or "up to 99 percent or more". Those would be every bit as meaningful.

  • Re:Safety (Score:4, Insightful)

    by andrewman327 ( 635952 ) on Wednesday August 09, 2006 @10:08AM (#15873379) Homepage Journal
    A 220 volt AC wall outlet will also kill you. Honestly, how many electrical accidents injure or kill IT workers every year? Not very many.
  • Re:Edison (Score:5, Insightful)

    by John Hasler ( 414242 ) on Wednesday August 09, 2006 @10:19AM (#15873461) Homepage
    > Tesla was right about AC for many applications but DC has its merits and any useful
    > application of DC is a credit to Edison's scientific achievements.

    For 19th and early twentieth century technology Tesla and Westinghouse were entirely right. They had no practical method of changing voltage.

    BTW you don't want to look too closely at Edison's scientific achievements. You might find that there is less there than meets the eye.
  • Re:Edison (Score:5, Insightful)

    by Svartalf ( 2997 ) on Wednesday August 09, 2006 @10:26AM (#15873521) Homepage
    Edison didn't have all that many scientific acheivements.

    The record player was really the only truely unique thing he did. Everything else was a duplication of someone else's efforts where he succeeded and the others failed- or was something one of his employees came up with. Did you know that he'd "Westinghouse" a cat "to show the dangers of AC power" during the time where he was trying to compete with AC power versus his DC system (From which ConEd initially came from...)? This would entail hooking up a grid of alternating plates with some small amount of insulating gap to an AC power connection, place them inside a cage that one's keeping a cat and then plug it in. Edison's NOT someone to be holding up as an example of scientific achievement- unless you want to hold Mengele up as well. Sure, we got a lot further in medical science because of that "Doctor", but how he got his information, I'd rather he didn't do what he did- and it's not a good example of a scientific achievement.

    DC and AC both have their place. DC is good for short-haul power distribution, but if you short out the lines you'll destroy the entire power run. AC doesn't do that anywhere near as bad- which is why electric power is distributed as AC- it doesn't have the same safety issues and it can be transmitted long distances without major losses as it's being transmitted down the wire, not conducted.
  • Re:Safety (Score:5, Insightful)

    by cswiger2005 ( 905744 ) <cswiger@mac.com> on Wednesday August 09, 2006 @11:16AM (#15874000) Homepage
    Obviously, you don't work on live circuits if you have a choice of working with them off instead, but good habits mean you treat even dead circuits as if they were live until fully isolated & disconnected, just as you should treat a gun as being loaded until you've confirmed that it is not.

    Well-designed power supplies often have a bleeder resistor across the primary filter caps to drain them of juice, but note that the vaccuum tube in a CRT makes an excellent capacitor as well (it's being charged to 20 kilovolts or more), and it's dangerous to try to dead-short it to drain the residual current. 120VAC current shock can be fatal but that is very uncommon; however, the voltages inside a CRT are probably the most dangerous level of current most people have around in their homes or work environments.
  • by Phreakiture ( 547094 ) on Wednesday August 09, 2006 @11:23AM (#15874063) Homepage

    - But they still need to have the transformers to step down the voltage

    This is done with a pulse-width modulator. An AC-DC power supply already has one of these running from 380VDC anyway. The 380VDC in that case is derived from a type of rectifier called a voltage doubler (in the case of 120V sources) or a full-wave rectifier (in the case of 240V sources). The excess voltage then comes from the fact that we are getting peak, rather than RMS, voltage from the AC to the DC side.

    The savings is in that the rectifiers are all consolidated. The pulse-width modulators can have an efficiency as high as 95% easily, whereas a whole switching PS can be as bad as 50% efficient.

    The savings are in the economies of scale for the rectifier. A similar savings could be realised in the pulse-width modulator, too, but would be quickly wiped out by the increase in losses by making long wire runs at low voltages (5V and 12V).

    - DC requires twice as many wires

    Nope. Still two to complete a circuit, just like AC.

  • by YesIAmAScript ( 886271 ) on Wednesday August 09, 2006 @11:53AM (#15874342)
    To get AC, you spin a coil in a magnetic field.

    To get DC you, um, spin a coil in a magnetic field, then rectify it, then put a huge capacitor on there to flatten out the humps.

    There's just no good method for generating DC. And even if there were, electric companies aren't going to run two new phases (DC+ and DC-) to get it to you from the source.

    Instead, the power is going to come to near you as 3-phase, then be rectified. There is a loss in that rectification, but sadly, you can't eliminate it, just change where it happens. Moving it to the other side of your power meter will have an advantage since you theoretically wouldn't have to pay for the losses, although the electric companies would surely change their rates to recoup this lost money. But note that even if they don't change their rates, you haven't saved any energy, just not paid for as much.

    So my guess is this experiment bought into this fallacy, that they measured their power usage at DC levels, found it was lower and reported that as a win, when without a source of DC power that doesn't involve rectification it really isn't.

    I'm sure they save some electricity due to the increased voltage. That reduces current, which decreases power lost. This is the same reason electric companies use high voltages for power transmission.

    The article seems to imply that power supplies convert 120VAC to 381VDC internally. This just isn't true. They never raise the voltage, and 120VAC peaks at 175V or something like that. Even 240V input would peak at 350V. So I don't get this. I think they just messed up a few numbers and really in the experiment connected rectified 240V (UK 240V, which is one phase double high, not the US one 120V phase over another) directly into the power supplies after the point where the rectifier would normally be.

    From what I can tell, going to DC just would save you the cost of lots of little rectifiers in favor of the cost of one big one. To be honest, since the small rectifiers come in commodity ATX power supplies, you're paying almost nothing for them anyway. So I don't see that it's all that valueable to consolidate them.

    I would recommend that if we wanted to save the most power on servers, we should just go to 3-phase 440V AC power supplies. A new connector would have to be designed, as the current 440V 3-phase connector would barely fit on the back of a tower, and wouldn't fit on a 1U server. This would save the most possible in losses without having to buy external rectifiers or force the electric companies to install one on site (and charge you back in increased rates).
  • Re:Let Go (Score:3, Insightful)

    by DarthStrydre ( 685032 ) on Wednesday August 09, 2006 @12:10PM (#15874474)
    "DC tends to cause a convulsive contraction, often forcing the victim away from the current's source."

    Riight... Whichever muscle in a muscle group is stronger presents the dominant force in a convulsion. In the human arm, the gripping muscles are far stronger than the hand-opening muscles. DC or (low frequency) AC, the result is the mostly the same - the hand will grip. If that grip is responsible for the zapping, good luck. DC is worse than AC in this aspect.

    That said, fibrillation is more of a risk with AC than DC, but at power distribution voltages or end-user voltages (220, or in the case of us 115), the difference in damage and risk is negligible.
  • Re:Safety (Score:5, Insightful)

    by Anonymous Coward on Wednesday August 09, 2006 @12:13PM (#15874497)
    >So you also failed electrical theory, as well.

    Yourself also?

    ANY electrical path must be joined from source to drain or no power will flow. It doesn't matter if it is DC or AC. Period.

    An AC path, however, has an easier to isolate ground because it works with simple transformers. A 1:1 transformer will allow you to grab a 220 volt line without being shocked, assuming you do not touch any path that leads back to the other side of the transformer. This is why in the ICU in hospitals you will find them being used: If a patients equipment shorts in a manner that the electricity reaches the patient, it will not shock the patient unless the patient grabs ahold of the equipment.

    Unfortunately DC does not offer this sort of simplicity of isolation.

    Edison was a sadistic nutbag that actually enjoyed electrocuting animals like cats, dogs, and elephants by joining them to an AC power path. His DC power was no less dangerous, the only reason it never electrocuted the animals was that the voltage was low enough skin (or fur) resistance did not allow enough current to pass through the animal's body to kill them. Furthermore, due to the low voltage/high current nature of his system, the amount of energy wasted through heating the conductors limited electricity runs to less than about 2 km.

    The exact same ridiculousness in power cable AWG requirements can be seen in "modern" car stereo upgrades. People will run a 4 AWG cable to their subwoofer amplifier to power an "800 watt" 12 VDC amplifier. The same 800 watts can be generated from a 16 AWG cable hooked into a 120 VAC amplifier. The difference being that the car amplifiers are often unfused because fuses in the 100 - 200 Amp range are expensive, and circuit breaks even more so, and that 15 amp fuses and circuit breakers for home electricity are incredibly cheap. The unfused car system when shorted will burn the car down in no time. The fused circuit in houses when shorted will burn nothing down, and, when repaired, the wiring can even be reused.

    Edison created a useless power system that never worked properly for anyone at all. He also enjoyed electrocuting animals for no apparent reason other than to hookwink customers. He also helped develop one of today's most popular capital punishments: The electric chair. Oh, and he stole credit for several inventions (not the least of which is the light bulb). All around, he's just not a cool guy.

    So, basically, for Edison's idea to have worked, we'd all have 0000 AWG cables running to our homes, and we'd probably be melting several of them causing fires, not to mention that the DC power will cause the conductors to be damaged through electroplating. But, we wouldn't get shocked. Of course, the exact same benefits, along with the additional benefit of no electroplating, could be had by running the same conductors with the same voltage AC current at a frequency outside of 50 - 60 Hz.

    Of course, at 50 - 60 Hz AC power is most dangerous. But then again, at the voltage levels required for modern electricity, the frequency makes very little difference.
  • Re:Here, here! (Score:3, Insightful)

    by cswiger2005 ( 905744 ) <cswiger@mac.com> on Wednesday August 09, 2006 @12:44PM (#15874719) Homepage
    Why isn't this stuff standardized, and power strips can instead contain one single transformer/recitifer package, with DC sockets, or retractable DC wires coming out of them? Even if we ignored PCs and only did the external peripherals for now, we'd still get a big saving in power just by having fewer transformers.

    The cynic in me suggests it's because your typical wall-wart costs about 50 cents to make in bulk and are commonly marked up by a factor of 20 to 100 or so, so when the company sells you a replacement they make out like Enron.

    But yeah, standards exist-- most of the time, you can buy a generic PS from Radio Shack which delivers 3V, 5V, 7.5V, 9V, & 12V @ 1amp or so for much less than you can buy the product-specific wall-wart. Some vendors (like Sony) have even deliberately disregarded the JEDEC? standard connector sizes in order to prevent you from using a generic replacement PS.

  • by jd34 ( 599131 ) on Wednesday August 09, 2006 @02:33PM (#15875648)

    To get AC, you spin a coil in a magnetic field.

    That is one way... granted, the most common way, but not the only way.

    To get DC you, um, spin a coil in a magnetic field, then rectify it, then put a huge capacitor on there to flatten out the humps.

    Again, that is one way... and it has power factor problems that make it undesirable for large installations.

    There's just no good method for generating DC.

    That is a bold assertion. There is a lot of opinion buried in that value judgement, "good", though.

    And even if there were, electric companies aren't going to run two new phases (DC+ and DC-) to get it to you from the source.

    Probably true. They already use high voltage dc for some transmission-level links, but the distribution system doesn't have to change.

    Instead, the power is going to come to near you as 3-phase, then be rectified. There is a loss in that rectification, but sadly, you can't eliminate it, just change where it happens. Moving it to the other side of your power meter will have an advantage since you theoretically wouldn't have to pay for the losses, although the electric companies would surely change their rates to recoup this lost money. But note that even if they don't change their rates, you haven't saved any energy, just not paid for as much.

    You can't eliminate it, but there ARE methods to minimize it that you aren't admitting to your argument.

    So my guess is this experiment bought into this fallacy, that they measured their power usage at DC levels, found it was lower and reported that as a win, when without a source of DC power that doesn't involve rectification it really isn't.

    No, they are aware that the same active rectification that is so popular with variable speed drives (electric motors) due to good power factor can achieve 97% efficiency.

    I'm sure they save some electricity due to the increased voltage. That reduces current, which decreases power lost. This is the same reason electric companies use high voltages for power transmission.

    Agreed.

    The article seems to imply that power supplies convert 120VAC to 381VDC internally. This just isn't true. They never raise the voltage, and 120VAC peaks at 175V or something like that. Even 240V input would peak at 350V. So I don't get this. I think they just messed up a few numbers and really in the experiment connected rectified 240V (UK 240V, which is one phase double high, not the US one 120V phase over another) directly into the power supplies after the point where the rectifier would normally be.

    "To be positive is to be wrong at the top of your lungs."

    • The test obtained 15% at the facility level.
    • The test was conducted in the U.S.
    • It is not a stretch to assume that their power-factor-correcting 480VAC input facility UPSs have 380VDC internally.

    From what I can tell, going to DC just would save you the cost of lots of little rectifiers in favor of the cost of one big one. To be honest, since the small rectifiers come in commodity ATX power supplies, you're paying almost nothing for them anyway. So I don't see that it's all that valueable to consolidate them.

    The equation asserted is: Lower capital cost + higher energy consumption = higher capital cost + lower energy consumption + energy cost savings. This may or may not be true, but they assert that their demonstration showed it was true.

    I would recommend that if we wanted to save the most power on servers, we should just go to 3-phase 440V AC power supplies. A new connector would have to be designed, as the current 440V 3-phase connector would barely fit on the back of a tower, and wouldn't fit on a 1U server. This would save the most possible in losses without having to buy

  • by Hymer ( 856453 ) on Wednesday August 09, 2006 @03:14PM (#15875926)
    380 V DC is the battery voltage on a PowerWare UPS... they have simply removed the DC2AC converter and operates on battery voltage... and most of the loss is from the DC2AC converter.
    A switching mode PSU needs DC so AC from the wall goes first to a rectifier and then to at HF generator (100KHz or more), then to a relativly small transformer (HF = small loss = high efficiency on a transformer), then again to a rectifier and then to some voltage regulators (+12V, -12V, +5V, -5V, +3.3V).
    It is littlt oversimplified maybe but this is the basic idea. The idea behind this is to get rid of a big, heavy and expensive (due to the price of copper) transformer.
    Your other idea 3x440 AC makes the initial rectifier very problematic. usually you just use 2 or 3 std. PSU's and connect them to different phases.

    --

    My ups need 40A @ 220V (single phase) and my power supplier denies me that...
  • The main reason... (Score:3, Insightful)

    by cr0sh ( 43134 ) on Wednesday August 09, 2006 @04:00PM (#15876240) Homepage
    The main reason you won't see 12VDC at home (from outlets around the house) is because of resistance. In order to counter resistance in the circuit, you would need to increase the thickness (gauge) of the wire, because as you make the wire longer (and thinner), resistance goes up. Even a few ohms per 10 feet will kill you (voltage drop wise). If you don't believe me, measure the resistance of a thousand feet of one pair in a Cat5 spool. Run your ohm's law formula over that and check what the wattage is to drive 12 volts at one amp through that (extra credit if you can figure the voltage drop). Ultimately, it would end up being an unsafe solution. This is why we use AC, instead of DC, for power distribution (look into the history of DC vs AC in the Edison vs. Tesla/Westinghouse days, if you don't believe me).


    What I could see happenning, though, is special small size switching transformers built into a standard electrical junction box, which are "smart" in some manner to know when a plug is plugged into them, which connects the switching transformer in, and that supplies, say, 12V at 10A or something to a common downconvertor system or something that all the other peripherals plug into (that, or each peripheral converts the 12VDC independently). In a way, I built something like this, once, for a desk I had: I hooked up an old Sun Computer pizza-box (Sparcstation?) powersupply and created a "bus" of electrical wires running under the desk, hooked up to screw terminal bus strips every so often. I ran the 12V, 5V, and ground lines via this bus along the length of the desk, so then I could get 12V, 7V, and 5V feeds from this system. Hooked up all my peripherals that had wall warts to the bus, and ran a "power on" green LED to the front of the desk for status. Worked pretty well.

  • by AWhistler ( 597388 ) on Wednesday August 09, 2006 @08:58PM (#15877842)
    Telecom companies have been using DC distribution systems for DECADES because they don't have to lose energy converting back and forth between AC and DC. It's about time the computer industry is catching on.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...