Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Low Voltage Power Distribution? 237

Posted by Cliff
from the wall-wart-motel dept.
thesp asks: "As I look around my apartment, I am continually struck by the plethora of high-voltage AC to low-voltage DC power adapters I use to power my various devices. At a recent estimate, around 30% of the power consumed in my house is via these adapters. From my laptop to my digital music player, and from my mobile telephone to my PDA, each device is down-converting its own power through its own adapter. Double this number to include my partner's devices. Many of these run hot, and are inconvenient to remove/replug to conserve power and outlets. Does Slashdot know of any moves to standardize power delivery to such devices, or of hobby/home-brew projects to distribute low-voltage power from a central power converter? Alternatively, are there reasons as to why this would not be a simple and effective solution to the proliferation of wall-warts."
"On closer examination, these adapters seem to fall into four major categories, 7V, 5V and 3V, with the most common being 5V. Despite this, each device uses a different DC plug configuration, which makes efficient use of adapters difficult. It seems to me that, just as AC power is standardised, portable electronics power requirements should be also be standardised, with a standard wall outlet and car outlet at, say, 5V, and a standard device cable and interface. Electronics manufacturers would save money on power adapters, and the consumer would have the cost of the converter written in to home construction or automobile construction costs. No longer would we have to lug 4 separate power adapters with us on an overnight business stay to power our various equipment."
This discussion has been archived. No new comments can be posted.

Low Voltage Power Distribution?

Comments Filter:
  • by TripMaster Monkey (862126) * on Friday February 17, 2006 @06:53PM (#14746310)

    Article is a dupe...original discussion can be found here [slashdot.org], which amusingly enough, is itself a dupe of this [slashdot.org] discussion. Even more amusing is the fact that all of these submissions share the same editor.

    Way to go, Cliff...a dupe hat trick. Zonk has nothing on you.
  • A few reasons... (Score:4, Informative)

    by Tyler Eaves (344284) on Friday February 17, 2006 @07:01PM (#14746359)
    1. You can't (simply) transform DC voltage to a different voltage. This can be done very efficiently with AC. The 120v to 5V (or whatever) in your power supply is done before the AC is rectified to DC.

    2. Low voltage == High losses, esp. with DC.
  • Re:A real problem (Score:3, Informative)

    by drinkypoo (153816) <martin.espinoza@gmail.com> on Friday February 17, 2006 @07:04PM (#14746374) Homepage Journal
    Actually, while there is no standard, I've definitely noticed that manufacturers are tending to use the same kind of plugs for the same voltages more and more. It seems like everything I've got that's 12VDC has the same plug, and I really mean everything, from my Radio Shack A/V transmitter unit, to the Intel webcam (from long long ago) that I connected to it, it's all the same size barrel connector. Obviously it's not standard but it's getting better.
  • Re:A few reasons... (Score:3, Informative)

    by amorsen (7485) <benny+slashdot@amorsen.dk> on Friday February 17, 2006 @07:13PM (#14746426)
    You can't (simply) transform DC voltage to a different voltage.

    Actually transforming DC is way cheaper and more efficient than transforming AC...

    The 120v to 5V (or whatever) in your power supply is done before the AC is rectified to DC.

    The 120V to 5V transformation is done by treating the AC as a fluctuating DC signal, and doing DC conversion. It is less efficient than proper DC to DC conversion, but not much, and it's way more efficient than using a traditional transformer.

    It would be very nice to have say 48V DC around the house. Devices could easily have 48V to 5V or whatever switching supplies built in -- they would be small enough and give off so little heat that they could be inside the box instead of being wall-warts.

  • Re:A few reasons... (Score:3, Informative)

    by cr0z01d (670262) on Friday February 17, 2006 @07:25PM (#14746482)
    1. Yes, you can simply transform DC voltages to different voltages. They're called switching power supplies, and you find them EVERYWHERE. You get them off the shelf or build your own, they're cheap, they're light, and they're efficient (90% is not uncommon). Your computer does NOT step down AC to a low voltage then rectify it... it rectifies it to high voltage DC, then steps it down.

    2. Losses have nothing to do with AC or DC, it's just a function of current.

    Let's say you've got 12 AWG wire in your house (not uncommon). Resistance is .00187 ohm per foot. Let's further say you're running 5VDC across it, and the wire distance to transformer is 50 feet. A short circuit would suck:

    (5 V)**2 / (.00187 ohm/ft * 50 ft) = 267 W

    Divide this by two to get the maximum power draw from a device: 133W. Sounds like a lot of headroom, but at that point half your electric bill is going to heating the wires! This is why we have high voltage distribution systems.

    On the other hand, I would like to see cool medium-high voltage DC distribution systems in the home. This would reduce the complexity of power supplies in our electronics: instead of having power drop out 60 times a second, they see 200VDC or something.
  • Re:High voltage (Score:3, Informative)

    by ralphclark (11346) on Friday February 17, 2006 @07:37PM (#14746538) Journal
    Let:

    P = power dissipation
    I = current
    R = resistance
    V = potential difference (voltage)

    We know that power is a function of power and current. For direct current,

    (1) P = V * I

    By Ohm's Law,

    (2) V = I * R

    Therefore

    (3) P = I ** R

    So power dissipation is proportional to the square of the current. Given a requirement to deliver some arbitrary amount of usable power to the devices you have plugged in, by (1) you know that if you halve the voltage you must double the current to deliver the same amount of power. But, by (3) you also know that if you double the current you square the power dissipated by the resistance in the cabling. Hence if you step down from say 120V to 12V, you must deliver ten times the current and hence power losses are multiplied by a factor of 100.

    This still wouldn't amount to much in reality as the sort of devices you're talking about are generally rated between 1-10W and therefore you're only delivering current on the order of an Ampere or two per device. Plus of course the resistance in your domestic cabling should be absolutely negligible.

    However, it does explain why the power companies use high tension power lines (tens or hundreds of kilovolts) to transport electricity over long distances. Imagine the amount of current these things carry. When they step the voltage up by a factor of a thousand, the power loss due to resistance in the cables (and over hundreds of miles it'll be a lot) is reduced to a millionth of what it would be if transported at domestic voltage.

  • Re:Ohm's law (Score:5, Informative)

    by toddbu (748790) on Friday February 17, 2006 @07:54PM (#14746630)
    What distance? A few hundred feet throughout the house? The loss would be neglegible over that distance.

    Depends on your current draw. Check out this table [windsun.com]. Remember that by time you wire your entire house, there will be several hundred feet of wire.

    There's a reason we feed houses with AC.

  • by Skapare (16644) on Friday February 17, 2006 @08:23PM (#14746802) Homepage
    There certainly are some difficulties:
    1. There are a lot of different voltage needs I have seen, including: 3v, 4.5v, 6v, 7.5v, 9v, 10v, 10.5v, 12v, 14v, 15v, 16v, and 18v. Some things need (or can accept) AC, others need DC (some can take it filtered while most want reasonably smooth). It would be nice if the voltages were better standardized, but this is not always an option. And often where it is an option, it ends up being traded off with a loss in efficiency.
    2. Voltage drop is more dramatic at lower voltages. Given a specific current and a specific wire resistance, the voltage drop is a constant. Home wiring typically sees voltage drops in the range of 2 to 3 volts with high current loads, which is not much of an issue with 120 volts (less so with 230 volts). But at even 12 volts, that's a rather dramatic drop in voltage.
    3. For the same amount of power, devices at lower voltage use more current, which means even more voltage drop.
    4. Fault current can be an issue. If you have a lot of devices, the total current you might need could be very high. A power supply would need to deliver such current. A short circuit on a high current source can result in significant damage to everything from the power supply to the house. Surely you would fuse protect each branch circuit. The small "wall wart" power supplies have very small fault current as seen by the small arc if you short them out (don't try this at risk of blowing a tiny fuse they may have inside). But a power supply that can deliver 25 amps to a normal load can deliver much more than that under a short circuit condition, resulting in damaging arcs.
    5. A central power supply (or transformer if AC is all you need) is going to have its own level of power waste, anyway. While it can probably be designed with better efficiency, it won't really make up for what's lost in the wiring.

    If you have a cluster of devices of all the same voltage at the same location, then it would make sense to have a common power supply. Otherwise, it makes more sense to use a higher voltage for distribution purposes. The electric utility generally brings power down to your street in the 11kv to 14kv range, and a permanent transformer drops it down to the 120 to 240 volt range you get into your home. Distributing power at 240 volts would not even be considered beyond at most 100 to 200 meters. Every time the voltage goes up by 2, the distant can go up by 4 since the current is cut in half, which means the voltage drop is cut in half, which has even less effect on twice the voltage. When they run the voltage at 50 to 100 times as much, they can deliver power over substantial distances. Cutting voltage to 1/10 as much means you can deliver the same amount power to only 1/100 the distance.

    Incandescent lights actually work better at lower voltage, especially for bulbs of lower wattage. Normally a low wattage bulb requires greater resistance in the filament. That means the filament must be longer and/or thinner. That means it is more prone to mechanical shock damage. It also has to run at a lower temperature, producing a more orange light (which in some cases is what is desired). The lower temperature wastes power since more is emitted as infrared instead of usable light. By changing the bulb design to a low voltage like 12 volts, the same power level can have a shorter, thicker, hotter filament, which can run more efficiently, even making up for the loss involved in having a transformer converting the voltage.

    The reason I mention low voltage lights is to point out that they are rather standard at 12 volts (a few use 24 volts), yet transformers are generally located close to where the lights are, rather than in a central location which would require the power be distributed in low voltage form. If a central low voltage source were practical, low voltage lighting would be the first to use it. But with very few exceptions, they don't do it this way.

    I once considered running lots of stuff in my house on lo

  • Re:Ohm's law (Score:3, Informative)

    by stoborrobots (577882) on Friday February 17, 2006 @09:15PM (#14747033)
    ... according to the chart, a #10 (6mm^2) wire (which, while by A/C standards is huge, isn't really that big at all) will get you 216 feet at 10 amps.

    You might note that that applies at 120 Volts, not 12V - at the lower voltage your #10 gets you a whopping 22 feet. For 200 feet at 12V you need 1/0 gauge wire, which is ten times the cross-section, and three-and-a-half times the diameter...

    Again, not huge in real-world terms, but bigger than you imagine...

    I think the biggest pitfall is making sure you don't deliver too much (or little) current to the devices you plug in. It would be very bad to deliver 10 amps to a device which is expecting 300 milliamps, or 300mA to a device expecting 2A.

    And thus your power source would be a fixed-voltage source, not a fixed current one. Technically, only raw components need to be protected from over-current situations - any (properly-designed) circuit should account for the max current going across any component within it, and prevent it from going overspec. Consider that most wall-warts do not limit the current being drawn from it - draw enough current, and you'll simply make the adapter overheat and melt down.
  • by Jozer99 (693146) on Friday February 17, 2006 @09:29PM (#14747089)
    Point taken, here the problem lies not in math, but in hardware. It is easy to change high voltage AC into low voltage DC with relatively high efficiency (70-80%). It is VERY hard to change the voltage of DC with high efficiecy, (like 30%). So you end up wasting lots of power that way. Plus, 120V AC current, if you get shocked, hurts like a B#TCH, but just leaves your ego bruised. 120V DC current will instantly cause your heart to stop. Better have a friend with a defibulator ready every time you plug in that laptop, or turn on that lamp.
  • by Jozer99 (693146) on Friday February 17, 2006 @09:49PM (#14747208)
    Yes, but this does not outweight the advantage of better transformers for AC. Back in the early 1900's, there was a hugh battle over whether our power infrastructure would be AC or DC. Many great minds had a huge stake in this debate. Eventually, AC won out, mostly because it was easier to tranform to higher or lower voltages, making sending it long distances much more efficient.

    BTW, Tesla was the oddball, he was all for wireless electricity. Sadly, his proposed wireless transmission device, the tesla coil, had a nasty habit of electrocuting people who steped within a 20 foot range. Not to mention it was hideously inefficient.
  • Re:Multiphase power (Score:3, Informative)

    by Skapare (16644) on Friday February 17, 2006 @10:31PM (#14747349) Homepage

    DC does not require any larger conductors than AC does, for the same voltage and current. You must be assuming low voltage in reference to DC.

    Three phase is only marginally better than single phase for converting AC to DC. And unless the power supply is a very complex and expensive type, it will result in a high level of harmonics and a low power factor on the AC source due to the rectification cycles. On a large scale this could also overload the neutral conductor.

    Three phase is generally good for motors only above the 1 horsepower level. Many home appliances would not benefit from it. A few might (the big ones), but not all areas get three phase power, so the dominant appliance products use single phase power.

  • Re:A few reasons... (Score:3, Informative)

    by Jeff DeMaagd (2015) on Friday February 17, 2006 @11:01PM (#14747441) Homepage Journal
    AC transformers are even more efficient than DC-DC converters, 99%+ efficient is not uncommon. 90% efficiency on DC is available, but for the cut-throat consumer electronics market, the extra cost means that they go with cheaper units with maybe 70% efficiency.

    There are still some nearly unsolvable problems with higher voltage DC as a distribution system. For one, arcs start easier on a 48VDC system, and arcs are harder to break because current can just follow the ionized trail and is easily sustained. This causes a safety issue, and is one reason why few autos have a 48VDC system.

    Incidental arcs with AC systems are easily broken and die automatically because the current goes to zero, breaking the current and the ionized path disperses too quickly for the reverse current to travel through it.
  • by Lewie (3743) on Friday February 17, 2006 @11:54PM (#14747595)
    It is easy to change high voltage AC into low voltage DC with relatively high efficiency (70-80%). It is VERY hard to change the voltage of DC with high efficiecy, (like 30%). So you end up wasting lots of power that way.

    You have got that backwards. It's hard and expensive to change 120 down to low voltage DC with any decent efficiency, whereas efficient (>90%) DC-to-DC is cheap and straightforward. Transformers are expensive, as are high-voltage rated components.
  • by scheme (19778) on Saturday February 18, 2006 @01:00AM (#14747832)
    I figure with a large UPS and some sort of redundant power-supply, you could feed a number of computers with 12V lines and a picoPSU-120 12V DC-DC ATX power supply. Has anyone tried this yet? I've never worked with high-density hardware (like blades) but I'd imagine that each blade is certainly not using its own PSU.

    Check out the specs on telco equipment. A lot of them run on 48 vdc with special 48vdc power supplies. You can get a lot of networking gear that come with 48vdc power supplies. I think there are probably computers that have the same ability.

    Of course this is all pretty expensive since it's intended for telco companies.

  • by RzUpAnmsCwrds (262647) on Saturday February 18, 2006 @05:37AM (#14748551)
    It is VERY hard to change the voltage of DC with high efficiecy

    Not true. DC-DC converters have existed for years, and they are highly efficient. Take, for example, the DC-DC convertor on your motherboard - if it were only 30% efficient, it would be dissipating more heat than the CPU. Fortunately, DC-DC converters are generally closer to 90-95% efficient.

    Take, for example, the picoPSU [mini-box.com] - it outputs 120W at various voltages (from a DC source) and it doesn't even have a heatsink.
  • by aivankovic (89462) on Saturday February 18, 2006 @08:20AM (#14748923)
    A question of DC vs AC for electricity distribution was the subject of conflict between Edison an Tesla in 19th century. You can read more on that:

    <URL:http://en.wikipedia.org/wiki/War_of_Currents/ >
  • by AlterTick (665659) on Saturday February 18, 2006 @06:29PM (#14752058)
    Mircowaves and CRTs upconvert power to way above 110V anyway, theres no reason why they cant do that with a 12V input.

    Yes there is. A 1200W microwave draws 10 amps at 120V. At 12V it would draw 100A. You have any idea how thick the wire has to be to handle 100A?

  • DC Vs AC Safety (Score:3, Informative)

    by Stephen Samuel (106962) <samuel&bcgreen,com> on Sunday February 19, 2006 @10:41AM (#14754671) Homepage Journal
    According to my electronics instructor, electricity should generally be treated with respect, but DC is a good bit more dangerous than AC.

    Edison, for some unknown reason, hated Tesla and tried to kill his ideas of AC power distribution. He apparently had the (AC-powered) electric chair created as a PR stunt so that people would know that AC power was being used to kill people -- but it turned out to be relatively difficult to reliably kill people with AC power because an AC charge turns out to be an impromptu defibrulator, so you essentially have to cook your victim.

    DC on the other hand, causes the heart to go into a constricted mode which is harder to recover from. I was taught to always handle AC with one hond only, if at all possible (to avoid a possible circuit across the heart).

  • by theLOUDroom (556455) on Monday February 20, 2006 @12:08AM (#14758864)
    Fortunately, DC-DC converters are generally closer to 90-95% efficient.

    At the EXACT current output they were designed for.

    Sure, you can get tons of efficiency if you're designing with a known load that doesn't vary too much. This is not the siutation we're talking about here. One minute you might be drawing 10mA, the next minute you might want 10 A, the supply is not going to maintain 95% efficiency over that range and maintain a reasonable cost.

You're already carrying the sphere!

Working...