Forgot your password?
typodupeerror

The True Cost of Standby Power 369

Posted by kdawson
from the 2-bucks-a-month dept.
Luther19 writes, "How much do all of our computers and electronic devices sitting in standby mode cost us? The author of the article concludes that he could save $24.44 per year by switching out wasteful power supplies. The article also touches on a global initiative to cut down on standby power, called '1-Watt': 'The idea has been promoted by the IEA, which first developed an international 1-Watt plan back in 1999. Countries like Australia and Korea have signed on officially, while countries like the US require 1-Watt in government procurement, which will have ripple effects throughout the economy. The goal of the program is to have standby power usage fall below 1W in all products by 2010.'" It's estimated that in industrialized countries, devices on standby consume on average 4% of the power used.
This discussion has been archived. No new comments can be posted.

The True Cost of Standby Power

Comments Filter:
  • by LiquidCoooled (634315) on Monday October 16, 2006 @04:40PM (#16458355) Homepage Journal
    But I don't think people are going to switch out their PSU mid hardware life.
    Push these improvements to the manufacturers and make the next generation of devices last longer per watt.
    Make them better when they are both on and off.

    Also folks, switch off your keyboard indicator lights to save power.

    • Re: (Score:3, Informative)

      by stecoop (759508) *
      I guess when your speaking for people that buy their computers pre-built; which, might get your geek card revoked on /. for not building your own system. When I shop for a power supply, I might try to find the most efficent cost effective PSU. That way, you cut out the middle man giving you the power supply you really didn't want in the first place.
    • by pete6677 (681676)
      This is one of those "problems" that costs a lot more to solve than to just deal with. Who cares if your LCDs are costing you $5 a year to power?
    • by purpledinoz (573045) on Monday October 16, 2006 @04:58PM (#16458681)
      Make electricity more expensive, then people will make a huge effort save power... Take advantage of capitalism.
    • by kalirion (728907)
      Personally, I turn off my computer when I'm not using it. If I shut down from windows I still see the keyboard lights and the mouse laser is still active though - I have to manually use the power switch to turn them off.
  • by Anonymous Coward on Monday October 16, 2006 @04:41PM (#16458375)
    Then it will be using 0 watts. Much less than using standby.
    • I wonder if, when large flash drives become common internal components to desktops or laptops, this type of issue might go away. The flash memory could hold the state of the user's last log in session. What is it other than that, that requires 'stand-by mode' to start the computer quicker?
      • Re: (Score:3, Informative)

        by drinkypoo (153816)

        What is it other than that, that requires 'stand-by mode' to start the computer quicker?

        You missed the point completely. Congratulations.

        ATX PCs are never actually turned off. There is always a trickle of power through the PSU and part of the motherboard, in order to support ATX Soft Power, Wake-On-Lan, Wake-On-Keypress, Wake-From-USB, etc.

        Typical CRT monitors are never actually turned off. They keep the tube charged so that you don't have to wait for it to "warm up" when you hit the power switch.

        • Re: (Score:3, Insightful)

          by necro81 (917438)
          Your point about ATX PC's, CRTs, and other typical office equipment is well made. It is for those reasons that I, after I shut down my workstation at the end of the day, I actually reach down and turn off the power strip they're all plugged in to. Viola! No power draw overnight. It probably saves my company a few cents a night - one machine out of about 50,000 on the campus - but I feel better about the principle of the thing.
        • by homer_ca (144738) on Monday October 16, 2006 @07:48PM (#16461303)
          To put some actual numbers on it, I've measured ATX standby consumption on a few PCs with an AC power meter. They range from a low of 2W for an NForce4 system with a good quality Seasonic PSU to 8W for an NForce2 system with a cheapo no-name PSU that had an LED fan that stayed lit and kept spinning slowly even in standby. The NF4 system could also suspend to RAM which is almost the same as standby but with power still going to RAM. That used about 8-9W. That's bad compared to shutting off, but better than the 50-60W from the computer powered on and idle.
      • Disk drive suck power at a terrific rate. Reducing the power consumption of mass storage is one of the opportunities in saving electricity. Between reducing CPU power consumption, replacement of CRTs with LED displays, and reducing rotating mass of disk drives, there's a lot that can be done to improve the power profile of a PC. When you've reduced the PCs requirements for power, you can downsize the power supply to add even more power savings.
    • by raehl (609729) <(raehl311) (at) (yahoo.com)> on Monday October 16, 2006 @06:43PM (#16460465) Homepage
      Then you don't use any gas, and the world is saved! Of course, your food spoils before it can get to you, but you didn't waste any petroleum!

      0 watts is better than >0 watts, but only if EVERYTHING ELSE IS EQUAL.

      But it's not. If you turn off your computer instead of leaving it on, that affects many things other than just how much power you are using while the computer is off. It means you have a boot squence where you use a *LOT* of power. And where you do a LOT of reading/writing to/from disk. And you have to sit around and wait for your computer to boot. And then reopen everything you closed when you shut-down.

      Saving $24/year in power is not worth spending $25/year on failed hard drives. Or on time lost turning your computer on and off again. Or on the power you use booting the thing back up.
  • by SurturZ (54334) on Monday October 16, 2006 @04:42PM (#16458381) Homepage Journal
    This is a serious problem and we need to change, and change now. I propose that instead of "Access Standby" mode we IMMEDIATELY redesign ALL electronic items to have a "Mode Execute Ready" state which uses less power.
    • "Mode Execute Ready"

      Is this a joke of some sort? I've never heard of that, and it sounds like a pretentious name to boot.
      • Re: (Score:2, Informative)

        by Stoertebeker (1005619)
        You really need to go back and re-read HHGTTG:

        Ford flipped the switch which he saw was now marked 'Mode Execute Ready' instead of the now old-fashioned 'Access Standby' which had so long ago replaced the appallingly stone-aged 'Off'.

  • by Anonymous Coward

    I have recently switched to a steam powered laptop. Nothing like coal and water.

    -----------

    James Watt XXIII

  • Pareto (Score:5, Insightful)

    by EaglemanBSA (950534) on Monday October 16, 2006 @04:43PM (#16458399)
    The way we engineers do it is by pareto analysis - you try to cut out of the largest portion of your power consumption. I'd like to see what lines up as the numbers one two and three consumers of electricity, and how that compares to the cited 4%, and how much was saved by going to standby mode as it stands today. I'm guessing that there are better places to focus the effort, but perhaps that's just my own bias.
    • Re: (Score:3, Informative)

      by onion2k (203094)
      That's a very sensible approach, but to ignore something that could save 4% of 'unused' power with practically no effort would be idiotic.
      • Re:Pareto (Score:5, Funny)

        by StikyPad (445176) on Monday October 16, 2006 @06:10PM (#16459961) Homepage
        It's not unused. What the Save-a-watt fanatics don't want you to consider is that without standby power, you couldn't turn on your TV with a wireless remote. Just imagine if everyone had to get up to turn on the TV. The only thing we're doing is moving the energy consumption back to FOOD. This ridiculous proposal to eliminate standby power will result of a food crisis of never-before-seen proportions as couch potatoes everywhere compensate for the extra physical activity by eating more.

        So go ahead, call us idiotic. Carry your hip "I won't stand by for standby" signs, and lobby Congress to ban devices that consume "unused power." But when the famine arrives -- and it will arrive -- don't say you weren't warned.

        Now if you'll excuse me, I'm going to go stock up on canned goods.
    • Re:Pareto (Score:5, Informative)

      by boingo82 (932244) on Monday October 16, 2006 @05:12PM (#16458913) Homepage
      Cutting out the largest sources isn't always the place to focus your efforts - allow me to draw a really bad analogy here:

      Analyzing your budget, you decide you need to cut back. While it appears that cutting the $700 mortgage would be the best way to save money, in actuality you're better off cutting out the $19.99 Netflix subscription to movies you never watch.

      If that makes any sense, you'll know what I mean - while cutting the largest consumer of power or money may *seem* like the best place to start, it's often a necessary function which just cannot be cut. However, cutting back on unnecessary waste, even if it's a mere 4%, can be a great investment of effort.

    • Re: (Score:3, Interesting)

      by John.P.Jones (601028)
      Cutting 96% of the 4% standby power is relatively easy to do, much easier than cutting 4% of the other 96%, so guess what??? Its cheaper and more effective to pick the low hanging fruit.
    • Re: (Score:2, Informative)

      by maxume (22995)
      Buying a new fridge somewhat sooner than you would have otherwise can be a pretty good idea. Especially if the old one had been around for more than 10 years.

      http://www.aceee.org/consumerguide/topfridge.htm [aceee.org]
    • Re: (Score:3, Interesting)

      by fahrbot-bot (874524)
      Perhaps the power company could just send 4% fewer electrons. Problem solved.

      Seriously, here's a question. I noticed that my APC UPS has SmartTrim enabled and the line voltage is high. Now it hasn't always been this way and seems to happen every fall around here. My question is, if I'm paying for kilo-watt hours and watts are volts * amps, am I paying more when the voltage is higher? If so, is the power co. ripping me off?

      • Re: (Score:3, Informative)

        by Grishnakh (216268)
        My question is, if I'm paying for kilo-watt hours and watts are volts * amps, am I paying more when the voltage is higher?

        Yes, if you have a big resistor connected to your power mains.

        Most equipment uses a certain amount of power. So if the voltage is high, the equipment uses less current (amps). The power is the same. This is especially true for things like switching power supplies, which only switch on (hence the name) a sufficient percentage of the time to get the power they need. Linear power suppli
  • Cost benefit? (Score:5, Insightful)

    by suparjerk (784861) on Monday October 16, 2006 @04:44PM (#16458407)
    I'm not sure the effort and materials costs associated with replacing a power supply are worth $24 per year...
    • ...but those of the environment. Think how much less we'd pollute if we could close down 4 out of every 100 power plants.
      • Re: (Score:3, Insightful)

        by chgros (690878)
        Think how much less we'd pollute if we could close down 4 out of every 100 power plants.
        I'm guessing about 4% less. That's still not much.
        • by crossmr (957846)
          Every little bit helps. 4% here, 3% here on something else, etc. 4% less polution from power plants could be the equivalent of taking 20 million cars off the road for all we know.

      • by voidptr (609)
        Slightly less than sending all those slightly less efficient but perfectly functioning power supplies to the junkyard + the environmental impact of building new ones?
    • by Vellmont (569020) on Monday October 16, 2006 @05:09PM (#16458855)
      You're right, few people are going to bother with replacing power supplies because it's just not worth it economically to replace them.

      But, the point is that if the industry had spent just a few dollars (maybe pennies) more in designing the devive, they'd be saving you money and it's be worth the extra costs. Right now most consumers have no idea the amount of money it costs them for these inefficient electronics, so there's no incentive for manufacturers to bother.
      • But, the point is that if the industry had spent just a few dollars (maybe pennies) more in designing the devive, they'd be saving you money and it's be worth the extra costs.

        Yes by all means. Let's get rid of those stupid little LEDs on the front of all my new A/V components telling me "I'm turned off right now, but if I were turned on this light would be off". Sure the power bill effects are marginal at best, but it is the annoyance factor of all those things with lights on at night. There is no good reas
        • Re: (Score:3, Informative)

          by Illserve (56215)
          Those LED's cost next to nothing, I would guess on the order of pennies per decade.

          The standby cost is the result of inefficient transformers in power supplies that manage to suck power from the grid without doing anything with it.

          • Re: (Score:3, Informative)

            by Iron Condor (964856)

            Those LED's cost next to nothing, I would guess on the order of pennies per decade.

            This is only marginally on topic, but: Why guess? Is it really so hard to multiply a couple numbers? Are you really saving yourself effort buy operating on the basis of ignorant guesses when you could inform yourself within a few seconds by taking a run-of-the mill number for an LED, say 1.6V 20mA makes 32mW, makes 32mWh per hour. Times 24 is 768mWh per day, comes to somewhere under 300Wh per year. Around here we pay ~10c/

    • by MasterC (70492)

      I'm not sure the effort and materials costs associated with replacing a power supply are worth $24 per year...

      True, but how about when you have to buy a new PSU (new computer/device) or replace your current PSU when it shells out? Then it becomes economical.

      Not quite the same thing with incadescent [wikipedia.org] vs. CFL [wikipedia.org] though. You'd be better off replacing all of them right now because the marginal cost of a regular bulb (~$0.50) is much less than the energy savings of a CFL (~$36.00 YMMV).

  • Check it yourself (Score:5, Interesting)

    by ScooterBill (599835) * on Monday October 16, 2006 @04:45PM (#16458421)
    You can buy a low cost wattmeter that you plug your equipment into and simply read out the power consumption. I've found that a lot of devices in standby take almost no power. Other devices aren't so frugal. I'd like to see some real statistics on this and something like the energystar ratings you see on refrigerators put on computers.
    • Re:Check it yourself (Score:5, Informative)

      by StarfishOne (756076) on Monday October 16, 2006 @04:56PM (#16458621)
      Mind you: it's not always a device with an explicit stand-by mode. I once used such a wattmeter on all devices and learned that my 40W lamp with a seemingly #$%#$% cheap transformer was using 25W while "off"!

      Factoid: if all American households would not use the stand-by mode of their TV, an entire _nuclear_ power plant can be saved on a national level. :S
      • Re: (Score:2, Insightful)

        by Anonymous Coward
        And that is better than saving an entire _coal_ power plant... how?
      • by PitaBred (632671)
        But then I'd have to get off my ass to turn it on.
      • Re:Check it yourself (Score:4, Interesting)

        by Crispy Critters (226798) on Monday October 16, 2006 @05:43PM (#16459491)
        My old HP printer used as much power turned off as it did turned on and waiting. I took to unplugging it. I guess this arises from using an inefficient transformer and putting the power switch on the low voltage side.

        My main problem with the wattmeter gizmo is that I could not use it on the items that I guessed were using a large percentage of the power, namely dishwasher, hot water heater, and dryer. Either the items did not run on 117 VAC or they were wired directly without a plug.

      • Re:Check it yourself (Score:5, Informative)

        by Avian visitor (257765) on Monday October 16, 2006 @05:47PM (#16459577) Homepage
        I can tell your from experience (this is one of the more popular demonstrations in the power engineering lab) that cheap watt-meters can be terribly wrong with loads that are not simple resistors.

        A transformer with no load (probably in your case - most lamps with halogen lamps have the switch on the secondary side) is almost a perfect inductive load. Current and voltage are not in sync and the (real) power is very close to zero.

        Not all instruments can show this correctly. Especially not if they measure voltage and current separately without taking the phase shift in account (as is often the case with cheap stuff). Switching power supplies (almost everything electronic uses one of those today) are also hard to measure. You need a high sampling frequency if you want to accurately measure the power they draw from the mains. Again, consumer instruments don't have this because fast AD converters are expensive.

        Just about the _only_ instrument I would trust outside the lab is the watt-meter the power distribution company installed in your house. These things have to go through very thorough testing before they are approved.
      • Re:Check it yourself (Score:4, Informative)

        by Shadowlore (10860) on Monday October 16, 2006 @05:50PM (#16459643) Journal
        Factoid: if all American households would not use the stand-by mode of their TV, an entire _nuclear_ power plant can be saved on a national level. :S

        Even better it could save coal usage, which puts out more radiation than nuclear plants do, and still pollutes otherwise.
      • Re: (Score:3, Informative)

        by afidel (530433)
        Hell, if the 4% figure is right then we could eliminate ALL nuclear plants simply by eliminating standby mode! The US uses 3.3TW of electricity 4% of 3.3TW is 132,000MW or about 30% more than the total output of all nuclear power facilities in the US (99,988MWe source [doe.gov] from this site [doe.gov].) Of course I would be much more in favor of shutting down an equivilant capacity in older coal fired plants since the environmental impact would be about an order of magnitude greater =)
    • by CastrTroy (595695)
      Well, it's really hard to define "running" for a computer. Is it when it's on, sitting idle? Is it when it's running at 100%, spinning every disk, and using every peripheral you have hooked up to it. With a washing machine, or a fridge, it's pretty easy to define the power usage. There are only so many standard operating modes. With a computer, the power consumption varies a lot with usage.
  • 1W from one source (Score:2, Interesting)

    by Moracq (63771)
    Why cut all the devices down to 1W draw, when I should be able to drop ALL my devices to 1W *total*. Put a 1W IR sensor on my power strip, and then I can turn the strip on and off from a remote! For modern programmable remotes, it's just one more line in my power on macro, and instead of 6 or 8+W (1W for each device, when you consider TV, VCR, DVD, Receiver and my 2 powered tower speakers), you just have the 1W from the "sleeping" power strip.

    It'd get even better if I could teach my Tivo to turn on/off my
    • My wife's TiVo has missed recording shows after the power flips off and on again. TiVo comes back up, but the cable box stays off. So TiVo records blackness.

      Now, the only UPS in the house protects TiVo and the cable box. Its surge suppressors protect the TiVo modem line, the Teevee, the VCR, and the DVD player.
    • Re: (Score:3, Interesting)

      by amorsen (7485)
      Around here you can buy power strips with a special "TV" socket. Plug the TV in the TV socket, and the rest (DVD etc.) in the other sockets. As soon as the power strip detects the TV using less than 20W, it powers off the other sockets. At least that way it's only the TV on stand by.

      You can also get power strips with a USB cable. They only supply power when they detect voltage on the USB line -- so turn off your computer, and the peripherals turn off too. Unfortunately there are computers which won't turn o
      • by jdgeorge (18767) on Monday October 16, 2006 @05:51PM (#16459659)
        Around here you can buy power strips with a special "TV" socket. Plug the TV in the TV socket, and the rest (DVD etc.) in the other sockets. As soon as the power strip detects the TV using less than 20W, it powers off the other sockets. At least that way it's only the TV on stand by.

        Is this [smarthomeusa.com] the one you're talking about? Looks like a good solution, from what I can tell; I'm intrigued. Combine that with using compact fluorescent lights instead of incandescent light bulbs as possible, and you can significantly reduce your home's electricity consumption.
  • by jizziknight (976750) on Monday October 16, 2006 @04:48PM (#16458471)
    Why apply this only to standby mode? Why not apply this to devices that are completely powered down as well? I've noticed a significant reduction in power consumption when I've unplugged appliances and other electrical devices (most notably my PC) when they're not in use. Is it that difficult to implement a hard switch within the device? Understandably, we wouldn't want this for devices that are operated via remote.
    • by dch24 (904899)
      I don't understand.

      I've noticed a significant reduction when I've unplugged appliances (most notably my PC) when they're not in use.

      If your PC uses an ATX power supply [wikipedia.org] (introduced in 1995) it never turns completely off. That's the point of the article.

      Or see the post [slashdot.org] in the discussion where a guy's lamp draws 25W when off. Care to clarify?

  • Postings so far have criticized the cost of conversion to save the under-$25/year figure.

    But there's another cost:

    How much does it cost in lost productivity, over a year, while people wait for their monitors and computer to "warm up" from power-save mode every time they've left their desks or done something OFF the computer for too long?

    And for recreational machines: In lost lifetime? How much is YOUR life worth to you?
    • There's also a convenience cost. Is it worth $2 a month to you so your entertainment devices can rapidly turn on?
      • by jroysdon (201893)
        I wouldn't call my PC an entertainment device. It's an information/education device (411, IMDB, Wikipedia, Bible software), family communication (email, email to cell), music library, and banking. We rarely play games on the PCs. I can see it on other devices that are rarely used (tv, home stereo, etc.).

        I know I hate certain standby devices with long warm-up times, like printers and photo copiers. They need intelligent clocks built in to watch usage patterns. M-F at 8am (or whenever usage usually occur
  • Are the five seconds to bring your computer out of hibernate really that critical? Hibernate takes 0W if you switch off your PSU when you walk away.
    • by Tiger4 (840741) on Monday October 16, 2006 @05:12PM (#16458923)
      In the five seconds I'm waiting for it to restart, I'll forget why I wanted to turn it on. Modern society functions on IMMEDIATE gratification of desries. Are you trying to kill us all?
    • by CreatureComfort (741652) * on Monday October 16, 2006 @05:29PM (#16459219)

      Is $25 a year that critical to your budget? Hell, I'm reading this thread while drinking a bottle of Scotch that cost 6 times that much.

      And let's see... 5 seconds for turning on a PC, figure I do that a minimum of 3 times per day, 300 days per year. That's 75 minutes (1.25 hours) per year. At my current billing rate that equates to $75 per year. So I'm supposed to give up $75 of my time to save $25?

  • It's somewhere between cold and hypothermia (metric) in here, so I don't feel too bad about leaving the old space heater althlon on. In fact since I moved to this benighted part of the world it hasn't been any other way.

    This is possible the only advantage of living in England. That and the beer.

    As for people with their AC on... switch it off! You can aclimatise to heat easily as long as you never go near air conditioned space. It normally takes me a week to get back into the swing of things but I'm pe
    • by brunes69 (86786)
      I don't think yo should generalize so much there - I know for me personally, I find it much harder to aclimatise to heat than cold. I live in a cold country too (Canada), and during the winter, if I had it my way the heat in our house would never be above 12 degrees C, since it doesn't bother me to operate at that temp (my fiancee on the other hand...)

      But in the summer, once it gets above 22 degrees c, it feels like a sauna to me. I have to run the AC at anything over 24 or else I simply can not sleep.
    • It normally takes me a week to get back into the swing of things but I'm perfectly comfortable in 90+ weather - the dehumidifier is way more important than the cooling, if you need any mechanical aids at all.

      Unless you are using a disposable dessicant (like Silica), a dehumidifier IS an air-conditioner, and requires no less energy to operate. It is just one that puts the heat back into the room instead of outside. An A/C an be used as a quite effective dehumidifier if you slow the blower speed. This incr
  • by Rosco P. Coltrane (209368) on Monday October 16, 2006 @04:53PM (#16458557)
    Some always-on devices are just plain stupid. Like computers: remember when computer PSUs had a physical switch that cut the power to the computer? when they replaced that with a soft power button that connected to the motherboard, they replaced a perfectly working system with one that didn't bring much at all to anybody, save for people who need to remote-boot through a network card and for people who are too dumb to stop the OS before the machine, and created the hateful power-button-that-doesn't-work-when-the-OS-crashes syndrome. Not to mention the extra power consumption...
    • by Sigma 7 (266129)

      Some always-on devices are just plain stupid. Like computers: remember when computer PSUs had a physical switch that cut the power to the computer?

      They still do - it's located on the back of the power-supply as opposed to being on the front.

      Not to mention the extra power consumption...

      Could be worse - ~1995-1998, there were plenty of computers that automatically turned themselves on (by default) as soon as the phone rang, complete with the two minute bootup sequence. This principle still exists today, as m

  • The amount saved is so minimal. You can make it sound large when you multiply it by the entire population, but if you compare that to the GDP of the nation, the amount saved is even more minimal!

    Plus, who will feed the starving families of the power companies when we all start using $24 less of power each year!
  • micro-generation (Score:3, Insightful)

    by Gothmolly (148874) on Monday October 16, 2006 @04:55PM (#16458619)
    I think this coupled with a small, cheap solar install on every rooftop could significantly cut power usage. With advances like this this [yahoo.com], its doable - not to power your house, but to help distribute generation capacity and smooth out load peaks. Of course, solar cell manufacture consumes a lot of energy and can create industrial waste issues, but the point is to get the power generation somewhere dirty and concentrated, rather than smogging up everything.
    • Re: (Score:3, Interesting)

      by dbIII (701233)
      Don't go for the solar cells first. Consider that it makes a lot more sense to heat up water in one go from the sun than going from solar to electricity to heat. I'm surprised that there aren't any solar airconditioners out there - that's another situation where heat input on a fluid is really what you need and not electricity.
  • by EricBoyd (532608) <.moc.oohay. .ta. .dyobcirerm.> on Monday October 16, 2006 @04:56PM (#16458625) Homepage
    I just finished a comprehensive audit of all the electricity drawing devices in my house:

    http://digitalcrusader.ca/archives/2006/10/househo ld_energ.html [digitalcrusader.ca]

    I learned that my Stereo system consumes 22W when on "standby" and only about 35W when in use - what a total waste! So I put it on a power bar. My older TV is 0W standby, and all the newer Wall Warts that I have seem to be OK as well - 4 of them together only rate 1W. Your milege may vary :-)
    • by centron (61482)

      Aren't you being hard on that poor stereo? I mean, it has to power an infrared receiver so that the remote control will work. According to this government study [lbl.gov], the IR receiver alone uses 0.05 watts all by itself! Once you factor in the overhead, all the wires and circuits and ohms and such, 22 watts makes complete sense.





      Yep. Complete sense...

    • by jez9999 (618189)
      You 'older' TV... is that the "32" RCA" you list on the page? If not, how much is that on standby?

      If things can use virtually no power on standby, i'll never understand why everything isn't designed that way. It irritates me when people go around saying no to put stuff on standby, why aren't they forcing (not telling, forcing) manufacturers to make a better, more efficient, standby?
  • Could someone tell me why you would even use stand by power? I just don't see the benefit of saving 1-2 seconds when modern consoles take longer than that just loading the main menus and hence defeat the entire point..

    I mean why is it so difficult to just turn it on?
    • by ClayJar (126217) on Monday October 16, 2006 @05:23PM (#16459095) Homepage
      "Standby power" is what you have when you can use the remote control to turn on the TV, DVD player, etc. It is powered up enough to be able to respond to the remote, i.e. it is standing by for your commands. It need not be a remote, however. A printer with an electronic power button (like a little HP inkjet, for example) is in standby mode, as opposed to the gargantuan EPSON 132-column industrial dot matrix printers that have what looks like a circuit breaker to turn them on and off. A touch-lamp would be using standby power, while a bulb on a mechanical pull-chain switch would not.

      This is only very loosely related to your idea of laptop-style standby mode.
  • From the article: Over the course of a device's lifetime, the cost of all that standby power can actually exceed the cost of having the device on.

    So, using full power costs less than by using standby? I suppose I can cool down the kitchen by leaving the fridge door open, too? Maybe I should leave the hot water running to cut down on my power bill?
    • by 241comp (535228) on Monday October 16, 2006 @05:22PM (#16459079) Homepage
      I don't think they mean that the per-hour cost of standby power exceeds the per-hour cost of having the device on but rather that you may have a device which uses 7W in standby 22hrs/day and 60W on for 2hrs/day (LCD TV?). This means that on the average day, the device uses 154W in standby and 120W while in use. Over the lifetime of the device (say, 900 days), the device uses 30KW more in standby than it did while in use. Another example of this is your hot water heater/tank. If you have an older, less insulated tank, you may be able to reduce your hot water power usage by more than 50% by getting an on-demand water heater which eliminates standby power usage.
      • Re: (Score:3, Informative)

        by RzUpAnmsCwrds (262647)
        This means that on the average day, the device uses 154W in standby and 120W while in use.


        You mean 154Wh, not 154W.
  • I got one of these to play with last year:

    http://www.thinkgeek.com/gadgets/electronic/7657/ [thinkgeek.com]

    While hooking it up to every little device I could find, I found that battery chargers, such as those for your drill or cell phone, are using electricity while their respective devices are not even connected to them. Granted, it's not much power, but with 5 or 6 of them plugged in, and no devices even attached to them.. thats wasteful. So.. unplug 'em if you aren't using them.
  • You know, for about half of the year, I'm paying for heat in one form or another. This "wasted energy" helps to heat my house during that part of the year. Additionally, I pay less for electricity during nights and weekends - it's cheaper than propane to heat with electricity during those hours. So, an "inefficient" electrical device, actually _saves_ me money if I'm paying to heat the house.
    • Improving the efficiency of a PC with a low-loss power supply has knock-on benefits. Where I work, they have to cool the place even in the wintertime (in Boise, Idaho!). Imaging if they could reduce the waste heat enough to stop having to cool the place even when it's below zero outside. The knock-on benefits would be year-round, of course.

      Another poster pointed out that incandescent bulbs are a horrible waste. Gummint could help by switching out traffic light, street lighting, etc. to more efficient LED
  • Um, it would be nice if they could encourage 1 Watt as a usage goal while the device is on as well. I can understand things like irons, dishwashers, dryers, refrigators, ovens, garabe disposals, vacuum cleaners, and heating/air conditioning taking up a big amount of energy while in use. I have no clue how much it costs to run my dishwasher each cycle or say over a month's time.

    I'd love for the government to work toward's most devices using 1 watt or less. Those walkaround phones, TVs, ceiling fans are a few
  • by hibiki_r (649814) on Monday October 16, 2006 @05:11PM (#16458901)
    Most old videogame consoles use less than 1 watt on standby, but this seems to be going away [dxgaming.com] The PS2 already used 2 watts on standby, and the XBox 360 is following suit. We don't have firm data on the Wii and the PS3, but given the numbers of the PS2 and the Wii Connect24 feature, I'd be surprised if either of the two go back to the 1W barrier
  • Small Potatoes (Score:4, Insightful)

    by oiper (575250) on Monday October 16, 2006 @05:16PM (#16458977) Homepage Journal
    You want to fight the war on power consumption? Incandescent light bulbs. In regards to energy consumption, they are perhaps the most inefficient piece of technology today; and they are everywhere.
    • GT saved $2mil (Score:5, Insightful)

      by Malluck (413074) on Monday October 16, 2006 @05:41PM (#16459459)
      I know Georgia Tech went on a campaign a few years back to replace as many incandescent bulbs as possible.

      As part of it they replaced all of the 300 watt bulbs in the Van Leer build (old EE building) with 20 watt fluorescent lamps. Each lab probably had 10-15 twenty of these power hogs. After the switch our labs were freezing cold! All that extra cooling wasn't needed any more.

      Over the course of a year it saves the institute over 2 million dollars. the first million was in direct power reduction, the second million was due to reduction in cooling cost.
  • Leave them on? (Score:3, Insightful)

    by Midnight Thunder (17205) on Monday October 16, 2006 @05:17PM (#16458995) Homepage Journal
    Given my general observation at work places that most people don't even bother switch to stand-by power and just leave their computers on, I think encouraging people to put the computers is a good start, even if not perfect. Ideally it would be nice to be able to have computers hibernate, but then if you want to work from home, then there is no solution to wake them up. The wake-on-Lan solutions that I have seen only work on computers in stand-by.

    At one of the places where I worked I implemented a web page which you could access from the VPN, and type in your PC name and it would wake up your office computer, if in stand-by.
  • Would it be possible to design a power strip for wall warts that could sense whether an external device was actually drawing on each of them at the time, and cut back the input power when it's not?

    How about a media center power strip with a remote control - just a simple on and off - with the option to train it to accept the on and off signals from other remotes?

    Or have a media center power strip which can be trained to recognize the power draw of one key device when it's in on rather than standby mode, whe
  • It's estimated that in industrialized countries, devices on standby consume on average 4% of the power used. [Citation Needed]
  • Why is it that so many devices that have no need to draw power when turned off have no aty to turn them completely off? My paper shredder has an optical on/off switch tht draws power all the time. My coffee maker draws power all the when it is turned off. Many items draw power just because it has a remote remote control or a clock, why does my microwave oven need to draw power all the time just to power a clock that is not used for operation? (it does not have an auto turn-on based on time of day). I ha
  • by Biotech9 (704202) on Monday October 16, 2006 @06:14PM (#16460025) Homepage
    Is here [ft.com] at the Financial Times.

    Wasteful television standby settings and the energy efficiency of computers and water heaters are to be targeted in a new legislative drive aimed at slicing 100bn a year from the European Union's energy bill, in a move that could impose Europe's green agenda on the world. Stringent new European Commission energy efficiency targets for items such as electrical appliances and cars could set new global standards, since all imports into the European market would have to comply.

    Some previous EU deadlines have resulted in some pretty dismal performances (the Lisbon agreement springs to mind), but the EU's very high standards for energy efficiency and recycling have been adhered to across the continent with admirable results. Not to mention the fact that EU enforced limits on car pollution (as one example) have led to high efficiency cars in Europe and across the globe, as manufacturers are forced to comply with EU levels to gain access to the EU market.

    The proposed regulations - including extensions of existing rules - would impose European energy efficiency standards on any company worldwide seeking access to the EU's 480m consumers, including US manufacturers. European standards and norms in the car sector and mobile telephony have already become accepted in many countries worldwide, to the annoyance of Washington, which believes the EU sets too many rules.


    If there is one criticism that is levelled at the EU a lot, is that it sets too many rules. But the high standards they have raised in efficiency for cars and electronics (think about those EU energy labels on all fridges, freezers and so on, they've come a long way from D's and E's a decade ago, how much energy did that initiative save?), so it's A-OK by me.

Remember the good old days, when CPU was singular?

Working...