No, I was not assuming they run from a battery; I was just using the fact that LEDs can run off of batteries to show how little power they use. As for the AC-DC converter itself, I guess you can tell how much energy it's wasting (as heat) by touching it. Most modern ones don't waste much.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Speaking of wasteful, do you know how many electrons had to go out of their way for you to write that post? And then how many were used to display your post on all the slash-dotters of the world! For crying out loud, you should conserve. Maybe if you left out the vowels, you could use fwr chrctrs, and the Arctic Ocean ice wouldn't melt so fast!
Ok, 0.26 watts. Let's pretend you never have to charge your phone (or you got a new charger and forgot to unplug the new one). There are about 8766 hours/ year. (This takes into account that one out of four years is a leap year.) So that charger is using about 2200 watt hours/ year, or about 2k. The average price for electricity in the US is 12 cents/ kw-hour (http://www.npr.org/blogs/money/2011/10/27/141766341/the-price-of-electricity-in-your-state). So we're talking 25 cents per year to keep this charger plugged in.
That same NPR article says the average American household uses about 900 kw-h/month, or 900,000 watt-hours. A quarter of a watt = 187 watt-hours/ month, or about 0.02% of the average monthly use. Putting this differently, you'd need 50 plugged-in chargers in your home to amount to even 1% of your electricity use.
An apartment that has another living space above and below, and on three sides? So heat transfer takes place through only one wall of the six (counting ceiling and floor as "walls"). I can imagine his AC not having to do much, especially if he lives in a moderate climate, like parts of California.
What is this distinction you're making between being "on" or "running" on the one hand, and "working" on the other?
I suppose the microwave has a LED display that shows the time (or shows nothing if you didn't set the clock after the last power outage), but we can run wristwatch LEDs for a couple years on a tiny battery, so I don't think that's using much juice. Likewise the LED display on our oven. Maybe a land line phone would use a tiny bit of electricity for an LED too, but I long ago traded that in for a cell phone, which doesn't appear to use up much battery juice on standby. (Yes, the office where I work still has a land line phone with an LED display. Again, I can't imagine the LED and accompanying electronics is using much electricity.) As for something with a remote: most cars now have a remote, since you can click a button on your key from several feet away and get them to do something. Does that use up your car's battery very fast? Ever leave your car at the airport parking lot and come back a week or two later? My car's battery seems to be none the worse for the wear. (Yes, a car's battery is pretty big, but _no noticeable effect_.) And so on.
Better yet, saw (with a hand saw, of course) and split your own firewood; it'll warm you twice, the saying goes.
If we are a computer simulation, that might explain why it took a day to simulate the heavens and the earth, and to get past the surface of last scattering; another day to simulate the formation of elements heavier than Hydrogen and Helium via supernova, and allow the elements to cool enough to form molecular clouds; another day to simulate the formation of planets and the beginning of life; and so forth. As each stage became more complicated, the simulation was taking longer in computer time to model shorter periods of time in the simulation. Probably six days of computer time to simulate all of it. At that point the simulation had enough computing power, in the form of sentient beings, to go simulating itself, and so the computer could rest.
I find crowbars are not sentient, so stalking up on them is not too difficult. Keeping a large supply of them, otoh, can be hard.
In some universes. I hope to be in one of the universes where the LHC forgot to pay its electric bill, and gets shut down.
"allow non-specialists to read specialized works (such as scientific papers and legal documents). But the specialist/intended target are the major market so this is rare." Being part of that specialist target myself, I'm afraid you're right. Google has a special search database for people like me, scholar.google.com; but they're constantly making it harder to find. It used to appear as a link at the top of a google search page, then it was relegated to a drop-down, now it's not even there any more. Guess they didn't make enough $ off of it.
I am basically in agreement with rockmuelle. But to put (what I think is) his argument slightly differently, there is no such thing as an exact model, because the categories that you would want to mark in a model are inherently fuzzy. Library catalogers knew this decades (a century?) ago; they were trying to create a model, embodied in their card catalogs, of the information in books. But the inter-cataloger agreement was (from my observations) far from exact. A century later, and it's no different--and I suspect it never will be. Categorization, whether done by man or machine, is still fuzzy.
And if I've misrepresented rockmuelle, or misunderstood your question, qpqp, it's because I don't have an exact model of what you're saying.
And they would have had that if Doc Emmet Brown hadn't hoodwinked them.
You need to take your medications.
"...what then determines proper use?" Who sayeth that there be such a thing? Of a truth, thou dost not speak it.
Chaucer's been blogging of late: http://houseoffame.blogspot.co...