Another study suggests people tend to only believe what they see happen before their own eyes, or that which their elders can explain to them in less than 20 words.
(Note, this is more of a stream of consciousness than an actual comment, so I apologize in advance if this sounds ADD-ish)
Get rid of the bulky, loud transformers and phase shifting coils and cap banks. Run -12KVDC to -20KVDC over the residential feeder lines down to neighborhood-located equipment with switchmode buck converters to give -240VDC and -120VDC to homes via their usual 3 mains wires, and a fourth wire for homes who wish to feed power back into the local grid via switchmode boost converters. The power transformer boxes on the corner of every block will contain high-frequency switching equipment and a few batteries (for keeping the block lit during upstream switching events and outages) instead of 2000-pounds of copper and laminated steel. The neighborhood substations will have their giant transformers and oil-filled breakers and phase compensating equipment replaced with IGBT-based switch stacks and intelligent converters that quickly compensate for changing load and back-feed conditions completely silently. Managing connections between substations and the high voltage grid will be an order of magnitude simpler and safer when all you have to worry about is matching the voltages within a few percent and measuring static currents after connections are made, rather than comparing frequency, phase angles, and power factors. With today's "modern" AC grids, you're liable to blow fuses/breakers/transformers if you connect two independently-fed parts of the grid together without first matching phases and frequency.
I know it's just too late for the change from AC to DC in the home to be practical. The biggest, most power-hungry devices just don't have an "upgrade path" to DC: Air conditioning and refrigeration compressors, fan/blower motors, fluorescent lights would all need complete replacement with DC-compatible equivalents. It would have been better if appliance manufacturers had designed their devices to be run off either types of mains from the start... Large, high-torque brushless DC motors are quite cheap now, and switchmode power supplies are now smaller and cheaper than 60HZ AC power transformers, and many of them will actually work equally well being fed by 120-240VDC.
Automatic transfer switches eliminate any danger of locally generated power being fed back into the grid if there's any sort of danger in connecting the two. The electric company would only have to tell home owners to employ transfer switches in order to stay connected to the grid (with the only side effect being that they can't contribute excess power back to the grid)
My local utility company actually employs smart meters that can monitor both grid-side and home-side circuits for dangerous conditions in cases where there's a grid-tie inverter in the home. The smart meter instantly disconnects the home from the grid if there's an excessive surge in current being fed back into the grid (by analyzing the voltages, transfer current, and phase angles of both sides). The same meters also communicate with the utility company over a combination RF and powerline-based data transmissions, eliminating the need for guys to be dispatched monthly to read everyones meters.
In other news, you can buy a good charge controller, a 50KWh bank of deep-cycle batteries, a 2KW inverter for lights and outlets, and a 12-KW inverter for air conditioning, all for about $12K. This setup can run A/C for 5 hours a day and your only reliance on the grid would be to top-off the batteries on dark days.
If you have the means to get off the grid, by all means, you should, because most electric companies don't care about anything but profits.
Does DJB insist that his crypto library gets installed under
Regarding your statement, "But this is typical of the Progressives, they don't mind when it is THEIR guy mucking up the politics."
It's typical of _everyone_ in politics, _everyone_ in the media, and _everyone_ with an agenda. Don't blame just one party when _everyone_ is doing it. It's human nature to deny the guilt of yourself and the people you associate with when the goal is to discredit or disarm a group with opposing views.
At what scope/scale of time or range of values does it really matter if a PRNG is robust?
A PRNG seeded by a computer's interrupt count, process activity, and sampled I/O traffic (such as audio input, fan speed sensors, keyboard/mouse input, which I believe is a common seeding method) is determined to be sufficiently robust if only polled once a second, or for only 8 bits of resolution, exactly how much less robust does it get if you poll the PRNG say, 1 million times per second, or in a tight loop? Does it get more or less robust when the PRNG is asked for a larger or smaller bit field?
Unless I'm mistaken, the point is moot when the only cost of having a sufficiently robust PRNG is to wait for more entropy to be provided by its seeds or to use a larger modulus for its output, both rather trivial in the practical world of computing.
Spoken like a true Libertarian. I'm surprised you didn't pull "authoritarian", "fascist", or "statist" out of your hat.
Society prospers when individuals work towards the prosperity of the societal unit, as well as their own being. When you stop caring about the greater good, what good are you to your country?
Would you rather just be an isolationist and give the rest of the world the finger?
As an IT manager who oversees deployment and maintenance of about 60 desktops and laptops, some of which are shared among multiple employees, consistency in OS availability for the end user is key. We upgrade one or two machines per month, and we started using Windows 7 three years ago, so about 15 systems still run XP. We're not touching 8.1 until there are no more XP systems on our network, AND people show interest in actually using 8.1, AND at least one service pack has been released to address outstanding issues since its public release, AND we discover a way to disable the "Tiles" start screen. Supporting systems with two different desktop interfaces is a serious pain in the ass, especially for non-technical users. So far, only two people have shown interest in using Windows 8 (techie geek types), and the vast majority of our employees are averse to changing their OS at all.
I've had to customize Windows 7 a bit to make it "comfortable" for the lowest common denominator: Long-time XP/2000 users.
Everyone who hates the US is loving this news, just like they cheered when they heard we all have to take off our shoes and have our nuts inspected at airports.
Does this mean developers might actually implement 'MUTE', 'FORCE STOP', or 'RESTART' context menu items for shockwave apps? I despise going to read a page with ads and other shockwave sidebar widgets that make noise or chew up CPU cycles and have no way to pause/mute/stop them. It also bugs that you must reload the entire page to get a flash app to restart.
It's beyond me why Macromedia/Adobe never wanted us to have those essential controls. The only thing we get, in some rare cases, are the ability to prevent the app/player from looping, or to turn down rendering quality.
I think that until _all_ TVs have 16:9 screens and _all_ studios broadcast unmodified/uncropped/un-letterboxed content, we'll have the following two problems:
The disparity in video formats and whether they "letterbox" a high-def program for standard-definition channels is frustrating. Most modern studios and news stations in the US are recording in HD, then either letterboxing it for SD broadcast (adding black bars on top and bottom to make it 4:3 aspect ratio), or cropping the left and right side of the image to get the 4:3 image. The former is horrible for people with HD sets that can't overscan (scale up the side of the image so it fills the screen, eliminating the black bars) and you lose effective image resolution... I wish someone would drill it into their heads that letterboxing HD content is BAD for SD broadcasts. Your camera crew should try to capture actors and action within the central 4:3 area of the image so you can crop and scale your HD content for people with SD TVs or receiving SD channels.
Even more frustrating is when I go to a public place with widescreen TVs showing standard-def channels (in 4:3 format), S-T-R-E-T-C-H-E-D to 16:9, so everyone looks fat and square-shaped graphics become rectangles. Half the widescreen "HD" TVs sold now are able to overscan properly, and the other half can't at all (they just stretch the image). A few good brands can do a "panorama" transform, which is a compromise, but makes diagonal lines look curved. There really is no legitimate reason for a TV to stretch a broadcast image horizontally, yet everyone thinks they NEED to do it to "fill" their wide screens with a 4:3 SD image. It boggles me that so many people purposely distort the image just so it can appear "widescreen".
mod up, please!
Can a state elect to locally invalidate the federal mandate that states that bills issued by the US Treasury are "Legal tender for all debts public and private"?
This may be something that can be easily challenged in federal court, and I truly hope someone does challenge it.
The worst part of this state bill is that every transaction, along with the verified identity of both parties, be recorded and submitted to law enforcement on demand.