It costs money to upgrade and stabilize the power grid. It costs money to stay ahead of the failure curve.
The current infrastructure sucks mainly because it's unpredictable and takes too much effort to synchronize disconnected sections of the grid before connecting them. You can't just "route around" a dead transmission line if there are generator stations active on both sides of the break. You must wait for the two sides to synchronize in phase before connecting them, which can take several seconds to a minute. If you don't, you'll cause even more breakers to trip.
None of this would matter if we switched distribution to HVDC. We have the technology, but again, the cost to convert everything to employ DC-DC switching converters is prohibitive. The biggest upside to switching everything to DC (all the way to the end-user) is that you could add standby capacity by simply connecting batteries to your mains circuit between the main breaker and load panel. The more people in a neighborhood using batteries to buffer their power source, more aggregate protection the neighborhood has against blackouts.
Another study suggests people tend to only believe what they see happen before their own eyes, or that which their elders can explain to them in less than 20 words.
(Note, this is more of a stream of consciousness than an actual comment, so I apologize in advance if this sounds ADD-ish)
Get rid of the bulky, loud transformers and phase shifting coils and cap banks. Run -12KVDC to -20KVDC over the residential feeder lines down to neighborhood-located equipment with switchmode buck converters to give -240VDC and -120VDC to homes via their usual 3 mains wires, and a fourth wire for homes who wish to feed power back into the local grid via switchmode boost converters. The power transformer boxes on the corner of every block will contain high-frequency switching equipment and a few batteries (for keeping the block lit during upstream switching events and outages) instead of 2000-pounds of copper and laminated steel. The neighborhood substations will have their giant transformers and oil-filled breakers and phase compensating equipment replaced with IGBT-based switch stacks and intelligent converters that quickly compensate for changing load and back-feed conditions completely silently. Managing connections between substations and the high voltage grid will be an order of magnitude simpler and safer when all you have to worry about is matching the voltages within a few percent and measuring static currents after connections are made, rather than comparing frequency, phase angles, and power factors. With today's "modern" AC grids, you're liable to blow fuses/breakers/transformers if you connect two independently-fed parts of the grid together without first matching phases and frequency.
I know it's just too late for the change from AC to DC in the home to be practical. The biggest, most power-hungry devices just don't have an "upgrade path" to DC: Air conditioning and refrigeration compressors, fan/blower motors, fluorescent lights would all need complete replacement with DC-compatible equivalents. It would have been better if appliance manufacturers had designed their devices to be run off either types of mains from the start... Large, high-torque brushless DC motors are quite cheap now, and switchmode power supplies are now smaller and cheaper than 60HZ AC power transformers, and many of them will actually work equally well being fed by 120-240VDC.
Automatic transfer switches eliminate any danger of locally generated power being fed back into the grid if there's any sort of danger in connecting the two. The electric company would only have to tell home owners to employ transfer switches in order to stay connected to the grid (with the only side effect being that they can't contribute excess power back to the grid)
My local utility company actually employs smart meters that can monitor both grid-side and home-side circuits for dangerous conditions in cases where there's a grid-tie inverter in the home. The smart meter instantly disconnects the home from the grid if there's an excessive surge in current being fed back into the grid (by analyzing the voltages, transfer current, and phase angles of both sides). The same meters also communicate with the utility company over a combination RF and powerline-based data transmissions, eliminating the need for guys to be dispatched monthly to read everyones meters.
In other news, you can buy a good charge controller, a 50KWh bank of deep-cycle batteries, a 2KW inverter for lights and outlets, and a 12-KW inverter for air conditioning, all for about $12K. This setup can run A/C for 5 hours a day and your only reliance on the grid would be to top-off the batteries on dark days.
If you have the means to get off the grid, by all means, you should, because most electric companies don't care about anything but profits.
Does DJB insist that his crypto library gets installed under
Regarding your statement, "But this is typical of the Progressives, they don't mind when it is THEIR guy mucking up the politics."
It's typical of _everyone_ in politics, _everyone_ in the media, and _everyone_ with an agenda. Don't blame just one party when _everyone_ is doing it. It's human nature to deny the guilt of yourself and the people you associate with when the goal is to discredit or disarm a group with opposing views.
At what scope/scale of time or range of values does it really matter if a PRNG is robust?
A PRNG seeded by a computer's interrupt count, process activity, and sampled I/O traffic (such as audio input, fan speed sensors, keyboard/mouse input, which I believe is a common seeding method) is determined to be sufficiently robust if only polled once a second, or for only 8 bits of resolution, exactly how much less robust does it get if you poll the PRNG say, 1 million times per second, or in a tight loop? Does it get more or less robust when the PRNG is asked for a larger or smaller bit field?
Unless I'm mistaken, the point is moot when the only cost of having a sufficiently robust PRNG is to wait for more entropy to be provided by its seeds or to use a larger modulus for its output, both rather trivial in the practical world of computing.
Spoken like a true Libertarian. I'm surprised you didn't pull "authoritarian", "fascist", or "statist" out of your hat.
Society prospers when individuals work towards the prosperity of the societal unit, as well as their own being. When you stop caring about the greater good, what good are you to your country?
Would you rather just be an isolationist and give the rest of the world the finger?
As an IT manager who oversees deployment and maintenance of about 60 desktops and laptops, some of which are shared among multiple employees, consistency in OS availability for the end user is key. We upgrade one or two machines per month, and we started using Windows 7 three years ago, so about 15 systems still run XP. We're not touching 8.1 until there are no more XP systems on our network, AND people show interest in actually using 8.1, AND at least one service pack has been released to address outstanding issues since its public release, AND we discover a way to disable the "Tiles" start screen. Supporting systems with two different desktop interfaces is a serious pain in the ass, especially for non-technical users. So far, only two people have shown interest in using Windows 8 (techie geek types), and the vast majority of our employees are averse to changing their OS at all.
I've had to customize Windows 7 a bit to make it "comfortable" for the lowest common denominator: Long-time XP/2000 users.