One of my customers is located in Southern California, but Sun's (now Oracle's) servers refused to give them Java updates because they were geolocated as being in Iran.
Environmental groups, for their part, have tended to sneer at the problems the utilities are having, contending that it is their own fault for not getting on the renewables bandwagon years ago. But according to Gillis, the political risks of the situation also ought to be obvious to the greens. The minute any European country — or an ambitious American state, like California — has a blackout attributable to the push for renewables, public support for the transition could weaken drastically. Rasmus Helveg Petersen, the Danish climate minister, says he is tempted by a market approach: real-time pricing of electricity for anyone using it — if the wind is blowing vigorously or the sun is shining brightly, prices would fall off a cliff, but in times of shortage they would rise just as sharply.
Oreskes argues that scientists failed us, and in a very particular way: They failed us by being too conservative. Scientists today know full well that the "95 percent confidence limit" is merely a convention, not a law of the universe. Nonetheless, this convention, the historian suggests, leads scientists to be far too cautious, far too easily disrupted by the doubt-mongering of denialists, and far too unwilling to shout from the rooftops what they all knew was happening. "Western scientists built an intellectual culture based on the premise that it was worse to fool oneself into believing in something that did not exist than not to believe in something that did."
Why target scientists in particular in this book? Simply because a distant future historian would target scientists too, says Oreskes. "If you think about historians who write about the collapse of the Roman Empire, or the collapse of the Mayans or the Incans, it's always about trying to understand all of the factors that contributed," Oreskes says. "So we felt that we had to say something about scientists.""
OK, let's say for sake of argument you bring gigabit to every doorstep. Or heck, even 1% of doorsteps. All of your uplinks are going to be so massively oversubscribed that it's essentially meaningless, except for content that's hosted on local caching servers. This is great for things like Netflix, but even ultra-high quality 4K video with uncompressed multichannel audio isn't going to consume that much bandwidth. 40Gbit connections are standard on the largest backbones, with 100 Gbit coming on-line, but that's some awfully expensive hardware right now.
So my question would be: what added benefit you expect to get with a gigabit local loop when it's still going into the same sort of congestion limits? i don't mean to sound like a curmudgeonly old bastard, but this sounds more like a marketing gimmick. Even governments aren't immune from spreading marketing bullshit; in fact it's sometimes easier when you know you won't be held accountable (advertising fraud vs political promises) and it's all other people's money anyway.
The Obama Administration (and Bush / McCain / Romney would have been no better) looked around and were thinking
Yeah, that last one sounds about right. We'll go with that.
You should have professional magicians look at it. These are people who know how to find the "trick".
You nailed it. I was just reading about James Randi's debunking of the alleged psychic Uri Gellar, who had managed to fool a bunch of scientists back in the 1970s. Randi claimed that scientists are some of the easiest people to fool because, as you said, they operate under a lot of preconceived notions and once you figure out how to work around those it's a piece of cake. As Randi put it, to catch a magician (who are essentially people who fool people for a living) you send a magician.
To start with, I have no idea what the answer to this question is with regards to the Swedish system, but I've found that in many cases of solutions like this the "cost" paid by end users is heavily subsidized in other areas (in the US it's so common it can almost be assumed). So if the $40 / month pays for all of the capital costs, maintenance, depreciation, etc. then wonderful. Otherwise it's just accounting slight-of-hand - put a happy number out for the public, and if somebody digs and puts together real costs then they find that the real number is horrific.
On the other hand, in the US most major metropolitan areas (there are exceptions) have sold monopoly or duopoly franchises on internet service, which also distorts prices horribly and in other directions. I live in one of these areas, as do most of the people I know (I get to chose between mostly tolerable but pricey Cox, and utterly abhorrent AT&T - for practical purposes just one choice). In many cases these "utilities" are limited to certain profit levels, so they just adjust their costs up. Competition isn't magic; it just incentivizes aggressive pursuit of the best cost / quality tradeoffs (which are usually subjective and may vary significantly between individuals, eliminating the possibility of a good "one-size-fits-all" solution).
Matthew Green tackled iOS encryption, concluding that at bottom the change really boils down to applying the existing iOS encryption methods to more data. He also reviews the iOS approach, which uses Apple's "Secure Enclave" chip as the basis for the encryption and guesses at how it is that Apple can say it's unable to decrypt the devices. He concludes, with some clarification from a commenter, that Apple really can't (unless you use a weak password which can be brute-forced, and even then it's hard).
Nikolay Elenkov looks into the preview release of Android "L". He finds that not only has Google turned encryption on by default, but appears to have incorporated hardware-based security as well, to make it impossible (or at least much more difficult) to perform brute force password searches off-device."
"People briefed on the matter" generally equals "deliberate leak, to move public opinion or at least test the waters."