Democrats said he was an outstanding, honest man when he dropped the case (while Republicans decried him as dishonest). When the case came back up, the Democrats and Republicans both completely flipped positions. I don't know if he's playing politics or not, but it seems obvious that everyone's hatred/love is tied to their party rather than the truth.
In any case, what could he have done differently? He announced the case closed going into election season. If he didn't mention the new evidence at all, then congress would have him for perjury sooner or later. If he released after and Hillary won, everyone would say he killed the investigation so Hillary could win. If he released before and Trump won, he would be accused of bringing up the investigation again so Hillary would lose.
Given that Hillary looks to win the election, he can claim that his release didn't adversely affect the election. That's about the best outcome he could hope for.
EPB offers fiber because last-mile fiber was part of the new smart grid power system (and why not use all the extra bandwidth). The actual company offering the service is EPB Fiber Optics which leases the lines from EPB.
NOTHING keeps Comcast from leasing those same lines at the same rates (or even bringing a case to court that the cost is too high). They simply refuse and instead offer sub-par services with 300GB data caps (guaranteed to run huge overages if you're a cord-cutter). Your territory idea only works when greedy corporations with state-granted monopolies aren't in the mood to abuse the people locked into their service.
For the record, EPB is good enough at their job that they already have agreements to do the same thing in north-west Georgia and are still in talks to offer the same thing in north-east Alabama. People want and need good services from companies that aren't out to screw them over and they'll go wherever necessary to make that happen.
Your control of what you make ends when you sell it to someone else. In the digital age, this means selling something opens you up to that person having an infinite number of copies. The only difference is that the government believes selling what you bought is reason enough to strip you of your "god-given rights" it claims to provide, to steal all that you have worked for, to take away freedom and inflict permanent harm.
What about the poor author then?
There are many means of making money without harming others. The first is to not sell something unless you get the price you desire (this is what developers do as do book authors when you consider that most books are out of print within 5 years). The second is patronage by someone interested in your continued creation of works (a very ancient and proven tradition). The third (and particularly relevant to musicians) is to offer performances for a fee. There are alternatives, but using the government as your personal mafia is a much lazier solution for corrupt artists and businessmen.
The team who made alphago deserve credit, but their approach (from a high level) isn't so revolutionary. Go AI devs moved away from solely using brute force tree pruning (like what deep blue used) a long time ago.
The first big change was to use pattern recognition (matching sub-sections of the game with already known patterns) to prune faster. The second (and far more revolutionary) change was to apply an upper confidence bound based on a monte carlo simulation. This is where computers gained the ability to bypass those billions of moves with a margin for error. The third was the use of neural nets as a way to balance between brute force and pattern matching while managing the confidence levels of the monte carlo simulations.
The biggest difference with Alphago is corporate backing. I don't know how many people Google hired for the job, but the paper lists 20 (so probably more than that). Buying and running supercomputers is extremely expensive as well. With the exception of darkforest (Facebook's go machine which, as expected, appears to use a similar design), most teams consist of a very few people on small budgets without someone willing to spend millions to buy and run supercomputers for them.
I believe the official Guinness record is 8.429GHz on an AMD pre-release bulldozer in 2011. Another record was set at 8.723GHz on an AMD FX-8370 in 2013, but I don't recall it being "official".
Companies don't pay FICA taxes (the contractor pays all of them instead instead of splitting them 50/50 with the employer). Some states don't require companies to pay workers comp on contractors instead. Contractors also don't get overtime, travel compensation, 401k matching, insurance benefits, etc (there's not even a minimum wage)
If a contractor works for anywhere close to the same hourly as an equivalent employee, that contractor will be making a lot less.
Person A: "What's 0.1 + 0.2?"
Person B: "Let me check my computer, one sec... OK, looks like around 0.30000000000000004."
I forgot to mention Nvidia's Denver core. They dropped it in favor of A57 and I don't think we'll be seeing it again for a while. The original reason for making it seemed to be for x86 emulation (literally the next generation of transmeta), but their lawsuit settlement with Intel sunk that ship leaving them to repurpose the architecture for ARM. I like the transmeta idea, but like bulldozer it seems a little less good in practice at present. I think we'll see something similar return in a few years, but for now I think fixed-function reigns supreme.
Looking at the latest in the ARM landscape, we have Apple A9, Qualcomm Kryo, ARM A57, ARM A72, and AMD A12. We can probably expect a small jump in Apple's performance next year along with a second revision of Kryo, but nothing competitive with Intel. A57 is being dropped for the fixed A72 since Apple screwed over ARM (tl;dr Apple shipped a new architecture in 2 years while ARM took almost 4 years for an inferior product -- everyone in the industry knows that design to shipping a new architecture is 4-5 years indicating either ARM screwed over all their non-apple partners (and themselves) by giving Apple a head start or Apple forced ARM to adopt a new ISA when they'd already had a couple years to work on int). Of all these architectures, I think only A72, AMD's A57 implementation, and AMD's A12 are worth focusing on.
A72 is supposedly close to the performance of Intel's core M processors, but I'm willing to bet that the default A72 can't actually compete with Skylake's wide dispatch, SMT, and vector units. The biggest question in this area isn't actually the CPU so much as all the "uncore" parts surrounding it. Even if it could have these things in theory, the companies controlling most of the patents in this area aren't using the A72 (AMD, Intel, IBM, Oracle, etc).
AMD's first generation of ARM processor (launching next year) is an A57 server part, but is probably going to be faster than most A72s in practice because it can be manufactured on a high-performance (rather than bulk) fab process and will have faster buses, faster memory, much larger caches, and even some parts of the core (like the branch predictor) may well be replaced with better systems while AMD's reworking the entire architecture for the new fab. This chip will probably be competitive in the low-power server market, but most likely won't be aimed at anything mobile.
Not much is known about AMD's A12, but for the first time, an ARM company seems to be moving into the higher-performance mobile segment. AMD failed with bulldozer (and has taken the heat for beating that dead horse for the past few years), but they at least had the sense to hire Jim Keller to help them make a couple new, next-gen architectures. While AMD has money troubles, it's in the intellectual property sweet spot to be able to put together a competitive chip. This is the chip I think Linus is wanting, but it's been pushed to 2017.
The complete unknown is Intel. They bought DEC and StrongARM was along for the ride, but they sold it in '97. They then made XScale only to sell it to Marvell in '06. I find it hard to believe that Intel's not experimenting with ARM design again. Even if they could make x86 compete in the low-end (atom has been a failure in that regard), convincing companies to switch will probably prove impossible as the current situation with lots of competing CPU providers works to their fiscal advantage. Apple won't be giving up the freedom to make their own chips (nor will Samsung). That said, I don't think we'll be seeing an Intel ARM chip before 2018-19.
tl;dr -- the current chips can't compete with Intel. The ones that can don't launch until 2017 or later.
While success is desirable, all the companies that signed deals with Marvell are required to make a movie every few years to avoid the movie rights reverting back to Marvell.
We saw this same thing happen with the Sony reboot of Spiderman. It was cheaper to make a couple of terrible films than to have the rights revert back to Disney where Sony would never get a chance at the intellectual property again.
No matter which side you support (or if you're an Egalitarian like me) researching and fact-checking are important. Just because a group that hates MRA says RooshV is one does not make it so. The fact that those sources are either mis-informed or are deliberately mis-characterizing should raise questions about what they say in the future.
Nothing motivates a man more than to see his boss put in an honest day's work.