While ideal, proper education takes decades to have effects. All the other changes are short-term with immediate benefits.
We choose between the party that taxes us to subsidize farmers and hollywood, or the party that taxes us to subsidize banks and oil companies. You may claim there is a difference, but I don't see enough of one for it to matter.
Even assuming I grant you the above, the difference is clearly between making food cheap and life entertaining vs. making your air unbreathable and gambling with your money to the point where you may not be able to afford to eat tomorrow. No difference you say? Yeesh.
As many others have stated, use a tool that computes a hash of file contents. Coincidentally, I wrote one last week to do exactly this when I was organizing my music folder. It'll interactively prompt you for which file to keep among the duplicates once it's finished scanning. It churns through about 30 GB of data in roughly 5 minutes. Not sure if it will scale to 4.2 million files, but it's worth a try!
I don't know much about oauth, but this sounds like a stupid move.
No, it's how it should have been to begin with. Bearer tokens are now pure capabilities supporting arbitrary delegation patterns. This is exactly what you want for a standard authorization protocol.
Tying crypto to the authorization protocol is entirely redundant. For one thing, it immediately eliminates web browsers from being first-class participants in OAuth transactions. The bearer tokens + TLS makes browsers first-class, and is a pattern already used on the web quite a bit, albeit not as granularly as it should be.
His criticisms against bearer tokens are based on the ideals of authenticating identity, but bearer tokens in OAuth are about authorization. These are very different problems, and authentication actually impedes the delegation patterns that people want to use OAuth for.
Giving someone a bearer token authorizes them to use a resource on your behalf. That third-party shouldn't also have to authenticate with the resource as well. It could be a person or service that's entirely unknown, so authentication requirements actually prevent work from getting done. This just leads to awkward workarounds, which OAuth was supposed to prevent!
Give this man some points! He has it exactly right.
Right, because real estate is at such a premium that we can barely manage to fit in four cores on a single die with 8M cache, so we couldn't possibly afford a few hundred transistors to decode the arcane instruction set.
Cores can be shut down to conserve power, as can caches in some cases, but instruction decoders cannot. I think you underestimate how power usage scales with numbers of transistors. Since this whole article is heavily biased towards low power and mobile computing, that's a very relevant factor.
Except that any such taps are instantly detectable, at which point communication stops. Thus, at most 1 bit of information leaks out to an eavesdropper.
This paper is a follow-up to the previous work you cited.
Of course, you're conveniently ignoring the microcode translator itself and the memory to store the microcode, which are significantly larger than merely thousands of transistors.
There's nothing inherently "superior" about ARM or PPC instruction sets.
Superior to x86? Sure there is. x86 is a mish mash of instructions many of which hardly anyone uses except for backwards compatibility, but that still cost real estate on the CPU die. That's real estate that could be spent on bigger cache or more registers. ARM is a much better instruction set by comparison.
Secretly filming your roommate having gay sex is a little worse than just saying something random and mean on slashdot.
Agreed, but you wouldn't have 15 charges levelled against you for a vicious comment. The "little worse" got him his jail sentence.
There is an open question of probable cause for search and seizure for such cases, because like it or not, citizens need to be protected from their governments just as much as they need protection from terrorists. Even if a terrorist could set off a nuke in the middle of New York city, governments would still have caused more death and misery than all terrorist attacks combined. Which is truly the greater threat?
But if you exclude the volume of material and energy, then evolution is more "complicated" than just coincidentally popping into being
No it's not. Natural selection is axiomatically very simple, requiring maybe 3 or 4 axioms (see cellular automota). Spontaneous creation of anything requires far more than that. The difference is quite clearly demonstrated by a system designed by genetic algorithms, and a system designed by a programmer. The former has maybe 10 simple rules, and the solution eventually emerges. The latter has literally tens of thousands of rules.
I don't know what you're taking about. Equation size has nothing to do with this. A formal theory's complexity is defined purely by the number of axioms required to define it. Many Worlds does not require the measurement postulates that Copenhagen and other non-realistic QM interpretations require, because the measurement postulates can be derived from the other axioms. That means Many Worlds is strictly simpler.
The number of assumptions required for Many Worlds is strictly less than all other interpretations of QM. The math is the same, so no, the device is simpler.
That's a hell of a lot of material, energy, computations, and/or real-estate.
This is exactly what I was talking about. Most people think each universe somehow "clones" all the matter and energy for each branch it takes. This is not at all the case. Again, think of the universe as an n-bit quantum computer, where n is very large. "Forking" a parallel computation to represent two possibilities of a quantum observable involves creating a superposed state to represent each outcome/universe, and requires very little energy (just the energy for entangling the states).