Are you high? He was trained and employed as a spy *by the US CIA*. He is not admitting to espionage, he's saying that whenever the NSA paints him as a hacker and a low-level IT guy, the NSA is lying. And the CIA has now confirmed that the government has known all along that it's telling lies about who Snowden really is.
Apple specifically addressed this during their conference call. Sales are not down; if you look at two quarters combined, sales are flat or slightly up. Sales only appear to be down year-over-year because they had supply issues five quarters ago, which pushed sales from that quarter (which was low) into the start of the next quarter (which was high).
That hasn't been true pretty much ever. Back before Windows did privilege separation, anti-cheats scanned everything they could find; after the rise of UAC, PunkBuster and other anti-cheat systems added a prompt to permanently authorize their system-level service on the first run.
When you play "Valve Anti-Cheat" (VAC) enabled games, you agree to allow Valve to scan your computer for evidence of cheat/hack programs. This is what VAC does. It's like Punkbuster, Warden, etc - depending on your point of view, it tries to level the playing field for multiplayer games, or it is an invasion of privacy because you have the right to cheat all you want.
Valve's VAC, Blizzard's Warden, etc are all "spyware" by definition. Their job is to find and collect evidence of suspected game-tampering cheats, both known and unknown, and report them. They already sniff your running processes, window titles, loaded drivers, USB devices, filesystem, etc. Scanning your local DNS cache is probably one of the least invasive things that VAC does, *and it only happens when you play games which advertise the VAC feature*.
If you don't like this, don't play VAC-enabled multiplayer games. It's that simple.
No. Failing to deliver a quality product isn't the problem. The problem is if you promise to deliver a quality product, and then you fail.
It seems to me like Apple wouldn't have made the switch right away on iOS 6 if they weren't confident that the software was ready. Someone had to stand up and say, "This is ready" or "This is not ready". If Mr. Williamson was in charge of it, and he told his bosses with confidence that it was ready, he should be fired. That's pretty straightforward.
That seems like an oversimplification since the DEFLATE algorithm includes a huffman encoding step, and it is within the specs for the compressor to simply never emit back-references. It would be a horrible bug in the implementation of zlib to have worse compression performance than a basic huffman encoding.
Although it has not been fully disclosed yet, it's my understanding that the attack is only practical because of a sort of implicit "trust chain" in the implementation of TLS 1.0 where knowledge of one block gives you all the information you need to decode the next block... but also that proper decryption of one block means that you know that you decrypted the previous block successfully. That's the kicker - if you are just making guesses at one block but know the contents of the next block, the encrypted results of the second block are a kind of oracle to see if you got the first block right or not.
My layman's understanding of the fix is that it neutralizes the oracle by adding additional variation. This means that you'd have to guess the random variation in order to craft an "oracle" packet that tells you that you guessed the previous packet correctly. Multiply the guessing search space (2^8 possibilities) by the variation and you're up in "computationally infeasible" territory. The attack is thus neutralized.
The point isn't to provide critical or useful feedback. The point is to provide positive reinforcement and emotion. Most human beings, believe it or not, enjoy feelings of acceptance and appreciation and despise criticism. You're welcome to tell them to go home until they grow a thicker skin, but then all you've got left is a bunch of nerds and engineers trying to maintain articles about 15th century art.
"Lost at a bar" probably means that someone left it on the bathroom counter, and another patron came in and swiped it as soon as he saw it, then realized it wasn't a normal phone and tried to sell it to tech sites for cash. Or maybe the bartender decided not to wait for the owner to claim it.
Not everyone is morally responsible. If the choices are 1) pre-planned conspiracy, or 2) average people behaving badly, it's always the latter.
As far as I can tell, the author of this paper just figured out a way to offload a bunch of memory management tasks to an idle CPU core, and then counted it as a performance gain. OK?
So, monolithic single-threaded applications can be made faster on multi-core systems, at the cost of bulk-allocating more memory than is actually required, and only if there is enough idle CPU to run your memory-management thread in realtime on another core. I am not exactly blown away.
The press release tries to trump it up as some kind of major advancement in performance, but it's not a performance gain at all. If you ran four copies of these single-threaded benchmarks simultaneously, you'd probably see a net performance decrease.
Open source bugs get fixed because people notice and are bothered by the bugs. This is the biggest motivator of open source contributions - everybody has an itch to scratch. The bugs that get fixed fastest are the bugs that are encountered the most. And this is why the Microsoft guy is absolutely correct in his analysis.
Bad security is not a user-facing bug. Unlike functionality bugs, there is little incentive for community members to identify and fix security bugs. Sure, the Linux kernel and other key packages will attract expert eyes, but the average random piece of open-source software will not.
Security analysis is both complicated and un-glamorous. There are not a lot of people attracted to that kind of work. There are even fewer people who would do it for free. The position of the linked article is that it's better to pay people to think about security than it is to rely on the principles of OSS. I agree 100%.
You would be amazed what firmware can do. Sony recently announced a revision to the physical disc format that places the pits closer together to increase storage by a significant percentage... and all existing Blu-Ray drives will be made compatible via firmware.
There is a reason that these drives cost so much to manufacture. The physical hardware is incredibly generic, and nobody really knows the limits of its capabilities.
There's a pretty big difference between "one second" and "50ms". And your assumptions about what is noticeable and playable are pretty much wrong when it comes to console controllers.
LAME MP3s are still compatible with the original reference spec. Sure, VBR sounds even better, but even 128kbps CBR has jumped light-years ahead in quality past the original generation of MP3 encoders.
In theory, the BBC's new encoders could be making better use of error analysis, redistributing the bitrate to avoid highly-noticeable errors (such as macroblock artifacts) in exchange for increased error in less important/perceptible regions of the picture.
There is no theoretical reason that a perfectly encoded 9.6mbps stream has to look worse than today's average 16mbps stream.
In reality though, from the complaints it sounds like the BBC's new encoders are kind of shitty.
But those things require format changes. My point is that you can improve the decision making in the encoder without changing the output format at all.
If you play an 128kbps CBR MP3 authored with the latest LAME using an MP3 player from 1992, it will sound light-years better than the same MP3 from a 1992-era encoder. This is not because of new features; there is no new coding, no varible block sizes, just a better understanding of which frequencies are important for human hearing.