The weird thing is, apparently the procedure when one of the pilots leaves the cockpit is for one of the flights attendants to go inside the cockpit temporarily until he returns, I imagine specifically to try and prevent such a scenario. If that's true, they probably simply didn't apply protocol, especially for such a short routine flight. Hopefully this event will make sure it will be applied in the future (still, can't wait for planes to be entirely remote or AI controlled to avoid this kind of things).
Maybe the pilot was extremely depressed (and incredibly selfish - although depression may have been so strong he didn't even care anymore), and decided he couldn't take it anymore. Or maybe he played kerbal and decided to try some good old lithobraking
That being said, it also includes some features for tracking continuous sessions based on L7 filtering, provides a limited GeoIP resolution, and so on - and it at least provides a framework for developing more advanced analysis.
As others have said since this release, it is at least an open source, base framework for developing more advanced stuff, and it provides library integration points for other software. As basic as it is, it might provide a common framework for an open development of an advanced traffic analysis tool that'll be open (after careful reading of the code, any relatively good expert would be able to provide a similarly capable code in a matter of days and probably has, as an interesting case study/exercise previously - I know I did, limited to HTTP analysis but still). That can only be a good thing, if only to regroup efforts in that direction to provide a universal traffic analysis tool for forensics and so on.
Any code being released open source is always a plus
how does the DoJ know what percentage of ANYTHING is going through it
That's an easy one: they make that number up, to spread FUD about it. And read the article I linked, BTW (and the source it links): the number in question has been easily disproved, with a relatively simple analysis of hidden services' hostnames resolution. Tor is not the USA's tool - it is an open source, publicly available software that was originally financed by part of the US government, period.
I realize all the Snowden revelations have made lots of people a bit paranoid (which is a good thing, mostly); but the fact is, it is extremely unlikely that tor is compromised in any serious way. Barring human error, tor works, mostly - there are some attacks possible, and there are demonstrated attempts by the NSA and others to compromise it (with some extremely limited results, both in their scope and in their duration); however, I have not seen any shred of evidence suggesting that it has been compromised in any serious way. This growing meme that "tor is broken for good", and the larger one that "if it's connected to a network, it's accessible by the NSA" is simply bullshit.
The thing is, in all likelihood, tor works. GPG works. If you encrypt something with GPG and the key is not available to them, even the most powerful security agency on the planet will not be able to read it. The advances both in basic mathematics and/or computing required to break those are so extreme that it would be very, very hard to hide it. And nothing in the Snowden leaks has suggested that those have been broken - quite the contrary, in fact, since several of the revealed documents suggest that tor and the growing encryption usage are a serious problem to those agencies.
That doesn't mean that it'll stay that way, mind; personally, I think that some sort of quantum computing might be in reach of those same agencies in a few years (and they are dumping and storing all the encrypted, non-breakable traffic they can in the meantime, I imagine waiting for this day) - and even that personal opinion will seem paranoid and far fetched to most experts in the field. But in the meantime, the most likely hypothesis is simply that those encryption algorithms and protocols are still secure. If you have any shred of evidence that is not the case, please link those - I'd genuinely appreciate it.
And finally, about Ulbricht and the other dark net markets taken down more recently: all of those have been clearly linked to human error, from corroborating testimonies from several parties. So sure, you can believe that this is entirely parallel construction, and you can also believe that Obama and most of the five eyes countries are bitching about encryption more and more to present a plausible deniability front while decrypting everything in the background; but right now, once again, there's nothing public even hinting at that.
Apparently, Mozilla is considering eventually deploying exit nodes as well though.
Finally, for those that will scream "child porn", it should be noted that a very, very small minority of tor traffic is actually linked to that type of content, despite what the DoJ says; the best estimates from the tor project is around 1.5%. This move by Mozilla is a good thing - amongst other things helping countless defenders of freedom in oppressed regimes speak up in safety.
A quick search seems to suggest it would be theoretically possible to have an 64 bits firefox talking to a 32 bits plugin-container loading, say, the flash plugin; it appears however that that would require an IPC bridge between both process to perform some sort of conversion (this suggests that somehow the way both process communicate is arch dependent ? I don't know enough about xulrunner to confirm).
There's a compatibility layer that apparently exists to do exactly that: nspluginwrapper.
Myself, I tend to avoid the headache and simply run the 32 bits version of Firefox even on an amd64 system.
later replicated by Toshiba
It's actually Toyota (and makes a lot more sense) - my bad.
But the "inventor" hooked up the meter, no?
No. The entire experiment was setup by the researchers themselves; the lab has no connection to Rossi, and none of the equipment came or was set-up by him. His only implication was to be here for the initial "fuel" insertion and the ash retrieval at the end, while being monitored (though that's more than enough to be suspicious of the alleged transmutation and suspect some sort of swap - still, it doesn't explain the excess energy).
Oh, he says words which he calls an explanation, but they fly in the face of already-understood theory, and he offers no explanations about why already-understood theory is wrong.
Agreed on this - it should be noted, though, that Rossi is not the only one that claims excess energy and transmutation using these kinds of mechanisms; look up for example the MIT NANOR devide (a small scale device that put out excess energy for more than one month straight), or the Mitsubishi transmutation claims in similar devices (later replicated by Toshiba). There are also other companies claiming similar things (Brillouin for one).
If this thing works (and that's obviously a big if), then I'd suspect Rossi discovered this mostly by accident, and that he has no precise idea himself of how it actually produces energy. IIRC, the few initial theories proposed are based on the idea of nano-scale lattices with trapped hydrogen inside; combined with some sort of excitation (EM usually, although not the only one that apparently produced some results) allowing somehow for the Coulomb barrier to be overcome at those scales and for a limited-scale, radiation-less (how ?) fusion to occur.
This is of course all pretty impossible given our current understanding of physics so if it does work somehow, it's wonderful news, even if it cannot be harnessed for energy; because it might lead to new, exciting physics.
I don't see that anybody checked the "reactor" coating materials for rare earth dopants.
Read the report (specifically page 8 and annex 2) - they actually analyzed the device's coating material. It was made of Al2O3 (and this was taken into account in the calorimetry), with no obvious other compounds.
While there are possible calorimetry issues here, it's hard to see an obvious one that would explain such a large measurement error; alumina IR transparency has been considered, as well as IR calibration issues (especially given the imperfect dummy test); both do not appear to be valid critics (see my comment here for details).
Given the extraordinary claims, extraordinary evidence is obviously required here; and this report definitely isn't that. Its experimental protocol and the results obtained are however more than enough to warrant further investigation; which may be hard given that this isn't like a "classical" experiment, that can be easily replicated - you basically need Rossi/Industrial Heat (the company that acquired Rossi's device and tech) to provide you with his black box and stay the hell away from the test (this is the first time he actually did that; and even here he couldn't help himself being present for the initial "fuel" insertion and the ash extraction at the end of the experiment - which render the isotopic changes inevitably suspicious).