Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Virgin is a wealthy company backed by a very wealthy man.
And, according to Tom Bower, $200m in subsidies from the state of New Mexico, courtesy of a starstruck Bill Richardson.
Branson's core businesses are built around operating monopolies and extracting subsidies from governments.
Rossi's time in prison was due to uncleared allegations of tax fraud and toxic waste mishandling [wikipedia.org], which even if true would have little to do with this story
He served time for them so they probably are true; and yes, this has everything to do with the story. This man lied to the government about his tax liability, and apparently lied to everyone with a false claim to convert toxic waste into oil.
Occam's razor sometimes shows that the seemingly improbable is actually the most likely explanation.
LOL. No it doesn't.
Occam's razor says (as a very basic summary) that in the absence of evidence or specific information, the proposal that requires the least assumptions is probably correct. Or, more conversationally, that in the absence of any better ideas, the simplest guess is probably the truth. The simplest guess here is that the guy is a fraud. The non-simple guess is that the guy is not a fraud and that our understanding of matter and energy to date (which is based on a huge body of actual scientific measurement and observation) is all wrong.
I think you are confusing this with Spock/Sherlock Holmes say that when all the impossible proposals are eliminated, the one remaining, however impossible, must be the truth. That's a good maxim to live by; the problem is that we haven't eliminated the possibility that the guy is a fraud.
I can't think of a single good technology that originated at Sun
ZFS, dtrace ?
On the contrary. What we have in filesystems at the moment is fragmentation.
We need people pitching in with stabilizing and fixing one major FS in Linux. It looks to me as if that should be btrfs.
Back in the day, the cutting masters from which LPs were pressed were inferior (the sound had to be modified to make it fit on the LP - longer tracks had to have their levels cut so that the track pitch could be reduced to enable them to be pressed). There is absolutely no way any objective person could believe that the compromised masters, which were modified in order to fit on vinyl, were in any way superior to the clean digital copies - except for pop music which was exposed to the loudness problem.
These days I would have assumed that the same problem would exist so I don't get this about modern LPs at all. If I want the sound of an LP I'll listen to a CD while scrunching a packet of Rice Krispies next to my ear.
When ripping it checksums the CDs and confirms that they match in a database where others have submitted their checksums of the same CD.
I have CDs which date back to the 80s which, according to this checksum, are bit-for-bit accurate.
Try doing that with an LP.
I have used another major static analysis tool at work, one of Coverity's competitors. And more than once have had the "if you had paid attention to the static analysis reports this problem could have been solved much more quickly and cheaply" discussion. In one case several weeks were spent chasing a particularly subtle and nasty memory tramper - which was found to be showing up in the analysis results.
False positives are certainly a concern. There is a tradeoff here in terms of dealing with the time spent (re)structuring the program so that they do not occur - a matter for the project lead. The same is true of compiler warnings. Best invest the time to clean them up and configure your build so that it breaks if they occur. You'll kick yourself later if you hit a bug that was revealed by a warning which was ignored.
I know that static analysis cannot catch all problems (duh) but I was curious, as this seems like a fairly classic example of accessing tainted data which in many circumstances the analyzers can spot. The blog post referenced above explains why this is.
Ah I should have checked that. Thanks.
OpenSSL is on the list of projects scanned by Coverity.
I wonder why exactly Coverity did not catch the heartbleed bug. Most likely, the scan wasn't set up to deal with OpenSSL's use of it's own internal heap management routines. That's something that I would have thought should be fixed right away.
Act as registers ? Huh ?
Article says "In comparison, a lithium-ion battery typically starts out with a storage capacity of 200 mAh/g but maintains it for the life of the battery, Pyun says."
Hmm. I have lithium ion batteries that can't hold a charge at all.
And it's only partially to do with how they're used. Lithium ion batteries lose capacity while in storage. Which is why you should never buy a used, or a new-old-stock one.
That's part of why LLVM is better than gcc today.
Certainly, the project has obtained its objective of being a simpler, faster compiler free of the FSF's politics.
But it isn't "better than GCC". It is targeted pretty much exclusively at x86 and looking at the project's website many features are missing from other architectures (such as the assembly parser I note). I also see no sign of advanced GCC features such as stack smashing protection, mudflap and so on.