Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:calling it (Score 1) 239

Iraq was not engaged in an invasion of Kuwait or Iran prior to the 2003 invasion; we fought the first Iraq war to push him out of Kuwait, and he stayed out. The Iraq-Iran war was long, long over before the 2003 invasion. Likewise, Iraq last attacked Israel in 1991, not at any time remotely near the 2003 invasion.

I do not have time to write an essay to rebut each and every one of your highly misleading or outright false "points." Your post is bullshit and a good example of Brandolini's Law. Suffice it to say that the primary justification for the Iraq invasion of 2003 was that Iraq, due to its possession of weapons of mass destruction, was a threat to the international community, and Iraq possessed no such weapons. When this was discovered to be the case, supporters shifted to things like Iraq's human rights record and occasionally various random crap like the stuff you brought up.

Comment Re:There is a difference. (Score 1) 589

I'd take those odds.

You'd be an idiot to take those odds for 2 hours of entertainment. A 0.5% chance of getting killed each day would mean you'd probably be dead within the year.

Risk/reward-wise, it would be better to knowingly have unprotected sex with an HIV-positive partner than to watch a movie where you have a 0.5% chance of getting killed. There's (supposedly) a 1% chance or so that AIDS gets transmitted through heterosexual unprotected sex per act. I imagine you'd get more entertainment out of the sex than the movie. And, even if you do get HIV, you'd still have a decade or so of good life left before the AIDS gets you. Blown up movie theater = dead right away.

If you're going to use statistics, make sure you know how to use them at least mostly correctly...

Comment Re:Why not ask the authors of the GPL Ver.2? (Score 3, Interesting) 173

So, it may upset you, but the foundation of the legal system is more or less until a judge rules on it, and until there is a legal precedent ... you don't really know if it holds water or not.

Bullshit. The entire point of having a legal system based on written law is so that people know what the law is without having to just try things and then see if the executive arrests them. There are places in the law that are rough and where you really don't know what a judge will do -- "new areas of the law" -- but, in most cases, you do know what a judge will do, because of statute and precedent in similar cases. This certainty is what gives the law its value.

The GPL is a fairly simple document. It's pretty clear what it means, so we really don't need a judge to tell us. This court case might clear up a few corner cases about the consequences of infringement (forced relicensing or simple injunction + damages), but it is effectively impossible the entire document will be held null and void. There's enough precedent that it is possible to conditionally license a copyrighted work that the GPL's general validity is not in doubt.

Comment Couldn't Find Parts (Score 1) 269

Some people over on Apple.com forums are claiming that the hard disk that went into the iPod classic isn't being made anymore and that Apple therefore was essentially was forced to discontinue the product, because they couldn't find parts for it. Obviously they could try to find another supplier, make the hard drives themselves, etc., etc., but I guess the ROI wasn't there for them to bend over backwards to keep it going.

Comment Re:You are not in control (Score 1) 113

No it doesn't. If a trait provides a reproductive benefit, and it is monotonically marginally beneficial, then life will almost always find a way to evolve it.

You're basically asserting without evidence that there's an invisible hand of evolution. There's not. Some mutations happen much more frequently than others, and, if a trait can only be expressed with a sequence of very rare mutations, it might take a very, very long time (as in, "will never happen in a trillion trillion years") for evolution to be expected to get there. Others sort of just happen, and not for any real reason or anything, at least as far as we can tell. Evolution will only destroy a mutation if it's significantly maladaptive. If a trait is only marginally maladaptive where it's present (example: blue eyes, much more maladaptive in very sunny places than in the north), it might randomly be carried forward, at least for ~100,000 years or so.

Comment Re:You are not in control (Score 1) 113

I've got another just-so story:

Deer will also disproportionally abort female fetuses during harsh winters. Offspring born after hard times are likely to be stunted and inferior. Even if they are disadvantaged, a male offspring is still more likely to reproduce, because the male reproductive system is simpler and therefore less likely to be affected by fetal malnutrition. So carrying a disadvantaged daughter to term, when she is likely to be less fertile, is a waste of resources.

The implications for reasoning with just-so stories is left as an exercise for the reader.

Comment Re:C is very relevant in 2014, (Score 1) 641

Bullshit. VLSI code is almost always verified by finite models, and many processors are verified down to the level of mathematical axiom.

Bullshit on your bullshit, no it's not, and no they're not, not even close. Hardware companies have a fetish for formally verifying floating point stacks because, 20 years ago, the fickle and vacuous mainstream press people decided one particular piece of errata in one particular processor -- the Pentium FDIV bug from 1994 -- was important for some reason, even though every processor ever made and used has errata. AMD took advantage of Intel's bad publicity to formally verify their own FDIV instruction -- JUST the FDIV instruction, mind you -- and then doing formal verification with floating point stacks became something of a thing. There's nothing more going on than that.

Take a look here, in the section "Errata": http://download.intel.com/desi...

Doesn't look like the "proved" that VLSI very well to me, although they doubtless subjected it to a fuckton of simulation hours. Which is what they should be doing; theorem proving software or silicon is, usually, a ton of effort for little gain. Simulation hours cost much less than developer time. Our processors would likely be 486 level today if the designers had to prove everything correct. If that.

Provably correct software code exists in small amounts, and it's emergence is inevitable.

Said the formal verification researchers, for 30 years or so now.

Comment Re:C is very relevant in 2014, (Score 1) 641

Impressive. But they verified about 9000 lines of C code, and, by their own admission, it's a brittle verification (meaning if they change anything substantial they have to do a lot of work to re-prove it). The specifically say, in one instance, that it took one man-year to verify a change to 5% of the code base. That's ~500 lines of code.

A year. To change 500 lines of code. imo verifying software is still more a gimmick than anything. We've been writing reliable airline software without formally proving it for over 30 years. It takes a ton of effort, but so does formal verification.

But thanks for the link, that's an interesting pig they made fly.

Comment Re:C is very relevant in 2014, (Score 1) 641

Look at IR in the LLVM project which has allowed an explosion of languages that can enjoy most of the same compiler optimizations that the C family enjoy using this principle.

Umm ... LLVM is a fairly conventional, if well-designed, compiler, and its backends certainly do have to have a model of the processor, and know how to generate assembly from the IR, and all that. GP is right: you can't get away with no one knowing how the processor works.

And provably correct code is still a pipe dream.

Slashdot Top Deals

If you have a procedure with 10 parameters, you probably missed some.

Working...