Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment open letter on the bug fix culture of peer review (Score 1) 786

Dear Michael,

The scientific high ground in this matter is to admit that the original peer review process sucked, lacking as it did any reviewer with sufficient statistical expertise to detect subtle methodological errors, and further, to admit that it does not require a PhD in any discipline to point this out (nor, especially, a peer-reviewed paper) if it happens to be true that the paper contained subtle methological errors (which it did).

It's all well and good that the main result itself seems to have held up under additional scrutiny brought to bear once these admittedly small deficiencies were aptly pointed out. This does not change the fact that the original peer review sucked.

(Perhaps you were merely lucky that your result continued to hold water after your subtle statistical errors were properly addressed. This is why a result that merely holds up isn't worth much in a high stakes debate. Proof by hindsight does not strike me as adequate given the magnitude of societal change that effective mitigation seems to require. To me, the stakes seem to be high enough to demand that critical links in the argumentative chain are right in all necessary respects before they are attached to a giant political lever; or, failing to achieve the almost impossible demand of being right in all essential particulars in peer-reviewed published paper V 1.0, that the culture of climate science embrace with a blazing passion the art of the mea culpa bug fix.)

Ordinarily, the peer review process is not expected to be 100% water tight, as the standard pace of science is stately and the stakes are modest. In this example, you paper served as the fulcrum of the biggest political mud fight of the late twentieth century. If climate scientists think that the fate of humanity and the planet lies in the balance, there shouldn't be even an epsilon gap in the quality of the peer review process.

You can't have it both ways without looking like a complete idiot. And it sure doesn't help your cause to look like an idiot when you're being attacked in a thousand illegitimate ways.

Thanks for your attention to this matter. I look forward to the future scientific culture of rock solid peer review in the first instance.

Live long and prosper,
J. Random hockey fan

(By some strange twist of fate, this was the first item to cross my feed after spending thirty minutes flipping through Popper's The Logic of Scientific Discovery which I'm presently reading to discovery why David Deutsch, in particular, praises it so highly.)

Comment Re:Well Then (Score 1) 148

A funny screed, but in the end just as wrong as what it debunks.

The Mossad does not have a bottomless budget. As a result, they generally fabricate pieces of uranium shaped like cellphones in hundred lots. They have even more expensive intrusions, which they fabricate in lots of ten, and then they have the most expensive intrusion of all, which is fabricated like a James Bond concept car (not the car that Bond actually gets, but the one he might get ten years from now).

It really does matter to edit your SSH configuration file to bump yourself up from 10^-9 cost bracket to the 10^-6 cost bracket.

Mossad is not magically except from the 80-20 law. They still try to use the cheapest effective method, and hope to haul in 80% of the catch for 20% of the effort.

If you're in the 99.999th percentile of pure evilness (backed by a private island gold reserve), it's no longer about casting a wide net, and moreover, you already know for certain that you're facing a Mossad-level adversary and you can proceed directly to paranoid schizophrenia.

If you're only in the 20th percentile of pure evilness (you fib on your tax return and download porn off some Shmoe's open wifi) it might just be true that Mossad-level adversaries filter feed at the cost-effective 10^-9 screening bracket.

They went to all this trouble to subvert NIST not because they couldn't break things otherwise, but because they couldn't afford to break things otherwise at the largest possible scale.

Comment Re:March isn't the only weakness. See WEP - RC4 br (Score 1) 148

In 2016, the attacks on ??? expand to ???. I'm not betting MY customers' security on the answer.

Good luck with having any customers by the time you whittle away every protocol with a potentially expandable attack surface.

As we don't even have a formal theory of quantum computation yet, but we do know that some things can be computed by quantum methods, I don't think any current protocol is entirely exempt from worrying cracks in the plaster.

Whatever you like to tell your customers, there's just no escaping this hard business of having to make a judgement call about which cracks to worry about and which to ignore.

Comment Re:Hiding is not effective (Score 1) 130

you will open that door

If your disk contains a larger number of large files with the names entropy$N (of which, the vast majority are actually full of entropy) the ability of the judge to distinguish a door from a wall declines to epsilon, at which point the judge might elect to sweat it out of you nevertheless (you're entirely screwed in this eventuality once you have no more passwords to divulge), but then so is the judge who gives a shit (some do) about the logical justification for his abuse of power (he can't actually know you're being willingly non-compliant—even more so if the file exercise_in_civil_liberty.c is found on your system containing code capable of having created those N-k entropy files).

[Yes, I'm aware that any stray disk subsystem metadata must support this story to the nth degree.]

Comment GotW #50: vector is not a container (Score 1) 80

Alex: I regard my first encounter with the STL (very shortly after its first public release) as one of the great eye-opening moments in my software development career. Unfortunately, as I'm sure you well know, quality of implementation issues in compiler support for the C++ template idiom cultified (i.e. made cult-like) the deeper principles for at least five (if not ten) years thereafter.

GotW #50

I've long regarded the criticism against vector[bool]—I'm not going to fugger with angle brace entitiesâ"not being a container were misguided. Of course, it *must* be a container for reasons of sanity, but to portray the problem as a standardization committee brain fart seems to miss the main point.

Just as STL introduced a hierarchy of iterator potency (that was the main technical innovation behind the STL, was it not?) one could likewise introduce a hierarchy of container potency. The container we ended up returns interators which promise a dereference operator returning an lvalue (it's been a long time since I've used this terminology) which is why the following statement from the linked discussed is expected to work:

typename T::value_type* p2 = &*t.begin();

But actually, of all the uses of containers found in the wild, I highly doubt that more than a small percentage (potentially a very small percentage) exploit the property that interator dereference returns an lvalue rather than an rvalue.

The net effect is that the standard containers promise us a potency we rarely exploit, yet the burden of this potency is universal. Forsake it in even the smallest way, and you'll be shouted out of the room for non-containerhood.

We could have handled vector[bool] by changing the standard container to not promise IDLV (container iterators dereference to lvalue). In cases where the programmer goes ahead and tries to do this, he or she obtains a simple syntax error (ha ha ha) and knows to either reformulate the algorithm to not require this property or to go back and add a specification override to the container setting the IDVL property to true.

With IDVL set, vector[bool] does not specialize.

With IDVL unset, vector[bool] will specialize.

Problem solved, except for the language overhead of introducing (and managing) a container strength hierarchy.

But instead, Herb Sutter decides to write this:

Besides, it's mostly redundant: std::bitset was designed for this kind of thing.

Doesn't that attitude make you want to pound your head upon a table somewhere? Seriously, if one repeats that remark 1000 times, we could almost make the entire STL go away (and return to the world we would have had instead had the STL not rescued us from parsimony mass produced.)

Clearly, there was enough of a pain point in the C++ standarization effort around iterators that the STL gained traction exceedingly quickly (and very late in the day), yet the C++ community is also extremely hidebound about minor pain points, as evidenced by Sutter's explanatory tack.

Obviously, there were some advantages in demonstrating that the STL approach could achieve performance comparable to C (and in some cases, better than C) in proving that the STL was not just another abstraction gained at the expense of runtime overhead (which all looks fine until five or ten different runtime overheads—however small each of these appears in isolation—begin to interact adversely).

But very quickly, the initial quality of implementation issues and the quirky (to be extremely kind) limitations of the C++ template mechanisms threw up some major walls in pursuing the underlying ideas behind the STL more extensively.

So, my question is this, more or less: in retrospect, was the early victory with C++ worth it (it's extremely easy to understimate the value of having a good idea noticed at all), or does the eternal puberty of the C++ STL continue to grate?

Comment Could? (Score 1) 292

The reported findings, if corroborated by further inquiry, could add fresh fodder to an ongoing debate over the Third Reich's ultimately failed attempt to secure an atomic weapon.

If, could, fodder, ultimately, failed, secure. Every one of these words as cast is a pablum-brained Orwellian nightmare.

The German Jesus nut was supremacy in all things. After setbacks in The Battle of Britain and Moscow/Stalingrad, the Germans found themselves in a situation where they needed to tighten their belts (both militarily and technologically) and settle for supremacy in merely the most essential things.

It was this belt-tightening challenge they bungled like crazy. Belt-tightening somehow wasn't in the German lexicon.

There was a scold Fuehrer who lived in a shoe
with so many children, he couldn't kick through;
He gave them some broth without any bread,
Then stripped 'em unsoundly to hiss boot instead.

Well, it was a try anyway, but it does capture the main idea.

Comment the writing was on the wall after the first movie (Score 1) 351

There were enough tells in the first movie that I decided to skip both prequel sequels. My only regret concerns the movie not made.

The problem when you have a strong emotional investment in something is that one's instinct is to give it one more chance. By the time you've watched two bad movies, you're almost pot-committed to watch the third.

It takes a special will to abandon a franchise without falling into the emotional mulligan trap, and so there's ultimately little incentive for Jackson to not do what he did.

I'm slowly learning. My loyalty function has now evolved to where it's almost vertiginous.

Comment Re:Precious Snowflake (Score 1) 323

A simple spanking is not "physical violence".

No, it's aversive physical dominance. Any more hairs you would like to split, or are we done now?

Aversive: the recipient is not pleased about it.

Physical: there's a smacking sound.

Dominance: the recipient's preference in the moment doesn't count for shit.

Maybe he or she will thank you later with a greater understanding of the situation. Or maybe not.

To my mind your story could be an argument for more effective barriers. If you're going to make a barrier to enforce safety, go big or go home. Otherwise you're just conducting a first lesson in Jr Steeplechase.

Comment it wasn't about text-to-speech (Score 2) 292

From Hyphen Hate? When Amazon went to war against punctuation

A ridiculous number of people have gotten caught up in the whole âoehe used a minus sign instead of an ascii hyphen! The bastardâ controversy that has followed this thread around and has spilled over into any number of internet message boards. First of all, let me be clear. The issue was not with my use of a minus sign. The issue Amazon had was that someone had complained about hyphenation. Second, I have since gone back and checked the original file on the Kindle text-to-speech app and it renders fine. No issues. [my emph.]

<acerbic>
These days 75% of all Slashdot posts seem to involve drilling down to get the original story straight. Tell me, when did a mass-confusion clusterfuck become the new nerd foreplay? Kindle typography, meet declining Slashdot editorial standards. You've got more in common than you think.
</acerbic>

Comment Re:Copenhagen interpretation != less complicated (Score 1) 197

Determinism = fail

With entanglement, we have an FTL coupling that can't be used to convey classical information.

Why can't we have a similarly knackered stripe of determinism, one which can't be used to shatter the illusion of free will? This would be a kind of determinism where even if you sort of know it's there, it makes no damn difference to your interpretation of local space.

Think big, grasshopper, think big.

Comment the sociology of accidents (Score 1) 175

The only "accidental" discovery in science is the discovery one could have stretched out over a great many more research grants if one had better anticipated the scientific windfall.

Of course, we do tend to refer to the outcome of bad planning as "an accident" concerning our hominid prime directive, so perhaps there's no help for language after all.

Comment Re:Does the job still get done? (Score 1) 688

You do realize that a narrative of this type can be fashioned around the prevailing conditions of all human societies at all points in human history?

America is an especially big and complex society, so one needs a correspondingly large and complex boogie man (though nevertheless, reductive to the core).

In the gospel of the one true fracture, defining yourself as against something only serves to throw more fuel on the fire. In reality, complex systems have hundreds or thousands of fault lines, and it's not always the case that the largest fault line is hovering around the supercritical state. Unless we all agree to obsess about it. Then the story self propels.

The slow march of AI is going to spin our a thousand fault lines. Get yours today!

Comment Re:The interne cables are tapped... (Score 1) 160

Next it's not that hard to develop mathematical techniques to analyze text and language in posts ...

Budget projects much? "Doable" and "easy" are not the same words. I'm guessing one person out of a hundred in the general population could take a reasonable stab at developing such an algorithm, and only one person out of a thousand could be considered a natural talent.

The first 20% of the work gets you to sqrt(sqrt(7e9)) as your mean perplexity, which is simultaneously impressive and yet not terribly actionable. And then the difficulty curve shoots off into the exponential regime.

Slashdot Top Deals

E = MC ** 2 +- 3db

Working...