Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Cofactor F430 (Score 4, Interesting) 171

Cofactor F430

Forget the organism. This is about the advent of a novel reaction pathway, that scales on the availability of nickel. Surprisingly, geology might have something to say on that score. Any vigorous reaction pathway that bubbles madly away at an oceanic scale is almost certain to colour the infrared signature of our thin gas membrane. Imagine if everyone on the planet had an F430.

There's a lot to like about this hypothesis. I've seen worse. To determine exactly how this pathway becomes prolific at global scale would take decades of further study. It's as yet a humble beginning, of the kind that sometimes pans out.

Comment the altar of the double standard (Score 1) 231

Would any other war hero have received the same treatment? The question here is double standards, surrounding the secrecy of Turing's work, the eternal nature of Turing's crime (does this remind anyone else of the war on drugs?), and the severity of his sentence.

Take Brian Carbury for example, an "ace in a day" New Zealand fighter pilot.

After leaving the RAF, he lived in England until his death in July 1962. In 1949, he along with three others, in a trial at Princes Risborough Magistrates' Court, was found guilty of two offences relating to the illegal export of Bristol Beaufighters to Palestine. Each man was fined a total of £100. [slashcode sucks]

My emphasis. In modern parlance, that sounds like an ITAR transgression, for which the current maximum sentence is detainment without charge. Let's see here. Door #1: a £100 fine. Door #2: chemical castration. Cue the game show music for the tense decision making. Tick tock tick tock, what will he choose?

Because of the upper crust attitude toward secrecy, Turing was positioned as an ordinary sex offender in the mind of the public instead of a flawed hero--no let me fix that--an outcast hero whose only flaw was being born into a culture of soft vegetables and spittle-spewing homophobes.

His chemical castration makes one wonder what the proportionate punishment might be (far, far worse than chemical castration) for a white-collared repeat pedophile, or for the white haired goats or moral opprobrium who vainly sheltered this behaviour so as not to publicly besmirch their high moral ground.

It was a crime at the time. Yes, the whole social structure was a crime at the time, and then some.

Comment Re:Root (Score 1) 153

While that would have been nice, it is very debatable if it is wise.

If they ever update The Fifth Discipline: The Art and Practice of the Learning Organization I'm sure they can cull a hundred pages of business-speak blather to make room for an additional chapter on the pernicious feedback loops of responsible disclosure.

Normally we allow markets to punish corporations for sloppy work. Causing grave identity harm to your customer base is the kind of sloppy work deserving of punishment. And then, you know, the innovation of the private sector swoops in, as it must under Hayekian divine law, to save the day.

But no, as usual we turn things upside down when the going gets tough: unpaid security researchers provide valuable QA in hushed conversations to deep-pocketed corporations, who may or may not choose to do anything about it.

Here's a suggestion: if a corporation has any unfixed security flaw they've known about for more than three months, they no longer qualify for responsible disclosure.

Customers when purchasing their toys can check the reputations of vendors in having their responsible disclosure pants down, aka those malingering issues not fixed because they value their bottom line more than their customer's peace of mind. In Hayekian theory, these are supposed to align by the divine grace of the invisible hand, but sometimes society weaves clever narratives to prevent this from happening.

The true Hayekian solution would be to allow security researchers to auction off the fruit of their labour to the highest bidder, black or white. This might be Samsung, should they care enough to protect their reputation by dipping into their bottom line.

Comment assimilation rape (Score 5, Interesting) 184

Wanna revisit your recent rants?

I can't stand how every slashdot story submission has to end with a pink flamingo smoke grenade. I'm guessing that sober "just the facts, ma'am" submissions still exist, but rarely make it through the selection hoop of our post-counting overlords.

I have several online pseudonyms which I make an effort to keep separate. I rarely post the same idea under more than one identity. If I post it here, it doesn't go there. I prefer to keep things separate so far as I can. I also have some background in computational linguistics. I've known for fifteen years that there is absolutely no way to win this battle long term. Only the most insipid comments will escape long-term annealing. If the word "gay" is the all season tire on your social media K-car, then your identity is safely concealed within the deep-wank weeds.

If every post you write contains colourful language or idiom such as "all-season tire of deep-wank camouflage" you're toast and you know it, clap your hands. Merely getting my possessives and plurals and possessive plurals right more often than not narrows the net substantially. I might pedantically write Harry S Truman without putting a dot after the S (Snopes: "Although the 'S' was not technically an abbreviation and therefore did not need to be followed by a period, Truman's full name was generally rendered as 'Harry S. Truman' during his lifetime ..."). I make use of colons, semicolons (these come and go), mdash appositives, and parenthetical side-notes--at least one of these in almost every paragraph I write. I post way more links than the average person. My thoughts meander. There is playful use of language with double readings. I subvert cliche to achieve double readings that enable me to circle away from my target, then loop back from an unexpected angle. My unit of thought is the paragraph more so than the sentence.

Even with all those signatures, originality in word selection is my neon tattoo. The corpus analysis algorithms likely don't do much (yet) with originality. Hard to characterize. For a while my anonymity might pass through the gun-metal algorithms unmelded by virtue of my writing being too bright and distinctive and easy to trace. But not for long. Even the fractal filigrees of originality will be coded eventually. (Pay no attention to the alliteration: an accident, not a stylistic signature.)

Frankly, my dear, I don't give a damn.

This is about respect. We all live a double life, pretty much all the time. We speak differently in front of our mothers (most of us) than with the lady-killing rough necks at the peanut bar or power tie horn-dogs at the chichi sushi bar.

I value anonymity because I don't wish to own everything I say on a literal level, stripped of context, devoid of my original conceit or persona.

I happen to regard linearity as a social construct. Humans are not inherently linear in cognition or constitution. We learn how to cultivate linear facades in our areas of competence (but not necessarily around the edges: this is why a competent accountant consults his astrologer Madam Threenipple). If you like the primary facade you have, and it suits all purposes, then I suppose you'll see the charm in proclaiming it from the RealName rafters.

If you're a Baptist homosexual (I've known a few), you might wish to string your public identity by separate ropes.

Or maybe you've just got things to work out. You're figuring things out on the fly and trying them on for size and you don't wish to fall prey to the Joseph McCarthy clean-nose auto-da-fe "have you ever". Implication: Anything you've ever said will be permanently recorded and will classify you irretrievably. This despite 0/1 statistics never passing T-scores. If the same person also has an NRA membership and has been a career employee of the Hoover Institute for two decades? Still a communist. Ten times more dangerous.

The kind of person most willing to spread their wings has the most to lose in the sphincter-wattled sport of cherry-picking thought crime. For this reason I will continue to compartmentalize my personae long after the velvet drapes are hoisted by the Peeping Tom ad-bots of the TSA. Putting my own name on everything I write would almost amount to granting permission to these parties to engage in assimilation rape.

Comment Re:More maths (Score 1) 328

It's getting harder and harder to find obtain premium power supplies rated under 400W. Sure Seasonic makes some, but my preferred vendors don't carry these models. On my last iteration I got pushed up to the 550W bracket on a file server I know won't use more than 300W packed with disk drives.

BEWARE that many high-efficiency supplies are extremely unreliable running off a cheap, conventional UPS.

Comment some TARDIS chit chat over numbers ungodly (Score 1) 60

Astronomer steps out of TARDIS under a bright moon.

Astronomer: Isaac, guess what? First: We've discovered time travel. Second: Our telescopes can now see all the way back to 300 million years since the, uh, beginning of, uh, all that exists. Aren't you impressed?

Isaac: What a stupendous lie and intrigue to greet this fine, rotund moon! Let me process that on its face. First: Light has a velocity finite after all, and either this velocity is slower than I surmised or the creation is larger than I dared conjecture. Second: Either the haste of light exceeds the velocity of leaving from objects so large as the sun might be, or light is impervious to restitution gravitational. Third: God fudged the creation story by seven multiples of both hands to conform with Aramaic notations of quantity. Fourth: This dorky astronomer thing is not just me, but a blight eternal.

Astronomer: Not bad, Isaac. Four out of four, from a suitable reference frame. You're the man.

Isaac: Indeed I am. You suggest light looks different depending on the observer? Only light confuses me so.

Astronomer: Close. Light looks the same. Time and space, they change instead.

Isaac: Oh, don't think I'm so foolish as to try to write down equations such as that. How malicious to taunt me with a puzzle that might [pauses for a moment] perhaps even have a viable geometry. [shakes head violently] Madness! It's my formula for the transmutation of gold you're after, isn't it? You've come back in time to distract me from my rightful legacy! Good day to you, sir.

Astronomer: Gravity makes gold, Isaac. You're thinking too small.

Isaac: If gravity made gold, the stars would capture and keep it.

Astronomer: Gold destroys stars, Isaac.

Isaac: Destroys stars, but not planets? A likely story.

Astronomer: A planet is just a star too small to either ignite or collapse.

Isaac: One nonsense after another. Gravitational collapse is a singularity forbidden. Where does this end?

Astronomer: Shucks, I hate to push you in this direction, but in truth your glassware will answer you at the end of a long road. By this you will know: table salt dissolved in water dissociates into two constituent elements. One of these come from a group of elements with similar properties we in the future term "halides". Halides reacted with argentium create a family of substances some of which exhibit physical change upon capture of light, including forms of light undetected by any eye in the animal world. A modest flux of this invisible light is released in the natural transmutation process that begets lead--which perhaps you know as plumbum. Once you have the seeing emulsion that never blinks, point your prism at the stars, Isaac, and be prepared for some rude surprises.

Isaac: Natural transmutation into plumbum? This is a joke most foul. Pray tell, what regulates this alchemical sacrilege attested as you claim from the unseen by this elixir of salts and metals?

Astronomer: God plays dice, Isaac, with an exceptionally steady hand ... and the patience of a saint.

Isaac: Enough! Enough of your heathen smirks and portly numbers! Antiquity as a blink of the eye in God's creation. What rubbish! Be off with you!

Astronomer: Farewell, then, my good man. May you neither underestimate nor inhale your aqua fortis, cleaver of matter.

Isaac: At last, a sensible word now that the joke has ended.

Astronomer: So long, Isaac, time waits for no man. [Pffft.]

Isaac: [Looks up at sky.] Stars, I see you, with my physical orbs, and from these orbs I shed tears of brine. The smug fellow weaves a deft braid of fact and fancy under a charmed moon. Has God indeed frozen time and bent space to favour your ethereal flux? And yet I can not say it could not be so. Why these folios unforeseen within the book of nature unknown to scripture or by revelation? Why send your faithful and humble servant this man of riddles to mock your immensity with numbers ungodly? Perhaps it is so that the human magnitude is but a puny magnitude against a vastness so arranged that in the grasping our bound recedes.

Comment Re:Wrong question to ask (Score 1, Interesting) 112

I would think that the resurgent interest in functional programming would ameliorate this to some extent but most of the JS I've worked on is crap and I would strongly recommend against using it unless you have a clear picture of why you would want to do so.

I'm taking an online course on MongoDB. This caused me to look into JavaScript and Node after a long hiatus concerning the web application stack.

We're deep into the territory of people confusing a tool with the culture of its use. JavaScript appears to be a fairly decent language of Scheme derrivation, minus one horrible blunder on scoping rules (to be addressed, perhaps in the worst possible way, in the next language standard). I'm given to understand that many of the horrendous web applications out there involve heaps of disorganized JavaScript maintaining application state via DOM edits. What could possibly go wrong? Is the language at fault, or the culture of its use? Hint: one of these answers takes more than five seconds to justify, so you might find it time efficient to run with the first answer that comes into your head. No one ever got fired for spewing blame in the widest possible arc.

Node.js itself is a strange beast. There's something to be said for language plasticity, of not having to prematurely bind code to server or client. Adopting an entire world-view to make this possible is the domain of fools and zealots. Today's zealots are tomorrow's gurus. Gurus don't grow on trees, after all.

I write a fair amount of code in R, also from the Scheme family tree. Are there any people out there who think that JavaScript sucks, but R doesn't? Wow. I'd be interested to hear that argument. No wait, with an infinite number of monkeys, all languages suck, by the simple statistical principle that natural languages are denser in idioms of derision than admiration. It's almost as if our languages know something about us. Hey, no one ever got fired for spewing blame in the widest possible arc.

I've said this before and I guess it's my fate to keep repeating this observation. There's a litmus test across the language like/despise spectrum of attitude toward one's fellow humans. If you hate your fellow humans, you most favour languages that offer facilities of orthodoxy, control and prevention. On the other hand, if you believe that your project is so difficult that only by the miracle of collaboration and teamwork can the project be completed at all, then you tend to value languages with the fewest expressive impediments, whether the nail-gun interlock works or not.

I've read many times about small teams doing big things with C++ and remarking about the language's unbearable legacy of self-harm "we're professionals: we rarely have a problem with it". If you're an ambitious code monkey and the coder beside you working on the same project was fathered by a dangling pointer, you really hate C++. No power under the sun can prevent the dangling pointer from making you also look bad.

I would certainly use Node.js, but with great caution. It won't maim your appendages like C++, but it will paint you into a corner (a certain asynchronous world view) if you don't understand its long term limitations. This is why I ultimately rejected my online MongoDB course. It taught you enough to get yourself into trouble (deep trouble), but not enough to get yourself back out again.

Comment Re:Never met anyone who uses it. (Score 1, Troll) 245

Well, I know people who use FreeNAS which is based on FreeBSD. I think the thought behind the BSD license is telling. It basically says you can take the code and nothing in return is expected, which is exactly what they get.

You must be a math major. An economist might have stopped to ask whether Clang/LLVM fell off a turnip wagon.

Anyone who is using FreeBSD properly soon reaches the point where they are thinking about other things they need to accomplish. You probably haven't met too many people having long conversations about their wonderfully reliable plumbing, either.

Linux seems to regard rewriting their firewall facility from scratch as a desirable social activity. Of course if you meet someone who enjoys replacing their deck every second year, you'll hear a lot more about advances in deck paint.

Comment skepticism spectrum disorder (Score 1) 655

It was going so well, and then you blew it.

Ergo, any scientist that comes out in favor of AGW or against AGW is not acting as a scientist, but as a partisan political/ideological advocate.

It's not political until the scientist advocates taking action to do something about it. Really, there isn't any possibility of doubt that enough CO2 would increase the planet's surface temperature (look at Venus). There is plenty of room to doubt that AGW might become a human (or planetary) crisis in our lifetime. Even if you accept that we are headed down this path, there's a huge debate remaining on what we can reasonably do to prevent it (if anything) and whether the benefits of diverting those resources outweigh the costs of what we stop doing instead.

Personally, I have next to no doubt that we'll see irrefutable evidence of global temperature increases attributed to human causes with a decade or two. I tend not to be as alarmist as most scientists in the field (it wouldn't be the first time that the people closest to an issue magnified the prognosis), but I wouldn't place a large stake against it, either. I have moderate skepticism over whether AGW will impact the planet as rapidly or severely as some of the forecasts. I have a high level of skepticism that any of the proposed measures (such as a Kyoto accord) are going to deflect this outcome if we are indeed already on this path. I have an extreme level of skepticism that radical reshaping of the earth's economy and geopolitical landscape is a prudent response, even if there's a theoretical hope we could actually succeed in averted catastrophic climate change. There's a fairly large scope for messing up civilization by engaging in radical politics. Am I the only person who worries about WW III resulting from attempting to shut the carbon economy down and not succeeding?

I am also fairly certain that we're about thirty years away from understanding climate science to the level required to confidently plan for the magnitude of the interventions that might be required. A five year old can point to a drowning man, but can't swim out to pull him in. Our climate science is like that five year old.

I don't even know where to classify myself as a denier or a believer. I seem to suffer from skepticism spectrum disorder. Surely I must be mentally ill if I can't be pigeon holed on one side or the other.

Comment twin towers of bias (Score 1) 339

If the previous climate science was any good, meaning that future estimates were unbiased expressions of the best available current knowledge, then p = 0.5 that any single factor they drill into produces either more (or less) than previously estimated.

If the media coverage is unbiased, we hear about both cases equally often. For every "Oh my god we're all going to die" headline there's a corresponding headline "Small earthquake in Chile, not many dead" (apparently this headline once made it past a sleepy night-desk editor).

Comment Re:Third option (Score 4, Interesting) 536

Maybe - and admittedly this is just a guess from my fairly ignorant viewpoint - it's a very hard problem. How can it be that after 100+ years of industrial development, we're still heavily reliant on internal combustion engines to get us around?

This is a good illustration of the wisdom in Thinking Fast and Slow: the people most likely to highlight what they don't know proceed to the most sensible conclusions.

I haven't sat down in front of a keyboard to write code since 1985 without this issue foremost on my mind. In the majority of serious programs what the program does is a tiny minority of what you need to think about. Dijkstra wrote some chapters where he illustrates that some programs will actually write themselves if you adhere rigorously to what the program is allowed/not allowed to do at each step, and bear in mind that you need to progress on the variant.

I do everything humanly possible when I write code to work within the model of tasks accomplished / not accomplished rather than the domain of everything that could possible go wrong (error codes). It's a lot harder when error codes unreliably signal transient / persistent error distinctions. I view programs as precondition graphs, where the edges are some chunk of code that tries to do something, typically a call into an API of some kind. Who cares if it reports an error? What you care about is did it establish the precondition necessary to visit edges departing the destination vertex in the dependency graph?

Dijkstra also cautions about getting hot and bothered about order relations in terms of which edges execute in which order. In general, any edge departing a vertex with all input conditions satisfied is suitable to execute next. Because he was so interested in concurrency, he assumed that when multiple valid code blocks were viable to run that scheduling was non-deterministic. It can make a difference to program performance which blocks are executed in which order. We enter the domain of premature optimization much sooner than we suspect simply by writing our code with overdetermined linearity (few languages even have a non-deterministic "any of" control operator).

I had a micro-controller application in which there were many variable references of the form a->b->c->d. This is hell to be strict about precondition testing.


bool ok;
ok = a != NULL && a->b != NULL && a->b->c;
if (ok) {
    ok = fragile_api_call (a->b->c->d);
    if (!ok) sideband_error_info = ERR_YUCK;
}
if (ok) { // lather, rinse, repeat
}

// ...

// end of graph, did we succeed or fail?
if (!ok) {
      barf_out (sideband_error_info);
}
return ok;

This is the fall-through do everything possible and not a thing more programming idiom. It doesn't end up highly nested, because you flatten the dependency graph. Assignments to ok often occur at deeper nesting points than where they are later tested.

This kind of code often looks horrible, but it's incredibly easy to analyze and debug. The preconditions and post-conditions at every step are local to the action. Invariants are restored at the first possible statement. For the following piece of code, the desired invariant is that if thingX is non NULL, it points to a valid allocation block. In the entire scope of the program, this invariant is untrue only for the duration of the statement whose job is to restore the invariant.

bool ok = true;

void* thing1 = NULL;
thing1 () = malloc (SIZE_MATTERS);
ok &&= thing1 != NULL;

void* thing2 = NULL;
thing2 () = malloc (SIZE_MATTERS+6);
ok &&= thing1 != NULL;

if (ok) {
// more logic
}

if (thing1 != NULL) {
        free (thing1); // this violates the invariant
        thing1 = NULL; // and this restores it
}
if (thing2 != NULL) {
        free (thing2); // this violates the invariant
        thing2 = NULL; // and this restores it
}

Error handling involving deeply nested if statements is a nightmare to maintain. The only thing stopping the programmer from flattening it out as I've illustrated is a high pain tolerance for the variable ok. Loops do need to be nested. This is good use of indentation.

Note that event loop architectures often push your code in this direction in any case, and you often do end up having to cope with non-determinism in message consumption order.

A robust program is one which does exactly what is possible, and never anything more.

An introspective program is one usefully reports why it did less than you hoped.

Introspection matters less to robustness than one would think. It proves a lot more useful when dealing with conflicting sources of authority (such as CSS or URL variables vs MIME encoded variables). Getting your overrides muddled is a whole different thing. A robust program won't crash if you get your overrides muddled, even if that results in contradictory parameters. It will just decide the operation is unsafe, and not do very much at all.

I take offense that there hasn't been a lot of hard thought on these issues in language design. Dijkstra had half the problem figured out by the mid 1950s. The rest is human sociology.

That said, if I had language support for

ok = a != NULL && a->b != NULL && a->b->c;

I'd use it. Instead they've given me managed memory.

Comment wise words from Tony Soprano (Score 1) 260

Commendatori

She comes on to Tony, but tension is also resolved when Tony reluctantly informs Annalisa that to have a sexual relationship with a business partner would be bad for business and states he does not want to "shit where he eats".

120 fps with FSAA / rock solid stability / effective driver support: pick any 2-arctan(epsilon)

Blob support: Basically a coin flip. You might get a "works for me" you can live with, or you might not. When things go wrong, they usually stay wrong. Harmful to the open source ecosystem in the long run.

Open driver support: Good to great for carefully selected older product lines a generation behind the performance curve; sometimes excellent if you can shed the most extreme features; Chinese water torture otherwise.

I'll never build a game machine again that I'm also intending to use for real work. Tony knows.

Comment appropriations expropriation (Score 1) 191

Don't we all love to assign misspent money to our favourite cause? And they worked so damn hard for their misbegotten windfall. Suck it up DoD. We going to space. Cry babies.

> (I don't believe in private armies)
Believe in them or not, they are allowed under the second amendment.

A mall cop by any other name is still a mall cop, and the credibility gap endures.

It strikes me that space exploration is not the highest priority when your home planet has a burgeoning fever unless you're the sort of person pathologically averse to hanging around and nursing the sick.

Another decade or five and we'll know for certain whether the flowers are blooming in Houston. Space will still be there, waiting for us, I suspect.

Comment divorce without TOS (Score 1) 277

I have no commercial relationship with Facebook. I've never visited their site. If they are maintaining information about me, it's entirely without my consent. I'd like to click a box to disappear myself from their incidental radar screen (such as if people I know unwisely divulge my image or personal details), but it appears that I'd first have to agree to the Facebook TOS to do so.

Let's have a law that enables divorce without TOS.

Slashdot Top Deals

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...