Follow Slashdot stories on Twitter


Forgot your password?

Comment Sophist's Choice (Score 1) 161

Except that the hoops are not meaningless if you want to keep your job, get paid, and feed your family.

The gospel according to Jordan B Peterson — 21 April 2018

There are many other memorable passages. One that particularly stands out is where Peterson describes a period of soul searching 30 years ago when looking for something in which to believe — anything of certainty. And he started to reflect on a practice at Auschwitz about which he had read.
"A guard would force an inmate to carry a 100-lb sack of wet salt from one side of the large compound to the other and then to carry it back. It was "an act of pointless torment ... a piece of malevolent art. "

Serenity doesn't pay the bills. Meanwhile, lugging a 100-lb sack of wet salt back and forth across the Auschwitz quadrangle keeps you out of the furnace for another day.

If your child required you to lug a 100-lb sack of wet salt for miles and miles in order to be spared from a cruel disease, the situation would be (A) in no wise different, (B) meaningful, rather than cruel and pointless.

Comment Re:Half of Legal Fees (Score 1) 116

I'm really wondering why he chose to spend over $500k on lawyers, for a defamation and business interference case.

Probably because deterrence, on general principles. It's far less insane for Bruce to do this than nearly anyone else (given his prominence, and his Rolodex, he might have had some support footing this bill, too). Plus for $500k, you want to run the deterrence (it can bite back) up the largest available flag pole, and with the most credibility.

Plus I'm pretty sure you missed the essential circularity here.

That Perens has the stature to take this on as he has, means his bar for defamation is that much higher than everyone else's.

Yes, it sure would be a strange world if the judge slapped defamation onto every obscure, opinionated, know-nothing screed.

Sometimes the law is dumb, but it's not usually that dumb.

Things become a little different, though, if you start slagging people's character and private lives. That's a bad line to cross, and probably more people overtaken by the heat of the moment should pause to consider the potential legal ramifications of going down this path than actually do.

Comment on moral certitude (Score 2) 206

Psychology Today is the best you can do? Whose side are you on, anyway?

The Lifespan of a Lie — 7 June 2018

About the author:

* Ben Blum was born and raised in Denver, Colorado.
* He holds a PhD in computer science from the University of California Berkeley.
* He was a National Science Foundation Graduate Research Fellow.
* He received an MFA in fiction from New York University, where he was awarded the New York Times Foundation Fellowship.

The author did mundo research, which including, near the end, an interview with Zimbardo himself, which included the following Frost–Nixon interaction:

"If [prisoners] said, 'I want to get out,' and you said, 'Okay,' then as soon as they left, the experiment would be over," Zimbardo explained. "All the prisoners would say, 'I want to get out.' There has to be a good reason now for them to get out. ... That's the whole point of the Pirandellian prison [Ed. note: Pirandello was an Italian playwright whose plays blended fiction and reality]. ... "

Zimbardo confirmed that David Jaffe had devised the rules with the guards, but tried to argue that he hadn't been lying when he told Congress [and others] that the guards had devised the rules themselves, on the grounds that Zimbardo himself had not been present at the time.

He at first denied that the experiment had had any political motive, but after I read him an excerpt from a press release disseminated on the experiment's second day explicitly stating that it aimed to bring awareness to the need for reform, he admitted that he had probably written it himself under pressure from Carlo Prescott, with whom he had co-taught a summer school class on the psychology of imprisonment.

The entire article is awesome. Read it now.

In summary, the entire experiment was conducted on the basis of publish or perish, and Zimbardo left few stones unturned—acting mainly through compliant Lieutenant Jaffe—to ensure that the end result was "publish".

Here's another link I dropped into a Slashdot thread a few days ago, of an academic whose pursuit of his local career incentive crossed more than a few lines:

Why the Joy of Cooking is going after a Cornell researcher — 28 February 2018

Plus, Orwellian popcorn swells enrollment and sells textbooks:

For psychology professors, the Stanford prison experiment is a reliable crowd-pleaser, typically presented with lots of vividly disturbing video footage. In introductory psychology lecture halls, often filled with students from other majors, the counterintuitive assertion that students' own belief in their inherent goodness is flatly wrong offers dramatic proof of psychology's ability to teach them new and surprising things about themselves.

On the other hand, there's a responsible, modern literature, such as Robert Sapolsky's Behave: The Biology of Humans at Our Best and Worst (2017).

There are specific passages in there about the neurobiology of bad cops (under stress, unreliable neural pathways become faster and stronger than reliable neural pathways, operating entirely beneath the level of executive self-control).

Another recent book, Matthew P. Walker's Why We Sleep (2017) explains why—in modern society—operating at far less than our best has become de rigueur.

At the center of this book, with more laboratory studies than you can shake a stick at (many of these conducted until the cold, impartial eye of clinical fMRI scans),

[*] fMRI scans are cold and impartial when applied to slow, global brain phenomena such as sleep; for the fast and small, this, too, can be Wansinked.

I colourfully christened Walker's central thesis as the post-millennial iJog sleep-debt epidemic.

The millennials are the first generation who have never known life outside the always-on "global village" social-media mosh pit. They tend to fall asleep with their iDevice on their pillows, after wallowing in artificial light (much of it blue, from handheld screens) right up until lights out (kiss goodbye to your dim-light melatonin onset). Some of these people wrote entrance exams for kindergarten or elementary school.

This is explained (indirectly) in Laszlo Bock's Work Rules: Insights from Inside Google (2015) about how compensation needs to respect the Pareto distribution: competence is exponential all the way up; corporations that set a fixed upper bracket for their best employees soon discover their best employees repricing themselves on the open market every three years. This casts a new (and entirely accurate) light on the HR collusion case:

Apple and Google settle antitrust lawsuit over hiring collusion charges — 2014

Apple, Adobe, Google and Intel had been scheduled to go to trial at the end of May, with lawyers for roughly 64,000 workers alleging that bosses including Google's Sergey Brin and Eric Schmidt and Apple's Steve Jobs orchestrated an elaborate scheme to prevent poaching and drive down wages.

In an open market, the best of the best do have the power to demand extreme compensation, because the gap between a Gretzky and the next guy is about the size of most team's entire second lines (unless the next guy is Lemieux, but how often do those planets align?)

In the old model, there was this implicit idea (which never survived much actual sunlight) that some day, if you work hard, you "arrive", with the destination implicitly having the structure of a plateau (in the promised land) as a captain of industry, or your little corner of it. If you arrived at the plateau six months later than the equally competent person beside you, no harm done, ultimately you're pretty much sharing the same thin-air level ground.

Bock's Pareto model argues the opposite, six months behind at the very top of the curve amounts to about 6000 feet of hard, cold mountain goat rock face. You had better get your kids into the right kindergarten. Time is money: only its not a plateau, and it's not a fixed incline, it's an exponential incline, all the way up (and there is no top; even Einstein dropped the ball on failing to see that his brilliant EPR thought experiment implied that his intuitive notion of what constituted "action" was incorrect—and that he needed to get over the non-locality of action with no information cargo, in much the same way he got over the conventionally inculcated independence of time and space).

I think one of the sociological effects of teenage immersion in modern social media is that it implicitly shatters the plateau illusion, certainly in a way I wasn't forced to internalize at the same age. Why does this generation have so much trouble breaking out of the FOMO trance? Here's my own answer: because there's no convenient handrail. Turns out, the available handrail is governed by the Pareto distribution of future career competence, and if anything, grabbing this handle will only magnify your FOMO from a toxic peer-group opera into a governing principle that will shape your entire future life.

I personally got a lot of value from Charles Murray's Coming Apart (2012), though I started to skim after about the third chapter (his lucid introduction has by this point given way to a fusillade of tenuously interrelated minutia). He points out how modern meritocracy (aka SATs acting as a proxy for g) now effectively gathers all the best and brightest into tiny ivy enclaves of assortative mate selection, where previously some budding Rosalind Franklin in a small town might have have had to content herself with becoming the head of the local union (actually, I think she found more headroom at Cavendish; there's no glass ceiling like a small-town caste glass ceiling). The point here is: the union boss in Podunk, Great Plains with an IQ of 165 used to function as an actual plateau, and such a person had no need to fall asleep with an iJog on the next pillow, just to keep up; in fact, such a person could probably have detoured into the secret involvement of Area 51 in the Kennedy assassination, and not lose a step in achieving her preordained career plateau.

The end result of all this is epidemic sleep debt. And the neurology is becoming extremely clear about this: sleep debt boosts emotional impulsivity and diminishes serene executive oversight. This goes right down to the subconscious facial perception of emotion: people with a sleep-debt see others as more angry and more threatening, even when presented with a face intended to represent neutral emotion.

For myself, I see no way to read this literature without concluding that Zimbardo's study is full of shit, with academic incentive gone all wrong; it was just a bad study that we should all forget.

Equally, we should step away from the gun: this idea that you can easily separate the nature of the individual from the nature of the system in which the individual operates. Not so far as Jordan Peterson's take, that there's a seething tyrant embedded in each of us, constantly on the look-out for its main chance, if given the least daylight. (The Pareto distribution also applies to fruit-salad bespangled despots; this is a hard row to hoe without the special and exclusive boon of an interior hubris overdrive.)

The bottom line: with a correlation coefficient of r=0.9 (pulled out of my extremely well-informed ass) the reliability of a narrative of what someone "deserves" is inversely proportional to the strength of the conviction presented; the correct analysis of social systems is simply not compatible with high degrees of moral conviction. Personal responsibility is the mahout, biological determinism is the elephant. Yes, the elephant rarely wins a direct confrontation with the electrical goad of the iron law; whereas the mahout rarely even survives an accidental confrontation between an absent-minded, psoriatic elephant and J. Random convenient oak tree with the gnarly bark of extreme bliss. Life is a busy place. Accidents happen every day.

Our good mahout, bad mahout moral CRM 114 discriminator is now about 70,000 years out of date. See Yuval Noah Harari's Sapiens: A Brief History of Humankind (2014). Oh, how we deeply want to believe this antiquated, instinctual moral apparatus still functions as it ever has, now that assortative mating functions (at the top) on the scale of half-billion member global sub-societies, while 80% of Western democracy's youth suffers from the pervasive iJog sleep-debt epidemic.

You can't remove our innate craving for moral certitude from human nature. The more we "enlighten" ourselves collectively, the more violently it returns as an emergent cultural phenomena (sometimes greatly abetted by an opportunistic nucleation crystal of charismatic bullshit; the bullshit relaxes your grip on your pre-existing moral center, while the charisma drags you forward into a new moral center—Jordan Peterson has this part completely right).

I don't sit here pretending I have any answers. For myself, I read many, many books (this includes many bad books, not mentioned here). Many of these books are difficult, or at least demanding. Sapolsky's book is like the War and Peace of neuroanatomy, complete with an index to the cast of characters, and quick-start appendix. I've surely got 10,000 hours of wading through arcane jargon under my belt, and for this book, I was close to crying "uncle" after the first one third. My big mistake: I tried to keep up. One does not read War and Peace (on the first pass) with a view to the big picture.

And so, yes, if all 300 million people in America would only pound their way through all the same books, things would be different (unrecognisable different, like alien booster-spice in the fluoridated water-supply different)—as if people would arrive at the same conclusions, even so.

Enlightenment doesn't scale. My little corner of enlightenment also serves as its own small cisnormative box.

Is Nicholas Matte correct that there is no such thing as biological sex? (2017)

In the actual Peterson debate Matte says he could "walk us through this [that biological sex does not exist], but in the interests of time, he won't".

My narrow, isolated pinnacle of enlightenment was equipped to wade though the following articles in record time:
XX male syndrome
XY gonadal dysgenesis
45,X/46,XY mosaicism
True hermaphroditism

And, since these accounts were prevalence-challenged, soon I upped my game to:
Incidence, Prevalence, Diagnostic Delay, and Clinical Presentation of Female 46,XY Disorders of Sex Development — September 2016

"Does not exist" turns out to have a y-axis on the order of 10 per 100,000-ish. That's a microscopic value attesting to "does not exist". (For extra marks: try to measure this on the Serengeti with the tools available to the world in the year that Einstein first published his theory of general relativity, which is to say, not exactly primitive tools.)

[*] If you're only going to have one pinnacle of 10,000-hour enlightenment, my chosen pinnacle is not a bad one.

Cops sure don't like people with their SRY complex in the wrong genetic place. Kill, kill, kill.

And this behaviour, too, is all determined by some other shifty genetic complex (in cahoots with thousands), right down to how society wants cops who want the SRY complex to go, for the love of God, where it belongs (as God intended).

Although fertility is possible in true hermaphrodites, there has yet to be a documented case where both gonadal tissues function, contrary to the misconception that hermaphrodites can impregnate themselves. As of 2010, there have been at least 11 reported cases of fertility in true hermaphrodite humans in the scientific literature, with one case of a person with XY-predominant (96%) giving birth.

Doctor: I've got good news and bad news.
Patient: What's the good news?
Doctor: You're alive.
Patient: What's the bad news?
Doctor: Genetic testing indicates that you're a mosaic hermaphrodite.
Patient: Are you sure you got those the right way around?
Doctor: Uh, that's exactly what I was afraid you'd ask.

To even begin to ask this question, you need to find testicles, ovaries, and a vagina all in one undercarriage. Earth paging God: this is really fucked up. This "mysterious ways" ways thing was a good gig for a few thousand years, but this is surely beyond the pale. Consider yourself busted. It's one thing to have seven billion people all playing a simultaneous game of genital rectitude FTW, but at least you need to deal people a fair hand, or the entire human enterprise of summary social judgement acquires a queer smell.

So everyone, by all means, continue on with your bad cop bashing, but bear in mind, if you can, that a fish rots from the head down, and that the true head of Western society is conscious and subconscious narratives of moral rectitude, tilled over thousands of years into the cultural subsoil (score another clear point for Jordan Peterson, with a first assist to Yuval Harari).

That's how Zimbardo caught the wave on the back of such a shoddy study. He appeared to shed light on one of high voltage wires in the deepest place, and that's how we lapsed (yet again, and not for the last time) into collective skepticism deficit disorder.

"When I heard of the study," recalls Frances Cullen, one of the preeminent criminologists of the last half century, "I just thought, 'Well of course that's true.' I was uncritical. Everybody was uncritical."

Nothing disables the human critical capacity faster than applying a sexually stimulating voltage to the yellow wire standing judgement over just deserts.

In conclusion, there are really only three mainstream genres in modern American film:
* teenagers have sex and die
* the big bad prances, devours, gloats, menaces without bound, and then goes splat (in some suitably grisly holocaust of blades, bullets, fire, and brimstone)
* cops gonna be cops (can be played for incompetence, odd-couple, persecution, procedure, patriotism, or dystopia)

Enjoy the popcorn, if you can.

Comment the ceramic pillow right stuff (Score 1) 456

Yes, from what I remember from university, the biggest cause of failing and dropping out was not lack of ability to pass beginning college courses but rather lack of discipline in getting up early and going to classes instead of partying and skipping classes once on your own and away from mommy and daddy.

Have you ever checked out the test score differences between owls and larks when both are forced into the "discipline" of waking up early? It's about a full letter grade to the disadvantage of the owls, when the test is taken early in the day (the effect lessons as the day continues, because the owls do finally stop yawning in mid-afternoon).

Owl performance recovers in full when allowed to sleep until their natural wake time. Check out Why We Sleep (2017) by Matthew P. Walker. It's the most authoritative general account of sleep presently available.

In most high schools (those which have stuck with traditional start times), because of age-related changes in circadian rhythm, almost all the students are owls, but some are more owls than others, and their grades all suffer (but the owl owls suffer more than the lark owls).

But sure, make rise time your go-to proxy for having the right stuff.

Why did they use ceramic pillows in Ancient China and as recently as the Ming Dynasty? — August 2017

Ancient Chinese didn't cut their hair after their teenage years. They were too lazy to clean their hair, so they always managed their hair once and didn't touch it again for a few days. A hard pillow would have helped them to keep the shape of the hair. And the long hair would have helped them to sleep comfortable as well.

Comment Re:lumper/splitter butter churn (Score 1) 80

s/degenerates to/degenerates into/

Also, by counterfactual in nature, I mean that estimating out-of-pocket damages requires manufacturing a hypothesis about how someone might have behaved differently, leading to a different remunerative outcome, had the "theft" not occurred.

s/someone/world and dog/ if you've got Hollywood balls (and then collect a government tariff attached to blank media just in case).

Comment lumper/splitter butter churn (Score 1) 80

If I could have a dollar for every time an "insightful" post on Slashdot — since the times of Napster — lectured the audience, that it is not theft, if the victim still has his copy of whatever is allegedly "stolen"...

This happens to be roughly the same distinction as the one between murder and attempted murder.

An industrious 12-year-old with a nickel-a-week allowance can easily "steal" $500,000 in a year, as the aggrieved prefer to frame it. And then they try to collect on the counterfactual $500,000, just to keep it real.

Actual outcomes:

(A) Minor engages in data hoarding hobby — minor deflection of revenue opportunity curve.

(B) White collar professional "boosts" his copy of AutoCAD or Final Cut Pro — non-trivial deflection of revenue opportunity curve.

(C) Bootlegger uploads a protection-neutered AutoCAD or FCP to a darknet warez server — potentially a substantial deflection of revenue opportunity curve.

Here's the thing. You can have any deterrent you want, so long as the colour is black. This is why the theft is theft is theft crowd is so quick to postulate 12-year-olds with $500,000 endowment accounts using magic bean, counterfactual arithmetic.

Case (C) degenerates in case (B), where the actual willingness-to-pay resides. However, it also decreases the opportunity cost for (B) to engage in skinflint behaviour, and since middlemen are a pox on humanity anyway—ask Bezos—this group gets the biggest boot up their ass, at the end of the day, once identified and apprehended (if ever).

Looking past the black-only theft is theft is theft deterrence field, all the losses in simple copyright IP theft are counterfactual in nature. Loss of life is not counterfactual. Loss of your car is not counterfactual.

Anyone determined to pack counterfactual theft and non-counterfactual theft into the same word is doomed never to think clearly ever again. Anyone determined to segregate these two cases 100% is also doomed never to think clearly ever again.

Now, if some 12-year-old Ferris Bueller trashes your tricked-out 1961 Ferrari 250 GT California Spyder, what you have is a factual $10 million hole (after applying a 33% hyperbole deflation field).

Ferrari identical to model driven in hit film 'Ferris Bueller's Day Off' expected to sell for $15.1 million

And once again we're right back at some giant number you can't feasibly collect, so what's the different, anyway? Answer, for the straight thinkers: one sad Ferrari corpse, made of actual metal and paint.

You don't get a $15 million car without an extremely rigid supply and demand curve.

For our 12-year-old data hoarder (with the putative $500,000 hoard), if you increase his direct marginal cost by $10 hard cash, he could well have a different hobby by tomorrow afternoon. How's that for a featherweight demand curve, floating along a passing breeze?

Lump or split, lump or split?

God, isn't it just such a tough call.

Comment Ritalin's widening gyre (Score 1) 129

That may be true this time, however, the same thing has been said for every previous technological advance.

You're doing inference from stupid. It's like wisdom of the crowds in reverse: round up all the people who've been gloriously wrong (over and over again) into a small pen, and then go opposite George.

News flash: you can't squeeze a correct prediction out of a teapot of stupid people.

The fall of Rome was predicted many times. These predictions were wrong every time—until it actually happened (only some historians dispute that this did ever happen; it kind of depends on how you choose to view Byzantium). Either way, the heyday years of the Roman empire did, indeed, come to an abrupt end (only historians dispute this too: some claim the end arrived in gradual stages).

One of the rationales floating around before the crash of 2008 was "well, the housing market has never gone done, everywhere, all at once." Until it did.

Let's just look at this from the point of view of sampling bias.

Get a large group of people, have them all make predictions about some future bright line, sort those predictions into time sequence.

Here's something that's guaranteed: if you get to the median prediction without it having come true (yet), half of all of the people can be entirely written off as Chicken Littles, while the other half can not (yet) be written off as Chicken Lates. Interesting asymmetry, isn't it?

False positive, false negative; Chicken Little, Chicken Latte (as in, Nero cozied up to an espresso bar while Rome burned).

And here your are trumpeting navigating through the rear-view mirror as some kind of great, refined wisdom.


Last night I was reading Sapiens (2014) by Yuval Noah Harari. I was really looking forward to this book, but to be honest, halfway into the second chapter, I'm pretty bummed out by his cavalier roll-ups. Such an enormous step down after Sapolsky's Behave (2017).

In any case, Sapiens is nothing but a litany of enduring, world-redefining change.

It's one of the main reasons people tend to predict alarming change Real Soon Now: because that's what history is actually made from. (Only people tend to forget that history is denominated on a log scale, while the future is usually denominated on a linear scale, which goes a long way toward accounting for the tragic surplus on the Chicken Little side of the fence; that, and thrill-seeking eschatology boners.)


I generally try to root my predictions about the future in the perceptions of people who can successfully translate from a log to a linear scale. Try it sometime. You might discover that No Change Ever is not the Bayesian all-world prior you imagine it to be.

Over the last century or so, the number of borderline unemployable males in Western democracy has gone from about 5% to about 15% due to the relentless inflation in the norms of educational attainment.

But a man could get by without an education, if he had physical competence and a work ethic, because there was always roofing to fall back on.

Power your Home with Beautiful Solar

Made with tempered glass, Solar Roof tiles are more than three times stronger than standard roofing tiles. That's why we offer the best warranty in the industry — the lifetime of your house, or infinity, whichever comes first. Watch our hail test video to see how we take durability to a whole new level.

Society is already failing to manufacture enough meaningful work to employ industrious, boisterous males who don't finish school.

I suspect we're more likely to address this problem in future by changing the broken educational system (broken for those whom it least serves) than by making the demands of the modern workforce less cognitively arduous.

In Colonial America, agriculture was the primary livelihood for 90% of the population, and most towns were shipping points for the export of agricultural products. ...

After 1840, industrialization and urbanization opened up lucrative domestic markets. The number of farms grew from 1.4 million in 1850 (population 23 million), to 4.0 million in 1880 (population 50 million), and 6.4 million in 1910 (population 92 million); then started to fall, dropping to 5.6 million in 1950 (population 150 million) and 2.2 million in 2008 (population 305 million).

True economic miracle: we managed to reduce a heavy manual labour sector by 90% over about a hundred years, while only increasing the number of unemployable males by a factor of three (short of another world war, which would solve the whole problem almost overnight, for a blinding value of night with a purple after-image).

Let's repeat this economic miracle again (at twice the speed): we can then project that by 2070 a full 40% of the male population is borderline unemployable (and yes, according to Schumpeterian gospel 80% of the males employed will be doing new work).

I'm desperately trying to pull something out of my ass here to falsify my own scenario. But what would it be? I suppose if America gets a real conservation bug (Teddy Roosevelt-style, with every state deciding it needs its very own American Prairie Reserve) we could employ a million boisterous males in outdoor, bushwhacking jobs reinvigorating this or that American Serengeti. Someone serene and weatherproof will need to survey the lizard and duck eggs. (Coming soon, Serenity Now by annual injection, half price with every flu shot.)


There's another good name for Schumpeterian laissez-faire: Somebody Else's Problem. Only you're also right: if we don't pay attention, the system itself surely will.

But that system is not cozy seminar chaired by Ludwig von Mises, it's mother nature, and we know how mother nature sorts out the difficult cases: by adjusting master parameters, such as r and K, consigning more unfit progeny to the genetic-junk-heap express route, as necessary, until problem solved.


8,000 Years Ago, 17 Women Reproduced for Every One Man — 17 March 2015

Once upon a time, 4,000 to 8,000 years after humanity invented agriculture, something very strange happened to human reproduction. Across the globe, for every 17 women who were reproducing, passing on genes that are still around today—only one man did the same.

"It wasn't like there was a mass death of males. They were there, so what were they doing?" asks Melissa Wilson Sayres, a computational biologist at Arizona State University, and a member of a group of scientists who uncovered this moment in prehistory by analyzing modern genes.

Trace your own ancestry back 8000 years. At that rung of your family tree, there will seventeen women for every man. The other sixteen men might have reproduced, but their entire family trees eventually withered out (possibly catastrophically, or possibly by persistent, accumulating decrements).


Here's a good question: how often in history have we replaced one major source of employment with another major source of employment involving less training, education, skill, or craft? Sugar cane, cotton, and tobacco is one example (for which the labour supply was mainly compulsory).

How about after the Corliss steam engine (circa 1850)? Paper routes, pizza delivery, and fast-food era burger flipping, that's all I've got. Everything else that's entry level wants either a diploma or a reliably sunny disposition (if you've ever called a member of a rival sports team a douche-bag in real anger, you do not have a reliably sunny disposition, not as the modern service economy presently views this). Note: drug dealing is very hard, as well as very dangerous.

Humans burn 50 watts at rest (of the most expensive fuel known) and only deliver a 25% duty cycle (42 hours out of every 168 hour week). Out of this 42 hours, about 30 hours is invested in productive work; 6 hours is spent making hazy, mindless errors; and the other 6 hours is invested in angering the hive (purely in self-defense, it goes without saying).


On the up side, we remain infinitely malleable.

Fly-in-the-ointment modern prerequisite: an enormous appetite for book learning; a correspondingly tiny appetite for sexual adventure, charismatic frauds, drugs, dust ups & high rolling.

This does not describe 25% of the male population, after having already endured 9–12 years of educational force feeding (colour me an optimist: perhaps we can yet improve the educational force feeding).

Schumpeter was surely right that work re-invents itself and expands to the talent available—modulo an ever-rising bar on net talent positivity.

It's already arguable that 15% of the modern male population is below the cut line of net talent positivity (in Schumpeter's time, that figure might have been 5%). It's already foreseeable that this figure could rise to 25% within another twenty years, short of a substantial cultural change to the male condition. (Women are somewhat exempt from this, having a higher baseline level of sociality and conscientiousness.)

Schumpeter might well have been an excellent economist, but he knew batshit all about the sociology of Ritalin's widening gyre.

Comment Re:Oh My God (Score 1) 448

Attempting to overthrow the President of the United States by members of federal law enforcement is more important. And it's ongoing.

The other name for this is calling Trump to account for his past actions. Under rule of law, these investigations are slow and deliberate. This would also be true for any common criminal, when the law is working as the constitution intends.

Furthermore, investigations into wealthy criminals is almost always a slow process, because they can afford to erect so many barriers of due process, launching one appeal after another. This is ongoing.

I have no issue with weaponizing due process (tax avoidance v. tax evasion), but only a world-class idiot (or brazen hypocrite) thinks you can weaponize due process to a quick conclusion.

Comment Re:What else would one do? (Score -1, Offtopic) 137

It's basically arguing that the technology is undergoing path dependence, which is no big surprise as it happens all the time in lots of areas.

Want an interesting path dependence?

Science, as an industry, is so busy defending themselves from climate science denialism (in the extreme case: even that it could, in principle, be right) that science tends to hold up peer review as an exalted process of cognitive righteousness (which it is, over a time base of 50-year internal feedback cycles).

However, at the same time, peer review is also a political mechanism to enforce path dependence, which systematically biases trivial incrementalism (insignificant career fodder is fine, so long as it knows its tiny, tiny place), over profound and potentially game-changing speculation (the proper onus here should be that any definite, predictive theory which can not be presently disproved is by default considered publishable, but that's not how it works—not if it runs against the grain of the endowed, old-timer consensus worldview). To some degree this is a budgetary bun fight, because without publication, no grants; so the gate-keepers of publication are implicitly also the gate keepers of funding opportunity.

Society pays a steep price for the 50-year bullshit-rejection convergence window of peer review (though it sure beats languishing in 3000-year traditions of metaphysical naval gazing).

To some degree, science kind of likes being marginalized by the climate science deniers, because it distracts from asking legitimate questions about just how broken some of these internal political processes really are (who can patiently pose these questions when you're shouting down accusations 24/7 that you're ten or a hundred times less competent than you actually are?)


Did I mention p-hacking? What an ultimate crock. Easily predictable 50-years ago, and now we're just getting to it.

Why the Joy of Cooking is going after Cornell's Brian Wansink — 28 February 2018

Preregistration of study designs: This is a huge safeguard against p-hacking. Preregistration means that scientists publicly commit to experimental design before they start collecting data. This makes it much harder to cherry-pick results.

What an amazing innovation. Someone hand the guy or gal who proposed that idea the Fields Medal.


Yes, all those virtuous climate scientists vigorously defending the ultimate truth machine of peer review sat around for decades barely lifting a finger to institute pre-registation of study design.

And these are the people who are going to save the planet from the greenhouse gas godzilla? Good luck with that. (My most cynical internal voice assigns a p_success_STP_G3_1v0 somewhere in the vicinity of Reagan's nakedly preposterous space laser (Strategic Defense Initiative).

"But boss, the stakes! But boss, the stakes!" cries the white-tuxedoed midget from Fantasy Island.

This is naked appeal to the Theory of Narrative Causality. If it must happen, it will happen.

This is what Terry Pratchett calls narrativium: the iron law that a million-to-one long shot happens nine time out of ten (precondition: all the stakes having arrived just in the nick of time at a synchronous planetary-alignment cross-road of dire urgency).

Narrative Causality was also the stock in trade enabling Reagan to float the SDI concept to the receipt of Educated Snickers Only: the stakes were sufficiently sky high to trigger narrativium normalization of million-to-one odds. (Education, by some magic power, is a potent form of narrativium kryptonite.)

Science, on the whole, has done a terrible job of reforming its own, internal, blindingly obvious path dependence.

Meanwhile, global geopolitical initiatives to address climate: no problemo.

Au contraire: big problemo.

Scientist, whispering advice in a private corridor: you wet blankets, you know, you're just making things worse, don't you? We need all the trust and faith we can get. The situation out there is dire!

Yeah, you got that right: on the matter of climate change, path dependence is pretty much all we have to work with, here.

Verily, the bare political truth alone shall not hold back a rising sea.

Sometimes I get the feeling that scientists tire of their own rigours, and just want their time in the sun to play in a giant sandbox of infinite stakes (the enduring greenish tint of the entire blue marble) where path independence is an obvious non-starter, right from the get go.

Ah, it must be nice—for once—to lounge in the cool beach breeze of narrativium-narcotic global salvation.


For 99% of television viewing, the illusion of acuity would more than suffice.

I foresee a day when sports broadcasting is camera-angle independent: all the camera angles are synthesized by a 3D-vision AI which burps out a minimum-cost polygon texture model, but with the new improved polygons of DNN self-organized feature space.

Then the receiving television can reconstruct this to any human perceptible pixel density, from any reasonable camera angle (what the actual cameras see directly is reconstructed with high verisimilitude, everything else is supplied TG2BT interpolation—too good to be true interpolation—without fail more convincing than real life).

This whole model would smash video coding upside the chops. But it doesn't necessarily require less than 25 mbits (not if the television is consuming less than a kW for reconstruction).

You'll also get high-fidelity super-slow-motion replay for free.

The downside: good luck ad-blocking all the product placements that occupy every non-essential blade of grass (or glass) on the entire iron grid, regardless of whether your preferred sports' competitive quadrangle are limned in red, white, or blue.

The upside: who needs the real athletes, anyway? You can generate just as much drama from a fictional league of manufactured athletic profiles (with heavy use of the TG2BT restraining bolt). The can also talk fantasy shit, and have their own fantasy Twitter accounts. What's the difference?

I'll tell you the difference: the Fantasy Hacker Sports League features mostly no ads.

(Careful making the fake cheerleaders too similar to any person, living or dead, or you might run afoul of the deepfakes copyright cops. Yes, they are available for further rendering after the game, but somehow or another we'll keep the seamy post-game underside a darknet Cheetos secret. Honestly, how can the real NFL even hope to compete?)

Comment do we not remember TOS? (Score 1) 60

The original TOS had terrible numbers, except among the demographic advertisers would later cherish above all others. (Advertisers are slow on the uptake.) So there it was, TOS hanging by a budgetary thread throughout its lame third season.

NBC at first planned to move Star Trek to Mondays for the show's third season, likely in hopes of increasing its audience after the enormous letter campaign that surprised the network.

But in March 1968, NBC instead moved the show to 10:00 pm Friday night, an hour undesirable for its younger audience, so as not to conflict with the highly successful Rowan & Martin's Laugh-In on Monday evenings, from whose time slot Laugh-In producer George Schlatter had angrily demanded it not be rescheduled. In addition to the undesirable time slot, Star Trek was now being seen on only 181 of NBC's 210 affiliates.

Roddenberry was frustrated, and complained, "If the network wants to kill us, it couldn't make a better move."

He attempted to persuade NBC to give Star Trek a better day and hour, but was not successful. As a result of this and his own growing exhaustion, he chose to withdraw from the stress of the daily production of Star Trek, though he remained nominally in charge as its "executive producer".

This is what you get when you grant an implied equivalency to exhausted eyeballs playing out the string to the tune of Maury Povich or Kim Kardashian to a smart-ass teenager with a working brain binge-watching Crash Course History.

This particular "more than" bucket (Internet v. television) is fit to make a clueless Mad Man weep a saline river for the lost marketing paradise of Atlantis—where all eyeballs were equal unto the market, as stipulated by the Nielson Ratings Equivalency Act of 1951 BCE.

[*] Altanteans routinely over-simplified their public sphere by decree, all the better to free up more time for "doing it" in such an immense variety of non-procreative ways (nascent gills add so many buoyancy options) that finally God was forced to summon up a wet, wet, wet collective express train to hell. Turns out, there are some cultural channels that even God can not bear to watch, day in and day out. Povich apparently makes the grade, where Atlantis didn't. These Altanteans, so much skin, and their guts don't even churn—simply unbearable. Be gone, channel, be gone.

Good grief, spare a clue for what you're lumping together.

Comment pivot language? (Score 2) 46

Is English considered to be the pivot language, or do all of these models product the same intermediate representation?

Rather useless article, with no shred of a deep understanding, whatsoever.

I'm guessing you run the input model from language to IR, and the output model from IR back to language, so you need to have at least two models to use this app. (I suppose you could translate from English to IR and back to English again, for perverse joy.)

Only I haven't read anything about training multiple machine translation models with a shared IR. That strikes me as technically difficult, and I would have thought I'd have seen some loud crowing out there, had it been achieved (it's now been a couple of months since I gave the Internet a good shake on machine learning, and things move fast).

Comment Re:Wiat a min... I thought Intel was done... (Score 1) 99

Didn't we have a story last week about how Intel was on death's door because they couldn't get their yield on the new chips high enough?

10 nanometer

Currently Intel's 10 nm process is denser than TSMC's 7 nm process and available in limited quantities, but volume production is delayed until 2019. However, TSMC's 7 nm process [in name only] is planned to be soon available in high volume shipments or mass produced devices.

After you've been the 800-lb gorilla for four decades, death's door is merely running abreast, because gorilla's don't historically adapt well to sustained sprints.

But in this case, everyone on the track must soon round the corner onto the EUV obstacle course.

This 10 nm design rule is considered likely to be realized by multiple patterning, given the difficulty of implementing EUV lithography.

TSMC is surely feeling their oats these days, but tackling EUV from the vanguard position has to practically scare them pantsless, if they've got any sense; they'll probably concede half a step to the once-tireless vanguard gorilla.

I imagine this will only be temporary, though. EUV from the vanguard might just be Intel's last gorilla glory. But it buys them time, so they're definitely not 800-lb shaggy deadman running abreast, just yet.

Comment coaching smart people dumb (mute) (Score 1) 164

There's a lesson here. If you have a good idea, don't fucking tell Google about it! Don't put it on your android phone, don't discuss it in email, don't type more than you have to in the search bar.

Classic example of availability bias.

The vast majority of inventions are lost to the world because the person who thought it up (in a form that was by no means complete and practicable unto itself) failed to solicit enough outside involvement to fully move the idea forward.

It's simply human nature that ideas die when not shared around and chewed collectively.

This has a lot to do with fueling the lone genius myth, because only weirdos like Tesla (and he was very weird) have what it takes mentally and emotionally to go it alone.

Most clever monkeys who select your recommended door #A seriously overestimate their intestinal fortitude, wherewithal, and life course. Then we tremendously celebrate the few who prevail over these dim prospects. Probably in most cases, clever monkey is far better served by selecting door #B: ensconce the idea into the public domain as quickly, and vigorously, and thoroughly as possible. Definitely mention all the ways the idea might play out or become applied in a practical scenario.

If the idea seems to gain any kind of social or economic traction, patent some lucrative corner case. I don't counsel against withholding some narrow, special tricks. If you've invented anything substantial enough to be worth this conversation, you've probably accumulated in your (years worth of) preliminary thrashing more than few exceedingly narrow, special tricks.

So You Want To Write Your Own Language? — January 2014 by Walter Bright

First off, you're in for a lot of work ⦠years of work ⦠most of which will be wandering in the desert. The odds of success are heavily stacked against you. If you are not strongly self-motivated to do this, it isn't going to happen. If you need validation and encouragement from others, it isn't going to happen.

No, I didn't look that up before writing the above. And it was on the first page of links that came up in a Google search "inventing a computer language difficulty".

Over the years, as the world has become ever more social, I've become increasingly convinced that this antisocial stiff-upper-lip door #A is tragic advice, 99 times out of 100.

If you're Walter Bright, YMMV. But Walter certainly wasn't reading Slashdot for prudent counsel. He was entirely of his own mind from the get go. The bright solitary lights tend to come fully equipped with a blanket-armour disdain for the rubes around them (sometimes graceful, sometimes polite, sometimes neither).

Moral of the story: if you need to ask, you can't afford it.

Comment wall flower culture shock (Score 1) 105

... originally only expected to survive for a few months ...

I think you're abusing the word "expected".

There was a high-likelihood failure mode (involving dust accumulation) which was baked into the mission parameters, whose budgetary concerns centered around achieving a minimum sufficient return on investment (the worst outcome of all in space exploration is no learning).

But if you'd asked anyone involved with a clue, they'd have said that the uncertainty around the dust accumulation model was high, and that most of the engineering had been done to a standard where a decade of nearly fault-free operation would be considered normal (otherwise the sum of parts wouldn't outlast xmas morning).

In NASA planning culture, the word "unexpected" doesn't convey the sad punter baggage you're implying out of context. In NASA planning culture, foreseeable adverse events get all the hasty index cards. They don't tend to invest up front in thick mission planning binders for unexpected (meaning: pleasant surprise) thin-atmosphere windfall. NASA doesn't plan for the worst, and hope for the best. 99% of the time, NASA plans for the worst, and then re-plans for the worst.

And then when the day comes and there's an eerie adversity absence, and everyone is standing around with not much to do and a blank look, it's not so much a mental surprise (unexpected) as an emotional shock.

Dance with the one that brung ya is strangely discomfiting on emotional terms when the one that brung ya is total institutional paranoia, with thick binders devoted to taking a single careful step.

Unexpected dance floor vacancy rate? Hardly at all.

Comment Re:Why is this surprising? (Score 4, Funny) 154

It's amazing because even humans living in Rome thousands of years ago created a way to write numbers but no way to write "none".

They had a way to write nothing: by writing nothing.

The Romans were pragmatists. This saves paper. Imagine the cost of inscribing "zero Bugblatter Beasts" on every urn, vase, and ceramic dufflebag?

Turns out, some nebbish recruit did invent zero, but the squadron leader spotted the unfamiliar symbol one day and then he said "what the fuck is this?" and somebody said "it means we didn't get any X in our rations this month" and then the squadron leader's veins bulged out of his neck while he barked "who's the jackass wasting a perfectly good resource to record what he didn't get?" and then the jackass had to run 100 laps around the Colosseum draped with a heavy marble placard reading "lion food / reward offered"—this while the people inside were cheering the lions (more than once he panted out excitedly "look, an elephant!" pointing at some unlikely bush when people got too close for comfort, while summoning yet another painful micro-sprint, and through this device he did avoid detection in the end).

Never made that mistake again. Not ever. Neither did anyone else, which, of course, also means that no-one was foolish enough to write a line itemizing the empty set of damn fools (many of whom invented zero, but knew better than to write it down).

Slashdot Top Deals

"You can have my Unix system when you pry it from my cold, dead fingers." -- Cal Keegan