Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:We Wish (Score 1) 663

Kunstler doesn't add much to the question posed. He burries the meat of his argument under this horrible diatribe:

You could call these two examples mendacious if it weren't so predictable that a desperate society would do everything possible to defend its sunk costs, including the making up of fairy tales to justify its wishes. Instead, they're merely tragic because the zeitgeist now requires once-honorable forums of a free press to indulge in self-esteem building rather than truth-telling. It also represents a culmination of the political correctness disease that has terminally disabled the professional thinking class for the last three decades, since this feel-good propaganda comes from the supposedly progressive organs of the media -- and, of course, the cornucopian view has been a staple of the idiot right wing media forever. We have become a nation incapable of thinking, or at least of constructing a consensus that jibes with reality. In not a very few years, the American public will be so disappointed and demoralized by broken promises like these that they will turn the nation upside down and inside out, probably with violence and bloodshed.

What did that accomplish, exactly? He sounds like a call-in radio host winding up his faithful windbags before opening the switchboard to a long queue of flashing lights. Did that actually help anyone think? I think not. It's just a long clatter of power words. If we had access to a time machine for a single trip, and we sent someone back to explain to Isaac Newton what the world looks like nearly four centuries later, there's about 49,850 words from a 50,000 word vocabulary that would serve far less well than "cornucopia" even before writing down e=mc^2 and explaining the energy content of a gram of matter and moreover, that we've already harnessed this, and we've very nearly harnessed this as well as the sun (which has, if he's curious, several billion years remaining of happy middle age). So then after drilling down into specifics for a week or three, he might ponderously observe "Now I understand. There was a temporary energy glitch circa 2030 which caused great consternation with ten billion mouths to feed and dime-store weapons of mass destruction ready to hand." He's underappreciated for his sharp ear and biting humour.

If we had an unlimited supply of oil (very nearly true if an efficient process is discovered to covert coal into oil) then we'd be game on for climate roulette. If we had any mostly unlimited supply of energy, then we'd have to start dealing with the fundamental problem that any good physicist would quickly identity as far more severe than an energy deficit: shedding waste heat from the hot blue marble. There's no future where we can continue to use energy as unwisely as we did during the global boom of the 1950s and 1960s.

Yet the real game changer, if we get there in one piece, is the transition from global population growth to global population steady-state. Rapidly growing populations have fundamentally different priorities than equilibrium populations. Personally, the thought of six billion middle class adults racking up 10,000 airmiles annually for mild respite from the 40/40/40 makes me shudder with disgust, so I'm mostly hoping the oil supply remains tight until we're ready to ante up to some fundamental societal change.

Comment blowhard shills (Score 1) 331

I don't know whether it qualifies as a fallacy, or has a name if it does; but arguments of this particular style always annoy me

It's absolutely a fallacy, which falls under many names, starting with the Straw Man fallacy.

It's so ridiculous I had to look it up again.

âoeOur goal is to make the world better. Weâ(TM)ll take the criticism along the way, but criticisms are inevitably from people who are afraid of change or who have not figured out that there will be an adaptation of society to it,â he added.

Here's another version:

People who make this kind of argument are blowhard shills (or, apparently, blowhard shill detractors).

I almost count myself as a card-carrying member of personal biometric Total Recall, and yet I'm far from immune from criticizing Google Glass.

Comment yanking the curtain strings is NOT leadership (Score 2) 231

"What's genuinely difficult is that both I and a bunch of people that help make choices, genuinely care about what other people think," Shuttleworth said. "We go through a lot of trouble to accommodate other folks."

Huh, that's why I recall getting the memo from Mark early on in the Unity adoption cycle that there would be a transition period that would suck donkey balls for power users with dual-head workstations, expressing that while he realized this would highly inconvenience certain user demographics making tough decisions is necessary to future success of Ubuntu.

That's why he so cleverly timed the transition so that the users most inconvenienced could wait out the dual-head donkey-balls fiasco on a LTS release. No wait, neither of things were true. He went to no trouble to help other people accommodate themselves.

From Leading Change by John K. Potter (p.88):

One of the main reasons that vision creation is such a challenging exercise is that those guiding the coalition have to answer all these questions for themselves, and that takes time and a lot of communication. The purely intellectual task, the part that could be done by a strategy consultant, is difficult enough, but that often is a minor part of the overall exercise. The emotional work is even tougher: letting go of the status quo, letting go of other future options, coming to grips with the sacrifices, coming to trust other, etc. Yet after they are done with this most difficult work, those on a guiding coalition often act as if everyone else in the organization should become clear and comfortable with the resulting vision in a fraction of that time. So a gallon of information is dumped into a river of routine communication, where it is quickly diluted, lost, and forgotten ...

So why do smart people behave this way? Partly, the culprit is old-fashioned condescension. "I'm management. You're labor. I don't expect you to understand anyway." But more important, we undercommunicate because we can't figure out a practical alternative: Put all 10,000 employees though the same exercise as the guiding coalition? Not likely. [My emph.]

Yes, Mark, I get the necessity message, and I always have. What I don't get is all the condescending bungling around proactively communicating this vision (and perhaps offering better transition options) so that more of us could have remained in the fold.

In Shuttleworth's view, the nastiest thing that people can do is to set up unnecessary tension.

You mean the tension about whether you communicated the Unity change well enough, soon enough? Bite me. Seriously, I hope Unity grows up to become everything you dreamed it would be. But excuse me if I don't hang around in a neighborhood where roads are demolished before signs are posted.

Comment Re:Relevant xkcd (Score 1) 233

http://xkcd.com/793/

My field is <mate selection>, my complication is <social transactions in symbolic discourse>, my simple system is <you> and the only equation I need is <you're not getting any>. Thanks for offering to prime my pump with higher mathematics. But you know, if you'd like to collaborate on a section on this intriguing technique of speaking in angle brackets to deliver a clue where no clue has gone before, perhaps we should meet for coffee—if you can refrain yourself from dismantling the social milieu long enough to drain your mug.

Pauses to observe patiently as the word "milieu" penetrates into physicist's long-forgotten amygdala with the deep impact of an entire bottle of earthquake pills, whose fine print reads "not effective on physicists(*)" with a footnote (in even smaller print) reading "unless first assailed with angle agonists of his own devising".

Comment serving a recall notice on "Don't Be Evil" (Score 1) 225

"If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place."

That statement is nothing shy of a Full Monty disgrace to free enterprise. Nobody ever has a nice thing to say about government, and this leads to the comforting illusion that we can devolve the beast of government (for the most part) to the free market where much of government's function would be better served, until some high and mighty idiot in the private sector comes around saying something like this and bursting everyone's happy bubble. Well done, Eric, running the graduated approach to managing one's personal boundaries straight to the tip heap, for the betterment of all society. Yes, this is exactly what government by quarterly report will look like when that fine day finally comes. Book it.

There has never in history been a society that has strayed so far into the glass fishbowl: in a closed community where no behaviour goes unnoticed, living quiet lives of desperation is the order of business. Woe to anyone who dares to shirk this shackle (a theme of the very difficult movie Breaking the Waves). And yet, this too is not enough?

What a pompous ass to make such a remark. So close, and yet so far. Google could have been so much worse. For a long stretch, their sane and (relatively) moral decisions far outweighed their missteps. Then they caught wind of Facebook eating their lunch, and now they seem hell-bent on making up for lost time. I can barely express my disgust at the implications of that remark.

There's that old joke about Gates declared darkness "the new standard". Now we have Schmidt declaring the naked light bulb in the holding cells of the Lubyanka as the new, unceasing dawn.

I was reading about circadian phase entrainment the other day. In the hamster model (which I say generically, forgetting the precise rodent flavour) they use constant dim light to establish the free running state (which is actually the free running state in constant dim light). They don't use constant bright light, because constant bright light causes the cells of the suprachiasmatic nucleus to lose synchrony (effectively destroying the body's internal circadian signal altogether). In the torture setting--if that is in fact the purpose of the unblinking naked light bulb hanging above arm's reach in every cell--loss of circadian rhythm would have an effect on sleep that would promptly dissolve and disintegrate all sense of perspective and self-hood. This is, of course, what they wish to achieve. One doesn't torture the whole man, one tortures the wretched shell, so that the whole man shall never take up residence ever again.

Praise be to Google, keeper of the constant light.

Comment selection pressure against "random" (Score 1) 181

I've always had a harsh relationship with terminology that subtly obscures. As such, I hated the term "junk DNA" from the moment I first encountered it long ago, instinctively reading it as "when sequenced, consumes huge amounts of grant money for results I can't publish". It struck me as ludicrous on its face that a combinatorial system engaged in adaptive "tuning" would eschew linearity where it could inject some on the cheap. We now know that much of the noncoding DNA is under heavy selection pressure. How anyone expected for a microsecond not to find mechanism there is beyond me.

Similarly, I've never been terribly pleased with "random mutation". It strikes me that if adaptation is adaptive (and therefore under selection pressure) that "random" must in some deep sense also be under selection pressure to become something not entirely or precisely captured by the word "random". "Random" turned out to be a deep word taking us down the path of Von Neumann, Shannon, Knuth, Chaitin, Kolmogorov, and recently into the terrain of Jurgen Schmidhuber.

This leads one to contemplate higher orders of viability, where say some branch of the evolutionary tree accumulates useful variation more quickly than another, due to some mutation having biased the "randomness" of mutation into a more productive or exploratory sub-space, and then this evolutionary branch inexorably out-competes other evolutionary branches less nimble in the adaptation arms race. Unfortunately, this notion perhaps reeks a bit of "group selection" as taken to task by Steven Pinker in one of his pieces at Edge. (In that piece he does mention that "random" is better read as "blind to outcome" but I still think that falls slightly short, as if mutations are only ever tasted once.)

Is there a sensible way to discuss or formulate selection pressure against the nature of the adaptive system itself? Are we sacrificing an important intuition by hiding this process, whatever it might look like, behind the customary word "random"? Just how necessarily blind must the genetic system remain? Obviously not completely blind because modern humans (obtained via evolution) are now capable (in theory) of designing evolutionary systems optimized to evolve more vigorously per an internal representation of viable evolutionary pathways as defined in some mathematical sense. But could this have bent back upon itself far sooner in evolutionary history without the detour through an "intelligence" phenotype?

An example: Most biology shares the same genetic code (assignment of codons to amino acids). There seems to be a lock-in aspect, despite the genetic code as established perhaps being less than optimal as an error shuffling substrate, for some not terribly proximal notion of optimal (which is problematic). We could change that now, if we wished, to produce an organism much like ourselves, with a very different adaptation profile into the future. Welcome homo mutabilis.

Comment Should we not distinguish "general" evolution? (Score 1) 181

Before my question, I'd like to express some gratitude for the influence of your work on my life. My first experience of your ideas was through your book Infinite In All Directions which I purchased when it was newly published (circa 1990). On the front cover my edition there is a blurb from the Washington Post Book World which reads:

The bedazzled reader emerges feeling like he's been in a metaphysical Maytag on spin cycle—his perception on man, God and the cosmos permanently altered.

That's not how I experienced it. I experienced it as having taken a delightfully straight-forward day trip on a paved highway away from my mundane reality in the metropolis of small minds where discourse obeys speed limits seemingly devised to protect muddled adults from clear-minded children. I wanted to crawl into the Maytag and live there.

The blurb finishes:

Dyson's language, reminiscent of Orwell's, is eloquently plain, wrought with the unaffected grace of a man certain he has something to say. [An] exuberantly stimulating book."

This came across most strongly for me in the chapter Quick is Beautiful and your discussion of space butterflies. I was still too callow to appreciate how much an educated person must unlearn to return to plain language. I became angry that so many other books are written less well. From your example, it didn't seem so hard. Looking back, that ease seems to derive from a mental rigour in attending your point of departure and keeping it clearly in view: it's not so much a clarity of language (though this is present), as a clarity of frame.

That exact moment, finishing your book on that porch on that June afternoon I still recall so clearly, and then flipping it closed to contemplate that exact blurb, remains with me as the first conscious seed of my own "geek manifesto", reminding me always that there is the hard word of having an idea, and the harder work of not having an idea (but pretending you do). Thank you so much.

Continuing in that vein for a moment longer, I was also deeply struck by a passage early in the book about the Shotgun Seminars and the anecdote about Jan Oort.

At our Institute in Princeton we sometimes organize meetings which are announced as Shotgun Seminars. A Shotgun Seminar is a talk given by an Institute member to a volunteer audience. The subject of the talk is announced a week in advance, but the name of the speaker is not. Before the talk begins, the names of all people in the room are written on scraps of paper ... ceremoniously ... picked out at random. The unbreakable rule of the seminar is that nobody whose name is not in the box may listen to the talk. This ensures that everybody does the necessary homework. The audience is at least as well prepared as the speaker. The audience is ready to argue and contradict whenever the speaker flounders. Anybody who has not given serious thought to the subject of the seminar has better not come.

I loved that passage. Upon reading that passage I had the first clearly articulated moment of regret of my adult life. Why didn't someone tell me this when I was fourteen and still in school so that I could have at least enjoyed a clear notion of what the entire system was shirking? This has remained my private useful definition for the phrase "doing your homework" ever since; that other loathsome scholastic busywork formerly known as such shall remain nameless, having been donated to a better cause.

The associated anecdote from your book is that Jan Oort at the age of eighty-two drops in unexpectedly and pronounces himself fit to participate "on the stability of stars revolving around the center of galaxy" on no advance preparation whatsoever, saying "no problem, I stay" and though he doesn't become the speaker, he does give the most lucid recap. This was inspiring to me at the time, and it still is.

Here begins my question

I've been pondering for the last while the notion that perhaps one of the biggest mistakes of science is that while we have special and general theories of relativity, we haven't clearly distinguished special and general theories of evolution.

In my pondering I tend to ascribe the special theory to the natural selection part. The general theory (which in my opinion we do not yet have) then concerns how systems defined by random variation and natural selection accrue complexity (the mysterious man behind the curtain of the "ascent of man" iconography Steven J. Gould so deeply despised). Can the fossil record even speak to the question of "Why not something else?"

This strikes me as unfortunate because the special theory is entirely pedestrian (once you have it) and the entire theological fracas lies on the general side of the ledger, high and dry above the rising GenBank tide which brooks no sane opposition. The special theory of evolution becomes essentially a descriptive (and not terribly contentious) claim about the combined fossil and genetic records: wherever the mesh is revealed, all the gaps will be mutation small (for non-theological values of "random"), the dispersions geographically plausible, and the phenotypes viable in their environments. Whether to view this colossal descriptive back-story of life on earth as natural history or Slartibartfast's imprimatur is another matter. A descriptive theory need not concern itself with what actually "happened", but merely with whether observational constraints are satisfied, which greatly constricts the scope of debate (so, too, with electrons). There's also a forward prediction in the special theory, which will pack a wallop four generations from now when a young child first receives his "mutation tree" dating back to his great great grandparents and then confronts their views on this matter encapsulated in their Facebook memorials (as excerpted in their own words, plus the algorithmic digest truer than true).

You write yourself in your small book The Origins of Life that:

How did it happen that, as life evolved, death continued to be commonplace while resurrection became rare? What happened was that the catalytic processes in the cell became increasingly fine-tuned and increasingly intolerant of error.

This is posited within the frame of a "toy model" about which you say:

The general theory of molecular systems obtained in this way allows us to define what we mean by the origin of metabolism but does not allow us to predict under what conditions the metabolism will occur. The second stage consists of the reduction of the general theory to a toy model by the assumption of a simple and arbitrary rule for the probability of molecular interactions.

In light of your profound mathematical metaphor of symmetry breaking (death cleaves from resurrection by enzymatic first principles), would you not concur that we've somehow unnecessarily entangled the public debate by lumping "how it goes" (natural selection) with "where it goes" (arriving by some mysterious means at the complexity of life as we know it)—all this flying under a single banner? Speciation would be a far less interesting phenomena if it consumed so much evolutionary "heat" that the entire system went cold. Yet, somehow, it doesn't seem to. There's a pump somewhere, or some diagram of combinatorial fecundity with an under-cusp labeled "cold chicken soup".

Optional: Is it even fair to claim that we have a theory for the magic evolutionary metabolism of complexification, beyond these suggestive hints from toy models such as your own? If our complexity surfeit (under some view I couldn't begin to articulate) equals or exceeds one non-blind watchmaker, we're right back to theology again.

Comment Re:Whats the alternative? (Score 1) 863

If you ran consumer Windows starting around Win 3 when it became popular ...

Bun fight erupts.

I intentionally traced the non-NT chain, as those WERE the consumer versions of Windows.

I fail to understand how a thoughtful person can wrap their insight around the word "popular" and not expect to retrench in the immediate aftermath.

It was never "popular" that Windows 9.x was a crashy piece of shit. But human dynamics being what they are, when a young couple purchases a shared PC that A) hardly supports any popular video games, and B) has no crashy-POS drivers for that cherished $50 plastic-shell of consumer landfill the least proficient shared user will scream bloody murder that the playing ground is tilted. So the consensus is to purchase the crashy POS which puts everyone in the same ugly boat.

Or, if the most proficient user is flush, two PCs are purchased. Now the proficient user has A) a balky system that's hard to install and configure but rarely crashes, B) a horde of the hopelessly naive demanding "technical assistance" to accomplish the technically impossible. If you simply say "it's an unstable POS by design" you're not regarded as a team player. They say "what should be do instead, should we install Windows 2000" and you'll say "not if you expect any of that consumer crap you've purchased in the last two years to ever work again". "So what do we do, then?" "Nothing, you're totally screwed by your thrall to impulse purchases. Mess with it enough to get your average uptime to four hours plus, and then tough it out ... it helps if you find yourself something useful to do during the frequent reboots, like scrubbing the toilet". "What if we buy an Apple?" "That works just fine if you part with three large every 30 months, and by some miracle a saviour is found for the good ship Apple."

Phase II: Proficient users migrate en mass to Linux/BSD to avoid the social nuclear winter of being Microsoft's unpaid technical support battalion in the pursuit of consumer expectations that Windows 9.x was never designed to satisfy. Popularity is left to run its course, precisely to where it has finally found itself.

Comment unplanned non-obsolescence (Score 3, Interesting) 564

Unplanned non-obsolescence is the dumbest thing I've heard since breakfast, which puts it in with some stiff competition.

How about frantically, desperately deferred non-obsolescence? How about IE6, Exchange, and Office suite document non-portability as a modern-day Maginot Line, equally doomed?

But in the end, what could they do? We were clearly entering the end-game on the desktop PC as a rain-maker a full ten years ago.

Meanwhile we managed to gadgify consumption with pocket trinkets where the entire device costs about the same as any decent ISA expansion card back in the day. Because they are autonomous (and you can lose them under a sofa cushion) each gadget is separately counted. It's a bit like counting remote controls instead of televisions, but we'll ignore that.

And best of all, according to the true nature of innovation, we now have the cyanide-green Apple business model of land-fill express non-replaceable batteries. Microsoft and their OEM cabal are green with envy they can't sell a PC whose golden age is so effectively knackered. That was not their father's green. The times they are a changing.

Comment Re:Sony v. Hotz (Score 2) 299

Sony at no point ever had any arrest powers. Could they petition the government to do so? Yeah, but they themselves can not. That is why it is ultimately the government who censors.

"The" government does not run around with guns. That would mainly be members of the police and the military. If either (or both) chose to ignore the orders of government, how would government force them to obey? Issue further orders, also to be disobeyed?

The police, having the powers of government behind them, do a lot of shit the government does not make them do, because they can, and who will arrest them if they cross the line? Their own? Only for major transgressions (such as not sharing the loot, or ratting out your own). In every police force, you find both kinds.

Conceptually power devolves to the government. Pragmatically, money talks, congress-critters lean to the green, and there are many leaks in the system where deputies can free-lance.

I'm flabbergasted when I see people willingly analyzing government as some kind of unblinking Eye of Providence. Governments are made of people, and people are just not that well organized, except in so far as the world gets wired up like the movie Brazil where the power of government is everywhere, but the intention of government is nowhere, as represented by the beetle who disrupts the teletype.

Censorship by a thousand cuts is still an pretty damn effective form of censorship at the end of the day. We can oppose these cuts one by one by voting with our wallets.

Comment Re:I have one, and really like it. (Score 4, Informative) 140

I've had mine for about 5 days now. So far it's worked pretty much flawlessly. It was a bit thicker than I anticipated, fairly large in the frame, and maybe not suitable for a woman's wrist, although the screen itself is small enough if the frame were a bit more compact. I've got a second one on order, in one of the colours they are not yet making.

I was surprised to get a notification this morning with my phone in one corner of the house, and my watch in the opposite corner (on the sill in the bathroom while taking a shower). I really didn't expect the BT to work at that range with a 90 degree bend from a large room into the hallway and then through a closed door at the far end. Perhaps it was a bit of fluke. Not enough data yet.

The vibration is surprisingly audible on the wrist, and even more so when the watch is left lying on a flat surface. This partly makes up for not having a beep.

Features are pretty limited as it stands, but the interface is dead easy to use with the four buttons provided. On the plus side, one can program a large number of distinct alarms. On the down side, there's no way to disable an alarm without deleting it completely.

I have two LCD screens on my desk. One is polarized horizontally, and the other vertically. With my polarized shades one display goes so dark I wonder if it's turned on--until I tip my head to either side. This causes the watch display to look a little funky when there are not other lights on in the room: different regions go darkish as I tilt my hand. For two puzzled seconds earlier today I thought the e-paper display had leaked.

It's super visible in bright light and a little hard to read in early dusk before you reach for the light switch. I turned my backlight off to better monitor battery life without accidental backlight activations. The wrist flick works, but it works too often if you have busy hands. No, I don't mean typing. No, I don't mean stereotypical activities, either.

I would never have bought one without the promise of an SDK to allow me to make customizations. There are aspects of my life not tied to a 24 hour clock, and I want my watch to display these other relationships as well as standard time. I'm happy enough with it, but it's just a silly toy for me until Pebble releases their SDK.

Comment Re:How modern! (Score 1) 75

Needless to say that there is absolutely no excuse for having such problems in the first place; if you can't write consistent interfaces, you have no business designing the core API of any programming language, period.

I guess you missed the memo that the K&R string functions are deprecated in many projects such as OpenBSD which has their own recommended set of string functions.

Way back when, Iverson and his APL cronies put a great deal of effort into defining the APL arithmetic operator set to conform to the largest possible set of simple arithmetic identities. Has the definition of the modulus operator concerning negative arguments been consistent in all languages since? That they shrouded this deep elegance with inscrutable Greek letters matters exactly how? They wrote a paper detailing all the identities they had discovered concerning the APL operator set. I've never seen a single other language bother to do this. Perhaps because identities written out with floor() and modulus() and spzkrm() lose a lot in translation.

Language designers preoccupied with consistency are known as dreamers (or Hurd developers). The formula that seems to grow up to become a language people actually use is 75% utility and 25% elegance.

I guess you missed the memo that when elegance dies on the vine in infancy, it does no one any damn good.

Why are programmers reluctant to refactor code when an elegant API becomes available to replace a hastily conceived core API? Because you can rarely trust the equivalence all the way down to the last edge case, because few APIs documents their identity sets listing all the cases you'd like to be true (and a few you hadn't even considered yourself) as well as the cases you presumed should be true, without realizing that these cases fundamentally conflict with other identities that made the cut.

Programmers who don't declare their identity sets shouldn't be allowed to write APIs, because a reliable identity set is the only way the downstream programmer will dare to refactor your API out of his applications if it turns out your API sucks--as part of a mass exodus from superior documentation.

Are you beginning to see the problem here?

Comment Re:An entertaining, gifted critic. That's it. (Score 3, Interesting) 198

Pennsylvania investigators concluded that Dunn was driving up to 140 miles per hour when he crashed. His blood alcohol content was .196, which is far higher than the legal limit of .08.

This behaviour displays a wanton disregard for the life and safety of those around him. Would you bite your tongue in respectful silence when Patient Zero is freshly planted?

From Snopes:

Dugas appeared to move between denial that whatever he had could be transmitted sexually ("Of course I'm going to have sex. Nobody's proven to me that you can spread cancer"), depraved indifference to his partners' wellbeing ("It's their duty to protect themselves. They know what's going on out there. They've heard about this disease"), and a desire to take others with him ("I've got gay cancer. I'm going to die and so are you").

In what way was Dunn's behaviour any better than Dugas? Was is the first time he ever drove over the speed limit? The first time he drove bombed out of his mind? The first time he combined being twice the legal limit and driving at twice the speed limit? Somehow I doubt it.

Ebert's tweet was really aimed at the jackasses who knew about and enabled Dunn's behaviour and decided to tolerate it, not caring enough about public safety to have him arrested and jailed (which he certainly deserved), and not caring enough about Dunn himself to prevent his foreseeable death. As a former alcoholic himself, Ebert had some strong personal opinions about the behaviours of his fellow alcoholics and those around them, the same way a sex offender might be harsh in condemning another sex offender. In-group vitriol is 200 proof.

What has it achieved this respectful biting of lips? Self-centered assholes like Dunn still put the public at risk after forty years of public awareness efforts. I would have been much happier with the outcome if Dunn had redeemed himself to "former asshole" by seeking treatment rather than killing himself.

Somehow the polite grieving process and the social institution of denial has become joined at the hip. Ebert decided to fire a cap into this unholy union before the glue dried. As a result, every time someone criticizes Ebert for his tweet intended as true, the message behind his tweet is reopened for examination. We might even be saving lives here if the message finally sinks into the public consciousness that people behaving like Dunn aren't much better than people behaving like Dugas. Or is there a subtle hierarchy on acceptable ways to expose people to mortal danger without their consent? Not for me, there isn't.

And who are we protecting by our polite silence? The people who either meekly or gutlessly enabled Dunn to continue his reckless behaviours? Well, guess what? Gutless sucks. And meek sucks, too. The respectful silence just serves to confirm in people's minds that they did the best they could, without forcing them to confront the public sentiment that it damn well wasn't good enough. The true enablers in this story? The phony friends who hung around and encouraged his outlandish behaviour because they found Dunn to be funny or entertaining, but didn't give a damn about his well being or the well being of the babies and children and parents and sisters and brother who shared the same highways with the drunken, hard-driving Jackass.

If I had a family member who was a hard-living alcoholic and he hung out with a bunch of enabling carousers and high-functioning deadbeats who let him (or her) walk out of a pub shit-faced to hit the highway with death-wish testosterone or toxic depression, and someone of Ebert's status tweeted about it that "friends don't let friends drink and drive" my own reaction would have been an angry "Damn straight!"

Or maybe I'm wrong about myself, and in my grief over my dead family member I'd be grateful for the social courtesy of respectful condolences that abets this sorry state of affairs to perpetuate itself. Perhaps grief lobotomizes in some deep way I've never fully understood because I haven't lost a family member to an alcoholic traffic incident that could so easily have been avoided.

Good grief, what's wrong with me? I just can't seem to stop appending that phrase "could have been avoided". How will I ever manage to grieve properly when disaster strikes?

I was sadder about Ebert's passing than I've ever been over someone I've never met or personally known. He was a humanist to the core, so much so that he never coddled the toxic without even stopping to consider the price.

Slashdot Top Deals

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...