Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
User Journal

Journal Journal: Yeah, about that ...

Okay, so there's this quote that never seems to die. It's often attributed to Morgan Freeman, although I believe it actually comes from Henry Rollins; in any case, it doesn't much matter who said it. It just gets posted and reposted as a bit of snarky wisdom. Snarky it certainly is, but wise it's not.

First, the quote: "I hate the word homophobia. It's not a phobia. You are not scared. You are an asshole." There it is. Read it, enjoy it, revel in the snark.

Now, here's what's wrong with it. First, "phobia" is widely understood to mean "aversion" as well as "fear." Spare me the etymological arguments, please. Language evolves, and this is one of the ways in which it's evolved.

Second, yes, homophobes are afraid. Pretty much any time one large group of people hates another large group of people, fear is at the root of it. They're afraid, in some ill-defined but vehement way, that if gay people are allowed to be gay the way straight people are allowed to be straight, everything will fall apart. The foundations of their world will crack. The earth itself will turn to quicksand beneath their feet. Things Will Not Be As They Have Been, And Should Always Be. In the case of male homophobes who have a particular aversion to male homosexuality, they're afraid--in the words of another meme that is both snarky and wise--that gay men will treat them the way they treat women. And they're afraid, in a startlingly large number of cases, of the way they just can't ... stop ... thinking ... about ... gay ... sex ... and ... how ... terrible ... it ... is ... can't ... stop ...

Third, and perhaps most important, homophobes themselves deny they're afraid, and run away from the word "homophobia" at every opportunity. Try it: identify a homophobe as such, and there's a good bet you'll get an invective-laced tirade about how it's not about fear but about the disgust that every decent person should feel when thinking about such acts (... can't ... stop ...) and how it is the patriotic duty of every red-blooded patriot who knows right from wrong to stand up against the Gay Agenda ... etc. This is particularly acute, again, when male homophobes who have a particular aversion to male homosexuality (sorry, I can't come up with a good acronym here) are confronted with their homophobia, because, you see, fear is for girls. And fags, who might as well be girls. Because girls are icky. Not like us big, strong, healthy, muscular men with our strong arms and bulging pecs and ... can't ... stop ... where was I? Oh, right. Fear is unmanly.

So yeah. No one hates (and fears!) the word "homophobia" more than homophobes do, and for that reason if no other, it needs to stay in the language. Never stop shaming them. Never stop reminding them what cowards they are. Know their fears and exploit them mercilessly, crush them and see them driven before you, chase them back under their rocks where they belong.

User Journal

Journal Journal: "America needs a white Republican President." 3

Opposition to Obama has nothing to do with race. ÂNope, nothing at all.

</sarcasm>

Okay, Republicans. ÂLook, I believe that most of you are not racist. ÂYou oppose Obama because you disagree with his policies, not his skin color. ÂYou'd rather have a Republican President because you're Republicans, and you're Republicans because you largely agree with Republican Party policies rather than out of a sense of tribal identity (I extend you that courtesy; please do the same) and you don't care what color this hypothetical Republican President, with whom you would agree far more than you do with Obama, might be.

I believe that, not least because the alternative -- that a majority of members of a political party that represents about a third of the American electorate is actively, maliciously racist -- is too grotesque to contemplate.

But there is, at the least, a substantial minority of your party that is actively, maliciously racist, that puts its racism on display as proudly as ever did the KKK wing of the Democratic Party of old. ÂFrom where I'm sitting, and where many Democrats are sitting, it looks an awful lot like this minority (I have to keep believing that) is steering the agenda of your entire party. ÂYou have to deal with these people. ÂYou have to exile them, shame them, chase them back under their rocks where they belong. ÂWe can't do it. ÂThey won't listen to us. ÂThey're your people, and that makes them your problem.

Or we can all keep going down the path we're on. ÂBecause, you know, that's working so well.

User Journal

Journal Journal: Correlation, causation, and all that. 12

So this cartoon has been going around my Facebook friends list ... I'm going to try to explain what's wrong with it, and I'll try to be succint, but I don't know how good a job I'll do, so bear with me. The short and snarky version is found in my Slashdot sig line, "The correlation between ignorance of statistics and using 'correlation is not causation' as an argument is close to 1," but that's kind of unfair and certainly isn't all the discussion this subject deserves.

First of all, yes, "correlation is not causation" is strictly true. That is, they are not the same thing. If events A and B tend to occur together, this does not mean that A causes B, or that B causes A. There may be a third, unobserved event C that causes both, or the observed correlation may simply be a coincidence. Bear this in mind.

But if you observe the correlation frequently enough to establish significance, you can be reasonably sure (arbitrarily sure, depending on how many times you make the observation) that it's not coincidence. So now you're back to one of three explanations: A causes B, B causes A, or there exists some C that causes both A and B. (Two caveats: whatever the causal relationships are, they may be very indirect, proceeding through events D, E, F, and G; and the word "significance" has a very precise meaning in this context, so check with your local statistician before using it.) An easy way to check for A-causes-B vs. B-causes-A is by looking at temporal relationships. If you are already wearing your seatbelt when you get in a car crash, you are far more likely to survive than if you aren't, but you have to have made the decision to put the seatbelt on before the crash occurs--it's the fact of you wearing your seatbelt that causes you to get through the crash okay, not the fact that you get through the crash okay that causes you to have been wearing your seatbelt. Unfortunately, the temporal relationships aren't always clear, and even if you can rule out B-causes-A on this basis, it still leaves you to choose between A-causes-B and C-causes-(A,B).

An awful lot of what science does is figuring out what C is, or even if it exists at all. This is where mechanistic knowledge of the universe comes into play. Suppose that emergency departments in particular city start seeing a whole bunch of patients with acute-onset fever and diarrhea. Shortly thereafter, ED's in nearby cities start seeing the same thing, and then the same in cities connected by air travel routes. Patient histories reveal that the diarrhea tends to start about six hours after the onset of fever. Does this mean the fever is causing the diarrhea? Probably not, because these days we know enough about the mechanisms of infectious disease to know that there are lots of pathogens that cause fever, then diarrhea. The epidemiologists' and physicians' job is then to figure out what the pathogen is, how it spreads, and hopefully how best to treat it; while they're doing that, the "correlation is not causation" fanatics will be sticking their fingers in their ears and chanting "la la la I can't hear you," and hoping desperately they don't end their days as dehydrated husks lying on a feces-soaked hospital bed.

The point here is that in most cases, correlation is all we can observe. (Some philosophers of science, a la David Hume, would argue that we never observe causation, but I'm willing to accept "cause of death: gunshot wound to head" and similar extreme cases as direct observation of causal relationships.) Not every patient exposed to the pathogen will get infected. Of those who do, not all will show symptoms. Some symptomatic patients will just get the fever, some will just get the diarrhea. Some will get them at the same time, or the diarrhea first. Medical ethics boards tend to frown on doing controlled experiments with infectious diseases on human subjects, so you have to make what inferences you can with the data you have.

Even with all these limitations, correlation--in this case between exposure and symptoms--is still a powerful tool for uncovering the causal relationships. Most of what we know about human health comes from exactly this kind of analysis, and the same is true for the observational sciences generally. Astronomy, geology, paleontology, large chunks of physics and biology ... they're all built on observations of correlation, and smart inference from those observations. So if you want to know how the universe works, don't rely on any one-liners, no matter how satisfying, to guide your understanding.

User Journal

Journal Journal: I'm not sure if Betteridge's law applies here or not. 2

Privacy and the Internet: Is Facebook Evil?

He's right that privacy in the modern sense is a new development--for most of human history, people lived with what we would now consider a near-total lack of privacy--but wrong, I think, to dismiss it on that basis. There are many, many modern ideas, such as democracy and equality before the law, that would have made no sense whatsoever to our ancestors; does that mean they're any less worth prizing?

Obviously I'm not particularly concerned about giving up my privacy by maintaining an online presence, else I wouldn't be posting this. But the combination of a traditional "village" level of everyone knowing everyone else's business with the speed and ubiquity of modern communications represents a third phase in humanity's development as far as privacy is concerned--the first having been the intensely linked small communities of nomads and peasants, the second having been the mass anonymity of the industrial age--and I don't think we have any idea how that's going to shake out yet.

User Journal

Journal Journal: Race is a social construct, again. 2

I thought it was already pretty well understood that "Celtic" is only meaningful as a linguistic grouping, but it seems the old idea of a separate "Celtic race" or "Irish race" is pretty strongly embedded, even now:

DNA shows Irish people have more complex origins than previously thought

This makes me think about wider issues. I don't know how many online discussions I've been in recently in which I've been solemnly assured that humanity is divided into three races. (Three shall be the number thou shalt count, and the number of the counting shall be three. Four shalt thou not count, neither count thou two, excepting that thou then proceed to three. Five is right out.) And people will go on believing this, even when genetic evidence makes it perfectly plain that there's no such thing as race, never has been and never will be. There are heritable phenotypes, some of which are clustered together as a result of geographical or historical accident, none of which are set in stone and almost all of which are continuous rather than discrete states. The weight we assign them is entirely cultural.

As always, Darwin puts it elegantly: "Man has been studied more carefully than any other animal, and yet there is the greatest possible diversity amongst capable judges whether he should be classed as a single species or race, or as two (Virey), as three (Jacquinot), as four (Kant), five (Blumenbach), six (Buffon), seven (Hunter), eight (Agassiz), eleven (Pickering), fifteen (Bory St. Vincent), sixteen (Desmoulins), twenty-two (Morton), sixty (Crawfurd), or as sixty-three, according to Burke. This diversity of judgment does not prove that the races ought not to be ranked as species, but it shews that they graduate into each other, and that it is hardly possible to discover clear distinctive characters between them."

User Journal

Journal Journal: Because I clearly need to do this more often

Dear Internet:

Some aspects of your style of argumentation have recently caused me some concern, and I thought it would be best to address them now, before they get out of hand.

If I insult you, I am not necessarily using an "ad hominem" argument. This phrase (literally, "to the man") refers to a specific logical fallacy, that of assuming that when someone you dislike or consider beneath you makes an argument, it follows that the argument is wrong. "You're a moron, so I don't have to listen to anything you say" is an example. "Only a complete idiot would say what you just said, so you must be only slightly smarter than the average flatworm" is not.

In fact, it's probably best to stay away from Philosophy 101 lists of common logical fallacies all together. Just as not all insults are ad hominems, not all citations of experts are "arguments from authority." Not all "slippery slope" scenarios are fallacious. And for the sake of all you hold holy, if you don't understand in gory mathematical detail what correlation and causality actually mean, and the different uses of the verb "to imply" in different contexts, please stay away from any version of A Maxim I Will Not Utter Here, But Which You Can Probably Guess.

All that being said, there is one fallacy to which you fall prey on an alarmingly regular basis. If you disagree with what I say, you have the right--in some cases, the duty--to voice your disagreement. Free speech is a wonderful thing, and it is easier to exercise in the modern world than it has ever been before. By all means, speak up.

However, please make sure that when you're voicing your disagreement, you are disagreeing with what I said. Replying instead to what someone else said, or what you think I'm "actually" saying, or what you think I or someone else might say in the future, are examples of the "straw man" fallacy, and although I have not performed the analysis necessary to test this hypothesis, I strongly suspect that this poor overworked scarecrow is to be found in greater numbers in online discussions than any other type of fallacy ... which, now that I think about it, is a pretty impressive accomplishment.

Thanks for your attention to these matters. Hopefully now that I've explained the error of your ways, we can move on from here and enjoy friendly, well-reasoned discourse on a wide variety of topics.

Sincerely,
That Guy Who's Always Right On The Internet

The Military

Journal Journal: It's a bug hunt.

I've said before that the expectations my generation of GIs had when we raised our right hands were shaped by two main forces: the flood of Vietnam movies that came out when we were in high school, and Aliens.* Given that the people who are now running things at the Pentagon went in about the same time I did (!) I can't help but think that the latter had a lot to do with the fairly smooth acceptance of women into combat positions over the last few years.

So it's just bizarre to me that a quarter of a century later, with a solid history of women fighting in Iraq and Afghanistan, this kind of thing is still happening. And because video games will probably have just as much to do with shaping the current generation of recruits as movies did with previous generations, it's going to be a problem on the battlefield as well as the web.

*Whether it's a good thing or not that recruits report to basic training with their expectations shaped by popular entertainment is a separate issue. Just accept that they do.

FEAR OF A WOMAN WARRIOR -- The development of Aliens: Colonial Marines and comments from Epic Games' art director reveal a troubling attitude about strong women in games among some major developers.

Media

Journal Journal: Because everyone knows Idiocracy was a documentary.

Show. Me. The. Data.

Dumb and Dumber: Study Says Humans Are Slowly Losing Their Smarts

The actual Trends in Genetics articles (paywalled, unfortunately; I urge anyone with access to read them) make it clear that this is not a "study" in any meaningful sense in the word, but rather a bit of unfounded speculation. Now, speculation is an important early step in the process of science, to be sure--but that speculation should be founded on observation, and the author offers none.

Unfortunately, this particular bit of speculation is (as I strongly suspect he knew it would be) a crowd-pleaser, playing as it does into the lost-golden-age mythology which has such universal appeal across all ages and cultures. (I speculate that it's something hardwired into the human brain, but I freely admit that I have no data to support this hypothesis, other than the observation that such a mythology exists.) An actual study would be a worthy project. This kind of sensationalism is just sad.

Original articles:
http://www.sciencedirect.com/science/article/pii/S0168952512001588
http://www.sciencedirect.com/science/article/pii/S016895251200159X

A very insightful critique:
http://www.sciencedirect.com/science/article/pii/S0168952512001941

Author's response to critique, which consists largely of saying "Nuh-uh!":
http://www.sciencedirect.com/science/article/pii/S0168952512002090

User Journal

Journal Journal: I want to go on record saying this now: 10

It's time to get rid of the Electoral College.

Based on the results of state vs. national polls, it's looking increasingly likely that Obama may lose the national popular vote but win in the EC. As a nakedly partisan Democrat, would I be pleased with this outcome? Well, I'd be happier about it than I was when Bush lost the popular vote but managed to finagle an EC win, obviously ... but "happier" does not equate in this case to "happy" by any means. Because having someone against whom the majority of Americans vote become (or remain) President should simply never, ever happen.

The EC hasn't served its ostensible purpose, to protect the interests of smaller states against domination by larger ones, for generations, if ever. All it does is focus an unwarranted amount of attention on a few "swing states" every four years, with the effect that the interests of the residents of states that don't fall into this category get no representation at all at the Presidential level. If you live in Texas or California, you might as well not vote at all in the Presidential election; same if you live in Wyoming or Vermont. And that really sucks.

Even "swing states" don't really matter all that much, most of the time, if they're sparsely populated. New Mexico was just as close in 2000 as Florida was, but nobody cared how it went, because whoever got Florida was going to get the White House. (Gore won NM by some incredibly narrow margin; if you'd forgotten that detail, I don't blame you.) What was that about small states, again? Yeah, that's what I thought.

Get rid of the damned thing. This isn't partisanship. It's an acknowledgement of reality.

User Journal

Journal Journal: The die is cast; the Rubicon is crossed.

I just finished submitting revisions on The Paper. Not, you understand, revisions in response to reviewers' comments--we haven't received those yet--but rather revisions made necessary by my discovery, well after submission, of a bug in the code. Fortunately it didn't substantially affect the main results or the conclusions, but it did require revising some of the numbers.

I've never had to do anything like this before, and sincerely hope I never do again. It was a stupid bug, the kind of mistake that anyone can make coding at 2:00 AM on too much caffeine and way too little sleep, and I should damn well have caught it before sending out a paper which will pretty much define my research career to date.

But I'm glad it's done. Because while everyone makes mistakes, and indeed those mistakes are part of the process of science, you have to be honest about them. If you're not honest, then what you're doing isn't science, it's something else (say, politics or religion). There is no capital-T Truth in science, but there is truth, and we must always tell that truth as best we can.

User Journal

Journal Journal: I'm happy about Curiosity. I really am.

But here's the thing. When I was born, my father was working for NASA on the Apollo program. You know, "the Eagle has landed", "one small step," all that. He was one of the (many, many) people who made that happen. He was there, as "there" as it's possible to be without feeling Lunar soil under one's own boots.

When we moved to Denver a couple of years later, he worked for what was then Martin Marietta, on the Viking project among other things. IIRC, he also worked on the early design process for the Shuttle. At that time it was supposed to be fully reusuable, the "big bird little bird" idea that was supposed to make flying into space not a whole lot more complicated than flying across the country.

So I grew up in a house full of space stuff. Giant glossy PR posters, mostly, including one incredibly detailed one about the Apollo missions that covered everything from orbital routes to spacesuit design; also unique memorabilia given only to those who actually worked on the Moon landing, prospectus-type brochures from Martin detailing the kind of stuff they seriously expected to be building within a few years, and--of course--Star Trek stuff. Because that was where we were going, sooner or later. That was the goal.

I grew up with this, waiting each year for it to happen, to start moving forward again. Apollo-Soyuz and Skylab were ... well, they were still something. And surely our retreat from the Moon was temporary, a retrenchment, perhaps an opportunity to do it right the next time by laying the groundwork with a permanent Earth-orbital station that would serve as a dock and transfer point for space-only shuttles between Earth and other destinations. But we weren't going to just give up. Surely not that.

Except we did. Every year, we dropped our expectations a little lower. Even our mass media science fiction reflected the change: from Star Trek and 2001, to Star Wars and Battlestar Galactica. From believable visions of a future that we could really build, to heroic fantasy with a technological gloss.

It wasn't until some time in the late 80s, I think, that I finally accepted it wasn't going to happen. We were not, in my adulthood and probably in my entire life, going to be a truly spacefaring species. We could be by now, you know. We could be living on the Moon and Mars, mining the asteroid belt, colonizing Europa and Titan and maybe figuring out, once and for all, if there are any loopholes in our current understanding of physics that might put the stars within reach. And all the work done by Spirit and Opportunity, and that will be done by Curiosity, could be done in a week by a couple of grad students from Areopolis U.

So you'll understand, I hope, if my happiness at seeing Curiosity's success is a little bittersweet. Not because it's not good and satisfying and important, because it is. It's just not enough.

User Journal

Journal Journal: Your terrifying inability to understand how the world actually works. 3

Morford is guilty here of a sin that might be called metaphoricalism--assuming that because he himself often speaks metaphorically, people who insist on literalism must be fools, ignorami, and/or members of a tiny lunatic fringe.

Yes, of course the ability to interpret metaphor is an important characteristic of the intelligent, educated mind. But most of the time, most people mean exactly what they say, and it's a grave mistake to assume otherwise. He really goes off the rails when he insists that mythology must be interpreted in metaphorical terms. There is no reason to believe--no evidence whatsoever--that the people who originally told the stories of Eve, Paris, or the risen Christ thought they were speaking anything other than literal truth; nor were the monsters lurking in the darkness beyond the campfire anything other than our ancestors' attempts to rationalize (not symbolize) the nasty, brutish, and short nature of life throughout most of human history. A metaphorical interpretation of these myths is more reasonable than a literal one, to be sure. It is also, historically and to a large degree in the modern age, a distinctly minority view.

Your terrifying lack of imagination

(Also: âZ"Science is just mysticism disguised as mathematics," says the guy on the internet.)

The Military

Journal Journal: The Supreme Court strikes down the Stolen Valor Act

Kind of lost in the shuffle over the health care ruling (my opinion, FWIW, is that it's a lousy law, but clearly the best we're going to get in the current political climate, so all in all I'm glad it was upheld; perhaps in another couple of decades, we'll be ready to try again) is this piece of news about another Supreme Court ruling: the court voted 6-3 to strike down the Stolen Valor Act.

I admit to mixed feelings about this. It was clearly the right decision -- any law that limits free speech is prima facie a bad law, and the government's argument that it only restricts "false statements (that) have no value and hence no 1st Amendment protection," to quote the LA Times story, is chilling. We cannot outlaw people telling lies. OTOH, there are a hell of a lot of people using lies about their claimed service for personal advantage (up to and including a certain former President) and this is not only disgusting, it's often outright fraud. The SVA was an exceedingly blunt instrument for a problem that called for a scalpel. I guess the solution I'd like to see is the use of existing criminal fraud statutes for cases where it could be shown that the liar is not just telling stories to impress his buddies at the bar, but actually deriving financial or other measurable gain. Oh yeah, also court-martial for deserters (preceded, where necessary, by other measures such as, oh, say, impeachment, for those whose position places them beyond the usual corrective measures.)

I blame Hollywood, really. At this point they've probably given out more Medals of Honor than have actually been awarded in the entire history of the US military. Lesser decorations have been relegated, in this mindset, to something you get just for showing up. It's not just lazy storytelling; it has a real effect on real people who earn real medals. And no, I'm not saying this should be illegal either, but it should certainly be mocked at every opportunity.

User Journal

Journal Journal: There are no moderate Republicans, part the nth 3

More proof, as if any were needed, that modern conservatism is completely insane.

At this point in the conversation, we're usually treated to a chorus of, "Hey, liberals say crazy things too!" And the answer to that is ... well, yeah, kind of. Which is to say, there are plenty of left-wing lunatics out there, and many of them put their lunacy on display at every opportunity.

The difference is that these left-wing lunatics do not have anywhere near the power or prominence of their right-wing counterparts. They're not hosting nationally syndicated talk shows. They're not parlaying famous last names into political careers. And they are sure as hell not running the Democratic Party, as the right-wing lunatics are clearly running the GOP.

Here's the thing, conservatives. We marginalize and trivialize our extremists. Maybe we shouldn't do that; sometimes the extremists have legitimate grievances. But it's better than what you do with yours. You celebrate and lionize them. It's not just Reagan; it's Limbaugh and Coulter and Savage and Hannity -- and yes, Boehner and Cantor and McConnell, and the current version of Romney (which may of course change next week, or an hour from now, but for now ...) We keep our lunatics locked up. You put yours in charge of the asylum.

So here's my challenge. If you are tired of liberals making hay of every crazy thing some conservative pundit or politician says, do something about it. Point and laugh at your own side's lunatics, as we do. Make us believe that common ground is possible, that you have the same ends for the country that we do even if we disagree about the means. Put your racists and fascists in the same room where we keep our communists and anarchists, and keep them decently out of public view.

Or if you're not willing to do this, understand that we have no choice but to consider you just as bad as the worst of your number, and act accordingly.

Advertising

Journal Journal: Old soldiers never die, nor stop grumbling. 4

Note to copywriters working for the DoD, or trying to appeal to a military audience: "soldier," "sailor," and "airman" are not proper nouns. "Marine" is a proper noun, because it happens to be part of the name of the service, United States Marine Corps. (Or, for that matter, the Royal Marine Corps on which the US version was modeled.) This does not mean that Marines are any more special or heroic or elite than members of the other services. (Marines, of course, will disagree, but that's part of their shtick. The rest of us just smile and nod.) It's an accident of language, no more.

Also not proper nouns: "military" and "veteran." Capitalizing any of these words, when they do not appear at the beginning of a sentence, does not emphasize how Special and Heroic and Elite our Brave Fighting Men And Women are for Making Sacrifices to Defend Our Freedom. It just makes you look illiterate. Now, you may not particularly care about literacy -- you're in the advertising business, after all -- but by God and the Constitution, I fought specially and heroically and elitely to defend your right to speak freely, not to sound like a moron doing so!

Thank You, and Have A Nice Day.

Slashdot Top Deals

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...