Become a fan of Slashdot on Facebook


Forgot your password?
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: Re:Terrible, wretched, no good science (Score 1) 637 637

I suspect that the issue here is you're looking at IQ as a distinct trait which is under direct balancing selection, whereas Cochran (and Crabtree, for that matter) look at it as a complex emergent property which is highly (primarily?) dependent upon genetic load--- and also that genetic load, rather than IQ (or even quantitative traits we'd normally associate with IQ), is really what a lot of this selection is about.

I.e., the hypothesis some geneticists are now discussing is that there aren't really "IQ genes" but that a lot of the variance in IQ directly varies with genetic load. I.e., someone with a high IQ will have a lot fewer broken genes (LOF variants) than someone with a low IQ.

I think Cochran et al.'s lens is better than yours in this context. There's plenty more background material at the blog I linked.

Comment: Terrible, wretched, no good science (Score 5, Interesting) 637 637

Greg Cochran over at West Hunter has a pretty damning critique of this paper.

Cochran's review:
In two recent papers, Gerald Crabtree says two correct things. He says that the brain is complex, depends on the correct functioning of many genes, and is thus particularly vulnerable to genetic load. Although he doesn’t use the phrase “genetic load”, probably because he’s never heard it. He goes on to say that that this is not his area of expertise: truer words were never spoken!

His general argument is that selection for intelligence relaxed with the development of agriculture, and that brain function, easier to mess up than anything else, has probably been deteriorating for thousands of years. We are dumber than out ancestors, who were dumber than theirs, etc.

The first bit, about the relaxation of selection for intelligence in the Neolithic -. Sure. As we all know, just as soon as people domesticated emmer wheat, social workers fanned out, kept people from cheating or killing their neighbors, and made sure that fuckups wouldn’t starve to death. Riiight -it’s all in the Epic of Gilgamesh. In the online supplement.

Why do people project a caricature of modernity back thousands of years before it came into existence? Man, he doesn’t know much about history.

Nor does he know much about biology. If he did, he’d understand that truncation selection is what makes such complex adaptations possible. If only the top 85% (in terms of genetic load) reproduce, the average loser has something like 1 std more load , so each one takes lots of deleterious mutations with him. But then, he’s probably never heard of truncation selection. I’m sure they never taught him that in school, but that’s no excuse – they never taught me, either.

If his thesis was correct, you’d expect hunter-gatherers to be smarter than people from more sophisticated civilizations, which is the crap that Jared Diamond peddles about PNG. But Crabtree says that everyone’s the same – stepping on the dick of his own argument. Of course, in reality, hunter-gatherers score low, often abysmally low, and have terrible trouble trying to fit in to more complex civilizations. They do a perfect imitation of being not-smart, amply documented in the psychometric literature. Of course, he doesn’t know anything about those psychometric results.

Which reminds me of secret clearances: it used to be that having a clearance mean that you were entrusted with information that most people didn’t have. Now, it means that you can’t read Wikileaks, even though everyone else does. In much the same way, you may have the silly impression that having a Ph.D. means knowing more than regular people – but in the human sciences, the most important prerequisite is not knowing certain facts. Some kind soul should post the Index, so newbies won’t get themselves in trouble.

He doesn’t even know things that would almost support his case. Average brain size has indeed decreased over the Neolithic- but in every population, not just in farmers. He might talk about paternal age effects, and how average paternal age varies – but he doesn’t know anything about it. He ought to be thinking about the big population increase associated with agriculture, and the ensuing Fisherian acceleration – but he’s never heard of it.

He even gets the peripheral issues wrong. He talks about language as new, 50,000 years old or so – much more recent than the split between Bushmen/Pygmies and the rest of the human race. Yet they talk. He says that the X chromosome isn’t enriched for cognition and behavioral genes – but it is (by at least a factor of two) , and the reference he quotes confirms it.

Selection pressures and mutation rates can vary in space and time. Intelligence could decrease – it’s not impossible. But we know that the pattern he suggests does not exist. Or, to be exact, in exists only in that neighboring world that’s full of Melanesian super-hackers, gay men whose main concern is avuncular investment, and butt-kicking pixies.


+ - The hidden truths about calories->

Raindance writes: "Scientific American recently explored how the traditional way of calculating calories has failed to keep up with science: "When the calorie was originally conceived it was in the context of human work ... chemical fire with which to get the job done, coal in the human stove. Fat, it has been estimated, has nine calories per gram, whereas carbohydrates and proteins have just four; fiber is sometimes counted separately and gets awarded a piddling two. Every box of every food you have ever bought is labeled based on these estimates; too bad then that they are so often wrong."

The NYT has some suggestions to improve and broaden our current labeling system, while others seem to feel the task of defining a healthy diet should be taken away from nutritionists and given to gut flora microbiologists. What's Slashdot's dream label?"

Link to Original Source

Study Finds Growing Up WIth Gadgets Has a Downside: Social Skill Impairment 203 203

PolygamousRanchKid writes with this excerpt from a CNN story:"Tween girls who spend much of their waking hours switching frantically between YouTube, Facebook, television and text messaging are more likely to develop social problems, says a Stanford University study published in a scientific journal on Wednesday. Young girls who spend the most time multitasking between various digital devices, communicating online or watching video are the least likely to develop normal social tendencies, according to the survey of 3,461 American girls aged 8 to 12 who volunteered responses. The study only included girls who responded to a survey in Discovery Girls magazine, but results should apply to boys, too, Clifford Nass, a Stanford professor of communications who worked on the study, said in a phone interview. Boys' emotional development is more difficult to analyze because male social development varies widely and over a longer time period, he said."

+ - How Anonymous Has Already Won: Insurance->

Raindance writes: "All the back-and-forth about Anonymous may obscure the real story: insurance. "even if Anonymous isn't behind the keyboard, so-called 'ethical hacking' is likely to increase in popularity. Given this, it'll become as common to hedge your risk from hacking as it is to hedge your risk from fire or flooding. But insurance companies aren't dumb, and it's likely that the premium on cybersecurity insurance will strongly reflect how much of a high-profile hacker target a company is. Just like it's more expensive to insure a coastal home from hurricanes, so too it'll be more expensive to insure a company popularly seen as brazenly greedy against hackers.""
Link to Original Source

Comment: Re:ahh, the "singularity"... (Score 5, Insightful) 830 830

PZ Myers wasn't there; he based his whole critique on gizmodo's writeup.

Speaking as someone who was there and heard Kurzweil's full speech, I can confidently say that PZ Myers does not understand Ray Kurzweil.

First off, a significant factual mistake: Kurzweil -clearly- never said we'd reverse engineer the brain by 2020. He argued against exactly that (his prediction was late 2020s, shading into 2030-- perhaps also unbelievable, but if you're going to critique someone, why not get the facts right?). Sure, gizmodo's writeup was entitled "Reverse-Engineering of Human Brain Likely by 2020". It'd be an understandable attribution mistake for say, an undergraduate.

Second, Myers is critiquing Kurzweil's ontological position based on a throwaway writeup dashed off by gizmodo. (Really, Myers? And you wonder why you're a magnet for shitstorms...)

Third, Myers' criticism is essentially that the brain is an emergent system, and we'll have to understand all the protein-protein interactions, functional attributes of proteins, etc. in order to actually model the brain.

This third assumption is arguable, but Kurzweil wasn't actually arguing against this. All Kurzweil meant with his comment about bytes and the genome was there's an interesting information-theoretic view of how much initial data gives rise to the wonderful complexity of the brain.

I had a lot more respect for Myers before I read this rant.

Comment: Impropriety (Score 3, Insightful) 464 464

One has to wonder, if Blizzard goes that far above and beyond requests of law enforcement and gives mountains of data in response to polite requests-- not even subpoenas-- how seriously do they take the privacy of *your* personal information?

I'm glad the bad guy got caught, etc, but handing over the keys to the kingdom to law enforcement without a subpoena implies, in my mind, that respect for users' privacy is simply not something Blizzard considers when they go about their business. Or rather, that such information is their property, not yours.


CrunchPad Being Re-branded As JooJoo 277 277

adeelarshad82 writes to tell us that Fusion Garage seems to be ignoring the drama surrounding the "CrunchPad" and is planning to launch their "JooJoo" tablet this Friday at midnight. Unfortunately, the device will be a long way from the imagined $200 price point, weighing in at a hefty $499. "The JooJoo comes in black and has a capacitive touch screen, enough graphic power to deliver full high-definition video, offline capabilities, and a 4GB solid-state drive, though 'most of the storage is done in the cloud,' Rathakrishnan said. He promised 5 hours of battery life. In a demo during the webcast, the device powered on in about 10 seconds, and showed icons for web-based services like Twitter, Hulu, CNN, and Gmail, though the JooJoo will not come pre-loaded with any apps, Rathakrishnan said. Scroll through them with your finger as you would on the iPhone. In terms of the ownership drama, Rathakrishnan said that TechCrunch editor Arrington has created an 'incomplete and distorted story.'"

Comment: You're playing their game (Score 5, Interesting) 375 375

Given the assumption that cryogenic revival will be possible, this may work in principle-- but the insurance industry doesn't exactly function on immutable code-like rules that can be hacked for fun and profit.

It's much more a game-- and moreover, the game is owned by the insurance industry. You're just playing it. And if you figure out a particularly good trick to beat the house, they're either going to rationalize why certain technicalities mean they don't need to pay you (and thus 'easy money' becomes 'try to drag deep-pocketed defendants into court'), or they'll simply change the rules before you're revived, and you won't have been able to do anything about it because you were dead.

From a what-do-you-have-to-lose perspective, sure, it's worth a shot. But this simply can't be a dependable part of estate planning.

Comment: Re:This is important (Score 1) 536 536

If Modern humans and Neanderthals were so different, how likely is it that fertile offspring could have been born?

We don't currently know enough to say much about the fertility of human-neanderthal hybrids, but see, for example, Ligers for fertile cross-species hybrids (and lions and tigers are separated by about twice as much time-since-divergence as humans and neanderthals, off the top of my head).

If it is not likely, could horizontal gene transfer have been a factor?

In short, no. Very probably not a significant factor. HGT happens quite often between, say, bacteria; bacteria and viruses occasionally leave nonfunctional copies of themselves in host genomes (which can provide entropic fuel for evolution); very seldomly, some other sorts of microorganism-host HGT can happen (e.g., how plants developed chloroplasts). But, from theory and genomic evidence, we can say pretty confidently that HGT just doesn't happen directly between say, two mammals.

Simply put, there just isn't a viable vector (bacteria, virus, loose DNA, etc) that could move a gene from one organism into the germline of another. Something like cannibalism could -very arguably- allow some gene transfer, but it wouldn't get passed down in the germline.

Comment: This is important (Score 5, Interesting) 536 536

The issue of introgression (gene flow from neanderthals to modern humans) is hugely important. It's a lot more important than the curiosity or oddity the Times article makes it out to be.

All the published studies looking for this introgression have been based on neanderthal mDNA. Since it doesn't undergo recombination, it's not a good marker, and the negative results so far are predictable and do not preclude gene flow. It'll be interesting to see Paabo's results. He's been working on getting nDNA data from neanderthal remains for a while now, and perhaps this is a hint that he's found some introgression.

Why it's important:

The small picture of why it's important is it would substantially redefine our family tree. We could refine our primate phylogeny.

The bigger, more hazy, and potentially earthshaking picture of why this could be important is that it doesn't take many viable pairings to get genes from one gene pool to another, and these genes could have been very important to our development. Modern humans and neanderthals were under many of the same environmental stresses but likely developed different adaptions to them. This includes behavior and cognition genes. As Stringer points out in the article, "in the last 10,000-15,000 years before they died out, around 30,000 years ago, Neanderthals were giving their dead complex burials and making tools and jewellery, such as pierced beads, like modern humans.” Proto-modern humans were smart. But neanderthals were also smart, potentially in different and complimentary ways. And perhaps it took a combination of proto-modern human and neanderthal genes to truly make the modern human mind. Our brains could be an example of 'hybrid vigor' on a grand scale.

So the big question mark is whether, given we can determine gene flow, if this hypothetical combination of proto-modern human and neanderthal cognitive adaptions could have led to the cultural explosion of ~30-50 thousand years ago. The biology is plausible and the timing's right. The data's still out, but it's coming in. Odder hypotheses have come true.


Neanderthals "Had Sex" With Modern Man 536 536

According to Professor Svante Paabo, director of genetics at the Max Planck Institute for Evolutionary Anthropology, Neanderthals and modern humans had sex across the species barrier. The professor has been using DNA retrieved from fossils to piece together the entire Neanderthal genome, and plans on publishing his findings soon. He recently told a conference that he was sure the two species had had sex, but still had questions as to how "productive" the relations had been. "What I'm really interested in is, did we have children back then and did those children contribute to our variation today?" he said. "I'm sure that they had sex, but did it give offspring that contributed to us? We will be able to answer quite rigorously with the new [Neanderthal genome] sequence." What remains a mystery is what Paleolithic brewery provided the catalyst for these stone age hook-ups.

Comment: Neat, but... (Score 4, Interesting) 43 43

Here are some questions I have about the chip:

- These chips/systems already exist. What's new about this MIT effort? The Computerworld article was very sparse.

- There's a great deal of bidirectional communication that goes on in normal eyes-- information not only flowing from eye to brain, but from brain to eye as well. As far as I know these tech just discards these signals. Is this important?

- Last I heard, this sort of technology was approaching 1000 effective pixels of visual information (assuming ideal electrode placement). Has this effort from MIT pushed this boundary? How does '1000 effective pixels' compare to the eye's effective resolution? Can we put normal vision in terms of pixel resolution?

- I've read about shunting tactile senses (for instance, the nerves on a person's tongue) over to a digital videocamera. I believe the military has done a fair bit of research into this. Could this sort of approach be viable for helping the blind function as well? Could it become the preferred approach since it seems less invasive than ocular- and neuro-surgery?

Getting the job done is no excuse for not following the rules. Corollary: Following the rules will not get the job done.