Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Still a long, LONG way to go... (Score 5, Informative) 220

Mod parent up. The linked article (and the MIT press release) are misleading. The closest thing I can find to a peer-reviewed publication by Poon has an abstract is here (no, I can't find anything throught the official EMBC channels--what a disgustingly closed conference):

https://embs.papercept.net/conferences/scripts/abstract.pl?ConfID=14&Number=2328

And there's some background on Poon's goals here:

http://www.frontiersin.org/Journal/FullText.aspx?ART_DOI=10.3389/fnins.2011.00108&name=neuromorphic_engineering

The goals seem to me to be about studying specific theories about information propagation across synapses as well as studying brain-computer interfaces. They never mention building a model of the entire visual system or any serious artificial intelligence. We have only the vaguest theories about how the visual system works beyond V1, and essentially no idea what properties of the synapse are important to make it happen.

About two years ago, while I was still doing my undergraduate research in neural modeling, I recall that the particular theory they're talking about--spike-timing dependent plasticity--was quite controversial. It might have been simply an artifact of the way the NMDA receptor worked. Nobody seemed to have any cohesive theory for why it would lead to intelligence or learning, other than vague references to the well-established Hebb rule.

Nor is it anything new. Remember this story from ages ago? Remember how well that returned on its promises of creating a real brain? That was spike-timing dependent plasticity as well, and unsurprisingly it never did anything resembling thought.

Slashdot, can we please stop posting stories about people trying to make brains on chips and post stories about real AI research?

Comment Author does not understand information theory... (Score 1) 622

The crux of Krugman's argument seems to be the extraordinarily misleading statement that "A world awash in information is one in which information has very little market value." Krugman has obviously never studied information theory. Yes, our world is 'awash' with information, but that's not because machines are especially good at producing it. Machines are only good at copying it.

Krugman's error stems from his conflation of the two definitions of information. By one definition, the physical number of bits that the human race has managed to store on hard drives, the amount of information the human race has produced has been increasing exponentially. However, this is not useful information, and not the kind of information that requires any serious education to produce. The other definition is from information theory, where information is defined in terms of randomness: here, information is the total number of bits that you need in order to convey a signal in its most compressed form (i.e. the 'random' component of the signal that can't be derived from other parts of the signal). By this definition, the fact that I copy the 100mb file 'a.mp4' from my desktop to my home folder does not mean that I have produced 100mb of information; I have produced at most 64 bytes of information, since that's the number of bytes it took for me to describe the new state of the world.

As for the rest of the article, Krugman argues (correctly, I believe) that any job which requires the production of information will remain strictly in the domain of human beings. However, he seems to forget that most physical goods are just copies of other physical goods, and therefore contain very little information. The production of those goods can generally be replaced by machines.

However, there is still some insight in what Krugman says, though you have to think a bit to realize it. Krugman is actually arguing that educations are only valuable if they teach you how to produce information, and that an education which only teaches you to parrot facts makes you very much like a computer, and very much replaceable by computers. Hence why he needed to use lawyers in his example. I don't think us computer scientists have much to worry about from this argument.

Comment Re:one step closer to drive thru degrees (Score 2, Informative) 371

If you want statistics on Harvard, here they are:

http://www.gradeinflation.com/Harvard.html

The rest of gradeinflation.com gives much more information you may find interesting.

The reason for this is that the more students they fail, the better they look.

This is also incorrect. Far more important in the school's rankings are (a) the percent of their admitted class to accept the admissions offer, and (b) a higher number of students who get job offers after graduating. This incentivizes schools to lower failure rates (US News and World Report reports graduation rates and rolls them into rankings because they know it turns off most prospective students), and also to increase grades to make their students' resumes look better.

Comment CPUs and GPUs have different goals (Score 5, Interesting) 129

At least as far as parallel computing goes. CPUs have been designed for decades to handle sequential problems, where each new computation is likely to have dependencies on the results of recent computations. GPUs, on the other hand, are designed for situations where most of the operations happen on huge vectors of data; the reason they work well isn't really that they have many cores, but that the operations for splitting up the data and distributing it to the cores is (supposedly) done in hardware. In a CPU, the programmer has to deal with splitting up the data, and allowing the programmer to control that process makes many hardware optimizations impossible.

The surprising thing in TFA is that Intel is claiming to have done almost as well on a problem that NVIDIA used to tout their GPUs. It really makes me wonder what problem it was. The claim that "performance on both CPUs and GPUs is limited by memory bandwidth" seems particularly suspect, since on a good GPU the memory access should be parallelized.

It's clear that Intel wants a piece of the growing CUDA userbase, but I think it will be a while before any x86 processor can compete with a GPU on the problems that a GPU's architecture was specifically designed to address.

Comment Re:Here it is for 5c (Score 1) 844

What an awful article...this one, and the HIV one that everyone keeps citing. This one starts off with the statement from the director of the institute that created it, "male circumcision is a scientifically proven method for reducing a man's risk of acquiring HIV infection." No real scientist would ever make this claim--science does not prove anything.

It gets worse. The way they conducted the studies (in both cases) was to start off with a large group of men, circumcise half of them, and see who comes back with more infections. There's no way to do blinding here, since you're going to know whether or not you've been circumcised. For example, one confounding factor may simply be that circumcisions hurt--maybe the controlled group just had less sex. Unfortunately, they didn't give any evidence for a mechanism, which makes it somewhat difficult to believe it. (As an aside, the mechanism they suggest is that the foreskin helps the HPV cells enter the cells on the surface of the penis--which suggests that it could prevented by simply pulling the foreskin back for a while after sex).

Another odd part about the study--the Herpes/HPV study was done in Uganda, and the one on HIV was done in Kenya. Of course, applying the results of a study to a population different from the one used in the study is generally a problem, but it's even worse in this case, because this whole conversation started because we believed circumcision stops people from using condoms. Kenya and Uganda are both known for disliking condoms, and so the effects of circumcision reducing the use of condoms has been minimized.

Comment Re:Violating the Constitution is a good reason (Score 1) 1657

I'm interested to hear how you define "lie". I think analytic philosophy has shown that it's nearly impossible to decide whether a statement is "true" or "false" in a completely black-and-white sense.

For me, a lie is any attempt to convince someone else of something that you yourself don't believe. And Bush certainly did this; he knew that the intelligence wasn't nearly as condemning as he wanted America to believe.

Slashdot Top Deals

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...