Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:What an idiot (Score 2) 137

Actually it's worse... or rather stupider. He offered to fix it (which really is just involves filling out and submitting a web form) if school settled a lawsuit for $200,000.

Now let's assume this guy is totally in the right as far as the claims in his lawsuit are concerned. That doesn't give him the right to hold his employers' systems hostage until he gets what he wants. Those systems still belong to them.

What was he thinking? Of course the courts are going to order him to hand over the metaphorical keys to the system. And the judge isn't likely to be sympathetic after this. On top fo that any future prospective employer is going to find out about this the instant they google "Triano Williams".

Based on the levels of stupidity and assholery displayed here, I'd be amazed if he weren't in the wrong.

Comment Re:Schitzophrenic Labor Dept. (Score 1) 186

Make up your mind. Which is it? They are paying more to white males or discriminating against white males by preferring Asian workers?

Oracle has not been complying with requests for information, so the Labor Department is suing them so they can get this information during discovery. They are probably suing them for multiple offenses so they can get a more broad set of information during this process. At least that is my guess based on the information in the article.

Comment Re:Politically driven pseudo-science garbage (Score 1) 187

Solar output in fact has decreased since the early 60s.

Also according to the Milankovitch cycles we should be in the middle of a cooling period, although the actual effect is quite complex (e.g. it makes a difference whether perihelion occurs in the austral or boreal summer). So it is also possible that we might be in for slight warming over the next twenty thousand years. But even if we were in for dramatic warming due to orbital resonance, that would be on the order of 0.1C/century, much lower than the changes we've observed.

You left out volcanoes, which are a natural source of CO2 (as well as cooling particulates).

If you add up all the known sources of natural climate variation you end up with no warming trend since 1900 (source).

Comment Re:Where are the error bars? (Score 3, Informative) 187

What Geoffrey said. It's easy enough to pull the instrumental record global average data into a spreadsheet and plot it; I've done it several times myself.

Also be aware of what error bars can and cannot tell you. You can't tell about the statistical significance of trends just by comparing adjacent years with error bars. It's the wrong statistical test to talk about decades-long tends. You might never ever see a year which is statistically significantly warmer than a prior year at some level of confidence, yet have a trend which over a decade or more hits that confidence level.

Comment Re:Human brain is NOT a computer (Score 2) 142

"The brain is just a computer" is hand-wavy and objectively incorrect.

No it is not. In the context of the rest of the paragraph it is obvious he doesn't mean a digital computer with an x86 instruction set. He was referring to the brain as a machine which accepts inputs, processes the inputs based on it's current physical state, and produces various outputs to initiate action throughout the body. He mentioned Substance Dualism at the beginning of the paragraph the line you are quoting is found in order to clear up any confusion about what he meant by calling the brain a computer.

I will grant you he is using a definition of computer not used in the common English lexicon, but since we are talking about potentially future human-level AI it's likely the definition of computer in common usage could change by then to include biological computers, not just silicon.

Comment People have a crude form of telepathy. (Score 1) 127

Not actual radio-like telepathy like in sci-fi stories, but an inbuilt capacity to actually experience what our brains think other people are experiencing.

One of the classic experiments like this is to get a subject wearing goggles to identify with a mannequin. Of course this is artificially induced; we didn't evolve in a world with 3D goggles and cameras. But there is a condition called "mirror-touch synesthesia" in which this occurs naturally, in which people spontaneously experience what someone else is experiencing.

The parallel element I see is the brain somehow generates a sensation without an appropriate physical input, and the phenomenon of mirror touch synesthesia suggests to me this isn't just a curious bug in our brain architecture. The 1.6% of people who report spontaneous mirror synesthesia also score higher than the general population on measures of empathy. I suspect it may also be linked in some way to our ability to learn by copying what others do.

This is a really exciting time in neuroscience, and synesthesia seems like an interesting target for DIY brain hackers. Mirror-type synesthesia particularly so because it's easy to induce. The rubber hand illusion is probably the easiest dramatic effect to produce at home.

Comment Re:Human brain is NOT a computer (Score 3, Insightful) 142

I read through the paper, but couldn't find any description of what the brain does which couldn't be considered information processing. It may not be digital processing, and it may not resemble how computers process instructions and retrieve data, but even if the physical architecture is different the brain still seems to be processing information.

He uses an example of a dollar bill, and how a person cannot recall every detail of a dollar bill from memory if asked to draw one. And that is somehow proof that the brain does not store data about the dollar bill? The person was still able to draw some details about the dollar bill, such as a person in the center and the numbers in the corners, so that data was stored somewhere. The author also makes a silly distinction between "storing data" and "changing the brain", as if the way the brain is changed isn't how it stores the data.

But neither the song nor the poem has been ‘stored’ in it. The brain has simply changed in an orderly way that now allows us to sing the song or recite the poem under certain conditions.

Sounds a lot like the brain stored the song or poem somewhere in a way it could be retrieved later, under certain conditions. Just because the brain stores information in a less precise way as a computer doesn't mean it isn't storing anything. The rest of the article continues to make similarly odd claims without backing them up. The researcher takes some very valid arguments about how many researchers rely too heavily on computer / human brain metaphors, but then he makes a lot of wild statements himself without backing them up either.

Comment New senses? (Score 2) 127

Elliot Freeman, a cognitive neuroscientist at City University and the study's lead author, said: "A lot of us go around having senses that we do not even recognise."

It seems to me more like a short circuit between regions of the brain than a different sense. I wouldn't like to hear things that aren't there just because I'm seeing things. It's well known that there are substantial interactions between different regions of the brain, which is why for example we turn down the stereo while trying to find an address.

Comment Re: "developed an artificial intelligence(AI) prog (Score 1) 142

And all of those have been branches of the AI field. Since the field of AI was created, arguably during the 1956 Dartmouth workshop, it has included topics such as expert systems and statistical methods being used to make decisions based on both simple and complex input. Whether you want to accept it or not, even those Pacman ghosts' behavior can be accurately referred to as AI.

If you want a term which only refers to human level intelligence, perhaps you should use either Artificial Consciousness, Machine Consciousness, or Synthetic Consciousness. Those probably match your personal definition of AI better than the broad definition used by the scientific community.

Slashdot Top Deals

The program isn't debugged until the last user is dead.

Working...