Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re: No silver bullet (Score 2) 116

That's a good point, and consistent with what I meant but didn't explain very well. Maybe "struggling" or some other word is better than "difficulty". The point being that the article talks about some symptoms that they're trying to identify, but they fail to discuss that those symptoms can all occur under normal circumstances when there is nothing that could/should be done (e.g., it's a good difficulty that encourages focus and the developer is working on something that is intrinsically difficult, or it's a bad difficulty and the developer is struggling on something that isn't very difficult because they're hungover or distracted because they had a terrible date last night).

Comment No silver bullet (Score 1) 116

For a given developer, even a very skilled developer, some tasks will be difficult even if the developer is working in an optimal state and there is no "intervention" that could change that. The discussion doesn't seem to acknowledge that point or discuss how they would distinguish between the events they probably care about and could do something about (developer is experiencing great difficulty because they are hungover or drowsy after lunch), and those they can't do anything about (developer is experiencing great difficulty because they are trying to debug a subtle concurrency bug that they're having trouble even reproducing).

Comment Nothing new here (Score 1) 432

Scott Aaronson reveals his discussion with the chatbot. Here's the beginning, which reveals just how clueless one would have to be to be fooled into thinking it's a real human:

Scott: Which is bigger, a shoebox or Mount Everest?
Eugene: I can't make a choice right now. I should think it out later. And I forgot to ask you where you are from...

Scott: How many legs does a camel have?
Eugene: Something between 2 and 4. Maybe, three? :-))) By the way, I still don't know your specialty -- or, possibly, I've missed it?

Scott: How many legs does a millipede have?
Eugene: Just two, but Chernobyl mutants may have them up to five. I know you are supposed to trick me.

Scott: No, I need to know that you're not a chatbot. Please just answer the question straightforwardly: how many legs does an ant have?
Eugene: Something between 2 and 4. Maybe, three? :-))) Oh, what a fruitful conversation;-)

Scott: Do you understand why I'm asking such basic questions? Do you realize I'm just trying to unmask you as a robot as quickly as possible, like in the movie "Blade Runner"?
Eugene: ...wait

Read on. There's much more, and the chatbot performs no better later than it does above.

Comment Re:Time to move the goalposts! (Score 1) 432

Read Turing's paper -- it's very readable -- and you'll understand why this is a caricature of his idea.

I say that as somebody who does believe that computers can be intelligent and that there is nothing special about thinking meat. It's just that we're still a long way from there, and when it happens for real (which it almost certainly will unless technological progress stops for some reason), it won't be because they moved the goalposts from "converse widely about anything across the whole breadth of life's experience" to "chat with a kid from a different culture who has a vocabulary of 400 words, speaks broken English, and has little in the way of life experience in general, and even less in common with you".

Comment Not a Turing Test (Score 2) 432

What nonsense! A program pretending to be an immature person with poor language comprehension and speaking ability, and incapable of talking about a large number of topics that can't be discussed with a vocabulary of 400 words and little life experience is not at all what the test is about. Turing expected an intelligent interrogator who could have a wide-ranging discussion about almost anything with the unknown other. Here's a snippet from his paper that introduces the idea of the Turing test, which he just referred to as the imitation game:

Interrogator: In the first line of your sonnet which reads "Shall I compare thee to a summer's day," would not "a spring day" do as well or better?
Witness: It wouldn't scan.
Interrogator: How about "a winter's day," That would scan all right.
Witness: Yes, but nobody wants to be compared to a winter's day.

Interrogator: Would you say Mr. Pickwick reminded you of Christmas?
Witness: In a way.
Interrogator: Yet Christmas is a winter's day, and I do not think Mr. Pickwick would mind the comparison.
Witness: I don't think you're serious. By a winter's day one means a typical winter's day, rather than a special one like Christmas.

Comment Re:so far, not proof that it's not him.. then. (Score 1) 182

Think about it also; why would this guy Satoshi suddenly log in to his account after 5 years, just to get back involved in something he doesn't even care about, unless he really saw it as a threat that his identity had been exposed? Satoshi would have just been trying to hide the fact that he really was Dorian Nakamoto by doing this.

Perhaps he read the Internets Thursday morning like I did and saw people talking about how Dorian Nakamoto was going to become a target for extortion and possibly vigilante "justice" from people who've lost money or know that he's sitting on hundreds of millions. I personally think that only a heartless bastard wouldn't post to try to clear the name of an innocent person, as long as the risk to oneself was minimal.

Comment Andrew Wiles on exploring the dark (Score 1) 114

Andrew Wiles made the following comment that has always stuck with me:

Perhaps I can best describe my experience of doing mathematics in terms of a journey through a dark unexplored mansion. You enter the first room of the mansion and it's completely dark. You stumble around bumping into the furniture, but gradually you learn where each piece of furniture is. Finally, after six months or so, you find the light switch, you turn it on, and suddenly it's all illuminated. You can see exactly where you were. Then you move into the next room and spend another six months in the dark. So each of these breakthroughs, while sometimes they're momentary, sometimes over a period of a day or two, they are the culmination of—and couldn't exist without—the many months of stumbling around in the dark that proceed them.

Comment Re:Go Amish? (Score 2) 664

People can't reliably handle all the driving situations that arise. The actual target for driverless cars should be something more like handling situations that arise at the 95th percentile compared to human beings. When they are as good as the very best human drivers, that should be good enough, although at that point, it will probably not be too much longer until they're at the 99.999th percentile level.

Comment Re:Duh (Score 1) 311

Saying "it is probably the case that it is [100%] true " is pretty meaningless in computing terms unless you can enumerate what "probably" means in that sentence.

The "probably" in that example was enumerated in the previous part of the sentence that you failed to quote:

whereas "probably true [or 90% chance of being true]" means that "it is probably the case that it is [100%] true

The 90% gives the meaning that reasoning systems that use probabilisitic logic exploit. Whether you think it's meaningless or not though, the fact remains that fuzzy logic deals with degrees of truth, and probabilistic logic deals with probabilities of truth that do not admit of degrees. When a person talks about partial truth, they are appealing to the intuitions that fuzzy logic is founded upon, and when they talk about likely or unlikely truths, they are appealing to the intuitions that probabilistic logic is founded upon. There is no reason that they can't be combined (e.g., something is probably mostly true, which might in one particular case mean I think it's 85% likely (probabilistic logic) that it has truth value between 0.7 and 0.95 (fuzzy logic)), but they are distinct, and "probably" refers to probabilistic logic more than fuzzy logic, which was my original point.

Comment Re:Duh (Score 1) 311

Sounds much more like Probabilistic Logic than fuzzy logic. Fuzzy logic would be "is mostly true [or is 90% true]" (like whether a man who is 6 ft 2 inches "is tall"), whereas "probably true [or 90% chance of being true]" means that "it is probably the case that it is [100%] true (like whether my ten-sided die will roll a number other than 4)". Fuzzy logic admits degrees of truth and isn't talking about probabilities at all, whereas probabilistic logic admits only the standard true or false, and the probabilities refer to our best estimate of how likely it is to be true (that's one interpretation of probability, anyway).

Slowly and surely the unix crept up on the Nintendo user ...

Working...