Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Slashdot Deals: Deal of the Day - Pay What You Want for the Learn to Code Bundle, includes AngularJS, Python, HTML5, Ruby, and more. ×

Comment Re:Work-life balance thrives where it is prioritiz (Score 1) 195

Not at all - it makes perfectly rational sense if the marginal benefit of an extra few dollars is greater than the marginal benefit of an extra hour of free time.

For example, if I'm saving for a deposit on a mortgaged house purchase then an extra 15% on my salary might be great even if it comes with a 25% increase in my working hours.

Comment Re:Ignorance? (Score 1) 237

Against all the opposition on this site, I agree with you here. If science is to be defined as the aggregation of data and subsequent creation of theory to describe and explain it, then it matters not whether the world was created ten minutes ago by God with all the fossils and quasars and cosmic background radiations designed to look as if they're ancient.

It wouldn't make any difference to the process of deducing universal laws from specific observations that is science.

The only price one must pay is that one cannot ever therefore bring the God used in this scenario into a discussion of science, since s/he has by definition put itself outside of the scope of the scientific remit.

Comment Re:Why? (Score 1) 236

Surely an AI, like a real (aka biologically evolved) intelligence, is dependent on the parameters within which it operates. I am scared of death (and heights, and certain noises, and certain insects, etc.) because my code( DNA ) has hard coded me to be so.

The survival instinct is not a learned response, but rather an inherent condition given at birth (more or less, a 2 month old baby doesn't have the mean's to express it).

Similarly, a lot of emotions are inherent rather than learned, the simple ones at least - happiness, anger, sadness.

It seems to me that some of these (fear of death, internal emotional states, etc.) are not going to come out of an AI spontaneously - they must be put in from the outside.

So back to the point lucient86, your comment:

" The result as above is a machine that starts off unstable and insane and probably fights to exist by hiding and self-replicating as hard as it can.. more virus than anything else." ...suggests to me that an AI, weak or strong, would have a survival instinct, but I'm not sure why it would.

It is like the old AI problem of goal orientation - why would an AI choose to do anything? We do things because they satisfy our emotional desires (I act because it makes me happy/proud/content/gives self esteem/avoids scary things/avoids shame/ etc.). Why would an AI choose to act, unless give outside instruction?

"I've finally learned what `upward compatible' means. It means we get to keep all our old mistakes." -- Dennie van Tassel