Another possibility, stemming from a rather long, and unfortunately heated, debate I had on this during a philosophy and ethics discussion: As a society, we constantly strive to define what it is to be alive and human. Early definitions were broad, but sufficient. With each new leap in technology, we can create things that mimic this definition, or we discover something existing that already does. When that happens, we redefine ourselves. Currently, our definitions are devoid of "flesh and bones" things, since our science long ago proved that these things are far from what makes you who you are. Instead, we keep to less tangible things, like thought, reason, and emotion. Now, even those places are being invaded by increasingly cunning programmers and robotics experts. When the machines look like us, think like us, and feel like us, what is it that really seperates them from us? Morally and Ethically, can we turn them off? That's a line in the sand that few are willing to blur. Currently, robots have become our modern slave labor. The perfect worker, that never complains or asks for vacation, and will gladly work itself clear to 'death' if you ask it to. The idea of these machines become 'intelligent' enough to consider what it is that they are being asked to do, and possibly refuse, is unsettling to most.