Comment Re:Don't need amoebae to fly (Score 1) 455
How about this: An AI/consciousness that develops from a different "root" so to speak, but which attains super-intelligence, by definition, must be able to understand us whether or not it has direct experience of our form of consciousness. In a sense then, super-intelligence must include meta-consciousness. I'm persuaded that true super-intelligence is the least thing we have to fear, because it entails profound wisdom and understanding of the dillema of all extant beings. Here's a new term:
A Superbuddhacomputer!
However, before an AI gets to that point, it may very well present a danger and we may never live to see the day it becomes the great compassionate server rack.
It's (not) funny to consider that we are so stupid that we can't even grasp that the logical conclusion to the serious safety hazards presented by the potential for hackers gaining access to automobiles containing wireless interfaces on their most critical controls--is to not do that. Instead we are considering legislation. WTF?
There is a serious argument to be made that if we do render ourselves extinct by creating an AI that obliterates us then that is just the next step of the evolution of life here on planet Earth, and perhaps it would be for the better.