Comment AI and Neuroscience are always exaggerated (Score 1) 230
The most frustrating thing about computers is the media and the general public have greatly overestimated the capabilities of AI and neuroscience research, in no part due to the tendency of some researchers in those fields to puff up the importance of what they've accomplished in these vast fields of the unknown. We don't "know" how the brain works -- we have some coarse models that fit and some experimental research that seems to fit those models, but we don't even have the beginnings of research that could be applied to therapeutic techniques that aren't much better than electro-convulsive "therapy".
We have some pretty impressive pattern matching and learning algorithms for very specific problems, but can't even begin to approach the way the brain self-learns and expands its own capabilities.
Yet there is the perfectly valid argument that we don't need features like self-awareness or general-purpose learning in order for an AI to be useful. Just look at what some of the more complex expert systems can do compared to their human counterparts, or how Watson won at Jeopardy without having even the vaguest "understanding" of what the questions were or what the meaning of the answers it gave were.
I'd even go so far as to argue that "self awareness" isn't necessary or useful for an artificial intelligence at all. Just look at all the animal species on the planet which are self-aware, yet don't have a level of intelligence that would be considered "useful" for understanding and interacting with humans conversationally. If anything, self awareness is the "boogeyman" that has people worried about an AI that might try to take over the world. If an AI isn't aware of itself as an entity, it can hardly try to conquer anyone unless it's been programmed to do so (How can "I" try to rule the world if there is no "I"?)