Comment Re:Google versus Apple (Score 5, Insightful) 360
To amplify this 'uncanny-valley' notion. The problem with the anthropomorphizing ('attitude') approach is that it lulls the user into thinking they are dealing with a very sophisticated (sentient) system. This fiction quickly disappears once the user runs requests that the AI quite obviously doesn't understand. At that point, the quirky personality becomes annoying (think Clippy), and the fact that it pretends to be as smart as a human, without actually being as smart as a human, makes the interface seem broken and comically insufficient.
The opposite approach, also seen in robotics and many other areas of AI (e.g. search), is to not pretend that the system is like a person. Instead, make it obvious that it is a machine, with a set input/output behavior. Users can then quickly learn how to best use this machine to accomplish tasks. If the shortcomings of the system are evident, users will not be surprised by them and will instead build these into their mental model of how the system works.
As a case study, consider the similar criticisms that have been made about Wolfram-Alpha (e.g. here): essentially, W|A is a highly sophisticated set of computation and relation engines. However it's all wrapped up inside an overly simplistic UI (a single text-entry box, without any obvious way to refine what you mean). This leads to people getting all kinds of unintended results, despite the fact that the system actually can perform the computation/analysis/lookup the user wants. It's just that there is no obvious way to tell it what lookup you meant. The overly-simplified UI implies to the user that the system will just 'figure out what you mean', but the fact is it fails to do that very frequently; the user becomes frustrated because they then have to mentally reverse-engineer W|A's parsing logic, trying to build a query that returns the kind of results they want.
In short, it's better to design a UI that is an honest reflection of the sophistication/power of the underlying technology. To do otherwise creates a bad user experience, because user expectations are not meant by available functionality.