An anonymous reader writes: Neurobiologist Ed Boyden has an article in Technology Review discussing the potential pitfalls of creating a super-intelligent artificial intelligence without also building in some sort of motivation. Most visions of a Technological Singularity are concerned purely with creating highly intelligent machines, but Boyden worries that such machines might quickly realize the futility of existence and "decide to play video games for the remainder of its existence". From the article: "Intelligence, as commonly defined, isn't enough to impact the world all by itself. The ability to pursue a goal doggedly against obstacles, ignoring the grimness of reality (sometimes even to the point of delusion--i.e., against intelligence), is also important. Most science-fiction stories prefer their artificial intelligences to be extremely motivated to do things--for example, enslaving or wiping out humans, if The Matrix and Terminator II have anything to say on the topic. But I find just as plausible the robot Marvin, the superintelligent machine from Douglas Adams' The Hitchhiker's Guide to the Galaxy, who used his enormous intelligence chiefly to sit around and complain, in the absence of any big goal".