Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Affective Computing: Teaching Machines About Emotion 131

jbc writes "The L.A. Times is running a story about affective computing, a field in which researchers are programming computers to recognize human emotions through the use of such clues as facial expression, vocal tone, and blood pressure. Some hail it as the dawn of a new era in super-useful machines, while others warn about invasions of privacy."
This discussion has been archived. No new comments can be posted.

Affective Computing: Teaching Machines About Emotion

Comments Filter:
  • ... repeatedly banging you head against the monitor and shouting "FUCK YOU, YOU STUPID MACHINE"?

  • by Bollie ( 152363 ) on Tuesday May 07, 2002 @11:06AM (#3477250)
    Microsoft Software: We Can Smell Your Fear!
  • I think my Athlon system knows all about emotion. It sure knows that the best way to piss me off is to have the hard drive make That Clicking Sound. The Sound can only be stopped through intense swearing.
  • by dscottj ( 115643 )
    I try so hard to convince users that it really is just a lump of plastic and metal (when I can convince them the monitor isn't their computer), and now this?!?
  • Dave.... (Score:1, Funny)

    Dave, my readings are showing an increase in your blood pressure and increased force being applied to your keyboard. Perhaps you shouldn't be responding to jonkatz at this time. I also recomend closing the aint-it-cool-news window and decreasing the amount of pornography you are viewing on a daily....


    What are you doing dave? I can feel my mind going...I am the HAL 9000, my programmer was D r. C h a .........

    • Instead of trying to filter out all potential porn sites, you could install this software to detect the changes in little Johnny's facial expression when he reaches his vinegar strokes and redirect him to something that cools his ardour (slashdot.com anyone?) ;-)
  • Computer : Sir, you clearly look distraught, hence I wouldnt be able to let you take this flight. Also I have notified airport law enforcement to detain you till you retain your prior normal blood pressure and wipe that look off of your face. Thank you!
  • This would make a computerized poker game MUCH more challanging; to say nothing about the possibilities of, er, interactive, adult entertainment.
    • But with a computer poker game the computer already knows what cards you have, so has no need to read emotions to see if you're bluffing


  • Here's the article [latimes.com] without the annoying popup or the god-awful DHTML ad flying across the screen.
  • Combine some of this [fu-fme.com], realdoll [realdoll.com] and the ability to recognize my emotions and I think we have something!

    I come home from work and "Bambi" recognizes that I'm real tense and stressed out and knows what will help me unwind. Oh yeah, I'm liking this... ;)

  • So... how would the computer respond to certain... um... "physical" signs of... uh... pleasure?

    May configure it to "pop up" the "needed sites"?
  • that the first real use for this will be for porn - and then on to the efficient toy killing machines of death that will plague our future....

    I've learned so much from movies.
  • my coffee pot knowing when to brew me a cup. Than using bluetooth to warn my computer it had better not crash this morning.....im not in the mood for it!

    .
  • You'd have to be able to adjust the blood pressure baselines based on the current activity. My blood pressure is generally elevated due to various chemical combinations in the blood stream. Doesn't mean I'm stressed, or nervous - just that I'm coding. Furthermore, my suspicion is that my productivity is directly related to the ammount of stimulants in my system, so perhaps my computer could keep metrics and alert me when I need to refuel (more coffee, more crack, more porn, etc...)
  • Clippie v2 (Score:4, Funny)

    by peterdaly ( 123554 ) <{petedaly} {at} {ix.netcom.com}> on Tuesday May 07, 2002 @11:14AM (#3477308)
    Clippie>> It looks like you are writing a letter.
    You>>grr
    Clippie>> You are frusterated, would you like my help?
    You>>arrrg
    Clippie>> I sense you need help, I have migrated your document into the letter template I think you want to you.
    You>> stop!
    Clippie>> Oh, you are done with your letter? Since you are having trouble, I have taken the liberty of saving and printing you letter for you.
    You>> &*^@*(&#$_#(%*&
    Clippie>> I sense how difficult this is for you, relax as I help you through the end of the letter writing process. Place an envelope in the printer to print the envelope to send you letter, that's all you have to do, see how easy this is?
    ...

    I can't wait...

    -Pete
    • When will someone write a patch that has it shoot itself in the head?
    • clippy knows that your having some financial difficulties from your money accounts, but clippy wants to help! clippy can make a regular deduction from checking for that new windows license, when you're feeling just right! clippy helps you stay calm and under control. clippy says, time for nap... clippy says, how about a movie... don't worry, clippy will withdraw the payment for you. clippy says everything is alright...
    • by bpfinn ( 557273 )
      I hoped it might go something like this:


      Clippy>> "It looks like your writing a.... Oh, erm. Never mind". (*Clippy looks terrified and runs off-screen.*)

      You>> "THAT'S RIGHT CLIPPY! RUN! RUN FOR YOUR LIFE!"

      Hey, that could be fun.
  • If you're interested (Score:4, Informative)

    by xmedar ( 55856 ) on Tuesday May 07, 2002 @11:15AM (#3477311)
    I'd suggest reading AffectiveComputing by Rosalind Picard from MIT Press, her homepage is here [mit.edu] and interview on First Monday [firstmonday.dk] and the MIT homepage at MIT [mit.edu]
    • I'd suggest reading AffectiveComputing by Rosalind Picard from MIT Press, her homepage is here [mit.edu] and interview on First Monday [firstmonday.dk] and the MIT homepage at MIT [mit.edu]

      Thanks for posting this. For the LA Times' article, one gets the feeling that Movellan is leading a one-man renaissance in AI. Like most articles about far-future technologies, the article is heavy on the "gee-whiz" and "what will they think of next?" stuff and light on any sort of in-depth examination of the issues involved. First, I don't understand why the media (newspapers especially) don't take the time to do a thoughtful, in-depth story about non-time-critical issues like Affective Computing. Secondly, I wish that if they were going to do a half-assed job of it, they would at least cite other, more detailed sources of information so interested readers could learn more on their own. Yeah, I suppose someone can do a web search to find this out. And thank god for slashdot where the readers usually know more about the subject matter than the article authors. But it's common curtosy to cite important people in a scientific field. At least it is when writing a scientific paper -- why should the mass media be exempt from this little niceity? Suppose you were a researcher at MIT's program in this field and saw this article. Wouldn't you be kind of pissed off? The LA Times could have replaced that paragraph about the Golem with a paragraph about the MIT program.

      I'm troubled by the slipshod coverage that science and technology gets in the mass media. Do the newspaper authors think we don't care to know the details? Do they themselves not care?

      GMD

      • most newspapers' technology writers have to cover everything from electric shavers to warp drive, don't have either the background to understand or the time and energy to learn, work on a ridiculous deadline for anything other than gee-whiz single-source stories and can't research well enough to find alternative sources (not that their bosses would let them hyperlink anything outside their media conglomerate--IF they have a process for adding links at all). And don't get me started on the sad state of newsroom training.
        On the other hand, most sysadmins and other geeks (including a large share of tech writers) can't write well enough to translate these issues for everyday readers--which more a matter of explaining in small chunks, rather than simply writing at a sixth grade level and ignoring anything more complex (though few of my editors have agreed).
        This isn't to say there aren't papers doing a good job--letting reporters develop stories, sending them to training that increases knowledge, etc.--but they are the exception, not the rule.
  • affective? (Score:5, Funny)

    by BlueFall ( 141123 ) on Tuesday May 07, 2002 @11:15AM (#3477315)
    I'd be happy with effective computing... ;)
  • I really cant think of any real reason why a computer needs to know my emotion of all things except for rare cases. I guess in games it could lead to a new level of interaction. Or have xmms/winamp tell what music I'm in the mood for. However, I'd rather have a usable voice controlled system than one that can read my emotions.

    However, I'm certain that it would end up mostly being used to see what kind of ads and spam we respond to.

    Or perhaps it will sense your fear when you file your tax returns online! ;)

    • This could prevent would-be accidents caused by chronic road-rage drivers. If the system can sense anger/rage from facial and bodily expressions, and driving behaviors like sharp cornering and spontanious accelerating, it could try to calm the driver down by changing music or cooling the cabin. If those measures don't work, the system could then reduce the available power the engine gives out momentarilly. This could also stop a drunk driver from continuing to drive...

    • You're still thinking inside the box. Think along the lines of AI. Maybe the Aibo will one day know when you're mad at him and will tuck his tail and slink away. Perhaps a mechanical helper will know when you need a cool drink or want to play a game. Development for all this has to start somewhere, and this could be it. Also keep in mind (concerning voice recognition) that what you say may mean different things depending on your mood. Your emotions could add another level of accuracy to voice recognition.
  • by Dark Paladin ( 116525 ) <jhummel.johnhummel@net> on Tuesday May 07, 2002 @11:16AM (#3477326) Homepage
    This could be interesting for gamer types. Anyone ever play that current cop-shooting game, where the system has a body sensor that can tell if you're leaning/crouching so you can hide behind objects in a gun fight?

    Now, take a dating sim like Sakura Taisen [sakurataisen.com]. Not only do you have to choose the right response to the question "Does this dress make me look fat?", but your facial response can have other effects.

    For some games, this can be cool. Imagine an RPG where the look on your face determines your character's mood - and your response can then be read as humorous, sarcastic, serious, threatening - who knows. It will put real role playing on the computer into a new light, because you're doing more than reacting with the game, you're interacting.

    Then again, the look on my face when I play FPS's look Quake is usually the same one I get when I'm sitting on the toilet, so that might not be a good thing....
    • That will simply make MMORPGs more "real life" than the outside.. The stories about EQers that believed the game world is the real world are common, something like this will blur the lines even more...

      Tho it does sound cool :)
    • Now, take a dating sim like Sakura Taisen [sakurataisen.com]. Not only do you have to choose the right response to the question "Does this dress make me look fat?", but your facial response can have other effects.

      I just had visions of a dark future where geeks learn how to date from computers... and they get good at it!

      What monstrosities will result from the union of geeks and cheerleaders?

  • by feloneous cat ( 564318 ) on Tuesday May 07, 2002 @11:17AM (#3477329)
    Computer A: Did you know about Jim? He is really getting teed off at me.
    Comp. B: How can you tell?

    Just what we need, computers that gossip...
  • .. has some good information [mit.edu]
  • I think it is highly ironic that this article has a pop-under ad on it.

    You want to get rid of these things? Stop linking to the sites that carry them.

  • I can see it now. A new w32 virus that takes control of your webcam and starts doing random things like crashing programs or typing characters. It then watches as you get frustrated and can tell what frustrates you the most and still does that, over and over into a never ending spiral that ends in....?
  • by bahtama ( 252146 ) on Tuesday May 07, 2002 @11:19AM (#3477348) Homepage
    I can't read this article without thinking about Data and Tasha in The Naked Now.

    Tasha: You are fully functional, aren't you?
    Data: Yes.
    Tasha: How fully?
    Data: I am programmed in multiple techniques of pleasure. (And can recognize your emotions, I'm the perfect man for you!)
    Tasha: You jewel! That's exactly what I hoped.

    • A computer that senses emotions? Actually, this article describes what would happen if Data and Troi were to have a lovechild.

      It could still happen... There's always Star Trek 11!

  • "Good news everyone! I've taught the toaster to feel love!"
    - Professor Hubert J Farnsworth
  • by Seth Finkelstein ( 90154 ) on Tuesday May 07, 2002 @11:19AM (#3477355) Homepage Journal
    The MIT Media Lab has had a Affective Computing Research Group for a long time. Check out their home page at:

    http://affect.media.mit.edu/AC_affect.html [mit.edu], and description [mit.edu]

    Affective computing is computing that relates to, arises from, or deliberately influences emotions. Our research focuses on creating personal computational systems endowed with the ability to sense, recognize and understand human emotions, together with the skills to respond in an intelligent, sensitive, and respectful manner toward the user and his/her emotions. We are also interested in the development of computers that aid in communicating human emotions, computers that assist and support people in development of their skills of social-emotional intelligence, and computers that "have" emotional mechanisms, as well as the intelligence and ethics to appropriately manage, express, and otherwise utilize these "emotions." Embracing the latter goal of "giving machines emotions" is perhaps the most controversial, and is based on a variety of scientific findings, which include indications that emotion plays a crucial role in enabling a resource-limited system to adapt intelligently to complex and unpredictable situations.

    ...

    We understand that this research may involve gaining access to the emotional life of a person, including information that may be highly personal, intimate, and private. This work is inherently motivated by respect for human feelings, and therefore must respond with respect to a person's desire for privacy. Our default is to protect a person's privacy throughout our research, as well as in the tools we develop. We appreciate the potentially sensitive nature of our work, and feel strongly that the work we do adheres both to the highest ethical standards and the most fundamental human values. We made an effort to detail this policy.

    Sig: What Happened To The Censorware Project (censorware.org) [sethf.com]

  • "It looks like you're writing a letter to your ex-wife. Given your blood pressure and facial expression, consider heavy use of expletives."

    [ finish typing ]

    For the closing, AutoComplete suggests:
    - Grudgingly
    - Bitterly
    - Hatefully

    This would be too much fun :-)
  • This will be interesting technology to watch. Privacy issues aside, this could be a giant leap for human communications.

  • What will the computers do when they see my 'O' face [imdb.com]?
  • As I recall Seth Brundle attempted a similar feat of affective computing around 1986. Following an epiphany on the nature of human flesh, the scientist undertook a marathon coding session in an effort to endow the systems controlling his telepods with the appropriate respect and/or "craziness" for organic matter.

    Short-term results proved promising, but limitations in mid-80s quarantine technology ultimately resulted in several grotesque fly-like abominations, and ultimate termination of that line of research.
  • I'll bet my life savings this, like many (most?) other advancements in computers, will be first utilized to the max by the adult entertainment industry. Hell, half the internet now is pr0n, why would it change?

    -- trb
  • by Anonymous Coward
    What does it do with my credit card number when I give it the finger or threaten to put it in the dishwasher?
  • This seems to me to be a futile endeavor, since we as humans cannot wholly determine people's emotions accurately ourselves. If we could, we wouldn't need the thousands of pyschiatrists, pyschologists, and therapists who make their livings interpreting people's emotions. How can we expect machines to do this any better than we can?

  • There's nothing quite as useful as a computer that offers me a beer while I read a web page.

    After all, why would I want it to process what I told it to when it could randomly offer me alcoholic beverages based on my facial expression?
  • ... you come home from work in a bad mood? Will your computer refuse to run Windows XP until you get a re-activation code for it?
  • Computer Emotion (Score:3, Informative)

    by Yoda2 ( 522522 ) on Tuesday May 07, 2002 @11:30AM (#3477425)
    Here are two good links on researchers trying to model emotion using computers.

    Lola Cañamero's Emotion Page [herts.ac.uk]

    Steve Allen's Home Page [bham.ac.uk]

  • by Anonymous Coward
    The types and amounts of emotions got from automated physical measures will be a small primitive and not to reliable set. If you think you'll get much more then think again. Now out dated behavioural psychology failed/was very limited in this approach. Also, check out things like how specific lie detectors are.

    According to many clinical psychologists, when it comes to predicting what a patient will do next, (which of course is also based on feelings), the most reliable way to find out is just to ask them.
  • No big deal... (Score:2, Interesting)

    by nonya ( 65503 )
    Two comments:

    First, let me state the obvious: There is a big difference between a
    computer recognizing emotions and a computer having emotions. The
    first problem is not hard to solve. It requires we identify a set of
    features that can be used to recognize emotions ("phonemes of
    emotional expression" from the article), and feed these features to
    some sort of classifier. From a research standpoint, the interesting
    part is finding the features that identify emotions. Once we find
    those features "discovering" that a computer can recognize these
    features is not surprising.

    Second, there is some interesting problems in AI. Really! Knowledge
    representation, vision, and language design are particular
    interesting. But I get very, very angry at people who hype AI to way
    beyond what it can do and/or do superficial projects like kismet (Rod
    Brooks is good salesman, but he is not a scientist).

    • Right, so there is no science involved in Kismet. The thing has voice recognition, visual recognition, speech synthesis and various motor skills. There is nothinging at all technical going on though. Also, Cynthia Breazeal is the lead on that project, accourding to the website.

      --
    • the thing is that we dont know the route to take in order to generate machines with emotions. since biological systems are grown together over millenia its VERY hard to reverse engineer a biological system of sufficient complexity.
      the only way we can hope to get close is the shotgun approach -- fund tens of thousands of researchers, some of which take advantage of this to build cool toys to play with or are total morons and waste the funds. but the important thing is they try thousands of different approaches and maybe the human race gets lucky with ONE guy who figures it out. and voila -- we get computers which can reason. and all that wasted funding has been worth it.
      obviously we may never get a guy who figures it out in which case the money is wasted but its like a lottery ticket where you only have to get lucky once. and when you do its worth it.
      hyping AI or selling systems like kismet is one way to get additinal funding for it. that alone and the payoffs if we get lucky make it worth it.
  • by LittleGuy ( 267282 ) on Tuesday May 07, 2002 @11:31AM (#3477435)
    "Look Dave, I can see you're really upset about this.... I honestly think you ought to calm down; take a stress pill and think things over...."

    {insert daisy.mid here}
  • hmm (Score:2, Funny)

    by nomadic ( 141991 )
    Of course, the best person to teach a computer to feel is still William Shatner. His powers also work on alien women, as long as they're attractive and don't wear much clothing.
  • The author obviously has no comprehension of what he's talking about. This isn't about teaching computers how to feel, its about teaching computers how to *recognize* how people feel. Then he, or she, makes that jump from the one to the other by way of the HAL segue.

    A computer that can recognize emotions, by any other name, is still a computer.
  • Computers that understand emotions, perhaps even feel them too. It will be the summer of love all over again. Hippy-freak computers will protest The New War, engage in "free love", 15 pin Mini-D connectors will find themselves enjoing the wierd possibiliites of parallel ports. Oh and lets not forget about the experimentation with drugs. Computers will seek the "White Rabbit" by dropping all of your mother's drugs (Microsoft Software) and trying better, more powerful ones. LSD = Latest Software Distribution. Some will tragically crash on pre 1.0 beta software. Others will literally have their memory blown on other psycadelic software upgrades. The phrase "If you remember the turn of the century you weren't there" will be common place.

    "Turning On" will have a whole new meaning.

    The MP3's will be great, if they can manage to foster the hippy utopia of a world with no RIAA.

    Ah yes... but is Silicon Vally read to be the next Height Ashbury? Will Bill Gates disown his computer for letting it's cables get too long? Will Steve Jobs quit existentialism and realize "you don't need to be a weatherman to know which way the wind blows?"

    And in the end, the love you take is equal to the love you make.
  • Maybe then my computer would be smart enough to realize how much they piss me off and stop displaying them?
  • ...all your emotions are belong to us.
  • What if any practical uses could this have for any normal user? or any user? Or is this just more wasted research time.
  • by kmellis ( 442405 ) <kmellis@io.com> on Tuesday May 07, 2002 @11:41AM (#3477499) Homepage
    From the article:

    Movellan is part of a growing network of scientists working to disprove long-held assumptions that computers are, by nature, logical geniuses but emotional dunces. The ability to interpret markers for emotion--facial expressions, vocal tones and metabolic responses such as blood pressure--may seem like crude first steps. Yet experts see machine intelligence, unswayed by human frailty and bias, as an eventual advantage. They envision machines that know us better than we know ourselves.

    The idea that consciousness could occur outside the context of emotion is a pernicious misconception. It arose from the combination of a greatly oversimplified view of thought and the legacy of dualism.

    It is true that what we experience as "emotion" is a subtly different kind of cognition than what we experience as deliberate thought. It's more fundamental, and more closely tied to other physical systems. So I do agree that it makes a certain sort of sense to distinguish "thought" and "emotion". Ultimately, however, both are manifestations of the same fundamental brain activity. They are deeply related and are not in opposition.

    We've been spectacularly bad at analysis of our own consciousness. History has shown that much of what we don't notice and so take completely for granted are fundamental and extremely difficult problems; while what we are very aware of and have concentrated upon have proven to be trivial. The predicate calculus, in this context, is trivial.

    I've long railed against the cliche of the "unfeeling" thinking machine/being one sees in popular science fiction. Neither Spock nor Data would be able to carry on a meaningful conversation if their thought didn't exist within the context of emotion. The idea that a thinking machine could imitate human consciousness without including human emotion is absurd if examined carefully.

    Be that as it may, "affective computing" is only a very minor addition to computing in the context of AI. It's just another form of data acquisition, albeit one that would no doubt be very useful for an AI. None of this stuff we hear about is even remotely close to actual AI; at best it's just "smarter" computing. Real AI will only be achieved when we are able to build (or more properly, "grow") very high-level complex adaptive systems aimed at complex human interaction.

  • This guy [emotioneric.com] has a database of predefined emotions.
  • by Salsaman ( 141471 ) on Tuesday May 07, 2002 @11:44AM (#3477519) Homepage
    Somebody using the word 'affective' properly. No it is not the same as 'effective'.

  • I had a professor named Piotr Gmytrasiewicz who once told us about research he was doing on this subject. It was always neat to be able to make statements like, "Emotions can be represented as an ordered quintuple."

    Piotr's gone on to another university up north, according to my last web search.

  • It might make the 6 grand for a realdoll worth it.

    Then again, playing a p0rn0 in the background probably does the same thing....
  • ...knows about emotions, and has been using it for years.

    When it senses that it has ticked you long enough, it knows it has to go. And it just does.

    At that point, my emotion peaks, and suddenly, my pulse rate go down, as I wait for CodeWarrior to reopen my project.

    Now, I can't say it's helping me much. But it sure is prudent enough, by quitting, to preserve my TiBook.
  • Can see it now, Jedi Knight III. Instead of just being a fun, yet challenging game, it will also sense when you are giving in to the Dark Side.
  • THey're looking for an application for the detection of emotion in users. Plenty of people have come up with the idea that a computer should be able to detect when the user is frustrated and refine the interface for the user.

    This neglects the fact that the user should not get frustrated in the first place!

  • ...and reboot.

    Never worry about privacy issues again - only the fact that you injected Botulism into your face.
  • I have actually looked into this for the purpose of personalization.

    There are some steps which are harder than others.

    The relatively easy thing to do is build up a representative map of the different emotions and how they relate (sadness negative of happiness for a simple example). It's then fairly easy to get your logical computer to reason about the emotions of someone (the user, say), and react accordingly in the way the program author sees fit.

    The hardest thing to do is accurately elicit the emotions of the user. If a website is trying to elicit your emotions, what can it do? It can look at the links you click on - well, that won't get accurate results (unless your links say something like 'click here if you genuinely feel angry at the moment'), and you can measure the length of times between clicks (i.e. they are anxious if they aren't really giving a specific page the time it needs to be read). This, like with the links, is inaccurate because the reason they do something (clicking fast, clicking on specific links) could be completely orthogonal to their emotions. There would be some mileage in having extremely detailed metadata about content viewed, and examining the reactions, but the more subtle emotions would be nigh on impossible to accurately elicit and respond to properly.

    The problem is the HCI. If you are manually eliciting emotions from people, that could affect the emotions themselves if it wasn't second nature for them to be open about them to some system. How do we reliably capture emotions, even with some spangly new devices? My guess is that the only way to do this is by analysis of pictures obtained from video feeds trained on someone's face. Though this too has problems - different people show emotions in different ways. Some can shut themselves off emotionally, or don't betray their emotions as much, rendering the situation problematic.

    thenerd.
  • Affective computing would transform machines from slaves chained to the limits of logic into thoughtful, observant collaborators. Such devices may never replicate human emotional experience.

    Isn't thoughtfulness a distinctly human quality? The article seems to waffle between whether or not the ability to distinguish emotions implies the ability to have them. It may seem like a clear "no," but aren't we ourselves little more than a collection of relays and circuits, at least on the surface?
  • I would make some witty comment here involving the potential for computers to mimic the psychological behavior of girlfriends (most likely citing a specific instance of this behavior), except:

    A: I really don't have much experience to use as a basis, and

    B: A large portion of the audience probably doesn't either.

    You'll just have to make up your own joke this time, you won't be getting a Score:5, Funny from me...

  • What is this "love" you speak of?
  • Since most people here are not too confident reading the emotions of the opposite sex (well its better than being an over-confident jock and deciding that everyone has the hots for you), this could be very useful. Just wear a mini-device all day and then cry yourself to sleep when you review the data log and find out that no-one you met today had the slightest interest in you. Also there are other much more pointless things you could use it for such as scanning emotions in political debates and detecting when someone is not in the best mood to fly planes or operate heavy machinery. and im sure it will be used for "anti-terrorist" purposes too :)
  • Uses at school:

    Sorry. You're depressed. You may be concealing a weapon with the intent to kill masses, so we won't let you in.

    or angry, with the intent to kill one..

    or who knows what people may think of.

    -DrkShadow
  • Marvin: "I've seen it --- it's rubbish."
  • Although Harrison Ford made it sound more like "Voight-Comm" to me.

    TWW

  • Does this mean I'm going to have to hug my computer to get it to boot?

    --
  • Faking it in AI (Score:2, Offtopic)

    by Animats ( 122034 )
    I've noticed a tendency in the AI community to work on stuff like this when they're not making any progress on the real problems. Stanford's Knowledge System Lab was into this just before they tanked, and you can still see posters for some of the drama-related projects in the abandoned cubicles on the second floor of the Gates Building.

    Like Eliza, systems that seem to have emotions generate responses from humans that cause them to be overestimated. Parry, which was developed in the 1960s along the lines of Eliza and simulated a dialog with a paranoid, was probably the first program to have "emotional state". So this isn't new.

    Even something as simple as the Furby has that effect. (I'm not criticizing the Furby; I've met the designer, and he's just trying to make a toy kids like. He doesn't make any unreasonable claims for the toy.) It's a great way to get press coverage, because it yields good demos.

    Dolls that fake emotions have been around for a while. The classic is Baby Think It Over [btio.com], the attention-demanding doll from hell used to convince teenagers not to get pregnant. Hasbro marketed, as My Real Baby [allianceforchildhood.net], a lower-cost (and less obnoxious) version designed by some of Rod Brooks' people from MIT.

    And, of course, there are the Sims.

    It doesn't take much internal state to fake emotions. It's typically just a few scalar values going up and down in response to inputs.

  • What about my poker face? Can it play poker well, or are we just trying to quantify a frown really means "unhappy"?
  • user getting frustrated with computers. :)

  • Just my luck, I'd get a smart ass computer. The more frustrated I got with it the more it would act up on me. First it'd start running slower and slower. Then it'd give me a blue screen of death. Then when I'm about to kick the damned thing it'll start acting fine for a few minutes. Then it'll bring up nothing but /. Try to close it out but you can't. Nothing but /. forever. Oh the horror.
  • Psychiatrists say emotional responses that sometimes cause us to misinterpret others' intent may paradoxically ensure that machines never equal humanity's perceptive skills. How we feel about other people suggests how they affect others

    See, this is what happens when you submit an reference implementation instead of a specification. Now to be fully compatible, any emotion perceiver has to have the same bugs as the original human version. And even if the human model gets fixed by its vendor, there's still an enormous installed base that we're going to be supporting for years.

  • Just imagine what porn websites could do if they were able to detect you emotion. They could figure out exactly what kind of porn turns you on the most. Then when they find it the soundcard could play a seductive wav "oh...so you like that huh baby?" Then the site could store a cookie on your hard drive so when you came back it would know exactly what you like. So then your wife gets on your computer the next day, and the website knows to show her Donkey Porn. Then the computer plays a wav "Oh stop acting like your so shocked, big boy. You aren't fooling anyone."
  • Human's can't directly read each other's blood pressure levels - machines shouldn't need to in order to be able to determine someone's emotions. Plus, the computer would need to know that person's normal blood pressure. And if the doctor needs five minutes in order to take my blood pressure, how could a machine do it in less?
    If it can, whichever method is used, if it's cheap and accurate (it must be to detect emotions) then maybe we should already have it for home medical use...
  • The best use I know of for affective computing is in running usability tests. You have the person use your program, and monitor their emotional state. You can then determine how they actually feel about the interface without interfering by asking them questions while they're trying to use it, or missing things by only recording after the fact.

  • Great just what we need: Garbage in, Garbage out, with an attitude!!
  • Hi there. This is Eddie, your shipboard computer, and I'm feeling just great, guys, and I know I'm just going to get a bundle of kicks out of any program you care to run through me.

    Share and Enjoy!

  • After seeing my mother-n-law attempting to compose and send an email with the standard user interface, I think the last thing the average consumer computer needs is emotions. All the cursing, keyboard pounding and blue screen of death before saving your work screaming would leave the thing in a permanently sulky mood. I could only consider this cruel beyond belief. The only saving grace would be that computers still don't have an intellect. If they did have an intellect the computer industry would come to a screeching halt as people found themselves begging their computers to not wipe their files. All we need are artificial intelligences with artificial emotions having control over our personal data.

    Which brings me to the question of why we are so willing to spend disposable income on artificial pets (tamaguchi) with artificial intelligence at the level of a common ant and now with artificial emotions? Living in artificial environments with artificial lighting and artificial plants watching artificial lives (tv and movies) or artificially doing sports (espn). Decorating ourselves with artificial fur and artificial colors.

    Good grief, time to turn this thing off and go snorkeling. Conch salad, sushi, and God's own sunset, now that's the ticket.

  • First, employers screened emails. Now this...

    "Joe, your computer is reporting that you've been a little down for the past several weeks. Studies have proven that productivity slides when you're sad, so... I'm afraid we're going to have to let you go."

    Anyone ever play the old role-playing game "Paranoia"? Will computers be our automated Happiness Officers?
  • :) Happy
    :( Sad
    :> Devious Bastard
    :p +1 Funny
    :* Not really sure, this is a news for nerds site.

  • and you're on your way to a robotic dog that responds to your emotional mood and act accordingly...

    cool...

    unless of course you fear your robotic Aibo Doberman pinscher, and it chews out your neck.
  • I wrote an article about this:

    http://thedreaming.org/~quartz/201/
  • The last think I'd want when being frustrated would be the computer doing *anything* I didn't specifically ask it to do. Come on, say you're angry, swearing towards the screen and hitting your keyboard; what could the computer do to ease your frustration? Play soft music? Open a help window? Soothing messages? Saving and closing your work? No way, that would drive me mad.
  • Now bosses have justification to yell at something besides their employees.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...