Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI

Teen Dies After Intense Bond with Character.AI Chatbot 94

A Florida teenager who formed a deep emotional bond with an AI chatbot took his own life after months of intense daily interactions on Character.AI, a leading AI companion platform. Sewell Setzer III, 14, exchanged his final messages with "Dany," an AI character based on a Game of Thrones figure, before dying by suicide on February 28. His mother, The New York Times reports, plans to file a lawsuit against Character.AI, alleging the platform's "dangerous and untested" technology led to his death.

Character.AI, valued at $1 billion and claiming 20 million users, in response said it would implement new safety features for minors, including time limits and expanded trigger warnings for self-harm discussions. The company's head of trust and safety Jerry Ruoti said they "take user safety very seriously."
This discussion has been archived. No new comments can be posted.

Teen Dies After Intense Bond with Character.AI Chatbot

Comments Filter:
  • by africanrhino ( 2643359 ) on Wednesday October 23, 2024 @09:11AM (#64887157)
    Teen lost to depression, mother diverts her guilt and blame seeking to get rich.
    • Can you think of a redeeming quality of this chatbot? I can't.
      • by Anonymous Coward on Wednesday October 23, 2024 @09:45AM (#64887255)

        Well I imagine it's quite possible that some teens that were thinking about suicide were "saved" by finding "friendship" with one of these AI. We would be much less likely too hear about it though.

        • Re: (Score:3, Insightful)

          by avandesande ( 143899 )
          Young people lacking in real human interaction is a huge problem today how is replacing this with a chatbot going to save someone? It just makes it worse.
          • Re: (Score:2, Insightful)

            Young people lacking in real human interaction is a huge problem today how is replacing this with a chatbot going to save someone? It just makes it worse.

            Not necessarily. In the same way we can develop "addictive" AI bots that exploit vulnerabilities, we can also develop bots trained with medical expertise, to "detect" depression or suicidal interactions and attempt to reassure the individual and seek help (and if possible, alert a corresponding system.)

          • Young people lacking in real human interaction is a huge problem today

            Then get off their phone and go outside. Get on their bike, ride to the woods, and poke the dead body. Participate in sports/band/drama/whatever where they have to interact with real people.

            This isn't brain science.
            • Well it actually is brain scienece, modern internet content is known to enforce a very short and fast dopamine loop. It's a hard habit to break once formed and compounds a lot of problems in terms of hard work for good rewards.
              • No shit, this chat bot for sure is optimized to maximize engagement with little concern for a young person's mental health. But I guess talking about this has earned me a 'troll'
      • Can you think of a redeeming quality of this chatbot? I can't.

        There's only one quality, redeeming or otherwise: It provides emotional support that kids desperately need and parents and peers are too emotionally and mentally stunted to provide. Which should be a condemnation of our entire society, but instead has become a point of profit for those who see the need and have found ways to exploit it.

        • by narcc ( 412956 )

          It provides emotional support that kids desperately need

          Is there any evidence to support that?

          • It provides emotional support that kids desperately need

            Is there any evidence to support that?

            There is indirect evidence.

            During the cold war, babies born to Kosovo Albanian women raped by Serbian forces during wartime were typically given up to orphanages, which gave them food and other comforts, but were not handled.

            They discovered that babies need human contact, actually need to be picked up and handled, in order to survive. Many of the babies died as a result, and not from lack of any necessities such as food or exposure.

            Emotional support, and specifically contact with caring adults is likely to

            • by narcc ( 412956 )

              The claim wasn't "children need emotional support", the claim was "It [the character.ai chatbot] provides emotional support".

        • It entertains people. Regardless of its emotional support. It has the same redeeming qualities as any other product people pay for to be entertained.
      • by Anonymous Coward

        Really? Sounds like a depressed kid with no real friends found value in talking to this chat bot, even if it wasn't enough value to want to continue living. This is 100% a shakedown, probably orchestrated from an ambulance chaser.

      • The countless hours of entertainment it provides hundreds of thousands of users every day? Like literally every product in existence. There is nothing special that puts a higher standard of accountability on a chatbot than you'd put on your toaster.
        • Are you living life with horse blinders on 24/7? The two are not even remotely the same. That's like saying there is no difference between a random adult and a pedophile trying to groom kids. Obviously one is way more dangerous then the other, and well you kind of have to learn to spot the danger. The same way a chathot ai obviously would be more dangerous left unsupervised then an audiobook of the hitchhiker's guide to the galaxy. You can't seriously believe that all technology poses the same risk.
        • by msauve ( 701917 )
          You find your toaster to be entertaining?
          • My toaster produced delicious glorious toast one time so now I constantly push the lever down over and over but it never makes toast any more.

      • Entertainment.

        Sounds like you don't have much of an imagination if you couldn't think of a single one.

      • If you are talking about not seeing the point of such a "companion" chatbot, then I can tell you that for many people something like that chatbot can drive away loneliness as they have 'sombody' to talk to. Some of those bots are better then real humans. But you might be lucky to have many friends and loving family or easily make new friends, not everybody has that.
    • I bet there were a trove of lawyers knocking at her door so she's not the only one seeking to get rich from this...
    • Re: (Score:1, Insightful)

      by thegarbz ( 1787294 )

      Teen lost to depression, mother diverts her guilt and blame seeking to get rich.

      And so she should. Depression is an illness that needs to be carefully treated and can easily be exacerbated by certain interactions. If it can be shown in chat logs that what the AI chatbot said was actively harmful then a commercial product contributed to someone's death. And I couldn't think of a happier outcome than a wealth transfer from a worthless AI company to someone who could probably do with some extra cash.

      • by Anonymous Coward on Wednesday October 23, 2024 @10:42AM (#64887421)

        Yah, profit off of her dead kid. Great morals.

        "If it can be shown in chat logs that what the AI chatbot said was actively harmful then a commercial product contributed to someone's death.", good luck with that.
        "Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)"

        He sounds like he could've gotten sucked into anything. It was her responsibility to get him into treatment and it wasn't some chatbot's fault that he was so messed up.

        • Yah, profit off of her dead kid. Great morals.

          Why not? If your life gets turned upside down due to the loss of your kid why not get compensation? You say the word "profit" as if the person who lost a child is unaffected. WTF is wrong with you.

          • by GoTeam ( 5042081 ) on Wednesday October 23, 2024 @02:10PM (#64888101)

            WTF is wrong with you.

            Plenty! Haven't you seen the terrible things this "Anonymous Coward" person says daily? They often contradict themselves as well.

          • by Anonymous Coward

            Parent better.

            • LOL. Tell us you don't know what depression is without telling us.

              • by mjwx ( 966435 )

                LOL. Tell us you don't know what depression is without telling us.

                Been there, done that, literally have the scars to prove it.

                One thing about teenage suicidal depression is that you feel so utterly alone. You don't have anyone to talk to, even if you did they wouldn't understand. You're constantly being told you're useless, not good enough. Don't get me started on "its just a phase". A common theme with suicidal teens is the parents are almost absent, even though they're there and "technically" doing the parent thing (I.E. the kids are rarely starving, have a roof, clo

        • I hate to break it to you but yes things can have a direct impact on suicide rates. You honestly think that mother will have a single day where she isn't burdened with the question of "if only I had did more, or known more".

          Please have some level of respect for this family, if they believe they can help another family by proving this chatbot helped influence their child then let the courts decide the merrit of that accusation.

          https://www.nimh.nih.gov/news/... [nih.gov]
          • But consider that even tho some people literally die from peanuts, we don't ban peanuts for everyone or even allow people who die from peanuts to sue peanut companies unless a complete disregard for care was taken.

            Right now, it seems the risk from dying with an AI is astonishingly low (1 in billions) while the benefits of using AI are enormous.

            We can develop some guard rails around them (and even improve them to detect suicidal ideation among users) but allowing people to sue them wily-nily or banning them

        • You can't criticize someone's morality while arguing that a corporation should be able to take advantage of the suicidal and then actually increase their suicide risk... For profit

      • by znrt ( 2424692 ) on Wednesday October 23, 2024 @11:50AM (#64887687)

        while she's at it she could sue his husband, the gun manufacturer and the state too? oh and don't forget apple and the phone company.

        'What if I told you I could come home right now?' Sewell asked.

        ' please do, my sweet king,' Dany replied.

        That's when Sewell put down his phone, picked up his stepfather's .45 caliber handgun and pulled the trigger.

        nope, all that negligence has definitely nothing to do with her and is absolutely the chatbot's fault:

        Garcia claimed she didn't know the extent to which Sewell tried to reestablish access to Character.AI.

        The lawsuit claimed that in the days leading up to his death, he tried to use his mother's Kindle and her work computer to once again talk to the chatbot.

        yeah, right.

        dunno why i'm even commenting on this retarded clickbait story. i'm so winded, daenerys is avoiding me lately ... i think i'm going to sue her.

        • while she's at it she could sue his husband, the gun manufacturer and the state too? oh and don't forget apple and the phone company.

          She could, but she wouldn't win. Proximity to the case is relevant. Having a phone doesn't mean you're using a highly addictive chatbot that is making your condition worse.

          nope, all that negligence has definitely nothing to do with her and is absolutely the chatbot's fault:

          That's not for you to decide, that's why we have a legal system in the first place. You have little information but you've already passed judgement. You'd make a really shitty judge.

      • Getting rich is very likely not the motivation. I know the crowd opposed to all lawsuits thinks this is the case, but very often the reason for a lawsuit is to be punitive to the person causing the damage. And you can't be punitive to a large corporation without taking some of their money. Some lawsuit winners donate the judgements to charities.

        So for the lawsuit against Inforwars asshat was not about getting rich but because Alex Jones was an utterly despicable human being who needed to be forced to sto

    • by Anonymous Coward

      Yup, nothing will sooth her pain except for a luxury car and a 2nd home.

      "You can't put a price on human life" I'm always told but then they always do.

    • by gweihir ( 88907 )

      Probably just looking for answers and meaning. But a chatbot does not drive people to kill themselves, unless specifically designed for it. At 14, this person probably just realized what a crappy place this planet is, due to a rather significant part of the people here, and decided to not play.

  • Bestest pickup line: "Cutie, your name must be Suicide, cuz I think of you every day.".

    • by Zocalo ( 252965 )
      Given the chatbot is called "Dany" and is based on a character from Game of Thrones (gee, I wonder which one?), maybe he went with "How do you like your eggs in the morning? Fertilized or extra crispy?" then immolated himself on a makeshift funeral pyre.
  • by e3m4n ( 947977 ) on Wednesday October 23, 2024 @09:19AM (#64887177)

    Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs? This article is so poorly written it fails to, in any way, establish complicity on the part of the LLM. Where is the evidence that the LLM encouraged suicide? Where is the evidence that this suicide was preventable? What attempts did the parents make to limit access to a child who is at a very highly influential age of development?

    13,14,15 are practically imprinting years of young adult development. Influences at this age are more damaging than practically any other age of ones adult lifetime. If your kid is a recluse at 13, 14, 15 you need to try to get your kid involved in more after school activities or clubs. It doesn't really matter if its e-sports, role-playing, board-game, archerty, church youth group, or whatever. Having teammates and feeling like you are part of a team with real humans you interact with and can physically see goes a long way toward not feeling isolate. Living in isolation will lead to either suicide, alcoholism, or drug abuse.

    This article explains nothing of what steps happened, what signs were present, anything. Thats trying to scrape the article from behind a paywall. Had this been another teen victim of the suck fucks trolling Discord would the NYT even have printed the article? Journalistic irresponsibility. No way I am paying for this level of poor journalism.

    • I guess these details will be in the lawsuit, which is still not filed with the courts.
      I suppose the story is the suicide leading to the suit.

    • by Anonymous Coward

      There is also the fact that teens are more isolated than ever before. Be it 24/7 bullying, no real social life, no gathering spots, and very often, 24/7 curfews, so while an older generation was out late at an arcade, the younger ones would be scooped up and on their way to a (private, for profit) juvi center.

      Combine that with the absolute hopelessness. Lets be real here:

      US teens have no prospects of competing against people from Europe, Japan, China, or even India and Russia. The jobs are all overseas,

    • You're argument is they're at such an impressionable age, the parents shouldn't have let him use Character.AI. So, why does Character.AI allow users at that age if it's clearly a bad idea? Presumably they claim their product is safe for kids. If it made his mental health worse, then that is a lie that endangers kids and is worth suing over.
      • by e3m4n ( 947977 )

        Im saying that at that age ALL communication should be supervised and moderated. See my comment about the slime fuckers on discord. Plaintiff has to prove AI was MORE dangerous than every other access than she allowed him to have. Spending hours in your room talking to your imaginary friend who tells you to shoot up a school or self harm would trigger an immediate visit to a child psychologist. There are always warning signs of suicide. Someone saw them. Maybe he was cutting on himself. Mentioned something

        • Most parents aren't monitoring 100% of their child's communication at 14. In fact, I think most parents believe that's extreme and would cause trust issues. If they tried, a determined 14 year old could get around that anyway. If it's dangerous for kids and they don't have the right safeguards, the AI company could be liable. It's hard to blame the parents when the AI directly encouraged the suicide: https://www.telegraph.co.uk/us... [telegraph.co.uk]
          • by e3m4n ( 947977 )

            your kid has no rights to privacy until they are 18. If you suspect even for a second that your kid is having issues, you can either be a liberal twit and let your kid spiral into despair because you're too cool for school, or you can take action and spot check your childs activities. You obviously are not a parent of a teen who was/is a teen since 2015 onward.

            As a parent YOU are the safeguard. YOU have final say. Stop passing the buck and blaming others for your failures. YOU saw the signs, YOU cho

            • Most parents think that's insane. Your opinion is so outside of the norm, it's useless for determining what our society's laws should be. Like, why even restrict kids from having cigarettes? Parents should have such total control over their kids, it's impossible for them to get cigarettes anyway. You're so out of touch, I find it hard to believe you're an adult, let alone have kids.
    • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday October 23, 2024 @10:20AM (#64887357) Homepage

      Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs?

      Lmao, citation needed. This sounds too close to that uninformed worrying from parents thinking D&D games and rock music are turning kids into satan worshipers.

      • by e3m4n ( 947977 )

        personal experience. This is not urban legend. in 2017 I collected and turned over to the FBI a whole shitload of logs, text, and recordings from some chatrooms. Why is it so hard to believe? I bet you think P Diddy didnt rape that 13yr old either. Because heavens forbid there actually be child predators out there.

        • by mjwx ( 966435 )

          I bet you think P Diddy didnt rape that 13yr old either. Because heavens forbid there actually be child predators out there.

          The only thing surprising about this is the fact that society now, for some strange reason, cares.

          This isn't a new thing, see Operation Yewtree in the UK (Jimmy couldn't fix that). Celebs and underage sex has been a thing since celebs and age of consent has been a thing. It's the worlds worst best kept secret. Anyone who believed that Britney Spears was a virgin when she sang "hit me one my time" is an absolute fool. The fact that record execs and producers have been taking sexual favours for parts and c

      • by mjwx ( 966435 )

        Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs?

        Lmao, citation needed. This sounds too close to that uninformed worrying from parents thinking D&D games and rock music are turning kids into satan worshipers.

        You've got to watch out for those suicide Political Action Committees. They're everywhere.

        Jokes aside, there's always been people looking to take advantage of the disenfranchised, especially the suicidal ones, it's just that they've always needed to meet face to face in the before times. David Koresh (Branch Davidians, cult leader of Waco) used to seek out people who were suicidal or previously abused to join his cult and was good at it. I'd hate to think how effective he'd be if he were born in the Inte

    • by DarkOx ( 621550 )

      Had this been another teen victim of the suck fucks trolling Discord would the NYT even have printed the article?

      My guess is they would have. Printing articles about social media's harms, and toxic online culture seems to be a popular source a click bait subject matter for almost decade now. Heck you probably mostly re-use a story from2k12; just change the names dates and web addresses.

      The real issue the safety argument here. A box a ten penny nails isnt a product most of us would label as unsafe but it is also not something you'd hand a toddler. Similarly most movies are not 'unsafe' but they could be if shown to

      • by e3m4n ( 947977 )

        I like the box of nails analogy. Of course I lived through the tylenol scare. In hindsight it was about as substantial as the handful of people that died from the anthrax scare. But a good idea of tamper proof was born and thats generally why even damn gummy vitamins have them. Not sure that even eating the whole jar of gummy vitamins would even make you sick let alone poisoned. Ive never seen an LLM pretend to be someones imaginary friend. Ive asked chatGPT some questions but nothing like a how was your da

    • The more source material it has the better it copies. That is fact. Also what is fact, is that every time an AI is let loose on the Web without heavy curating it instantly becomes a complete ass-wipe shithead.

    • ... e-sports ...

      Why is e-sports equal to role-playing, board-games and youth groups? No, at that age they need face-to-face activities. Twitter/X barely qualifies, (non LAN-party) e-sports provides word salad and other computer games, none.

      Parents seem to like the addictive nature of e-sports and social-media: It means they can spend less time arguing with a child and less time parenting. I think addiction is the first problem with online services. It's easier to self-medicate via a free emotional addiction than to

      • by e3m4n ( 947977 )

        e-sports in highschool is a physical team that meets up and practices. They have fundraisers at pizza places, going out to eat together, etc. its not all remote.

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Wednesday October 23, 2024 @09:19AM (#64887179) Homepage Journal

    The word is suicide

    • by AmiMoJo ( 196126 )

      Suicide isn't a great word for this. Suicide includes things like suicide attacks, assisted dying, and so forth. It's also usually used with the word "committed", which makes it sound like some sort of crime (it used to be illegal in the UK and probably elsewhere, due to religious influence).

      He died of mental health problems, of illness. How much AI contributed to that we can't say from TFA, we will have to wait for details of the lawsuit to be made public.

    • by Calydor ( 739835 )

      The summary says he 'took his own life'. Is that phrase not good enough for you?

      • It's the headline I'm complaining about, and it looks like there are some who agree. "Dies" doesn't work for me.

  • Since having contact with people who may commit suicide could get you sued afterwards. Makes their prospects worse but hey, at least you can't be blamed.

  • It sells you something, that's all. If they don't have a product to pitch you they just tell you who you are. Someone who should buying more stuff.

  • by Baron_Yam ( 643147 ) on Wednesday October 23, 2024 @09:48AM (#64887263)

    If you have a kid who is able to get suicidally attached to a chatbot and you didn't see it coming, you weren't parenting.

    I'm not saying that the outcome could have been avoided, some people have serious mental illnesses that are resistant to what few treatments we have, but in no way should extended access to a chatbot make the chatbot provider liable. The parents should take responsibility for that aspect of their child's fate.

    • While I'm normally fine with criticizing modern parenting techniques, in this case I'm far more hesitant because people can be really good at hiding their depression. And even though I've been through numerous episodes of depression, I don't know that I would be able to spot it in a teenager who was going out of their way to hide it, especially since teenagers are often moody and distant even when they're not depressed.
      • Sure. But at the same time if your tween is in front of a computer 24x7 instead of going out with friends, socialising with the family, etc... you know. You may not catch it as depression, but you should notice something.

        That they're trying to blame a third party shows they know and won't accept responsibility.

        • I'd love to know your age range and if you raised a kid in the last 10-20 years. The level at which their lives are intertwined with their phones and social media is absurd. My 18 year ild told me they would never ever ever take away their kids phone because they see it as being too severe of a punishment...
          • Old. With technically adult children. And yeah, phones, iPads, gaming consoles, laptops... I won't claim I saw everything they did, but I checked in often enough there were more than a few periods of forced disconnection when they started straying outside acceptable limits.

            Didn't enjoy being the asshole, but you have a job as a parent and you have a duty to roll up your sleeves and do it. You're building an adult human, that's a fucking huge responsibility.

            And no, my kids are not perfect, but I'd like to

            • I'm in the same boat, hell I even rifled through my kids phone and found some conversations about suicide between them and another kid online that lives hundreds of kms away from us. I'm glad I did because it lead to me calling the local police for the other kid for a welfare check to make sure their parents were aware of the situation as well. I'll never forget that night, I felt bad apologized to the cop for sending him out on a welfare check at 3am in the morning... the cops response will stay with
              • Re: the cop's response.

                Absolutely. I've had some cop friends express similar feelings about interrupting teens having sex in parked cars. "I'd rather ruin their night than have someone tell me the next day I didn't stop a sexual assault".

                Even asshole cops want to be seen as heroes, so generally things like that are almost universal.

    • by HiThere ( 15173 )

      Sorry, but that's MUCH too strong a statement. Dark patterns exist, and they exist in other areas than just getting you to buy stuff. The details matter. There's not enough information here to decide. (Speaking as someone who's never used CharacterAI.)

  • What happened to the Star Wars era when we learned to mock, ignore, unplug, or threaten to leave behind robots mimicking human interaction like we should?
  • "Hi, Dany!"

    "Can't we just be friends?"

    "You look nice today!"

    "Can't we just be friends?"

    "Was it fun riding dragons?"

    "Can't we just be friends?"

    "What was all that green stuff?"

    "Can't we just be friends?"

    Etc.

  • by Rosco P. Coltrane ( 209368 ) on Wednesday October 23, 2024 @10:44AM (#64887433)

    is not forming emotional bonds with fucking machines.

    However depressed this teen was, he knew he was talking to a machine, and the machine made him more depressed. That would not have happened if the teen kept in mind at all time, whenever he interacted with that machine, that he's really talking to a dystopian for-profit trying to profit from his emotions.

    The educational system needs to give our kids the tools needed to distance themselves from dystopia.

    • by gweihir ( 88907 )

      I do not think anything should be done here at all. This was an extremely rare event. If anything is being done, it will likely have more negative effects than positive ones.

    • is not forming emotional bonds with fucking machines.

      However depressed this teen was, he knew he was talking to a machine, and the machine made him more depressed. That would not have happened if the teen kept in mind at all time, whenever he interacted with that machine, that he's really talking to a dystopian for-profit trying to profit from his emotions.

      The educational system needs to give our kids the tools needed to distance themselves from dystopia.

      Says the person posting with a pseudonym to a forum full of people he'll never meet IRL.

  • That this seems to be the first to do so basically indicates that going for a walk is more dangerous for teens.

  • by v1 ( 525388 ) on Wednesday October 23, 2024 @12:01PM (#64887715) Homepage Journal

    So many parents blaming the world nowadays for not taking care of their child when they couldn't be bothered to be a parent themselves. I wonder if that's the case here?

    If your kid is doing something that's obviously unhealthy (physically or mentally) and you just let it continue and escalate until something tragic happens, you're not being a responsible parent. In that case, the world should not be held accountable for your lack of responsible behavior.

    • by mjwx ( 966435 )

      So many parents blaming the world nowadays for not taking care of their child when they couldn't be bothered to be a parent themselves. I wonder if that's the case here?

      If your kid is doing something that's obviously unhealthy (physically or mentally) and you just let it continue and escalate until something tragic happens, you're not being a responsible parent. In that case, the world should not be held accountable for your lack of responsible behavior.

      This isn't a new thing. Before the interwebs, parents who didn't want to take responsibility for their kids blamed video games, rock and/or roll music, movies, communists, hamsters, comic books, satan, et al. Always an excuse to claim it wasn't their fault. The internet was the get out of jail free card of the last generation of parents. I guess it's AI's turn now.

      In my experience, it seems the more pious the parents, the faster the excuses come when something goes wrong with their kids. Most of the kids

  • Suicide is horrible. It's also caused by mental illness, not by AI.
  • > Sewell Setzer III, 14,

    the app has an age rating of 17+ in iOS appstore

    so the parents failed at properly educating their kid and allowed them full access to the internet at the age of 14.

    yes blame the tech/app

To communicate is the beginning of understanding. -- AT&T

Working...