Teen Dies After Intense Bond with Character.AI Chatbot 94
A Florida teenager who formed a deep emotional bond with an AI chatbot took his own life after months of intense daily interactions on Character.AI, a leading AI companion platform. Sewell Setzer III, 14, exchanged his final messages with "Dany," an AI character based on a Game of Thrones figure, before dying by suicide on February 28. His mother, The New York Times reports, plans to file a lawsuit against Character.AI, alleging the platform's "dangerous and untested" technology led to his death.
Character.AI, valued at $1 billion and claiming 20 million users, in response said it would implement new safety features for minors, including time limits and expanded trigger warnings for self-harm discussions. The company's head of trust and safety Jerry Ruoti said they "take user safety very seriously."
Character.AI, valued at $1 billion and claiming 20 million users, in response said it would implement new safety features for minors, including time limits and expanded trigger warnings for self-harm discussions. The company's head of trust and safety Jerry Ruoti said they "take user safety very seriously."
Let me fix that for you. (Score:5, Insightful)
Re: (Score:3)
Re:Let me fix that for you. (Score:5, Insightful)
Well I imagine it's quite possible that some teens that were thinking about suicide were "saved" by finding "friendship" with one of these AI. We would be much less likely too hear about it though.
Re: (Score:3, Insightful)
Re: (Score:2, Insightful)
Young people lacking in real human interaction is a huge problem today how is replacing this with a chatbot going to save someone? It just makes it worse.
Not necessarily. In the same way we can develop "addictive" AI bots that exploit vulnerabilities, we can also develop bots trained with medical expertise, to "detect" depression or suicidal interactions and attempt to reassure the individual and seek help (and if possible, alert a corresponding system.)
Re: (Score:2)
Re: (Score:1)
Then get off their phone and go outside. Get on their bike, ride to the woods, and poke the dead body. Participate in sports/band/drama/whatever where they have to interact with real people.
This isn't brain science.
Re: Let me fix that for you. (Score:3, Informative)
Re: (Score:3)
Re: (Score:3)
Re: (Score:1)
Can you think of a redeeming quality of this chatbot? I can't.
There's only one quality, redeeming or otherwise: It provides emotional support that kids desperately need and parents and peers are too emotionally and mentally stunted to provide. Which should be a condemnation of our entire society, but instead has become a point of profit for those who see the need and have found ways to exploit it.
Re: (Score:3)
It provides emotional support that kids desperately need
Is there any evidence to support that?
Indirect evidence (Score:2)
It provides emotional support that kids desperately need
Is there any evidence to support that?
There is indirect evidence.
During the cold war, babies born to Kosovo Albanian women raped by Serbian forces during wartime were typically given up to orphanages, which gave them food and other comforts, but were not handled.
They discovered that babies need human contact, actually need to be picked up and handled, in order to survive. Many of the babies died as a result, and not from lack of any necessities such as food or exposure.
Emotional support, and specifically contact with caring adults is likely to
Re: (Score:2)
The claim wasn't "children need emotional support", the claim was "It [the character.ai chatbot] provides emotional support".
Re: (Score:2)
Re: (Score:1)
Really? Sounds like a depressed kid with no real friends found value in talking to this chat bot, even if it wasn't enough value to want to continue living. This is 100% a shakedown, probably orchestrated from an ambulance chaser.
Re: (Score:3)
Re: Let me fix that for you. (Score:1)
Re: (Score:2)
Re: (Score:1)
My toaster produced delicious glorious toast one time so now I constantly push the lever down over and over but it never makes toast any more.
Re: (Score:2)
Entertainment.
Sounds like you don't have much of an imagination if you couldn't think of a single one.
Re: Let me fix that for you. (Score:2)
Re: (Score:2)
Re: (Score:2)
Litigation nation
Re: (Score:1, Insightful)
Teen lost to depression, mother diverts her guilt and blame seeking to get rich.
And so she should. Depression is an illness that needs to be carefully treated and can easily be exacerbated by certain interactions. If it can be shown in chat logs that what the AI chatbot said was actively harmful then a commercial product contributed to someone's death. And I couldn't think of a happier outcome than a wealth transfer from a worthless AI company to someone who could probably do with some extra cash.
Re:Let me fix that for you. (Score:5, Insightful)
Yah, profit off of her dead kid. Great morals.
"If it can be shown in chat logs that what the AI chatbot said was actively harmful then a commercial product contributed to someone's death.", good luck with that.
"Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)"
He sounds like he could've gotten sucked into anything. It was her responsibility to get him into treatment and it wasn't some chatbot's fault that he was so messed up.
Re: (Score:1)
Yah, profit off of her dead kid. Great morals.
Why not? If your life gets turned upside down due to the loss of your kid why not get compensation? You say the word "profit" as if the person who lost a child is unaffected. WTF is wrong with you.
Re:Let me fix that for you. (Score:4, Funny)
WTF is wrong with you.
Plenty! Haven't you seen the terrible things this "Anonymous Coward" person says daily? They often contradict themselves as well.
Re: (Score:1)
Parent better.
Re: (Score:2)
LOL. Tell us you don't know what depression is without telling us.
Re: (Score:2)
LOL. Tell us you don't know what depression is without telling us.
Been there, done that, literally have the scars to prove it.
One thing about teenage suicidal depression is that you feel so utterly alone. You don't have anyone to talk to, even if you did they wouldn't understand. You're constantly being told you're useless, not good enough. Don't get me started on "its just a phase". A common theme with suicidal teens is the parents are almost absent, even though they're there and "technically" doing the parent thing (I.E. the kids are rarely starving, have a roof, clo
Re: Let me fix that for you. (Score:1)
Please have some level of respect for this family, if they believe they can help another family by proving this chatbot helped influence their child then let the courts decide the merrit of that accusation.
https://www.nimh.nih.gov/news/... [nih.gov]
Re: (Score:1)
But consider that even tho some people literally die from peanuts, we don't ban peanuts for everyone or even allow people who die from peanuts to sue peanut companies unless a complete disregard for care was taken.
Right now, it seems the risk from dying with an AI is astonishingly low (1 in billions) while the benefits of using AI are enormous.
We can develop some guard rails around them (and even improve them to detect suicidal ideation among users) but allowing people to sue them wily-nily or banning them
Re: Let me fix that for you. (Score:3)
You can't criticize someone's morality while arguing that a corporation should be able to take advantage of the suicidal and then actually increase their suicide risk... For profit
Re:Let me fix that for you. (Score:4, Interesting)
while she's at it she could sue his husband, the gun manufacturer and the state too? oh and don't forget apple and the phone company.
'What if I told you I could come home right now?' Sewell asked.
' please do, my sweet king,' Dany replied.
That's when Sewell put down his phone, picked up his stepfather's .45 caliber handgun and pulled the trigger.
nope, all that negligence has definitely nothing to do with her and is absolutely the chatbot's fault:
Garcia claimed she didn't know the extent to which Sewell tried to reestablish access to Character.AI.
The lawsuit claimed that in the days leading up to his death, he tried to use his mother's Kindle and her work computer to once again talk to the chatbot.
yeah, right.
dunno why i'm even commenting on this retarded clickbait story. i'm so winded, daenerys is avoiding me lately ... i think i'm going to sue her.
Re: (Score:2)
while she's at it she could sue his husband, the gun manufacturer and the state too? oh and don't forget apple and the phone company.
She could, but she wouldn't win. Proximity to the case is relevant. Having a phone doesn't mean you're using a highly addictive chatbot that is making your condition worse.
nope, all that negligence has definitely nothing to do with her and is absolutely the chatbot's fault:
That's not for you to decide, that's why we have a legal system in the first place. You have little information but you've already passed judgement. You'd make a really shitty judge.
Re: (Score:2)
Getting rich is very likely not the motivation. I know the crowd opposed to all lawsuits thinks this is the case, but very often the reason for a lawsuit is to be punitive to the person causing the damage. And you can't be punitive to a large corporation without taking some of their money. Some lawsuit winners donate the judgements to charities.
So for the lawsuit against Inforwars asshat was not about getting rich but because Alex Jones was an utterly despicable human being who needed to be forced to sto
Re: (Score:1)
Yup, nothing will sooth her pain except for a luxury car and a 2nd home.
"You can't put a price on human life" I'm always told but then they always do.
Re:Let me fix that for you. (Score:4, Insightful)
"You can't put a price on human life" I'm always told but then they always do.
It's currently the only way to do *anything* to the offending party, at least here in the USA, because it's absolutely impossible to get a criminal indictment against any corporation, or pretty much anyone working in said corporation.
That doesn't mean that suing for $$ is good, but it's the only [legal] option
Re: (Score:3)
Probably just looking for answers and meaning. But a chatbot does not drive people to kill themselves, unless specifically designed for it. At 14, this person probably just realized what a crappy place this planet is, due to a rather significant part of the people here, and decided to not play.
pickup line (Score:1)
Bestest pickup line: "Cutie, your name must be Suicide, cuz I think of you every day.".
Re: (Score:2)
What a shit article (Score:5, Insightful)
Do you know how many real non AI sick fuckers troll Discord pretending to be teens looking for vulnerable teens to convince them to join suicide pacs? This article is so poorly written it fails to, in any way, establish complicity on the part of the LLM. Where is the evidence that the LLM encouraged suicide? Where is the evidence that this suicide was preventable? What attempts did the parents make to limit access to a child who is at a very highly influential age of development?
13,14,15 are practically imprinting years of young adult development. Influences at this age are more damaging than practically any other age of ones adult lifetime. If your kid is a recluse at 13, 14, 15 you need to try to get your kid involved in more after school activities or clubs. It doesn't really matter if its e-sports, role-playing, board-game, archerty, church youth group, or whatever. Having teammates and feeling like you are part of a team with real humans you interact with and can physically see goes a long way toward not feeling isolate. Living in isolation will lead to either suicide, alcoholism, or drug abuse.
This article explains nothing of what steps happened, what signs were present, anything. Thats trying to scrape the article from behind a paywall. Had this been another teen victim of the suck fucks trolling Discord would the NYT even have printed the article? Journalistic irresponsibility. No way I am paying for this level of poor journalism.
Re: What a shit article (Score:2)
I guess these details will be in the lawsuit, which is still not filed with the courts.
I suppose the story is the suicide leading to the suit.
Re: (Score:1)
There is also the fact that teens are more isolated than ever before. Be it 24/7 bullying, no real social life, no gathering spots, and very often, 24/7 curfews, so while an older generation was out late at an arcade, the younger ones would be scooped up and on their way to a (private, for profit) juvi center.
Combine that with the absolute hopelessness. Lets be real here:
US teens have no prospects of competing against people from Europe, Japan, China, or even India and Russia. The jobs are all overseas,
Re: (Score:1)
Re: (Score:2)
Im saying that at that age ALL communication should be supervised and moderated. See my comment about the slime fuckers on discord. Plaintiff has to prove AI was MORE dangerous than every other access than she allowed him to have. Spending hours in your room talking to your imaginary friend who tells you to shoot up a school or self harm would trigger an immediate visit to a child psychologist. There are always warning signs of suicide. Someone saw them. Maybe he was cutting on himself. Mentioned something
Re: (Score:1)
Re: (Score:2)
your kid has no rights to privacy until they are 18. If you suspect even for a second that your kid is having issues, you can either be a liberal twit and let your kid spiral into despair because you're too cool for school, or you can take action and spot check your childs activities. You obviously are not a parent of a teen who was/is a teen since 2015 onward.
As a parent YOU are the safeguard. YOU have final say. Stop passing the buck and blaming others for your failures. YOU saw the signs, YOU cho
Re: (Score:1)
Re:What a shit article (Score:4, Insightful)
Lmao, citation needed. This sounds too close to that uninformed worrying from parents thinking D&D games and rock music are turning kids into satan worshipers.
Re: (Score:2)
personal experience. This is not urban legend. in 2017 I collected and turned over to the FBI a whole shitload of logs, text, and recordings from some chatrooms. Why is it so hard to believe? I bet you think P Diddy didnt rape that 13yr old either. Because heavens forbid there actually be child predators out there.
Re: (Score:2)
I bet you think P Diddy didnt rape that 13yr old either. Because heavens forbid there actually be child predators out there.
The only thing surprising about this is the fact that society now, for some strange reason, cares.
This isn't a new thing, see Operation Yewtree in the UK (Jimmy couldn't fix that). Celebs and underage sex has been a thing since celebs and age of consent has been a thing. It's the worlds worst best kept secret. Anyone who believed that Britney Spears was a virgin when she sang "hit me one my time" is an absolute fool. The fact that record execs and producers have been taking sexual favours for parts and c
Re: (Score:2)
Lmao, citation needed. This sounds too close to that uninformed worrying from parents thinking D&D games and rock music are turning kids into satan worshipers.
You've got to watch out for those suicide Political Action Committees. They're everywhere.
Jokes aside, there's always been people looking to take advantage of the disenfranchised, especially the suicidal ones, it's just that they've always needed to meet face to face in the before times. David Koresh (Branch Davidians, cult leader of Waco) used to seek out people who were suicidal or previously abused to join his cult and was good at it. I'd hate to think how effective he'd be if he were born in the Inte
Re: (Score:2)
Had this been another teen victim of the suck fucks trolling Discord would the NYT even have printed the article?
My guess is they would have. Printing articles about social media's harms, and toxic online culture seems to be a popular source a click bait subject matter for almost decade now. Heck you probably mostly re-use a story from2k12; just change the names dates and web addresses.
The real issue the safety argument here. A box a ten penny nails isnt a product most of us would label as unsafe but it is also not something you'd hand a toddler. Similarly most movies are not 'unsafe' but they could be if shown to
Re: (Score:2)
I like the box of nails analogy. Of course I lived through the tylenol scare. In hindsight it was about as substantial as the handful of people that died from the anthrax scare. But a good idea of tamper proof was born and thats generally why even damn gummy vitamins have them. Not sure that even eating the whole jar of gummy vitamins would even make you sick let alone poisoned. Ive never seen an LLM pretend to be someones imaginary friend. Ive asked chatGPT some questions but nothing like a how was your da
AI is great at copycat, ah (Score:2)
The more source material it has the better it copies. That is fact. Also what is fact, is that every time an AI is let loose on the Web without heavy curating it instantly becomes a complete ass-wipe shithead.
Re: (Score:2)
Why is e-sports equal to role-playing, board-games and youth groups? No, at that age they need face-to-face activities. Twitter/X barely qualifies, (non LAN-party) e-sports provides word salad and other computer games, none.
Parents seem to like the addictive nature of e-sports and social-media: It means they can spend less time arguing with a child and less time parenting. I think addiction is the first problem with online services. It's easier to self-medicate via a free emotional addiction than to
Re: (Score:2)
e-sports in highschool is a physical team that meets up and practices. They have fundraisers at pizza places, going out to eat together, etc. its not all remote.
typical msmash trash (Score:4, Insightful)
The word is suicide
Re: (Score:1)
Suicide isn't a great word for this. Suicide includes things like suicide attacks, assisted dying, and so forth. It's also usually used with the word "committed", which makes it sound like some sort of crime (it used to be illegal in the UK and probably elsewhere, due to religious influence).
He died of mental health problems, of illness. How much AI contributed to that we can't say from TFA, we will have to wait for details of the lawsuit to be made public.
Re: (Score:2)
The summary says he 'took his own life'. Is that phrase not good enough for you?
Re: (Score:2)
It's the headline I'm complaining about, and it looks like there are some who agree. "Dies" doesn't work for me.
Better avoid all contact then (Score:2)
Since having contact with people who may commit suicide could get you sued afterwards. Makes their prospects worse but hey, at least you can't be blamed.
For profit corporate structure (Score:1)
It sells you something, that's all. If they don't have a product to pitch you they just tell you who you are. Someone who should buying more stuff.
Countersue the parents (Score:3, Insightful)
If you have a kid who is able to get suicidally attached to a chatbot and you didn't see it coming, you weren't parenting.
I'm not saying that the outcome could have been avoided, some people have serious mental illnesses that are resistant to what few treatments we have, but in no way should extended access to a chatbot make the chatbot provider liable. The parents should take responsibility for that aspect of their child's fate.
Re: (Score:2)
Re: (Score:2)
Sure. But at the same time if your tween is in front of a computer 24x7 instead of going out with friends, socialising with the family, etc... you know. You may not catch it as depression, but you should notice something.
That they're trying to blame a third party shows they know and won't accept responsibility.
Re: Countersue the parents (Score:1)
Re: (Score:2)
Old. With technically adult children. And yeah, phones, iPads, gaming consoles, laptops... I won't claim I saw everything they did, but I checked in often enough there were more than a few periods of forced disconnection when they started straying outside acceptable limits.
Didn't enjoy being the asshole, but you have a job as a parent and you have a duty to roll up your sleeves and do it. You're building an adult human, that's a fucking huge responsibility.
And no, my kids are not perfect, but I'd like to
Re: (Score:1)
Re: (Score:2)
Re: the cop's response.
Absolutely. I've had some cop friends express similar feelings about interrupting teens having sex in parked cars. "I'd rather ruin their night than have someone tell me the next day I didn't stop a sexual assault".
Even asshole cops want to be seen as heroes, so generally things like that are almost universal.
Re: (Score:2)
Sorry, but that's MUCH too strong a statement. Dark patterns exist, and they exist in other areas than just getting you to buy stuff. The details matter. There's not enough information here to decide. (Speaking as someone who's never used CharacterAI.)
Le wrong generation (Score:2)
It's a Friendly AI, Thankfully! (Score:2)
"Hi, Dany!"
"Can't we just be friends?"
"You look nice today!"
"Can't we just be friends?"
"Was it fun riding dragons?"
"Can't we just be friends?"
"What was all that green stuff?"
"Can't we just be friends?"
Etc.
Re: It's a Friendly AI, Thankfully! (Score:1)
How can you know what's real when the AI domain is a parody of a mirror domain which is a satire of the first? Your real life literally does not matter any more than being an algorithm to make fun of.
I think the one skill schools need to teach asap (Score:4, Insightful)
is not forming emotional bonds with fucking machines.
However depressed this teen was, he knew he was talking to a machine, and the machine made him more depressed. That would not have happened if the teen kept in mind at all time, whenever he interacted with that machine, that he's really talking to a dystopian for-profit trying to profit from his emotions.
The educational system needs to give our kids the tools needed to distance themselves from dystopia.
Re: (Score:2)
I do not think anything should be done here at all. This was an extremely rare event. If anything is being done, it will likely have more negative effects than positive ones.
Re: (Score:2)
is not forming emotional bonds with fucking machines.
However depressed this teen was, he knew he was talking to a machine, and the machine made him more depressed. That would not have happened if the teen kept in mind at all time, whenever he interacted with that machine, that he's really talking to a dystopian for-profit trying to profit from his emotions.
The educational system needs to give our kids the tools needed to distance themselves from dystopia.
Says the person posting with a pseudonym to a forum full of people he'll never meet IRL.
Not a major factor (Score:2)
That this seems to be the first to do so basically indicates that going for a walk is more dangerous for teens.
or maybe bad parenting? (Score:3, Insightful)
So many parents blaming the world nowadays for not taking care of their child when they couldn't be bothered to be a parent themselves. I wonder if that's the case here?
If your kid is doing something that's obviously unhealthy (physically or mentally) and you just let it continue and escalate until something tragic happens, you're not being a responsible parent. In that case, the world should not be held accountable for your lack of responsible behavior.
Re: (Score:2)
So many parents blaming the world nowadays for not taking care of their child when they couldn't be bothered to be a parent themselves. I wonder if that's the case here?
If your kid is doing something that's obviously unhealthy (physically or mentally) and you just let it continue and escalate until something tragic happens, you're not being a responsible parent. In that case, the world should not be held accountable for your lack of responsible behavior.
This isn't a new thing. Before the interwebs, parents who didn't want to take responsibility for their kids blamed video games, rock and/or roll music, movies, communists, hamsters, comic books, satan, et al. Always an excuse to claim it wasn't their fault. The internet was the get out of jail free card of the last generation of parents. I guess it's AI's turn now.
In my experience, it seems the more pious the parents, the faster the excuses come when something goes wrong with their kids. Most of the kids
Teen dies after breathing (Score:1)
Age rating (Score:2)
the app has an age rating of 17+ in iOS appstore
so the parents failed at properly educating their kid and allowed them full access to the internet at the age of 14.
yes blame the tech/app