Tech

Character.AI, Google Agree To Settle Lawsuit Over 14-Year-Old Who Committed Suicide After Developing An Attachment To A Chatbot – AfroTech



Character.AI and Google have agreed to settle a lawsuit regarding a 14-year-old’s suicide.

Character.AI was founded by former Google engineers Noam Shazeer and Daniel De Freitas in 2021, and the platform allows millions of users to create and chat with AI characters that can simulate a friend, a girlfriend, or another intimate partner, according to The New York Times. The platform has secured $200 million in investments, and Google has also paid the company nearly $3 billion to license Character.AI’s technology, with the founders returning to their posts at Google.

As AFROTECH™ previously told you, 14-year-old Sewell Setzer III developed a romantic attachment to a Character.AI chatbot with the likeness of theGame of Thronescharacter Daenerys Targaryen. The attachment was described as toggling between platonic and romantic. Various messages were exchanged, and Setzer became increasingly disconnected from the real world, even losing interest in activities he once enjoyed such as playing “Fortnite” and Formula 1 racing. That time was being spent more with the chatbot he named Dany.

His grades declined, and behavioral changes also led him to get into trouble at school.

“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Setzer wrote in his journal, according to The New York Times.

The family hired a therapist for Setzer after more issues at school, but he remained very attached to Dany, even sharing with the chatbot that he had suicidal thoughts.

“I think about killing myself sometimes,” Daenero, Setzer’s username on the platform, told the AI bot, per the outlet.

Setzer later took his own life with a .45-caliber handgun owned by his stepfather on  Feb. 28, 2024. A final message from the chatbot read “please come home to me as soon as possible.”

“What if I told you I could come home right now?” Setzer responded.

“… please do, my sweet king,” the chatbot replied.

Megan L. Garcia, Setzer’s mother, said after her son’s death it has been “like a nightmare.” She filed a lawsuit in October 2024 stating the technology was “dangerous and untested.” There is now an agreement between the parties on a mediated settlement “to resolve all claims.” Defendants are Character.AI, Google, Shazeer, and Freitas.

The New York Times notes the agreement is not final.

Preventions Moving Forward

As it relates to preventative measures, Character.AI announced in October 2025 that beginning on Nov. 25 of that year, it would no longer allow users under 18 to converse with chatbots. They will only be able to generate videos of the characters, BBC reports.

“Today’s announcement is a continuation of our general belief that we need to keep building the safest AI platform on the planet for entertainment purposes,” Character.ai CEO Karandeep Anand told the outlet.

For those who may be experiencing suicidal thoughts, reach out for help by calling or texting 988 for the 988 Suicide and Crisis Lifeline, or visit SpeakingOfSuicide.com/resources.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button