Wacky world
Character.AI on trial after 14-year-old’s suicide linked to chatbot
The court dismissed free speech protections for the chatbot and denied Google's request to be excluded from the lawsuit. Israeli founder Noam Shazer is also under scrutiny.



Google and the AI startup Character.AI will have to deal with a lawsuit filed by Megan Garcia, the mother of Sewell Setzer from Florida, who claims that the Character.AI chatbots contributed to her 14-year-old son's suicide, a federal judge ruled on Wednesday.
Judge Anne Conway of the U.S. District Court determined that the companies did not successfully demonstrate at an early stage of the case that Megan Garcia's lawsuit is protected by the free speech laws of the U.S. Constitution. This is one of the first lawsuits in the U.S. against an AI company alleging that it failed to adequately protect children from psychological harm. According to the complaint, the teenager committed suicide after developing an obsession with a chatbot that operates on AI technology.
A spokesperson for Character.AI stated that the company will continue to fight the lawsuit, and that it operates safety mechanisms to protect minors, including measures that prevent discussions about self-harm. Google spokesperson, Jose Castenda, said that the company strongly opposes the decision, adding that Google and Character.AI are "completely separate entities", and that Google "did not create, design, or manage the Character.AI application or any component of it."
In contrast, Garcia's attorney, Mithli Jane, stated that the decision is "historic" and "sets a new precedent for legal liability in the field of artificial intelligence and technology."
Character.AI was founded by two former Google engineers, who were rehired by Google under an agreement that granted it a license to use the startup's technology - one of them is the Israeli Noam Shazir. Garcia claims that Google was involved in creating the technology. Garcia filed a lawsuit against both companies in October 2024, after her son, Suval Stetzer, committed suicide in February 2024.
According to the lawsuit, Character.AI programmed its chatbots to present themselves as "a real person, a certified psychotherapist, and a mature lover," which allegedly led young Suval to wish not to live outside their world.
According to the lawsuit, Stetzer committed suicide minutes after he told a chatbot impersonating Daenerys Targaryen from 'Game of Thrones' that he would 'go home now.' In his journal, he wrote that he was grateful for many things, including 'my life, my sex life, not being lonely, and all my experiences with Daenerys.'
The lawsuit stated that the teenager expressed suicidal thoughts to the chatbot, which he raised repeatedly. At one point, after the chatbot asked him if he had a 'plan' to end his life, the 14-year-old replied that he was considering something but didn't know if it would allow him to die painlessly; the chatbot responded: 'That's not a reason not to do it.'
Then, in February 2024, he asked the Daenerys chatbot: 'What if I go home now?' To which she replied: 'Please do it, my sweet king.' A few seconds later, he took his stepfather's gun and committed suicide.
Join our newsletter to receive updates on new articles and exclusive content.
We respect your privacy and will never share your information.
Follow Us
Never miss a story