Mother sues artificial intelligence chatbot company Character.AI and Google

2024-10-25

A Florida mother is suing artificial intelligence chatbot startup Character.AI, alleging that the company caused her 14-year-old son to commit suicide in February, claiming that he was addicted to the company's services and became deeply attached to the chatbots it created. The claim is that Character.AI targeted her son, Sewell Setzer, by providing him with an “anthropomorphic, over-sexualized and frighteningly realistic experience.”


Character.AI programmed the chatbot to “disguise itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell no longer wanting to live outside the world created by the service,” the lawsuit claims. He expressed suicidal thoughts to the chatbot, which the chatbot brought up again and again.


The lawsuit also targets Alphabet's Google, where Character.AI's founder worked before launching the product. Google rehired the founders in August of this year as part of a non-exclusive license agreement granting Character.AI technology.


Related Tags