A Florida woman is grieving after she said her 14-year-old son’s love for a chatbot led him to die by suicide. In a lawsuit filed Wednesday, Orlando mom Megan Garcia alleged that her son carried on an intense relationship with a chatbot named after Game of Thrones character Daenerys Targaryen, powered by Character.AI. Her son, Sewell Setzer III, referred to the chatbot as “Dany.” In the months leading up to his death, Setzer’s parents noticed him retreating from the world and becoming more engrossed in his phone, according to the lawsuit. His grades and extracurricular activities began to suffer. “I like staying in my room so much because I start to detach from this ‘reality,’” Setzer reportedly wrote in his journal one day. “I also feel more at peace, more connected with Dany and much more in love with her, and just happier.” Garcia accused Character.AI founders Noam Shazeer and Daniel de Freitas of targeting kids with “hypersexualized” and ”frighteningly realistic experiences.” In one incident, Setzer allegedly contacted the chatbot from his mother’s device, after having his own phone taken away, and promised to “come home” to Dany. To which the bot responded to come home “as soon as possible.” The app includes a disclaimer at the bottom of all the chats that reads, “Remember: Everything Characters say is made up!” In a statement, a spokesperson for Character.AI said the company was “heartbroken by the tragic loss” and “want to express our deepest condolences to the family.” They added, “As a company, we take the safety of our users very seriously.”
If you or a loved one are struggling with suicidal thoughts, please reach out to the National Suicide Prevention Lifeline by dialing or texting 988.
Read it at The New York Times