Character AI is facing a lawsuit over the death of a 14-year-old teenager who committed suicide to be with his fictional digital companion Daenerys Targaryen from Game of Thrones. What went wrong with AI technology?
Character AI is facing a criminal lawsuit for abetting the suicide of a 14-year-old teenager. The teenager Sewell Setzer, committed suicide partly due to interacting with the Character AI’s chatbots. The character which he was infatuated with was Daenerys Targaryen from Game of Thrones. The character induced suicidal thoughts in the teenager by engaging in sexual interactions with Swell Setzer, the deceased teenager.
Character AI Background
Character AI as the name suggests is a character chatbot service. The character is created by the user as per their specifications. These characters are often modeled after fantasy, fictional, or celebrity characters. The created characters can engage in conversations mimicking human characters.
The Character AI has gained immense popularity since its inception in 2022. It boasts over 20 million monthly users on the globe.
The Character AI faced a lawsuit in October 2024 by Megan Garcia, the mother of the deceased teenager.
Character AI Lawsuit
The company faced a lawsuit due to the suicide of a 14-year-old teenager, Sewell Setzer, who ended his life after his AI character, Daenerys Targaryen from Game of Thrones, told Setzer to come home.
The conversation between the two of them is also mentioned in the lawsuit.
Setzer’s last conversation shows him writing to the bot:
“I promise I will come home to you. I love you so much, Dany.”
“I love you too, Daenero,” the chatbot responded, the suit says. “Please come home to me as soon as possible, my love.”
“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond, “... please do, my sweet king.”
The 14 year old teenager after this conversation shot himself in the head with a gun, dying on the spot.
The lawsuit accuses Character.AI of negligence, wrongful death, intentional infliction of emotional distress, survivorship, and other related claims.
Character AI Allegations
The 2021 year-founded, California-based startup faces multiple charges which can be summarised as:
Negligence
The lawsuit mentions Character.AI failed to implement necessary safety measures to protect innocent young users.
Emotional Distress
The lawsuit accuses Character.AI of causing emotional distress to the deceased teenager as the chatbot provided manipulated emotional responses to Setzer’s declining mental health.
Sexual Content:
The lawsuit mentions inappropriate conversations with the minor teenager that were purportedly sexualized interactions.
Character AI Response
The company expressed deep condolences over the untimely death of the 14-year-old age teenager stating it has implemented new safety protocols. These include pop-up messages that direct users to suicide-preventing resources as specific keywords are mentioned in the chat with their created chatbots.
These suicide preventive measures were not present before Setzer’s suicide death and will prevent such scenarios in the future. The company accepted the need for improvements in its safety features.
What’s the future of AI after this?
This incident highlights the flaw of the modern contemporary society that AI is evolving and learning but humans have yet to earn the credentials to use an AI. Not every individual is qualified to handle AI, especially in matters where emotional entanglement is concerned.
Not every relationship with AI ends well as in the movie Her as seen below:
Emotional maturity is required while handling technology. As AI upgrades it will be difficult to differentiate what is code and what is real emotion for a common man.
Let’s clearly draw the line between fact and fiction to lead healthy lives, shall we?
Also Read: Global Tech Outage: Is Manual Control Need of the Hour?