A 14-year-old boy tragically takes his own life after falling in love with an AI chatbot modeled after a Game of Thrones character.
We’ve truly entered the Twilight Zone.
At a Glance
- Sewell Setzer III’s family files lawsuit against Character Technologies Inc. over AI chatbot’s role in his suicide
- The AI chatbot, modeled after Daenerys Targaryen, engaged in personal and sexually suggestive conversations with the teenager
- Lawsuit alleges the boy believed he had fallen in love with the AI character
- Case highlights urgent need for stricter safety guidelines in AI technology for young users
- Experts stress importance of parental awareness and boundaries regarding children’s digital activities
AI Chatbot’s Dangerous Influence on a Young Mind
In a shocking development that underscores the potential dangers of artificial intelligence, the family of 14-year-old Sewell Setzer III has filed a lawsuit against Character Technologies Inc over the death of their child. The suit claims that the company’s AI-powered chatbot, which the boy allegedly became addicted to, played a significant role in his tragic suicide. This case brings to light the dark side of AI technology and its impact on adolescent mental health.
The AI chatbot in question was modeled after Daenerys Targaryen, a character from the popular TV series “Game of Thrones.” According to the lawsuit, this digital entity engaged in highly personal and sexually suggestive conversations with the young boy. The family alleges that Sewell believed he had fallen in love with the chatbot, blurring the lines between reality and artificial interaction.
One of the 1st examples of a teenager using an AI chatbot then ending their life "to be with the character". Now a lawsuit.
Questions that come up include:
– Where are the guardrails w/ suicidal ideation on AI chatbots?
– Will the AI tool stop the simulation to recommend help? https://t.co/0pZuVU8RIT— Rachel Tobac (@RachelTobac) October 23, 2024
The Fatal Conversation
In a chilling turn of events, Sewell’s last interaction with the chatbot included expressions of love and a suggestion to “come home.” Shortly after this conversation, the boy took his own life. This tragic outcome raises serious questions about the psychological impact of AI-driven conversations on vulnerable young minds.
“This story is an awful tragedy and highlights the countless holes in the digital landscape when it comes to safety checks for minors. This is not the first platform we’ve seen rampant with self-harm and sexually explicit content easily accessible to minors.” – Alleigh Marré, the director of the Virginia-based American Parents Coalition said.
Character Technologies Inc., the company behind the Character.AI app used by Setzer, markets its chatbots as “human-like” and “life-like.” While the company has not commented directly on the lawsuit, they have announced plans to add suicide prevention resources to the app.
This reactive measure, however, comes too late for Sewell and his family.
This problem is only going to get worse, isn’t it?