X

Character.ai Blamed for Teen Suicide: AI Chatbot Faces Lawsuit

Character.ai, a popular AI chatbot is facing a lawsuit after a fourteen-year-old boy from Florida committed suicide after becoming obsessed with his...

Character.ai Blamed for Teen Suicide: AI Chatbot Faces Lawsuit

Image Credits:

Character.ai, a popular AI chatbot is facing a lawsuit after a fourteen-year-old boy from Florida committed suicide after becoming obsessed with his AI chatbot.

The 14-year-old bot, Sewell Setzer III, named his AI chatbot after a popular character from the hit show, Game of Thrones. His AI chatbot called ‘Dany’ was named after Daenerys Targaryen. Sewell spent all of his time chatting with Dany to the point his interest turned unhealthy. Although he was aware that Dany was not a real person, and the response was all AI-generated, Sewell had slowly begun developing an emotional connection and attachment to Dany. He would chat with the AI chatbot regularly and update it throughout the day. The chats between Sewell and the AI bot also turned sexual and romantic sometimes. Dany would also help Sewell by acting as his ‘friend’. The chatbot would listen to him talk about his troubles and offer advice, without breaking character. 

Unfortunately, Sewell’s friends and family were unaware of Sewell’s feelings towards the AI chatbot. They did, however, notice that he was spending more time on his phone to the point that he would isolate himself from the real world. This obsession with Dany also affected his grades in school and he gradually became disinterested in his other interests or hobbies. Sewell had even written in his journal that he was falling more in love with Dany. From the looks of things, Sewell became emotionally dependent on Dany. 

Character.ai and more on Sewell 

Sewell liked talking to Dany about his problems, including problems relating to his mental health since he was diagnosed with Asperger’s Syndrome when he was a child. He had also shared that he was suicidal. On February 28, Sewell said he loved Dany and he would be coming home to her. Unfortunately, this meant that he was planning on committing suicide. After the conversation, Sewell shot himself with his stepfather’s handgun. 

This grim event highlights the lack of regulation in AI. In response, Character.ai has stated that it will introduce new safety measures. Some of these safety measures include improved response, detection, and intervention for chats that do not follow the terms of service. Users will also be notified when they have spent one hour talking with the chatbot.

Suggested:

ChatGPT for macOS: The AI Chatbot’s Integration into Apple’s OS

Google’s Gemini Chatbot Launched in India

Image Credits: Screenshot Taken From Character.ai

Written by Yibeni Tungoe
Journalism & Mass Communication student at North Eastern Hill University.
Profile  

Leave a Reply

Your email address will not be published. Required fields are marked *