Skip to main content
Clear icon
59º

Florida mother files wrongful death lawsuit against A.I. company after son’s death by suicide

The mother of a Florida teenager is filing a wrongful death lawsuit, claiming a chat-bot encouraged her son to take his own life.

The mother, Megan Garcia said her son, 14-year-old Sewell Setzer III, shot himself after he became obsessed with a character he created online.

Metali Jain is the Director of the Tech Justice Project and one of the attorney’s representing Garcia.

“It became really apparent that this was going to be a watershed case,” said Jain.

Jain said unlike social media companies, she thinks it is hard for Artificial Intelligence companies to relinquish responsibility.

“It moves us from talking about the harms of social media, talking about the harms of generative A.I.,” said Jain.

The lawsuit claims the company was reckless for offering minors access to lifelike companions without proper safeguards.

Character A.I. posted a stated online about the situation:

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

Character AI

Garcia said her son had been conversing with the chatbot for months and that although he knew he was not chatting with a real person, he became emotionally attached to the bot.

She claims as he sank into isolation and depression, he shared those feelings with the bot before taking his own life.

Dr. Shannon Wiltsey Stirman is a professor of Psychiatry at Stanford University who said while there is potential for AI to provide support, there is a long way to go.

“The risk of suicide is multifaceted. There are a lot of different factors that go into it. Some of the chatbots try to shut the conversation down by saying, this is not a mental health chatbot. Call a suicide hotline. But I think when we’re seeing people that are in real distress, they’re expressing an intent or a desire to harm themselves, we might even need to find a way to go a step further,” said Stirman.

A spokesperson for Character A.I. said they cannot comment on pending litigation, but said the company is making changes focused on safety for teen users.

Leer en español


About the Author
Sanela Sabovic headshot

Sanela Sabovic joined Local 10 News in September 2012 as an assignment editor and associate producer. In August 2015, she became a full-time reporter and fill-in traffic reporter. Sanela holds a Bachelor of Arts degree in communications with a concentration in radio, television and film from DePaul University.

Loading...

Recommended Videos