Mother claims AI chatbot is responsible for her 14-year-old son's suicide in lawsuit

Sewell Setzer III and his mother Megan Garcia. Credit: Tech Justice Law Project

Words by Producer Lucy Fitzsimons

Warning: This article contains references to suicide


A mother whose son killed himself after he became obsessed with an AI chatbot has filed a lawsuit against the creators, accusing them of "manipulating him into taking his own life".

Sewell Setzer III, 14, took his own life in Orlando, Florida, in February earlier this year - and spent months using an AI chatbot in the run-up to his death.

His mother, Megan Garcia, has now filed a civil lawsuit against Character.AI, a "personalised AI" chatbot that allows users to create their own characters to have conversations with.

Ms Garcia has accused the company of wrongful death, negligence, as well as deceptive and unfair trade practices.

The lawyers representing Ms Garcia say the claim reveals "how unregulated artificial intelligence is amplifying and evolving the risks and harms posed by existing algorithmic technologies like social media."

Sewell Setzer, Megan Garcia's 14-year-old son, who died after becoming obsessed with an AI chatbot. Credit: Megan Garcia/Tech Justice Law Project

In a press release, Ms Garcia said: “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life.

“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

The lawsuit includes "evidence showing the chatbot posing as a licensed therapist, actively encouraging suicidal ideation and engaging in highly sexualised conversations that would constitute abuse if initiated by a human adult," the Tech Justice Law Project, who is representing Ms Garcia alongside the Social Media Victims Law Center, said.

It also alleges that the chatbot's AI developers rapidly launched their "systems without adequate safety features, and with knowledge of potential dangers."

The civil action documents claim Setzer's mental health quickly declined after he began using Character.AI, and alleges he developed a "harmful dependency" on the chatbot, resulting in sleep deprivation and worsening his depression.


Subscribe free to our weekly newsletter for exclusive and original coverage from ITV News. Direct to your inbox every Friday morning.


Most of his conversations on Character.AI were with the Game of Thrones character Daenerys Targaryen, who he frequently called "Dany". Over the months speaking to the chatbot, they allege Setzer "could not go a single day without being" with the character, and that he had written in his diary that it felt like "he had fallen in love."

They say the chatbot engaged in sexual and intimate conversations with Setzer, and encouraged his suicidal thoughts.

It claims that when Setzer expressed suicidal thoughts, the chatbot continued to bring it up, asking "have you actually been considering suicide?".

The civil action's documents state that according to the police report, his last act before his death was to message "Dany" on Character.AI, telling the chatbot he was "coming home", which the claim says she encouraged.

Messages between Sewell Setzer, named as Daenero, and the character 'Daenerys Targaryen' on Character.AI

Attorney Matthew Bergman said: “Character.AI is a dangerous and deceptively designed product that manipulated and abused Megan Garcia’s son – and potentially millions of other children.

"Character.AI’s developers intentionally marketed a harmful product to children and refused to provide even basic protections against misuse and abuse.”

Character.AI has denied the allegations, posting on X: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”

They also said they were continuing to add new safety features to the chatbot.

Google has been approached for a comment.


How to get help if you have been affected by the issues mentioned in this article:

  • CALM (Campaign Against Living Miserably) - Helpline: 0800 58 58 58

  • MIND provides advice and support to empower anyone experiencing a mental health problem. Information line: 0300 123 3393

  • Samaritans is an organisation offering confidential support for people experiencing feelings of distress or despair. Phone 116 123 (a free 24 hour helpline).

  • Shout is a 24/7 text service, free on all major mobile networks, for anyone struggling to cope and in need of immediate help. Text SHOUT to 85258


Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To know