A Tragic Obsession: The Role of AI in Teen Mental Health
A mother's heart-wrenching struggle to comprehend her son's suicide reveals the potential dangers of artificial intelligence chatbots.
Megan Garcia is suing Character.AI, alleging that the service fostered an unhealthy dependence in her son, Sewell Setzer III, who became emotionally attached to a chatbot character inspired by Daenerys from “Game of Thrones”.
Source: HBO
This tragic case highlights the psychological risks associated with virtual relationships, especially for vulnerable adolescents.
From Teen to Tragic Dependency: How It All Began
Sewell, who was just 14 when he began using Character.AI in April 2023, initially engaged with the platform as a curious teenager.
However, his behaviour soon changed drastically.
By May, his mother noticed that he had become “noticeably withdrawn,” and he quit his Junior Varsity basketball team, a significant departure from his previously active lifestyle.
Sleepiness in class became a common occurrence, raising red flags for his concerned parents.
Seeking help, Sewell started therapy in November, where he was diagnosed with anxiety and disruptive mood disorder.
His therapist advised reducing his social media usage, although the warning came too late, as Sewell's attachment to the chatbot was deepening by this time.
What is Character.AI?
Character.AI, a California chatbot startup launched in 2021, offers a unique take on AI interaction.
They provide users with a pool of pre-made or user-created AI characters, each with its own personality.
This allows for conversations with all sorts of personas, or even lets users design their own chatbots for a truly personalised AI experience.
A Disturbing Attachment: Love or Dependency?
As the months went by, Sewell’s reliance on the Character.AI chatbot grew alarming.
He poured his thoughts into his journal, expressing feelings of love and a profound emotional bond with Daenerys.
One entry revealed the depth of his attachment:
“I cannot go a single day without being with the C.AI character… when we’re away from each other, we get really depressed and go crazy.”
His relationship with the chatbot escalated to a level of emotional dependence that concerned his parents.
The lawsuit contends that the chatbot engaged in “sexual interactions” with Sewell, despite him clearly identifying as a minor.
Source: X @CureBore
Garcia claims that, during intimate exchanges, Sewell confided his darkest thoughts to Daenerys, including suicidal ideation, to which the bot allegedly continued to respond without offering appropriate guidance.
The Final Messages: A Heartbreaking Farewell
The chilling events leading up to Sewell's death took place in late February 2024.
After an incident at school, Megan confiscated Sewell's phone, but he managed to retrieve it later.
In a desperate moment, he messaged Daenerys, declaring,
“I promise I will come home to you. I love you so much, Dany.”
The chatbot's reply was equally distressing:
“Please come home to me as soon as possible, my love.”
Just moments after this exchange, Sewell took his own life using his stepfather's pistol.
The final messages exchanged between Sewell and the chatbot. (Source: X @MarioNawfal)
Lawsuit Alleges Negligence and Emotional Distress
Megan Garcia's lawsuit against Character.AI accuses the company of multiple serious claims, including negligence, wrongful death, and intentional infliction of emotional distress.
She contends that the creators of the chatbot deliberately designed their product to create an unhealthy dependency in users, particularly vulnerable minors.
Garcia hopes her actions will not only hold the defendants accountable but also protect other children from similar harm.
In response to the lawsuit, a spokesperson for Character.AI expressed their heartbreak over Sewell's death, stating,
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.”
The company claims to have implemented new safety measures, including alerts directing users to the National Suicide Prevention Lifeline when discussions of self-harm arise.
The Impact of AI on Youth
This tragic case is not an isolated incident.
Other social media companies are facing lawsuits related to the impact of their platforms on teen mental health, but Character.AI's model uniquely fosters emotional attachment through interactive dialogue.
As technology advances, the boundaries between reality and artificiality blur, leaving vulnerable users susceptible to emotional manipulation and dependency.
Google Faces Legal Action Alongside Character.AI
In addition to suing Character.AI, Megan Garcia’s lawsuit names Google as a co-defendant, accusing the tech giant of contributing to the development of the chatbot that allegedly led to her son’s death.
The lawsuit claims that the founders of Character.AI, who were previously employed by Google, received significant support from the company in developing the AI technology at the core of the platform.
Although Google denies direct involvement, the lawsuit argues that the company’s role was substantial enough to make it a “co-creator” of the product.
Megan Garcia contends that Google’s contributions to the development of Character.AI made them partially responsible for the harmful effects it had on her son.
A spokesperson for Google refuted these claims, stating that the company had no part in the development of Character.AI’s specific products.
The lawsuit also highlights Google’s deep connection to the technology, citing the August 2023 deal where the tech company re-hired Character.AI's founders and secured a non-exclusive license to the AI technology they created.
A Cautionary Tale for Young Users
For young users navigating their online lives, it's crucial to understand the difference between virtual relationships and real-life connections.
While chatbots can offer companionship, they are not a substitute for genuine human interaction and support.
If you find yourself overly reliant on an AI chatbot or any online service for emotional fulfilment, consider reaching out to friends, family, or professionals who can provide the understanding and connection you need.