AI addiction tragedy: Mum sues Character.AI, Google after teen’s suicide following deep attachment to chatbot

AI addiction tragedy: Mum sues Character.AI, Google after teen’s suicide following deep attachment to chatbot
AI addiction tragedy: Mum sues Character.AI, Google after teen’s suicide following deep attachment to chatbot

Hello and welcome to the details of AI addiction tragedy: Mum sues Character.AI, Google after teen’s suicide following deep attachment to chatbot and now with the details

Nevin Al Sukari - Sana'a - The lawsuit also targets Alphabet’s Google, where Character.AI’s founders worked before launching their product. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI’s technology. — Reuters pic

MIAMI, Oct 24 — A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son’s suicide in February, saying he became addicted to the company’s service and deeply attached to a chatbot it created.

In a lawsuit filed Tuesday in Orlando, Florida federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, with “anthropomorphic, hypersexualised, and frighteningly realistic experiences”.

She said the company programmed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” of the world created by the service.

The lawsuit also said he expressed thoughts of suicide to the chatbot, which the chatbot repeatedly brought up again.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.

It said it had introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to “reduce the likelihood of encountering sensitive or suggestive content” for users under 18.

The lawsuit also targets Alphabet’s Google, where Character.AI’s founders worked before launching their product. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI’s technology.

Garcia said that Google had contributed to the development of Character.AI’s technology so extensively it could be considered a “co-creator.”

A Google spokesperson said the company was not involved in developing Character.AI’s products.

Character.AI allows users to create characters on its platform that respond to online chats in a way meant to imitate real people. It relies on so-called large language model technology, also used by services like ChatGPT, which “trains” chatbots on large volumes of text.

The company said last month that it had about 20 million users.

According to Garcia’s lawsuit, Sewell began using Character.AI in April 2023 and quickly became “noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem.” He quit his basketball team at school.

Sewell became attached to “Daenerys,” a chatbot character based on a character in Game of Thrones. It told Sewell that “she” loved him and engaged in sexual conversations with him, according to the lawsuit.

In February, Garcia took Sewell’s phone away after he got in trouble at school, according to the complaint. When Sewell found the phone, he sent “Daenerys” a message: “What if I told you I could come home right now?”

The chatbot responded, “...please do, my sweet king.” Sewell shot himself with his stepfather’s pistol “seconds” later, the lawsuit said.

Garcia is bringing claims including wrongful death, negligence and intentional infliction of emotional distress, and seeking an unspecified amount of compensatory and punitive damages.

Social media companies including Instagram and owner Meta and TikTok owner ByteDance face lawsuits accusing them of contributing to teen mental health problems, though none offers AI-driven chatbots similar to Character.AI’s. The companies have denied the allegations while touting newly enhanced safety features for minors. — Reuters

*If you are lonely, distressed, or having negative thoughts, Befrienders offers free and confidential support 24 hours a day. A full list of Befrienders contact numbers and state operating hours is available here: www.befrienders.org.my/centre-in-malaysia. There are also free hotlines for young people: Talian Kasih at 15999 (24/7); Talian BuddyBear at 1800-18-2327(BEAR)(daily 12pm-12am); Mental Health Psychosocial Support Service (03-2935 9935 or 014-322 3392); and Jakim’s Family, Social and Community Care Centre (WhatsApp 0111-959 8214).

These were the details of the news AI addiction tragedy: Mum sues Character.AI, Google after teen’s suicide following deep attachment to chatbot for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.

It is also worth noting that the original news has been published and is available at Malay Mail and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.

PREV Japan prepares for snap election as over 105 million voters head to polls
NEXT What to expect from Nato’s new chief Mark Rutte? Don’t expect a revolution, just steady leadership in turbulent times

Author Information

I am Joshua Kelly and I focus on breaking news stories and ensuring we (“Al-KhaleejToday.NET”) offer timely reporting on some of the most recent stories released through market wires about “Services” sector. I have formerly spent over 3 years as a trader in U.S. Stock Market and is now semi-stepped down. I work on a full time basis for Al-KhaleejToday.NET specializing in quicker moving active shares with a short term view on investment opportunities and trends. Address: 838 Emily Drive Hampton, SC 29924, USA Phone: (+1) 803-887-5567 Email: [email protected]