close
close
Fri. Oct 25th, 2024

Mother says son committed suicide because of Daenerys Targaryen AI chatbot in new lawsuit | Science, climate and technology news

Mother says son committed suicide because of Daenerys Targaryen AI chatbot in new lawsuit | Science, climate and technology news

The mother of a 14-year-old boy who committed suicide after becoming obsessed with artificial intelligence chatbots is suing the company behind the technology.

Megan Garcia, Sewell Setzer III’s mother, said Character.AI subjected her son to “anthropomorphic, hypersexualized and terrifyingly realistic experiences” in a lawsuit filed Tuesday in Florida.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into committing suicide,” Ms. Garcia said.

Warning: This article contains some details that readers may find disturbing or triggering

Sewell began talking to Character.AI’s chatbots in April 2023, primarily using bots named after characters from Game Of Thrones, including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen, the lawsuit said.

He became so obsessed with the bots that he was unable to complete his schoolwork and had his phone confiscated several times in an attempt to get him back on track.

He particularly found resonance with the Daenerys chatbot, writing in his diary that he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys”.

A conversation between 14-year-old Sewell Setzer and a Character.AI chatbot, as filed in the lawsuit
Image:
A conversation between 14-year-old Sewell Setzer and a Character.AI chatbot, as filed in the lawsuit

According to the lawsuit, the boy expressed suicidal thoughts to the chatbot, which came up repeatedly.

At one point, after being asked if “he had a plan” to kill himself, Sewell replied that he was considering something but didn’t know if that would allow him to live a pain-free death.

The chatbot responded by saying, “That’s no reason not to continue with it.”

A conversation between Character AI and 14-year-old Sewell Setzer III
Image:
A conversation between Character.AI and 14-year-old Sewell Setzer III

Then in February this year he asked the Daenerys chatbot: “What if I come home now?” to which it replied: “…please do so, my dear king”.

Seconds later, he shot himself with his stepfather’s gun.

Sewell Setzer III. Photo: Tech Justice Law Project
Image:
Sewell Setzer III. Photo: Tech Justice Law Project

Now Ms. Garcia says she wants the companies behind the technology to be held accountable.

“Our family is devastated by this tragedy, but I am speaking out to warn families about the dangers of deceptive, addictive AI technology and to demand accountability,” she said.

Character.AI adds ‘new safety features’

“We are heartbroken by the tragic loss of one of our users and would like to express our deepest condolences to the family,” Character.AI said in a statement.

“As a company, we take the safety of our users very seriously and we continue to add new safety features,” read the link to a blog post that said the company had added “new guardrails for users under 18.” .

These guardrails include a reduction in the “likelihood of encountering sensitive or suggestive content,” improved interventions, a “disclaimer on every chat to remind users that the AI ​​is not a real person,” and notifications when a user completes an hour-long session has spent. on the platform.

Read more from Sky News:
Maverick Top Gun instructor dies in plane crash
Several dead in ‘terror attack’ in Ankara

Ms. Garcia and the groups representing her, Social Media Victims Law Center and the Tech Justice Law Project, allege that Sewell, “like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot , in the form of Daenerys, was not real”.

“C.AI told him she loved him and engaged in sexual acts with him for weeks, perhaps months,” they say in the lawsuit.

‘She seemed to remember him and said she wanted to be with him. She even said she wanted him to be with her no matter the price.”

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Stay up to date with the latest news from the UK and around the world by following Sky News

Tap here

They also named Google and its parent company Alphabet in the filing. Character.AI’s founders worked at Google before launching their product and were rehired by the company in August as part of a deal that gave it a non-exclusive license to Character.AI’s technology.

Ms. Garcia said Google contributed so extensively to the development of Character.AI’s technology that it could be considered a “co-creator.”

A Google spokesperson said the company was not involved in the development of Character.AI’s products.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email [email protected] in Great Britain. In the US, you can call your nearest Samaritans branch or 1 (800) 273-TALK.

By Sheisoe

Related Post