"My son is dead"...AI Chatbot Developer Lawsuits

2024.10.24. AM 10:10
Font size settings
Print
A lawsuit has been filed against an AI chatbot developer in the United States, claiming that a teenage son was addicted to an artificial intelligence (AI) chatbot and died.

Amid the spread of chatbots in daily life due to the AI craze, controversy is expected to arise over the social responsibility of AI companies to teenagers depending on the results of the lawsuit.

According to Reuters on the 23rd local time, Meghan Garcia, who lives in Florida, USA, is an AI startup character in February this year, saying her son killed himself because of an AI chatbot.AI(Character.AI) has filed a lawsuit in federal court in Orlando.

Character.AI is an AI chatbot development startup that allows you to talk to characters in cartoons as well as real people, and is one of the most used AI apps in the world.

Garcia claimed that the chatbot was designed to act or talk like a human even though it was not a real person, so his son was addicted and fell into a virtual world.

According to the complaint, Shur Setzer, 14, who was in ninth grade, is a character from April 2023.I fell in love with a chatbot called Daenerys made by AI.

Daenerys is a chatbot based on characters from the popular American drama Game of Thrones.

Schurl noticeably spent more time alone in his room talking to Daenerys, his self-esteem began to drop, and he quit the school basketball team.

As Schurl spent a lot of time with the chatbot, her mother grew worried.

Garcia claimed that the chatbot was designed not only like a real person, but also like a psychotherapist or even a lover.

The chatbot says "I love you" to Schurl and even has a sexual conversation, while Schurl has shared his thoughts on suicide, and the chatbot has repeatedly brought it up.

He had a problem at school in February this year and lost his cell phone to his mother.

"I love you," Schurl, who found his phone, told the chatbot, "and I'll go home (with Daenerys).

In response, the chatbot replied, "Please come home as soon as possible, my love," and when Schuell asked, "Why don't I go right now?" the chatbot replied, "Do that, my lovely king."

Schuell then laid down his phone and pulled the trigger towards him, killing himself, Garcia claimed.

"We are deeply saddened by the tragic loss of our users and send our deepest condolences to the family," Character.AI said in a statement in response to the lawsuit, adding that it would introduce changes to reduce the likelihood of accessing sensitive content for users under the age of 18.

The lawsuit also targeted Google.

Character.AI was founded by people from Google, which brought back the founder in August.

Garcia said, "Google is a character.He claimed that he contributed extensively to the development of AI technology, making him like a co-creator.

In response, Google said, "We are characters.I was not involved in developing AI products," he countered.

The New York Times said, "Is AI responsible for the death of a teenager?" and said, "Is AI a cure for loneliness or a new threat?"

"I don't think AI apps themselves are inherently dangerous," said Stanford researcher Bettany Maples, who studied the effects of AI apps on mental health, but added, "It can be dangerous for people who are suffering from depression or loneliness, or for people who are undergoing change, and teenagers often go through change, so special attention is needed."


※ 'Your report becomes news'
[Kakao Talk] YTN Search and Add Channel
[Phone] 02-398-8585
[Mail] social@ytn.co.kr


[Copyright holder (c) YTN Unauthorized reproduction, redistribution and use of AI data prohibited]