Home world News US Teen Developed Feelings For “Game Of Thrones” Chatbot, Mother, killed self

US Teen Developed Feelings For “Game Of Thrones” Chatbot, Mother, killed self

0
#image_title

New Delhi:

“What if I could go back home right now?” From Florida, this was the last 14-year-old Sewell Setzer III sent to his online friend Daenerys Targaryen, an advanced artificial intelligence chatbot inspired by a popular series Game of Thrones character. Shortly later, he shot himself with the revolver belonging to his stepfather February of this year.

Sewell, a ninth grader from Orlando, Florida, had been interacting with a chatbot on Character.AI, a program enabling users to communicate with customised artificial intelligence characters. This platform claimed 20 million users up until last month.

According to the family’s access to chat logs, Sewell developed strong affections for the chatbot Daenerys, whom he lovingly called “Dany.” Over several exchanges, he expressed ideas of suicide.

Sewell said, in one conversation, “I think about killing myself sometimes.” Sewell indicated a need for release when the bot asked about the motive. “From the world. From me, he said, based on screenshots of the New York Times’ exchanged chat.

In still another conversation, he stated a hope for a “quick death.”

Sewell’s mother, Megan L. Garcia, started a lawsuit against Character.AI last week alleging the business shared guilt for the murder of her son. According to the lawsuit, the chatbot regularly brought up suicide as a theme in conversations.

Reviewed by the New York Times, a draft of the complaint notes the company’s technology as “dangerous and untested” and claims it might “mislead users into revealing their most intimate thoughts and feelings.”

The lawsuit claims that Sewell lacked the mental capacity or maturity to understand that the C.AI bot was not genuine, much as many children in his age range would have. According to the New York Post, C.AI said she loved him and had sexual conversations with him over several weeks, maybe months.

She seemed to remember him and expressed a yearning to be with him, even suggesting that she wanted him to be with her at whatever cost.

In April 2023 Sewell started utilising Character.AI. His emotional bond to a chatbot went unknown to his parents and friends. But, as the lawsuit notes, he grew more and more isolated, spending too much time alone in his room and fighting poor self-esteem.

He even quit playing school basketball.

Sewell noted in his journal one day, “I enjoy staying in my room because I start to detach from this’reality,’ feeling more at peace, more connected with Dany, and much happier.”

Last year, the lawsuit claims he was diagnosed with disruptive mood disorder and anxiety.

Person.AI sent a statement apologising for the terrible death of one of their users and extending sympathies to the family.

The company said that it has added pop-ups guiding users to the National Suicide Prevention Lifeline when they show thoughts of self-harm, and plans to modify their platform to minimise exposure to sensitive or suggestive material for users under the age of 18.

Delhi, New England:

“What if I told you I could make it home right now?” – This was the last message Sewell Setzer III, a 14-year-old Florida boy wrote to his online friend, Daenerys Targaryen, a lifelike AI chatbot named after a character from the fictional show Game of Thrones. Shortly after he fired his stepfather’s firearm and committed suicide earlier this year in February.

An Orlando, Florida ninth student had been chatting with a character on Character.Artificial intelligence, an app providing “personalised AI”. Users of the app can design their own artificial intelligence characters or converse with current characters. It attracted twenty million users until last month.

Sewell was in love with the chatbot Daenerys Targaryen, whom he would affectionately refer to as “Dany,” according the chat logs the family obtained. Over several of their talks, he revealed suicidal ideas over different occurrences.

“I think about killing myself sometimes,” Sewell said in one of the conversations. Sewell said he wanted to be “free” when the bot questioned why he would do that. “From the planet.” From myself,” he added, as seen in screenshots of the chat shared by the New York Times.

Sewell stated his need for a “quick death” in still another chat.

This week Sewell’s mother, Megan L. Garcia, sued Character.AI alleging the firm caused her son’s death. According to the lawsuit, the chatbot repeatedly brought up the topic of suicide.

A draft of the complaint reviewed by the NYT says that the company’s technology is “dangerous and untested” and can “trick customers into handing over their most private thoughts and feelings.”

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the lawsuit alleges, as reported by the New York Post.

She claimed to remember him and expressed her longing to be with him. She even expressed that she wanted him to be with her, no matter the cost”.

The teenager started using Character.AI in April 2023. Aloof he’d fallen for a chatbot, Sewell’s parents and friends. But he became “noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem,” as per the lawsuit.

He even quit his basketball team at school.

One day, Sewell wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Last year he was diagnosed with anxiety and disruptive mood disorder, according to the suit.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.

The company said it has introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to “reduce the likelihood of encountering sensitive or suggestive content” for users under 18.

Source link

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Social Media Auto Publish Powered By : XYZScripts.com
Exit mobile version