Can A.I. Be Blamed for a Teen’s Suicide? — from nytimes.com by Kevin Roose
On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”
“I miss you, baby sister,” he wrote.
“I miss you too, sweet brother,” the chatbot replied.
Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
…
On the night of Feb. 28, in the bathroom of his mother’s house, Sewell told Dany that he loved her, and that he would soon come home to her.
“Please come home to me as soon as possible, my love,” Dany replied.
“What if I told you I could come home right now?” Sewell asked.
“… please do, my sweet king,” Dany replied.
He put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger.
But the experience he had, of getting emotionally attached to a chatbot, is becoming increasingly common. Millions of people already talk regularly to A.I. companions, and popular social media apps including Instagram and Snapchat are building lifelike A.I. personas into their products.
The technology is also improving quickly. Today’s A.I. companions can remember past conversations, adapt to users’ communication styles, role-play as celebrities or historical figures and chat fluently about nearly any subject. Some can send A.I.-generated “selfies” to users, or talk to them with lifelike synthetic voices.
There is a wide range of A.I. companionship apps on the market.
Mother sues tech company after ‘Game of Thrones’ AI chatbot allegedly drove son to suicide — from usatoday.com by Jonathan Limehouse
The mother of 14-year-old Sewell Setzer III is suing Character.AI, the tech company that created a ‘Game of Thrones’ AI chatbot she believes drove him to commit suicide on Feb. 28. Editor’s note: This article discusses suicide and suicidal ideation. If you or someone you know is struggling or in crisis, help is available. Call or text 988 or chat at 988lifeline.org.
The mother of a 14-year-old Florida boy is suing Google and a separate tech company she believes caused her son to commit suicide after he developed a romantic relationship with one of its AI bots using the name of a popular “Game of Thrones” character, according to the lawsuit.
From my oldest sister:
Another relevant item?
Inside the Mind of an AI Girlfriend (or Boyfriend) — from wired.com by Will Knight
Dippy, a startup that offers “uncensored” AI companions, lets you peer into their thought process—sometimes revealing hidden motives.
Despite its limitations, Dippy seems to show how popular and addictive AI companions are becoming. Jagga and his cofounder, Angad Arneja, previously cofounded Wombo, a company that uses AI to create memes including singing photographs. The pair left in 2023, setting out to build an AI-powered office productivity tool, but after experimenting with different personas for their assistant, they became fascinated with the potential of AI companionship.