'Sickening' Molly Russell chatbots found on Character.ai

15 hours ago 10

Bloomberg/Getty Images A photo of the Character.ai app on a phone - the screen shows a list of ai avatars.Bloomberg/Getty Images

Chatbot versions of the teenagers Molly Russell and Brianna Ghey have been found on Character.ai - a platform which allows users to create digital versions of real or fictitious people.

Molly Russell took her life at the age of 14 after viewing suicide material online while Brianna Ghey, 16, was murdered by two teenagers in 2023.

The foundation set up in Molly Russell's memory said it was "sickening" and an "utterly reprehensible failure of moderation."

The platform is already being sued in the US by the mother of a 14 year-old boy who she says took his own life after becoming obsessed with an Character.ai chatbot.

In a statement to the Telegraph, which first reported the story, the firm said it "takes safety on our platform seriously and moderates Characters proactively and in response to user reports."

The firm appeared to have deleted the chatbots after being alerted to them the paper said.

Andy Burrows, chief executive of the Molly Rose Foundation, said the creation of the bots was a "sickening action that will cause further heartache to everyone who knew and loved Molly."

"It vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough", he said.

Esther Ghey, Brianna Ghey's mother, told the Telegraph it was yet another example of how "manipulative and dangerous" the online world could be.

Artificial friends

Character.ai, which was founded by former Google engineers Noam Shazeer and Daniel De Freitas, has terms of service which ban using the platform to "impersonate any person or entity".

In its "safety centre" the company says its guiding principle is that its "product should never produce responses that are likely to harm users or others".

It says it uses automated tools and user reports to identify uses that break its rules and is also building a "trust and safety" team.

But it notes that "no AI is currently perfect" and safety in AI is an "evolving space".

Character.ai is currently the subject of a lawsuit brought by Megan Garcia, a woman from Florida whose 14-year-old son, Sewell Setzer, took his own life after becoming obsessed with an AI avatar inspired by a Game of Thrones character.

According to transcripts of their chats in Garcia's court filings her son discussed ending his life with the chatbot.

In a final conversation Setzer told the chatbot he was "coming home" - and it encouraged him to do so "as soon as possible".

Shortly afterwards he ended his life.

Character.ai told CBS News it had protections specifically focused on suicidal and self-harm behaviours and that it would be introducing more stringent safety features for under-18s "imminently".

Read Entire Article