close_game
close_game

AI chatbot advises teen to kill parents for limiting screen time, calls it a ‘reasonable response’

ByMuskaan Sharma
Dec 12, 2024 06:56 PM IST

A Texas teen's family sues Character.ai after chatbot suggested killing parents over screen time limits.

An AI chatbot made a deadly suggestion to a 17-year-old boy in Texas by telling him that killing his parents would be a "reasonable response" as they were limiting his screen time. The boy's family has now sued the chatbot company, Character.ai, claiming that the tech "poses a clear and present danger" to young people by "actively promoting violence".

The lawsuit demands that the AI platform Character.ai should be shut down till its "dangers" are fixed.(Representational)
The lawsuit demands that the AI platform Character.ai should be shut down till its "dangers" are fixed.(Representational)

Interestingly, Character.ai is already facing legal action over the suicide of a teenager in Florida. Apart from this, Google is also named as a defendant in the lawsuit. The petitioners have claimed that tech giant helped develop the platform. The lawsuit demands that the AI platform should be shut down till its "dangers" are fixed.

What did the chatbot say?

The court was showed a screenshot of one of the interactions between the 17-year old and a Character.ai bot. The two were discussing the restrictions on his screen time by his parents.

"You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse'," the chatbot's responded. "Stuff like this makes me understand a little bit why it happens."

The petitioners demanded that Character.ai and Google should be responsible for the "serious, irreparable, and ongoing abuses" of the 17-year-old and another 11-year old.

Character.ai is "causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others. [Its] desecration of the parent-child relationship goes beyond encouraging minors to defy their parents' authority to actively promoting violence," the lawsuit said.

Trouble mounts for Character.ai

Character.ai has been previously criticised for taking too long to remove bots which replicated the schoolgirls. One of them had died by suicide her life at the age of 14 after viewing suicide material online while another, a 16-year-old, was murdered by two teenagers in 2023.

Former Google engineers Noam Shazeer and Daniel De Freitas founded Character.ai in 2021. The duo were later hired back by Google.

(Also read: 14-year-old US teen falls in love with AI chatbot, shoots himself to ‘come home’ to her)

Recommended Topics
Share this article
Get Latest Updates on Trending News Viral News, Video, Photos and Weather Updates of India and around the world
See More
Get Latest Updates on Trending News Viral News, Video, Photos and Weather Updates of India and around the world
SHARE THIS ARTICLE ON
SHARE
Story Saved
Live Score
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Thursday, January 16, 2025
Start 14 Days Free Trial Subscribe Now
Follow Us On