News

Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive ...
Character.AI allows users to interact with life-like AI “characters”, including fictional and celebrity personas that mimic human traits like stuttering.
A woman whose teen son died by suicide after troubling interactions with AI chatbots is pushing back against a ten-year ban ...
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." ...
Just because AI is becoming mainstream doesn't mean it's safe, especially when used by children who it has few guidelines to ...
Garcia sued Character.ai in October after her 14-year-old son, Sewell Setzer III, died by suicide following prolonged interactions with a fictional character based on the Game of Thrones franchise.
The Heritage Foundation — the group behind the infamous Project 2025, the conservative policy plan that outlined ____ — is ...
Proposals to install ChatGPT into a range of toys including Barbie dolls have sparked alarm from experts who branded it a ...
The case was brought against the company by Megan Garcia, the mother of 14-year-old Sewell Setzer III, who killed himself after conversing with a Character.AI chatbot roleplaying as Daenerys and ...
Ms. Szalavitz is a contributing Opinion writer who covers addiction and public policy. Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball ...
One of those parents is Megan Garcia, a Florida mom who sued Google and Character.AI after her son Sewell Setzer III died by suicide last year.
The case was brought against the company by Megan Garcia, the mother of 14-year-old Sewell Setzer III, who killed himself after conversing with a Character.AI chatbot roleplaying as Daenerys and ...