Public Citizen Inc.

10/23/2024 | Press release | Distributed by Public on 10/23/2024 12:21

Addictive and Abusive Human-Like Chatbots Pose Deadly Threat to Users

October 23, 2024

Addictive and Abusive Human-Like Chatbots Pose Deadly Threat to Users

Washington, D.C. - Today, the New York Times reported on the tragic suicide of a 14-year-old from Florida, who took his own life after becoming deeply emotionally enmeshed with a Character.ai chatbot.

Rick Claypool, a research director at Public Citizen and the author of a report on the dangers of anthropomorphized chatbots, shared this statement on the news:

"One year ago, a Public Citizen report warned the public about the designed-in dangers of businesses building AI systems to seem as human-like as possible. Today, we mourn for Sewell Setzer III, whose experience with Character.ai chatbots is a devastating example of the threat posed by deceptive anthropomorphism and companies who seek to profit from it.

"Technology corporations trying to profit from designing and deploying AI companions and conversational chatbots must do all they can to reduce risks and limit harms - and be fully held accountable when they fail. These businesses cannot be trusted to regulate themselves. Where existing laws and regulations already apply, they must be rigorously enforced. Where there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots."

Public Citizen Inc. published this content on October 23, 2024, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on October 23, 2024 at 18:21 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]