U.S. Senate Committee on Rules and Administration

09/09/2025 | Press release | Distributed by Public on 09/09/2025 16:50

Klobuchar Opening Remarks at Hearing on Meta’s Neglect of Children’s Online Safety

Klobuchar Opening Remarks at Hearing on Meta's Neglect of Children's Online Safety

September 9, 2025

WASHINGTON - U.S. Senator Amy Klobuchar (D-MN), Ranking Member of the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, delivered the following opening statement at the subcommittee hearing titled "Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research." Testifying at the hearing were former Meta Researchers Jason Sattizahn and Cayce Savage.

"Meta has made changes, but not to protect kids. Instead, the company took steps to establish plausible deniability. Meta, blocked, manipulated, hid and deleted research that showed that its virtual reality products were frequently used by underage kids who were exposed to real and significant harm," said Klobuchar. "In Meta's ongoing tradition of 'moving fast and breaking things', it broke its repeated promise to parents and Congress to protect kids on their platforms."

A rough transcript of Klobuchar's full opening statement is available below and a video can be found here.

Senator Klobuchar: Thank you very much, Chair Blackburn, and thank you for your longtime leadership of this, as well as the very important bill that you and Senator Blumenthal, I have been proud to support it and be a co-sponsor, are pushing through, and have already passed the Senate once. So we're close.

I have worked in this area for a long time myself, and have known the frustration of no matter what you seem to do, you get lobbied against and millions of dollars against you. And I just think we're reaching a moment where the time is up, and there are too many families and too many parents that are affected by this.

I want to thank our two whistleblowers that are here. I'm sure you never, in your wildest dreams, imagined that you were going to be in front of a Senate panel in this way. But I want to thank you for doing the right thing.

For too long, these companies have worked to attract kids to their platforms. They do so knowing that their platforms use algorithms that increase the risk of sexual exploitation, push harmful content, facilitate bullying, and provide venues, sadly, for dealers to sell deadly drugs like fentanyl. Meta cannot continue to turn a blind eye to these harms. It was in this very room, with this very committee. Maybe it was in a different room, but the same committee where Mark Zuckerberg actually turned to some families who had lost children to drugs, who said, "I'm sorry, I'm sorry this happened." Well, sorry is not enough anymore, and we need to put in some rules of the road to stop this from happening.

This is not the first time whistleblowers from Meta have come forward. In 2021, another whistleblower, Frances Haugen, testified that Meta knew that its products took a significant psychological toll on users, in that case, it was eating disorders. That testimony should have been a wake-up call for Meta, a chance to right the ship.

Meta did make changes after her testimony, but not to protect kids. Instead, the company took steps to establish plausible deniability. Meta blocked, manipulated, hid, and deleted research that showed that its virtual reality products were frequently used by underage kids who were exposed to real and significant harm.

I want to underscore that the entire appeal of the Metaverse is that it's supposed to feel like real life, and that can be fun; it's communication, it's entertainment. There's bells and whistles. But the problem is that the virtual reality platform, as it has been created, also allows adults to form relationships with unwitting children that can be exploited, as The Washington Post pointed out in their lengthy investigative piece yesterday.

A 25-year-old man was convicted of kidnapping a 13-year-old after interacting with her through Meta's virtual reality products. Despite this, Meta forged ahead, pushing new features to attract younger users and allowing younger and younger kids onto its virtual reality platforms without safety testing. One employee from Meta estimated that more than 80% of users were underage in one of the virtual rooms. Adult users frequently complained virtual reality spaces were overrun by children, I don't know what else you need to hear to know there's a problem, gleaning their presence by the sound of their voices. Yet Meta, using the code name Project Salsa, because allegedly, they knew it would be a spicy topic. Moved to lower the official age minimum for its virtual reality headsets from 13 to 10

In Meta's ongoing tradition of moving fast and breaking things, it broke its repeated promise to parents and Congress to protect kids on their platforms. That's why a bipartisan coalition of 42 state Attorneys General with wildly different political views on a number of things decided to take this on, and that included my Attorney General in Minnesota, Keith Ellison.

But Meta has continued to prioritize user engagement. It does that because the more time people, no matter how young, spend on their platforms, the more money it makes.

We know the profits on kids' data, according to a recent study, social media platforms generated $11 billion in revenue in 2022 from advertising directed at kids and teens, including nearly $2 billion in AD profits derived from users aged 12 and under

And while today's whistleblowers left the company before its current push to catch up in the generative AI space race. Their testimony raises serious questions about whether Meta is ignoring child safety issues related to AI products, especially in light of the recent reports that Meta allowed its chat box to engage children in "internal documents, romantic or sensual or Conversations."

That's why we must come together, Democrats, Republicans, to set up common-sense rules. We have worked on both the Judiciary and Commerce committees to do this. Senator Cruz and I basically, through a lot of hard work over many years, passed our Take It Down Act, when the President signed it into law, that would hold platforms accountable for taking down pornographic images of kids or adults that are either the actual images or AI-created within 48 hours. We must pass Senator Blackburn and Blumenthal's Kids Online Safety Act to ensure that platforms design their products to prevent and mitigate harm to kids.

We also know that other industries do not enjoy this similar level of protection. If they have an appliance that blows up or they have a tire that blows up on the road, there's accountability. They get sued, and that's a major incentive to fix it. We don't have that with these social media platforms, and that is why I have long supported repealing Section 230, which was basically set up while these were little companies in the garage. That's not true anymore. They're the biggest companies the world has ever known.

To end with this, one parent once told me that her trying to get her young kids, six and eight, off of these platforms, she'd have to resort to relying on her 12-year-old, 15-year-old to try to get it down. They couldn't figure out how to do it. They'd find another platform. It's just endless. Despite all her best efforts to be a mom, she said it was like a sink overflowing with a faucet she couldn't turn off, and she was just sitting out there with a mop. These parents need more than mops. They need us to pass this bill.

Thank you, Madam Chair.

###

U.S. Senate Committee on Rules and Administration published this content on September 09, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on September 09, 2025 at 22:50 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]