09/09/2025 | Press release | Distributed by Public on 09/09/2025 20:50
WASHINGTON - U.S. Senator Chris Coons (D-Del.) questioned two Meta whistleblowers during a Senate Judiciary Subcommittee hearing today, where he thanked both witnesses for calling attention to the failure of Meta and other big tech platforms to adequately protect children and called on Congress to pass legislation ensuring it would not happen again.
The Senate Judiciary Subcommittee on Privacy, Technology, and the Law, which Senator Coons formerly chaired, held a hearing with former Meta researchers Jason Sattizahn and Cayce Savage, who became whistleblowers revealing the company's efforts to cover up and delete internal research on safety measures to protect children using Meta platforms. In today's testimony, Sattizahn and Savage recalled disturbing incidents of children being exposed to sexual exploitation and violence in the company's virtual reality spaces. They explained how Meta allegedly buried or deleted findings that showed the risks to young users because the company wanted to prioritize engagement and profitability over safety. On Monday, The Washington Post reported that Sattizahn conducted an interview in Germany with a teenage boy who said he and his younger brother were sexually propositioned by strangers on Meta's virtual reality headsets. Sattizahn's supervisor ordered the recording deleted, and the final report omitted those allegations.
Senator Coons emphasized the need for stronger oversight of large technology companies, pointing to the imbalance of power between these platforms and the families who have little visibility into the risks their children could face online.
"Protecting our children from harm is the highest obligation all of us have, and it must have been so difficult for you to work for a company that in some ways does admirable things, delivers great services, but that as you served there longer and longer you began to realize was knowingly and willingly - willfully - blinding themselves to the harm that their products and services cause children," said Senator Coons. "There's a huge imbalance in power between big tech platforms, who have all the data and all the power to understand the impact of their products and engineer them to favor safety, and the children and families, the policy makers and the advocacy groups who have no way to get that insight. Addressing this imbalance of power is a fundamental and essential component of ensuring we're protecting our kids online."
During his remarks, Senator Coons highlighted several bipartisan bills he has introduced that would help address the problems that the whistleblowers raised and assist in holding companies like Meta accountable:
A video of Senator Coons' full remarks and partial transcript of his comments are available below.
WATCH HERE.
Senator Coons: Thank you so much, Senator Blackburn, Senator Klobuchar, for convening this important hearing today and to our two witnesses for your courage, your determination to make sure that the truth gets out. Protecting our children from harm is the highest obligation all of us have, and it must have been so difficult for you to work for a company that in some ways does admirable things, delivers great services, but that as you served there longer and longer you began to realize was knowingly and willingly - willfully - blinding themselves to the harm that their products and services cause children, and then taking aggressive action to prevent you from studying or understanding the harm being caused to children, and then tried to prevent you from communicating about that to anyone.
So here you are today, testifying in front of the Senate Judiciary Committee, to a bipartisan panel that includes seasoned prosecutors and seasoned senators and parents and grandparents, and frankly, your testimony has been alarming - even jaw dropping. To summarize: Meta prioritized engagement over safety for billions, and when you tried to inform them of demonstrable harm and risk to children, they first turned a blind eye, and then tried to handcuff or blind you and others charged with research and promoting integrity.
So, whether it's artificial intelligence or social media or virtual reality, we're in a very difficult period for parents. As Senator Blackburn just demonstrated, many of us lack the focus skill and ability to navigate the exact path towards parental controls on systems our kids are employing, and so a very small percentage of parents are effectively protecting their children. They would expect that businesses that provide these services would test whether they're safe and would build them to be safe for children. Yet, your testimony proves otherwise.
This ends up happening because there's a huge imbalance in power between big tech platforms - who have all the data and all the power to understand the impact of their products and engineer them to favor safety - and the children and families, the policy makers and the advocacy groups, who have no way to get that insight. Addressing this imbalance of power is a fundamental and essential component of ensuring we're protecting our kids online.
I want to talk briefly about three bills that I've introduced or am developing that are designed to help address this. The first, with Senator Cassidy, is the Platform Accountability and Transparency Act. It creates mechanisms for independent researchers to study what's happening on social media platforms - how their algorithms drive engagement over safety. Second, with Senator Grassley, a bill to protect whistle blowers, particularly in the space of artificial intelligence, who come forward to disclose serious safety violations or vulnerabilities. And then last, a bill I'm developing to create similar mechanisms for independent research and transparency in artificial intelligence platforms.
If we don't move forward bills like these, and some of the bills my colleagues have championed, we are continuing to allow Big Tech to grade their own homework, to build their own platforms, and to continue to cruise forward towards profitability through engagement, blinded - or uncaring - about the harm to our children.
So Mr. Sattizahn, if I might, your testimony about all the ways that Meta buried or hindered internal research is just stunning. Given what you saw on the inside, could you just say a few words about the value and importance of ensuring there are mechanisms for independent researchers to actually study what's happening, whether it's in virtual reality, social media platforms or AI?
Sattizahn: I'd say, you know, over the last six years, it was very clear to me that they will not change from the inside out. Meta will not change from the inside out. And during my long tenure there, the only things that actually led to professionalism or doing the right thing was Meta's fear of losing control, whether that was losing control over their own finances, or losing control from some regulation or oversight coming in from the outside. So, all I have to say is, I love that phrase, 'they cannot grade their own homework,' because the only thing that will change the company is initiatives like that, vying for independence, because, as I've also testified, is if we rely on their own research, it will just be altered, changed, or even just erased. So, thank you.
[To Savage] I don't know if you had anything to add.
Senator Coons: And Dr. Sattizahn - Ms. Savage - if you would, what's most important for us to get right in legislation that ensures access for independent researchers?
Savage: I think, I mean partially, starting at collection, starting with the methods that are being used to gather the data, the populations that we're looking at, et cetera. But also having access to the data before it's analyzed. That's very critical.
Senator Coons: Ms. Savage, could you talk about the ways Meta specifically discouraged employees from coming forward as whistleblowers, and what more robust whistleblower protections might be important in order to ensure that we, policymakers, the public, [and] parents know about the risks their kids are facing?
Savage: Absolutely. I think the most powerful weapon that Meta had internally was narrative. After Frances Haugen's disclosure, Meta referred to the incident as a leak and frequently said, 'Oh, this, this was so harmful to the researchers whose reports were shared. This has really been harmful for our ability to do good research and actually investigate harms to users.' In terms of actual whistleblower protections, then I think part of it is the folks who are actually whistleblowing, but also the folks who remain at the company who do the investigation.