Chuck Grassley

04/09/2026 | Press release | Distributed by Public on 04/09/2026 12:04

Grassley Releases New and Disturbing Information on Online Child Exploitation, Presses Tech Giants for Answers

04.09.2026

Grassley Releases New and Disturbing Information on Online Child Exploitation, Presses Tech Giants for Answers

BUTLER COUNTY, IOWA - U.S. Senate Judiciary Committee Chairman Chuck Grassley (R-Iowa) is opening a congressional inquiry into eight major tech companies for allegedly failing to sufficiently report online child sexual exploitation, frustrating law enforcement investigations into online child abuse. Additionally, Grassley is releasing new information from the National Center for Missing and Exploited Children (NCMEC) - provided to Congress in response to Grassley's oversight - which details the eight companies' reporting deficiencies, a list of "poor reporting" companies and data related to generative AI.

Meta, Amazon AI Services, TikTok, Snapchat, Discord, X.AI, Grindr and Roblox in 2025 submitted over 17 million reports of suspected online child exploitation. According to NCMEC, these eight companies collectively accounted for 81% of the reports received through NCMEC's CyberTipline in 2025. All electronic service providers (ESPs) are required by law to report suspected cases of online child sexual exploitation to NCMEC's CyberTipline.

NCMEC found significant issues with the companies' reporting processes in 2025, with some companies failing to provide essential location data on users and suspects, failing to disclose child sex abuse material (CSAM) in AI training data and failing to report instances of sadistic online exploitation targeting children, among others. However, NCMEC indicated that Meta and X.AI had improved their reporting in 2025.

"For almost thirty years, NCMEC has worked tirelessly to combat online child sexual exploitation by attempting to persuade ESPs to detect, report and remove child sexual exploitation on their platforms and improve the quality and substance of their CyberTipline reports. Many ESPs regularly tout the number of reports they submit to the CyberTipline, but fail to disclose that millions of reports lack basic information… This leaves children unprotected online, subjects survivors to revictimization, enables sexual offenders to remain freely online and wastes valuable and limited law enforcement resources," NCMEC wrote to Grassley.

Grassley is demanding Meta, Amazon AI Services, TikTok, Snapchat, Discord, X.AI, Grindr and Roblox respond to NCMEC's letter and describe how they're working to improve their reporting process in 2026.

"On March 16, 2026, NCMEC responded to my [oversight] letter and provided my office with new information regarding online child exploitation. I'm alarmed by what I've read. Based on information provided to my office, I am concerned that some companies have not provided NCMEC and law enforcement with sufficient data needed to protect kids and prosecute suspected predators," Grassley wrote.

In addition to his oversight of ESP reporting, Grassley is leading the bipartisan James T. Woods Act with Ranking Member Dick Durbin (D-Ill.) to address concerning developments in online child exploitation by targeting lax federal sentencing laws, violent online criminal networks and sextortion. Grassley's bill has garnered widespread support and was advanced through the U.S. Senate Judiciary Committee in February.

Read Grassley's letters to Meta, Amazon AI Services, TikTok, Snapchat, Discord, X.AI, Grindr and Roblox.

Read NCMEC's full response to Grassley HERE. A summary of the NCMEC data is below:

Meta

Meta submitted nearly 11 million reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Meta's most significant reporting issues included:

  • "[C]onsistency and quality issues" with nearly 1.2 million reports related to online enticement and child sex trafficking, many of which were unable to be pursued by law enforcement;
  • Failures to escalate suicidal ideation by a child on 11 separate occasions;
  • Failures to address false positives caused by "adult classifier," a tool used to detect users who might be lying about their age;
  • Reporting violent content with no nexus to child sexual exploitation; and
  • Duplicative and fragmented reporting, which strained NCMEC and law enforcement resources.

NCMEC has informed Grassley's office that Meta's reporting "has improved, but there are additional improvements that can be made."

Amazon AI Services

Amazon AI Services submitted over 1.1 million reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Amazon AI Services' most significant reporting issues included:

  • Zero of the 1.1 million reports were actionable when made available to law enforcement due to Amazon AI Services' failure to provide location or suspect information.
    • According to NCMEC, a representative for Amazon AI Services indicated that "[Amazon's] systems were intentionally designed not to collect or retain information about the underlying content or the associated user."
  • Failures to transparently disclose or explain detection of CSAM in its AI training set.

TikTok

TikTok submitted over 3.6 million reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

TikTok's most significant reporting issues included:

  • Routinely reporting content unrelated to child exploitation, with occasional reports of CSAM, causing major workflow issues for NCMEC staff and law enforcement officers.
    • When confronted by NCMEC, TikTok reportedly indicated that they "are working on other high-priority items and could not commit to a timeframe to correct this reporting issue."

Snapchat

Snapchat submitted over 752,000 reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Snapchat's most significant reporting issues included:

  • "Quality issues" with more than 143,900 reports related to online enticement and more than 20,000 reports related to unsolicited obscene material sent to a minor, including failures to submit substantive information in reports involving chat.
    • Of the reports with law enforcement feedback, over 80% were deemed inactionable, because, among other reasons, law enforcement indicated Snapchat's "information was not useful."

Discord

Discord submitted nearly 490,000 reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Discord's most significant reporting issues included:

  • Failures to provide location or account information for individuals involved in reported chat logs, making these reports inactionable; and
  • Routine reports of adult and graphic gore content unrelated to child sexual exploitation, causing major workflow issues for NCMEC staff and negatively impacting law enforcement officers.

X.AI

X.AI submitted over 135,000 reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

X.AI's most significant reporting issues included:

  • More than 90% of reports were originally deemed inactionable due to limited user information.
    • In 2025, NCMEC met with X.AI staff at its headquarters regarding these reporting issues and ultimately was forced to escalate the situation to X Corp. After X Corp.'s intervention, X.AI resubmitted all past reports with more robust user information, including location information.

NCMEC has informed Grassley's office that X.AI's reporting "has improved, but there are additional improvements that can be made."

Grindr

Grindr submitted over 111,000 reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Grindr's most significant reporting issues included:

  • Routine failures to include critical location information in reports, including in reports deemed "high-priority" by Grindr.
    • In 2024, only 35% of Grindr reports contained some form of location information. In 2025, only 4% of Grindr reports contained any location information.
    • According to NCMEC, "Grindr is generally unresponsive or provides passive responses" when confronted with these severe reporting deficiencies.

Roblox

Roblox submitted over 65,000 reports involving suspected online child exploitation to NCMEC's CyberTipline in 2025.

Roblox's most significant reporting issues included:

  • Failures to identify child victims in reports related to online chats; and
  • Failures to regularly report incidents of sadistic online exploitation victimizing children.
    • According to NCMEC, "the Roblox platform [is a] primary location where suspects meet and recruit minors for abuse, including by various [sadistic online exploitation] groups."
    • Sadistic online exploitation groups, such as the 764 Network, have been known to entice children to acts of self-harm, as Grassley has exposed.
  • Additionally, Roblox "submitted an extremely low ratio of [sadistic online exploitation] reports in comparison to the volume of reports submitted by members of the public concerning [sadistic online exploitation]-related enticement and exploitation on Roblox."

2025 Poor Reporting Companies

NCMEC published the below list of "poor reporter" companies, which submit more than 100 CyberTipline reports, with 50% or more containing no location information for a suspect or child victim:

  • Amazon AI Services - 0% of 1.1 million reports contained actionable location information.
  • Grindr - only 4% of 111,000 reports contained actionable location information.
  • Invoke AI - 0% of 2,838 reports contained actionable location information.
  • Lightspeed Systems - 0% of 1,549 reports contained actionable location information.
  • Redgifs.com - 0% of 1,042 reports contained actionable location information.
  • Box Inc. - only 12% of 1,131 reports contained actionable location information.
  • InternetArchive - only 17% of 820 reports contained actionable location information.
  • Streamable Inc. - 39% of 1,596 reports contained actionable location information.
  • Zoom Video Communications Inc. - 45% of 768 reports contained actionable location information.

Generative AI Reports

In 2025, NCMEC received 1.5 million CyberTipline reports that had a nexus to generative AI and child sexual exploitation, including:

  • 1.1 million reports containing no actionable information were submitted by Amazon AI Services.
  • Over 12,000 reports of CSAM that companies indicated were identified in training data.
  • Over 7,000 reports of users generating or possessing generative AI CSAM.
  • Over 30,000 reports of users attempting to create generative AI CSAM by uploading a file and using text prompts.
  • Over 145,000 reports of users using generative AI to engage with or alter a CSAM file without text prompts.
  • Over 3,000 reports of other forms of generative AI use in relation to child sexual exploitation, including chat-based exploitation.
  • Over 133,000 reports indicating a generative AI nexus, but lacking sufficient info to determine how generative AI was used in connection with the exploitation of a child.

-30-

  • Print
  • Email
  • Like
  • Tweet
Chuck Grassley published this content on April 09, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on April 09, 2026 at 18:04 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]