‘I’m disturbed’: Senator opens investigation into ways tech companies report suspected child abuse


Amazon’s AI Services division submitted 1.1 million reports of online child exploitation in 2025 to an advocacy group. But because those reports lacked basic information, there were no instances where law enforcement was able to take action. A new investigation opened in the Senate aims to ensure this does not happen again.

Sen. Chuck Grassley, an Iowa Republican who chairs the Senate Judiciary Committee, said this week I opened an investigation To eight major tech companies over their handling of mandatory reporting of online child exploitation. It’s the latest step in a growing movement that questions whether tech companies can be trusted to keep their young users safe while online.

Electronic service providers are required by law to report incidents of child sexual exploitation to CyberTipline It is run by the National Center for Missing and Exploited Children. In 2025, more than 17 million reports of child sexual exploitation were made online. But these reports may not contain the information needed to prompt real-world action.

“I’m disturbed by what I read,” Grassley said. “Based on the information provided to my office, I am concerned that some companies have not provided NCMEC and law enforcement with sufficient data needed to protect children and prosecute suspected predators.”

Atlas of Artificial Intelligence

Grassley sent requests for more information to several major tech companies: Meta, TikTok, Roblox, Snap, Amazon AI Services, xAI, Grindr, and Discord. These eight companies account for 81% of all child exploitation reports submitted to NCMEC. One of the most notable absentees from the investigation was Google, the owner of YouTube.

A Meta spokesperson told CNET that the company “works tirelessly” to protect children from this “horrific crime,” saying, “We are committed to continuous improvement and value feedback, which has already prompted us to make some improvements, as NCMEC has acknowledged. We will continue to make improvements to improve our reporting process.”

Grindr, Discord, and Roblox made similar comments, saying they plan to work with the Senate and NCMEC on the issues. Grindr added that its dating site is intended for adults only, ages 18 and up. Other technology companies did not immediately respond to requests for comment.

The Iowa Republican’s investigation comes on the heels of reports from NCMEC in 2025 that technology companies failed to provide basic location data in their reports and failed to disclose their use of child sexual abuse material in training AI data. This is particularly concerning given previous incidents in which AI has been used in construction Non-consensual intimate imagesincluded Child sexual abuse material.

Online exploitation of children is a growing problem. In 2025, META alone filed nearly 11 million reports, 1.2 million of which dealt with suspected child trafficking. Meta owns the popular platforms Facebook, Instagram, and WhatsApp. NCMEC said in 2025 that Meta and xAI had improved their reporting, but that was still lacking.

“Many cyber providers regularly tout the number of reports they submit to CyberTipline, but fail to disclose that millions of reports are missing basic information,” NCMEC wrote to Grassley in 2025. “This leaves children unprotected online, exposes survivors to revictimization, enables sex offenders to remain online freely, and wastes valuable and limited law enforcement resources.”

There has been a movement in other branches of government to hold technology companies accountable for children’s safety. Meta was recently found liable by a New Mexico jury Misleading users about safety Their platforms and failure to prevent child exploitation. The company was ordered to pay $375 million in damages. A day later, a California jury found Meta and Google liable for the creation Social media platforms that cause children’s addiction.

the The first person was convicted On Tuesday under a new US anti-AI deepfake law, the Take It Down Act, to create AI-generated child sexual abuse material.



Leave a Reply

Your email address will not be published. Required fields are marked *