Meta's AI Generating Flood of Low-Quality CSAM Reports, US Child Abuse Investigators Testify
Key Takeaways
- ▸US law enforcement officers testified that Meta's AI moderation generates large volumes of unusable CSAM reports, with one ICAC department seeing cybertips double from 2024 to 2025
- ▸Many tips from Instagram, Facebook, and WhatsApp contain non-criminal information or lack crucial evidence like images, videos, or text needed for investigation
- ▸Meta disputes the criticism, citing DOJ praise for fast cooperation (67-minute average response time) and NCMEC recognition of improved reporting processes
Summary
Law enforcement officers from the US Internet Crimes Against Children (ICAC) taskforce have testified that Meta's AI-powered content moderation systems are generating overwhelming volumes of unusable reports about child sexual abuse material (CSAM). Special Agent Benjamin Zwiebel told a New Mexico court that "a lot of tips from Meta are just kind of junk," while another ICAC officer reported that their department's cybertip volume doubled from 2024 to 2025, with much of the increase consisting of low-quality reports.
The testimony came during New Mexico's lawsuit against Meta, which alleges the company prioritizes profits over child safety. ICAC officers describe receiving thousands of monthly tips from Instagram, Facebook, and WhatsApp that often lack critical information—images, videos, or text are frequently missing or redacted, making investigation impossible. In some cases, the reports don't contain criminal information at all. Officers noted that Instagram tips have "really skyrocketed recently, especially in the last couple of months."
Meta defended its practices, citing praise from the Department of Justice and the National Center for Missing & Exploited Children (NCMEC) for its cooperation speed and improved reporting processes. The company stated it resolved over 9,000 emergency requests from US authorities in 2024 with an average response time of 67 minutes, and even faster for child safety cases. Meta emphasized its teen account protections and efforts to help NCMEC prioritize urgent reports through tools like case management systems and tip labeling.
- The testimony emerged during New Mexico's lawsuit alleging Meta prioritizes profits over child safety on its platforms


