Meta's AI Flooding Cops With Junk Child Safety Reports
Police testify that Meta's automated CSAM detection is overwhelming investigators with low-quality tips.
Meta's AI-powered child safety reporting system is burying law enforcement under a mountain of useless tips, according to police officers testifying in New Mexico's lawsuit against the company.
Officers say the platform's automated detection is generating a flood of low-quality CSAM (child sexual abuse material) reports. The result: overwhelmed investigators, drained resources, and actual cases grinding to a crawl.
It's a brutal irony. The AI systems meant to protect children are instead clogging the pipeline that catches abusers. Police are stuck sifting through junk reports while real cases sit waiting.
The testimony comes as part of New Mexico's ongoing legal battle against Meta, putting a spotlight on how automated content moderation can backfire when volume trumps accuracy. More reports doesn't mean better enforcement — sometimes it means the opposite.