A Bloomberg investigation finds that AI-generated child sexual abuse material now floods investigative pipelines, forcing triage decisions that divert resources from real child victims. NCMEC received over 400,000 AI-CSAM reports in the first half of 2025 alone, averaging more than 2,000 per day. North Carolina saw an 11-fold increase in tips between 2019 and 2026. Tech platforms do not consistently label AI-generated material, making triage harder. Some task forces report waiting months for annual funding allocations.