AI-Linked CSAM Reports Surge 22x to 1.5 Million in 2025

The National Center for Missing & Exploited Children (NCMEC) received 1.5 million reports of suspected AI-linked child sexual abuse material in 2025 — up from 67,000 in 2024 and just 4,700 in 2023. The 22x single-year jump is the steepest acceleration recorded in NCMEC's history and reflects the rapid democratisation of AI image and video generation tools capable of producing synthetic abuse imagery at scale.

Why It Matters

The trajectory — 4,700 → 67,000 → 1,500,000 in three years — indicates AI-enabled CSAM generation is compounding faster than both detection tooling and regulatory frameworks can respond, placing extreme pressure on child safety organisations and platform moderation infrastructure.