Detect Deepfakesby Resemble AI
Deepfake case study · Multi-modal

Children deepfake (Mar 2026)

AI-generated child sexual abuse material (CSAM) has surged, with reports skyrocketing from 4,700 in 2023 to over 400,000 in the first half of 2025. Images are manipulated by AI to appear explicit, and scammers use…

Incident date
Mar 2026
Target
Children
Updated May 6, 2026 · 1 min read

AI-generated child sexual abuse material (CSAM) has surged, with reports skyrocketing from 4,700 in 2023 to over 400,000 in the first half of 2025. Images are manipulated by AI to appear explicit, and scammers use AI-generated images of children in sextortion schemes, threatening to release fake explicit images unless victims pay.

Sources