Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Families of potential kidnapping victims deepfake (Dec 2025)

Scammers use AI voice cloning to simulate a family member's screams, claiming they are being held captive and demanding ransom. They instruct the victim to stay on the line and wire money.

Incident date
Dec 2025
Target
Families of potential kidnapping victims
Updated May 6, 2026 · 1 min read

Scammers use AI voice cloning to simulate a family member's screams, claiming they are being held captive and demanding ransom. They instruct the victim to stay on the line and wire money.

Sources