Detect Deepfakesby Resemble AI
Deepfake case study · Audio

AI voice cloning scams target families with fake kidnapping calls - investigatetv.com

A mother lost thousands of dollars in a voice cloning scam where criminals used AI to mimic her daughter's voice, claiming she'd been kidnapped after a car accident.

Incident date
Jan 2026
Target
Rachel's daughter
Updated May 6, 2026 · 1 min read

In January 2026, a mother was targeted in a voice cloning scam where criminals used AI to convincingly mimic her daughter's voice during a fake kidnapping call. The scam resulted in significant financial loss and emotional distress for the victim.

What happened

A mother, identified as Rachel, received a call that appeared to be from her college-aged daughter, with the caller ID displaying her daughter's name and phone number. The caller, sounding exactly like her daughter, claimed to have been in a car accident and subsequently kidnapped. Over nearly two hours, the scammer, impersonating a kidnapper, instructed Rachel to wire money to Mexico via Walmart and Walgreens, creating the illusion of constant surveillance. Rachel wired the money, but realized it was a scam after finally reaching her real daughter on another phone line.

Sources