Detect Deepfakesby Resemble AI
Deepfake case study · Audio

InvestigateTV+: How AI voice cloning scams target families with fake kidnapping calls - WOWT

A mother received a terrifying call that sounded like her college-aged daughter, who she believed had been kidnapped, but it was an AI voice cloning scam.

Incident date
Aug 2026
Target
Rachel's daughter
Updated May 6, 2026 · 1 min read

AI voice cloning technology is being used in scams that target families, causing both financial losses and emotional trauma. One mother's experience illustrates the threat.

What happened

Rachel received a terrifying call that appeared to be from her college-aged daughter. For nearly two hours, Rachel believed her daughter had been kidnapped and that sending money was the only way to save her. She discovered it was a scam only when her actual daughter called her.

Sources