Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Scammers using AI to clone voice of loved ones to steal information, money - WBRC

Scammers are using AI to clone the voices of loved ones, tricking victims into sending money or revealing personal information, highlighting the increasing sophistication of AI-driven fraud.

Incident date
Nov 2025
Target
Cameron
Updated May 6, 2026 · 1 min read

AI-powered scams are on the rise, with fraudsters now using voice cloning technology to impersonate family members and friends in order to steal money and information. One such incident involved a couple whose great-grandson's voice was cloned in an attempt to extract money from them.

What happened

The couple, the Borens, received a call from someone claiming to be their great-grandson, Cameron. The caller said he was in pain, had a broken nose, and was being taken to jail following a car accident. The scammer, impersonating Cameron, provided a case number and the name of an attorney, and pleaded for money to be paid for bail, promising to pay it back. The Borens were then contacted by a person claiming to be the attorney, who also requested money. The couple eventually discovered it was a scam and located their great-grandson, realizing his voice had been cloned using AI. Experts warn that with as little as 30 seconds of audio, scammers can create convincing voice clones and manipulate victims by adding emotions to the synthesized voice.

Sources