Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Voice Cloning AI Scams Are on the Rise - BECU

AI voice cloning scams are on the rise, enabling criminals to mimic voices and trick victims into sending money; protect yourself by limiting your digital footprint.

Incident date
Sep 2025
Target
Jennifer DeStefano's daughter
Updated May 6, 2026 · 1 min read

AI voice cloning is being used by scammers to impersonate family members and request money. These scams leverage AI to generate audio that sounds like a person's loved one in distress, making them harder to recognize.

What happened

One example involves Jennifer DeStefano, who received a call where her daughter's voice sounded panicked and claimed to be kidnapped, demanding a $1 million ransom. DeStefano was able to confirm her daughter was safe and didn't pay the ransom. In another instance, a woman lost $15,000 after receiving a call from someone impersonating her crying daughter, withdrawing cash that was picked up from her house. The scammers often research families on social media for voice samples to clone, then use AI tools to replicate the voice, pairing it with a request for money.

Sources