Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Voice cloning scams are on the rise. This ASU researcher is making a tool to detect real humans…

In late 2025, a voice cloning scam successfully defrauded an elderly couple of over $50,000 by impersonating their grandchild in distress, highlighting the rising threat of AI-driven fraud.

Incident date
Dec 2025
Target
Bella Lenz's grandparents
Updated May 6, 2026 · 1 min read

Voice cloning scams are becoming increasingly prevalent due to advancements in generative AI. One such incident involved Bella Lenz's grandparents, who received a distressing phone call. The caller, using a voice convincingly cloned to sound like Bella's cousin, claimed to be in dire straits while traveling abroad, urgently needing money to avoid a worse situation.

What happened

Unbeknownst to the grandparents, the voice was a digital forgery. Over several months, the scammers manipulated them into wiring over $50,000. The incident left Bella's grandparents distraught, illustrating the emotional and financial toll these scams can inflict. The ease with which these voice clones can be created, using readily available software and minimal audio samples, makes this type of fraud increasingly accessible and difficult to detect.

Sources