Detect Deepfakesby Resemble AI
Deepfake case study · Audio

AI voice cloning is showing up in elder fraud cases. What's working for keeping eyes on Mom or Dad

AI voice cloning is increasingly used in elder fraud, with significant financial losses reported. Families are adopting communication strategies to mitigate these risks.

Incident date
May 2026
Target
Dad's brother
Updated May 8, 2026 · 1 min read

AI voice cloning is emerging as a tool for elder fraud, leading to substantial financial losses for seniors. In 2025, adults aged 60 and older lost $4.9 billion to internet crime, a 43% increase in one year, prompting families to adopt new strategies to protect their loved ones.

What happened

One family experienced this firsthand when Dad's brother received a call sounding exactly like his nephew, claiming to be in jail and needing money. The funds were almost wired before the family was able to verify the story. This highlights the increasing sophistication of scammers, who need only a few seconds of audio—taken from sources like Facebook videos or voicemail greetings—to clone a familiar voice. Families are now using strategies like establishing a family code word, creating a shared information hub for family activities, and implementing multi-person authorization for financial transactions.

Sources