Detect Deepfakesby Resemble AI
Deepfake case study · Audio

The AI “Grandkid” Voice Clone Scam That’s Stealing Seniors’ Life Savings - Yahoo

AI voice cloning is used to scam seniors by mimicking the voices of their grandchildren in distress, exploiting emotional triggers for financial gain.

Incident date
Jan 2024
Target
Seniors
Updated May 8, 2026 · 1 min read

A new wave of scams targets seniors by using AI to clone the voices of their grandchildren. Scammers leverage readily available AI tools and social media audio clips to create convincing fake emergencies.

What happened

Scammers need as little as 30 seconds of audio from social media to clone a voice. They then spoof a local phone number and script a personalized emergency, such as a grandchild claiming to be in jail. The cloned voice, combined with emotional manipulation, increases the believability of the scam. Seniors lost nearly $5 billion to cybercrime in 2024, a 43% increase from the previous year. One-third of respondents across multiple countries encountered deepfake voice fraud, with victims losing an average of $6,000. A Canadian grandmother nearly wired $9,000 to a scammer posing as her grandson, but a bank teller intervened. In Suffolk County, New York, multiple seniors fell victim to similar AI-powered scams in early 2025. The FBI notes that AI has dramatically increased the "believability" of criminal scams by exploiting our deepest emotional triggers.

Sources