Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Loved ones, bosses, real estate agents, lawyers, accountants, financial advisors deepfake (Apr 2025)

AI voice cloning is used by scammers to mimic voices of family, friends, and coworkers to steal money. They use AI programs with short audio clips to create voice clones, and then create fake emergencies to request…

Incident date
Apr 2025
Target
Loved ones, bosses, real estate agents, lawyers, accountants, financial advisors
Updated May 6, 2026 · 1 min read

AI voice cloning is used by scammers to mimic voices of family, friends, and coworkers to steal money. They use AI programs with short audio clips to create voice clones, and then create fake emergencies to request funds. Scammers may impersonate relatives, bosses, real estate agents, lawyers, accountants or financial advisors.

Sources