Deepfake case study · Audio
Family members of individuals whose voices are cloned deepfake (Dec 2025)
Criminals are using AI voice cloning technology to impersonate family members in distress and request urgent money transfers. They use a short audio sample to clone the person's voice and create a voicemail or phone…
- Incident date
- Dec 2025
- Target
- Family members of individuals whose voices are cloned
Updated May 6, 2026 · 1 min read
Criminals are using AI voice cloning technology to impersonate family members in distress and request urgent money transfers. They use a short audio sample to clone the person's voice and create a voicemail or phone call explaining they were in an accident, robbed, or injured.