Deepfake case study · Audio
Americans deepfake (Apr 2026)
Criminals used AI to clone voices of family members during fake emergencies, leading to financial losses for the victims. Scammers cloned voices from social media to imitate a person in distress and request money.
- Incident date
- Apr 2026
- Target
- Americans
Updated May 6, 2026 · 1 min read
Criminals used AI to clone voices of family members during fake emergencies, leading to financial losses for the victims. Scammers cloned voices from social media to imitate a person in distress and request money.