Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Individuals and families deepfake (Dec 2025)

Scammers are using AI voice cloning, requiring only a three-second audio sample, to impersonate family members (e.g., 'hey mom scam') or IRS agents, creating a sense of urgency to trick victims into sending money.

Incident date
Dec 2025
Target
Individuals and families
Updated May 6, 2026 · 1 min read

Scammers are using AI voice cloning, requiring only a three-second audio sample, to impersonate family members (e.g., 'hey mom scam') or IRS agents, creating a sense of urgency to trick victims into sending money.

Sources