Detect Deepfakesby Resemble AI
Deepfake case study · Audio

loved ones deepfake (Feb 2026)

Scammers use AI to clone the voice of a person's loved one and then call the person pretending to be the loved one in distress, asking for money. Scammers obtain voice samples from social media and other sources.

Incident date
Feb 2026
Target
loved ones
Updated May 6, 2026 · 1 min read

Scammers use AI to clone the voice of a person's loved one and then call the person pretending to be the loved one in distress, asking for money. Scammers obtain voice samples from social media and other sources.

Sources