Detect Deepfakesby Resemble AI
Deepfake case study · Audio

UK adults deepfake (Sep 2024)

Criminals are using AI to clone voices from just three seconds of audio, obtained from publicly available online content. The AI-powered scam involves cloning a victim's voice and then cold-calling their loved ones…

Incident date
Sep 2024
Target
UK adults
Updated May 6, 2026 · 1 min read

Criminals are using AI to clone voices from just three seconds of audio, obtained from publicly available online content. The AI-powered scam involves cloning a victim's voice and then cold-calling their loved ones to request money. Hundreds of cases have been reported by Starling Bank, and data suggests 28% of UK adults have been targeted.

Sources