Detect Deepfakesby Resemble AI
Deepfake case study · Video

AI voice-cloning scams: How to protect yourself from impersonators duping victims - WSYX

AI voice cloning and deepfakes are increasingly used in scams, with criminals impersonating trusted figures to defraud victims; one woman lost $80,000 due to a doctored video.

Incident date
Nov 2025
Target
Steve Burton
Updated May 6, 2026 · 1 min read

AI-powered scams are on the rise, with bad actors cloning voices and creating deepfakes to deceive victims. These scams are becoming increasingly sophisticated, making it difficult for the average person to distinguish between reality and fiction.

What happened

One woman lost $80,000 after thieves doctored a social media video of actor Steve Burton, falsely claiming he lost property in the California wildfires. Scammers are also cloning voices, extracting personal information from social media platforms to mimic individuals convincingly. Experts warn that these impersonations can range from teenagers to organized crime and even nation-state actors. One news outlet demonstrated how easily a voice can be cloned using readily available software. Authorities advise skepticism and direct verification with trusted sources to combat these scams.

Sources