AI Voice Cloning Scams: How Criminals Are Stealing Millions - vocal.media
AI voice cloning scams are on the rise, using just seconds of audio to mimic voices and deceive victims, leading to significant financial losses.
- Incident date
- Apr 2026
- Target
- Married couple in Phoenix, Arizona
AI voice cloning has emerged as a potent tool for scams, enabling criminals to mimic voices with alarming accuracy using only a few seconds of audio. This technology has led to significant financial losses for individuals and organizations alike.
What happened
In February 2024, a married couple in Phoenix, Arizona, received a distressing 2:00 AM phone call. The caller ID displayed their daughter's name and number. The father answered, hearing his daughter's voice pleading for help, claiming she'd been in an accident. A man then impersonated a public defender, requesting $50,000 for bail. The father wired the money, only to discover three hours later that his daughter was safe at home and the call was a scam. The criminals had cloned his daughter's voice from a short clip found on social media.