Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Real-world AI voice cloning attack: A red teaming case study - TechTarget

An AI voice cloning attack successfully tricked a senior leader at a large business, highlighting the increasing effectiveness of social engineering campaigns.

Incident date
Jan 2026
Target
senior leader at a large business
Updated May 6, 2026 · 1 min read

AI-driven voice cloning is making social engineering attacks more potent than ever. A recent red teaming exercise demonstrated how easily a seasoned employee could be deceived using this technology.

What happened

During the exercise, the ethical hacker aimed to access the email account of a senior leader. The hacker found the target's email address publicly available and discovered his password in previous data breaches. To bypass Microsoft Authenticator, the hacker impersonated a member of the company's IT team using voice cloning. They found a video of a senior IT leader giving a presentation and used ElevenLabs to create a voice clone from the audio. The hacker then called the target, explained that the IT team was conducting maintenance, and asked him to enter a two-digit number from the screen into his Microsoft Authenticator app. The target, convinced by the cloned voice, complied, granting the hacker access to his email and SharePoint.

Sources