Detect Deepfakesby Resemble AI
Deepfake case study · Audio

companies, consumers deepfake (Jul 2025)

The article discusses the increasing challenges of voice fraud due to AI and hybrid work environments. It mentions instances of cloned voices being used to impersonate individuals, such as a CFO asking for a wire…

Incident date
Jul 2025
Target
companies, consumers
Updated May 6, 2026 · 1 min read

The article discusses the increasing challenges of voice fraud due to AI and hybrid work environments. It mentions instances of cloned voices being used to impersonate individuals, such as a CFO asking for a wire transfer or a CEO requesting a 'quick favor'. It highlights that 63% of consumers have received AI-generated deepfake robocalls. While it describes the potential for deepfake attacks, it does not focus on a single, specific incident.

Sources