Detect Deepfakesby Resemble AI
Deepfake case study · Video

Biometric Authentication is Easier to Fake Than You Think - Infosecurity Magazine

In a 2024 incident, attackers used a deepfake to impersonate a CFO and trick a finance employee into transferring approximately $25 million, highlighting the risks of AI-generated impersonation.

Incident date
Jan 2024
Target
Arup
Updated May 13, 2026 · 1 min read

Advances in generative AI have made impersonation easier, as demonstrated in a recent attack where a deepfake was used to steal millions. Attackers no longer need to steal credentials; they can simply pretend to be the user using synthetic video, cloned voices, and biometric spoofs.

What happened

In early 2024, a finance employee at Arup's Hong Kong office joined a video call with what appeared to be the CFO, but it was an AI-generated deepfake. The employee was persuaded to transfer around $25 million before discovering the fraud. No systems were breached and no credentials were stolen. The attackers simply impersonated people the employee trusted, showcasing the effectiveness of deepfakes in remote environments where digital signals are trusted.

Sources