AI Voice Cloning Apps Should Terrify You - Here's Why - bgr.com
AI voice cloning apps are enabling scams, from impersonating loved ones to tricking employees into transferring funds, highlighting the need for heightened awareness and verification.
- Incident date
- Jan 2024
- Target
- Hong Kong company
AI voice cloning technology presents new opportunities for scammers to deceive victims. These tools, readily available and easy to use, require only a few seconds of audio to clone a voice, enabling malicious actors to impersonate individuals for financial gain and other nefarious purposes.
What happened
In 2024, a company in Hong Kong was targeted in a financial scam where impersonators used a voice cloning tool to pose as the firm's CFO, resulting in the transfer of $25 million to the scammers. Scammers may clone voices to impersonate loved ones in emergency situations, celebrities endorsing investments, or superiors requesting financial transactions.