Detect Deepfakesby Resemble AI
Deepfake case study · Audio

Seniors, Business executives and senior public officials deepfake (Dec 2025)

AI voice cloning was used to create deepfake audio to impersonate family members in distress or high-profile public figures requesting money or sensitive information. Scammers would clone the voices of grandchildren…

Incident date
Dec 2025
Target
Seniors, Business executives and senior public officials
Updated May 6, 2026 · 1 min read

AI voice cloning was used to create deepfake audio to impersonate family members in distress or high-profile public figures requesting money or sensitive information. Scammers would clone the voices of grandchildren and make distress calls to grandparents. Scammers were leaving AI-generated voice messages mimicking the voices of senior officials and prominent public figures. The messages were urgent requests for money or sensitive information. The targets were business executives and senior public officials.

Sources