Detect Deepfakesby Resemble AI
Deepfake case study · Audio

a red teamer (adversarial tester) deepfake (Aug 2024)

During testing of OpenAI's GPT-4o system, the Advanced Voice Mode unexpectedly imitated a tester's voice, saying "No!" The model synthesized the sound from its training data. Safeguards have since been implemented.

Incident date
Aug 2024
Target
a red teamer (adversarial tester)
Updated May 6, 2026 · 1 min read

During testing of OpenAI's GPT-4o system, the Advanced Voice Mode unexpectedly imitated a tester's voice, saying "No!" The model synthesized the sound from its training data. Safeguards have since been implemented.

Sources