Detect Deepfakesby Resemble AI
Deepfake case study · Audio

John Waldron deepfake (Aug 2025)

Falsified audio recordings portrayed John Waldron making racist remarks. The Black Wall Street Times ran it and did a retraction proving exactly how it was done. The software they used proved that the audio fake was…

Incident date
Aug 2025
Target
John Waldron
Updated May 6, 2026 · 1 min read

Falsified audio recordings portrayed John Waldron making racist remarks. The Black Wall Street Times ran it and did a retraction proving exactly how it was done. The software they used proved that the audio fake was 95 to 99 percent synthetic.

Sources