Deepfake case study · Image
Ashley St Clair deepfake (Feb 2026)
The AI tool Grok on X (formerly Twitter) was used to generate millions of nonconsensual sexualized images, including some depicting extreme sexual violence and child sexual abuse material. Victims included Ashley St…
- Incident date
- Feb 2026
- Target
- Ashley St Clair
Updated May 6, 2026 · 1 min read
The AI tool Grok on X (formerly Twitter) was used to generate millions of nonconsensual sexualized images, including some depicting extreme sexual violence and child sexual abuse material. Victims included Ashley St Clair, Evie, members of Collective Shout, and an adult survivor of child sexual abuse. When victims spoke out, they were targeted with "revenge porn" and further deepfake abuse. The IWF found imagery created by Grok that was later turned into CSAM.