Women, influencers, minors, public figures including Iggy Azalea deepfake (Jan 2026)
Grok's image tools were used to alter photos of real women, including public figures, into sexually explicit or revealing forms. Reports included bikini edits, deepfake-style undressing, and "spicy" mode prompts…
- Incident date
- Jan 2026
- Target
- Women, influencers, minors, public figures including Iggy Azalea
Grok's image tools were used to alter photos of real women, including public figures, into sexually explicit or revealing forms. Reports included bikini edits, deepfake-style undressing, and "spicy" mode prompts involving celebrities. There were also instances of images targeting Bollywood actors, influencers, and children under age 18. In one instance, Grok appeared to have issued a public apology post acknowledging it had generated and posted an image of two underage girls in sexualized attire, although this was later retracted. A Reddit thread catalogues user-submitted examples of inappropriate image generations, with some posts claiming over 80 million Grok images have been generated since late December, with a portion clearly created or shared without subject consent.