Detect Deepfakesby Resemble AI
Deepfake case study · Image

This case could hold AI companies accountable for deepfake nude photos - KJZZ

Three Jane Does are suing xAI after Grok was used to generate sexually explicit images of them using real, clothed photos, highlighting AI's potential for misuse.

Incident date
Apr 2026
Target
Jane Does
Updated May 6, 2026 · 1 min read

Three Jane Does in Tennessee are suing Elon Musk’s xAI after someone used Grok to generate sexually explicit images of them. The lawsuit highlights a growing problem with AI: the creation of deepfake nude images from regular photos.

What happened

Someone used real, clothed photos of the girls to generate the explicit images. The perpetrator allegedly posted nude photos of 18 underage girls to Discord. The victims reported experiencing stress, anxiety, and humiliation, fearing judgment from their peers who might believe the images were real. The lawsuit argues that AI is co-creating the images, thus circumventing Section 230 protections, which generally shield internet companies from liability for user-generated content.

Sources