Deepfake case study · Image
Taylor Swift, Alexandria Ocasio-Cortez and women with no public profiles deepfake (May 2025)
AI-generated pornographic images of public figures and other women were widely circulated on social media platforms, especially X (formerly Twitter). The 'Take It Down Act' makes it a federal crime to distribute…
- Incident date
- May 2025
- Target
- Taylor Swift, Alexandria Ocasio-Cortez and women with no public profiles
Updated May 6, 2026 · 1 min read
AI-generated pornographic images of public figures and other women were widely circulated on social media platforms, especially X (formerly Twitter). The 'Take It Down Act' makes it a federal crime to distribute sexually explicit images—real or AI-generated—without the subject's consent and allows victims to sue creators and platforms that fail to remove them.