Deepfake case study · Image
Individuals whose images were used to create 'nudify' deepfakes, including celebrities like…
Meta's platforms (Facebook, Instagram, etc.) displayed ads for 'nudify' apps that create sexually explicit deepfakes from uploaded images. The ads targeted men aged 18-65 and were active in the US, EU, and UK. Some…
- Incident date
- Jun 2025
- Target
- Individuals whose images were used to create 'nudify' deepfakes, including celebrities like Scarlett Johansson and Anne Hathaway. Also, potentially minors if age verification is not in place.
Updated May 6, 2026 · 1 min read
Meta's platforms (Facebook, Instagram, etc.) displayed ads for 'nudify' apps that create sexually explicit deepfakes from uploaded images. The ads targeted men aged 18-65 and were active in the US, EU, and UK. Some ads used deepfake images of celebrities. The apps sometimes required payment to access 'exclusive' features. The generated content often bypassed age verification measures. These apps were available on Meta platforms and sometimes even Apple's app store.