Deepfake case study · Multi-modal
Actors, Recording Artists, Broadcasters, High School Students deepfake (Apr 2025)
SAG-AFTRA is supporting the NO FAKES act to protect against unauthorized use of digital replicas. The bill would establish a federal right in voice and likeness to protect against unauthorized use of digital…
- Incident date
- Apr 2025
- Target
- Actors, Recording Artists, Broadcasters, High School Students
Updated May 6, 2026 · 1 min read
SAG-AFTRA is supporting the NO FAKES act to protect against unauthorized use of digital replicas. The bill would establish a federal right in voice and likeness to protect against unauthorized use of digital replicas. Activision Blizzard used generative AI to test interest in games that never existed. Serene Questworks allegedly replaced its voice cast with generative AI. Sony made use of the technology to turn Horizon series’ protagonist Aloy into an unsettling digital animatronic.