Detect Deepfakesby Resemble AI
Deepfake case study · Audio

When Your Voice Becomes the Product - TALKERS magazine

OpenAI faced allegations from Scarlett Johansson, who claimed their voice assistant sounded "eerily similar" to her after she declined to license her voice.

Incident date
May 2026
Target
Scarlett Johansson
Updated May 15, 2026 · 1 min read

The dispute between Scarlett Johansson and OpenAI highlights the evolving legal landscape surrounding AI-generated voices. Johansson alleged that OpenAI created a voice assistant that sounded "eerily similar" to her own after she declined the company's request to license her voice. OpenAI denied intentionally imitating her and paused the voice after backlash.

What happened

The core of the issue is that while a voice is not generally protected by copyright law like a song recording, a recognizable voice may trigger claims involving the right of publicity, false endorsement, unfair competition, or misappropriation of identity. The legal risk shifts from copying audio to exploiting identity. This has implications for broadcasters, podcasters, advertisers, and AI companies experimenting with synthetic voices. If listeners reasonably believe a celebrity endorsed or authorized the content, the legal exposure changes dramatically. The case exposes a legal gray area many creators misunderstand, with potential implications for anyone using AI to generate celebrity-adjacent voices.

Sources