The Biden Deepfake Robocall (New Hampshire, 2024)
Cloned audio of President Biden told New Hampshire Democrats not to vote in the primary. The first high-profile AI robocall of the 2024 US election cycle and the case that triggered FCC action on AI voice calls.
- Incident date
- Jan 2024
- Target
- New Hampshire voters / Biden campaign
- Outcome
- FCC rules AI voices in robocalls illegal; $6M fine against operator
On the evening of January 21, 2024, telephones across New Hampshire rang with a cloned-voice robocall. The voice on the line sounded unmistakably like President Biden — warm, familiar, direct — and told listeners that voting in Tuesday's primary "only enables the Republicans in their quest to elect Donald Trump again." It urged them to save their vote for November.
It was, in fact, entirely synthetic.
The attack
A single voice clone built from publicly available Biden audio, delivered via a cheap automated-calling service. Total estimated cost: under $1,000. Reach: an estimated 5,000–25,000 phone numbers in New Hampshire.
The message was crude — it used Biden's "malarkey" catchphrase in a way real Biden speeches typically didn't — but the voice itself was convincing enough that local election officials initially had trouble confirming it was fake without forensic review.
Detection signal
Short robocall audio is among the harder deepfake surfaces: clips are brief, phone codecs compress aggressively, and the bandpass filtering strips much of the spectrum detection models rely on. Despite these conditions, audio deepfake detectors were able to flag the recording with high confidence once it reached investigators. The vocoder signature was characteristic of a widely-available open-source voice-cloning pipeline.
Regulatory consequences
The FCC ruled in February 2024 that AI-generated voices in robocalls are illegal under the existing Telephone Consumer Protection Act (TCPA). Simultaneously:
- The New Hampshire AG's office filed felony voter-suppression charges against the organizer.
- A $6M FCC fine was proposed against the originating service provider.
- Multiple states (Texas, California, Michigan, Washington) passed or accelerated deepfake-robocall laws.
The case became the precedent that AI-generated election calls are criminally prosecutable, independent of whether they change voter behavior.
What would have prevented it
For voters: robocall-blocking apps, caller-ID analysis, and a willingness to disbelieve urgent voting-instructions calls from unexpected numbers.
For carriers: STIR/SHAKEN call authentication, AI-voice detection in robocall-filtering pipelines. Several large carriers have added AI-voice flags to their fraud scoring in the year since.
For campaigns: monitoring-services that scan the phone ecosystem for impersonation calls early and coordinate rapid takedowns.