Detect Deepfakesby Resemble AI
Deepfake case study · Image

Why London-area MP Andrew Lawton pushed AI deepfake amendments - London Free Press

London-area MP Andrew Lawton pushed for amendments to Bill C-16 to address gaps in the law around sexual deepfakes and strengthen protections for victims after a friend was victimized.

Incident date
May 2026
Target
Andrew Lawton's friend
Updated May 15, 2026 · 1 min read

In May 2026, London-area MP Andrew Lawton pursued amendments to Bill C-16, the Protecting Victims Act, to target gaps in the law around sexual deepfakes. The amendments aim to strengthen protections for victims of AI-generated sexual content.

What happened

Lawton's decision to pursue these amendments was prompted by a personal experience: a friend of his was victimized when AI technology was used to create an image that removed most of her clothing from a photo. This incident highlighted the potential for AI to be used to create sexually explicit images of individuals without their consent. One amendment would require social media companies to remove illegal images from their platforms within 48 hours. Another amendment would expand the definition of the images to include “nearly nude” images.

Sources