UK Schools Told to Remove Children's Photos as Criminals Use AI to Create Explicit Images -…
UK schools have been advised to remove student photos from websites due to blackmailers using AI to create explicit images, demanding money to prevent publication.
- Incident date
- May 2026
- Target
- Unnamed U.K. secondary school
U.K. schools have been advised to remove students’ photos from their websites because criminals are using AI to create sexually explicit images. Child safety experts and the U.K.’s National Crime Agency (NCA) have warned that schools are facing blackmail threats.
What happened
An unnamed U.K. secondary school was recently targeted in a blackmail attempt. Criminals took photos of students from the school’s website or social media accounts and used AI tools to create child sexual abuse material (CSAM). The offenders then sent the images to the school and threatened to release them online unless they were paid. The IWF says that 150 images linked to the blackmail attempt could be classified as CSAM under U.K. law. Child safety experts are now urging schools to remove pictures showing pupils’ faces from websites and social media because of what they describe as an emerging threat.