Deepfake Law in Australia
Australia criminalized non-consensual sexual deepfakes federally in 2024. The eSafety Commissioner has takedown authority; the proposed Privacy Act reform would extend biometric-data protections to AI-generated imagery.
- Status
- enacted
- Jurisdiction
- Australia
- Effective
- Sep 2024
- Statute
- Criminal Code Amendment (Deepfake Sexual Material) Act 2024
Australia passed federal criminal legislation against non-consensual sexual deepfakes in 2024, adding to an already-active eSafety Commissioner takedown regime under the Online Safety Act 2021.
Key provisions
Criminal Code Amendment (Deepfake Sexual Material) Act 2024. Criminalized creating or sharing non-consensual sexual deepfake material of a person aged 18 or over. Penalties up to six years imprisonment (seven years for aggravated cases including repeat offenses). Child-sexual-abuse-material offenses already covered AI-generated content; the 2024 Act specifically addressed adult non-consensual cases.
Online Safety Act 2021. The eSafety Commissioner has world-leading takedown authority — can order content removed by platforms within specified timeframes, with financial penalties for non-compliance. Deepfake sexual material is within scope.
Privacy Act (reform pending). Draft amendments would expand biometric-data protections and clarify application to synthetic content depicting identified persons. Expected enactment 2026.
Existing fraud statutes. Commonwealth and state fraud laws apply to deepfake-enabled CEO fraud and identity impersonation.
eSafety enforcement
The eSafety Commissioner is internationally recognized for effective platform enforcement:
- Takedown notices with 24-hour compliance windows.
- Financial penalties on non-compliant platforms.
- Coordination with international counterparts (Ofcom UK, FTC US, DPAs in Europe).
The 2024 Criminal Code amendment gave eSafety additional authority specifically around deepfake sexual content.
Practical implications
For organizations operating in Australia:
- Platforms: eSafety takedown obligations are strict; compliance infrastructure must include rapid-response capability.
- Creators of deepfake content: criminal exposure for non-consensual sexual material is serious; prosecutions have begun.
- Enterprises: moderate compliance burden; eSafety regime is predictable but strict.