European Suggestions Lack Effectiveness in Combating Unconsenting Depictions in Deepfake Pornography
In the digital age, the rise of deepfake technology poses a significant threat, particularly in the form of nonconsensual pornography. EU policymakers are seeking ways to better protect victims of this disturbing trend.
The EU AI Act, set to enter into force on 1 August 2024, mandates transparency for AI-generated content, including deepfakes, by requiring clear disclosure that the content is artificially created or manipulated. This helps reduce misinformation and empowers users to recognize synthetic media. However, the EU currently lacks specific criminal laws that penalize the nonconsensual creation and distribution of deepfake pornography.
Learning from Denmark’s example, EU policymakers could amend copyright and personal rights laws to explicitly give people control over their own likenesses, voices, and digital representations, criminalizing nonconsensual use and mandating takedown of such content. Introducing penalties such as fines or imprisonment and imposing rapid takedown obligations on platforms could strengthen enforcement.
Before new laws are in place, the EU could promote codes of conduct or ethical guidelines for AI developers and platform providers. These guidelines could encourage voluntary best practices such as mandatory labeling or watermarking of synthetic or manipulated intimate content, robust and rapid notice-and-takedown procedures for nonconsensual deepfake pornography, and user education initiatives to raise awareness about deepfake harms and risks.
Support for self-regulation and industry collaboration would serve as practical interim protection, where platforms adopt consistent standards to detect and remove abusive deepfake content quickly. Policymakers can also fund and encourage technological tools that detect deepfakes and assist victims in reporting and removing content.
Europol recommends the EU to invest in deepfake detection systems. However, the AI Act's compliance requirements for law enforcement AI tools used for deepfake detection should be exempted to encourage the development and use of deepfake detection tools.
The proposed bill currently does not cover nudity that is not explicitly sexual nor sexual imagery that is not wholly nude. Article 7b of the proposed bill covers material that users produce or manipulate to make it appear as though another person is engaged in sexual activities.
In summary, a multi-pronged approach involving clear legal prohibitions, transparency requirements, platform accountability, and soft law codes is essential to effectively protect victims of nonconsensual deepfake pornography in the EU. Meanwhile, soft law mechanisms like codes of conduct and voluntary industry standards can offer immediate, though limited, protection until robust legislation is adopted and enforced.
- The EU AI Act, due to be implemented in 2024, will enforce transparency for AI-generated content, including deepfakes, through clear disclosure of artificial creation or manipulation, aiming to combat misinformation and empower users.
- To protect victims of nonconsensual deepfake pornography more effectively, EU policymakers could draw inspiration from Denmark's approach, amending copyright and personal rights laws to outlaw nonconsensual use and mandate takedown of such content.
- As an interim solution, the EU could advocate for codes of conduct or ethical guidelines for AI developers and platform providers, encouraging voluntary best practices in deepfake detection and removal, user education, and notice-and-takedown procedures.
- Europol advises the EU to invest in deepfake detection systems, yet the AI Act's compliance requirements for law enforcement AI tools used for deepfake detection should be exempted to foster the development and use of these tools.
- The proposed bill does not currently address nudity that is not sexual or sexual imagery that is not wholly nude, while Article 7b covers material manipulated to make it seem as though another person is engaged in sexual activities.