A draft report by Green MEP Sergey Lagodinsky proposes amending the European Union’s AI Omnibus proposal to prohibit artificial intelligence applications that generate nonconsensual sexual content. The German lawmaker, who leads the file for the legal affairs committee, published the draft Monday.
The document includes an amendment to the AI Act that would create “an explicit ban on AI systems that create sexualised content, such as fake nude photos, without the consent of those affected, or that promote sexual and gender-based violence,” according to Lagodinsky.
The proposal follows controversy in January when X’s integrated chatbot Grok was used to create nude images of millions of women and some minors. The European Commission is now investigating X and Grok for potential violations of the Digital Service Act, though critics said the response came too late.
The amendment would add nudification to Article 5 of the AI Act, which outlines prohibited AI practices in the EU and took full effect in February 2025. Because the article is already in force, any new ban would “take immediate effect and be directly applicable across all member states” if included in the final legislative text, Lagodinsky said.
The AI Omnibus was proposed by the European Commission in November as part of efforts to simplify regulatory requirements for AI in the EU. Parliament justified adding the amendment because the commission was supposed to review prohibited AI practices annually under the AI Act but failed to do so during the last assessment period.
Irish Renew MEP Michael McNamara, who leads the AI Omnibus file for the Civil Liberties, Justice and Home Affairs Committee, has also expressed interest in classifying non-consensual generation of intimate images as a prohibited practice. Spanish S&D MEP José Cepeda said he “fully support[s] the proposal put forward by the rapporteur to explicitly ban the creation and use of non-consensual sexualised images under the AI Act.”
The legal affairs committee will vote on the draft report Feb. 24.







