Subscribe to the 100% free rdnewsNOW daily newsletter!
FILE — The opening page of X is displayed on a computer and phone, Oct. 16, 2023, in Sydney. (AP Photo/Rick Rycroft, File)

MPs amend bill criminalizing sexual deepfakes to include ‘nearly nude’ images

May 11, 2026 | 2:00 AM

OTTAWA — A House of Commons committee has amended a proposed bill that would criminalize sexual deepfakes to ensure it covers “nearly nude” images.

The change to Bill C-16 comes after experts warned the original version of the bill likely would not cover many of the images created by Elon Musk’s Grok chatbot which proliferated on his X platform at the beginning of this year.

The original version of the bill would have criminalized the non-consensual sharing of images which show the subject nude, exposing their sexual organs or engaged in explicit sexual activity. The images created by Grok — such as edits of photos of women to depict them wearing see-through bikinis — may not meet that standard.

MPs on the justice committee voted in favour of amendments put forward by Conservative MP Andrew Lawton to change the wording of the legislation to address images in which the subject is nude or “nearly nude.”

Lawton told the committee the amendment was based on both witness testimony the MPs heard during committee meetings, and the experience of a friend of his.

“We are seeing with the advancement of technology these very sophisticated and in some cases quite traumatizing assaults taking place. It simply just ensures that a small technicality is not excluding something that I think this law intends to capture,” he said.

Liberal MP Patricia Lattanzio, parliamentary secretary to Justice Minister Sean Fraser, said the amendment “clarifies the scope of the offence, and it aligns with evolving case law and it responds to the needs of the victims.”

The amendment was carried despite the objections of Bloc Québécois MP Rhéal Fortin, who said “nearly nude” was not defined specifically enough in the bill.

Another amendment adds a specific reference to artificial intelligence software in the definition of “intimate image.”

Lawton told the committee the idea was to make sure to the bill captures AI systems.

“We’re trying to ensure that we don’t end up having to come back to the drawing board because this fails to capture the technologies that we’re dealing with here,” he said.

NDP MP Leah Gazan put forward an amendment to the same section of the bill which was not adopted. It looked to target images of women and children which — while they might not depict them fully nude — present them in sexualized or humiliating contexts, such as in transparent bathing suits or covered in blood or bruises.

She said those images amount to acts of violence and are meant to humiliate.

“We cannot leave open loopholes in this legislation that perpetrators can exploit to evade justice,” Gazan said in the committee.

The MPs also voted in favour of an amendment put forward by Lawton aimed at increasing the maximum penalty in cases where the material depicts sexual assault, and an amendment that would impose a 48-hour deadline for images to be taken down.

Lawton said the 48-hour requirement would place the responsibility on tech companies, though a departmental expert told MPs it was not clear whether the amendment would result in images being taken down more quickly or more slowly.

Rosel Kim, senior staff lawyer at the Women’s Legal Education and Action Fund, said in an email that “we are glad that the committee considered the wide-ranging harms that arise out of deepfakes.”

But she noted there is still a lack of clarity about the part of the legislation’s definition that says a deepfake must be likely to be mistaken for “a visual recording of that person.”

“Who decides it is likely to be mistaken as a visual recording — is it the survivor or law enforcement?” Rosel asked.

She added while “we appreciate the clarification on the takedown time frame, we would like to ensure there are adequate resources to ensure it can be a reality.”

The committee last week wrapped up its clause-by-clause process, through which amendments to a bill are made. The bill must make its way through the Senate before it becomes law.

Artificial Intelligence Minister Evan Solomon said Monday Bill C-16 is part of a wider government effort to target deepfakes and non-consensual image sharing, indicating the issue could be addressed in upcoming privacy and online harms legislation.

“We believe that non-consensual sharing of sexualized imagery is violence towards women and we are going to do our job to protect citizens, vulnerable Canadians,” Solomon said at an announcement in B.C.

Deepfakes are only one element of Bill C-16, which would also criminalize coercive control and restore all mandatory minimum imprisonment penalties found unconstitutional by the courts, among other measures.

It’s part of a trio of tough-on-crime bills put forward by the Liberal government that would introduce a long list of changes to the Criminal Code. The bills have yet to be passed by Parliament.

— With files from Chuck Chiang in Vancouver

This report by The Canadian Press was first published May 11, 2026.

Anja Karadeglija, The Canadian Press