Deepfake Intimate Images Could Target Anyone, Not Just Celebrities, MPs Told

Deepfake Intimate Images Could Target Anyone, Not Just Celebrities, MPs Told
Academic and legal professionals told the Commons heritage committee of their concerns about the Online Harms Act (Bill C-63). Sergey Zolkin/Unsplash.com
Chandra Philip
Updated:

Deepfake intimate images are not just a concern for celebrities and public figures, but for everyone, the House of Commons Standing Committee on Canadian Heritage heard during a meeting looking at the harms of access to explicit material online.

Six academic and legal professionals spoke at the June 13 committee meeting and raised concerns about the accessibility of tools to create fake artificial intelligence (AI) materials in light of the Online Harms Bill (Bill C-63).

“Research shows that this is really catching fire and become an issue in schools,” McGill Law School graduate Shona Moreau told the committee. “In December in Winnipeg, we heard that there was one school where almost 40 young girls were victimized by this technology, and that is a great number. So that’s one story. And I’m sure that there are many more such stories everywhere.”

Ms. Moreau’s colleague and fellow McGill Law School graduate Chloe Rourke said digital platforms need to be held responsible for the circulation of such content.

“Tech platforms such as Google and pornography websites have already created procedures that allow individuals to request non consensual porn of themselves be removed and delisted from their websites,“ she said. ”This is not a perfect solution. Once the content is distributed publicly, it can never be fully removed from the internet, but it is possible to make it less visible and therefore less harmful.”

However, Ms. Rourke raised concerns over the ease of creating fake content with the rise of AI tools.

“If you type in ‘deep nude’ into Google, the first results will get 10 different websites you can go access and it can be done in minutes. It’s possible to make it less visible and less accessible than it is now,” she said.

She said the ease with which deepfake content can be created anonymously means changes to criminal law won’t be effective.

“I think it’s pretty unnerving just how easy and how accessible it is,” she said. “I think that’s why we’re seeing, you know, teenagers use it, and that’s why a criminal remedy would just be inadequate, or even civil remedies are just inadequate considering how accessible this is.”

Rather, Ms. Rourke told the committee that digital platforms should be held accountable for making the content less accessible.

Ms. Moreau said legislators need to think about how the technology could change in the future, and whether laws put in place now will be effective.

“AI is not going away and more work is going to be coming down the pipeline,” she said. “When we’re making legislation now, we have to actually be looking five to 10 years, even sometimes 25 years out.”

General counsel for the Canadian Centre for Child Protection Monique St. Germain told the committee that criminal law cannot be the only tool to battle the problem, and she described the impact such content has on children who are exposed to it.

“It can normalize harmful sexual acts lead to distorted beliefs about the sexual availability of children and increase aggressive behaviour,” she said. “More sexual violence is occurring among children, and more children are mimicking adult predatory behaviour, bringing them into the criminal justice system.”

Bill C-63

Bill C-63 was introduced by the federal government in February to regulate internet content involving sexual exploitation, bullying, deepfakes, and “hateful conduct.”

Besides provisions in the bill to protect children, such as imposing large fines on platforms for failing to make content that exploit children inaccessible, the bill includes other features, such as adding a new standalone hate crime offence that applies to all existing offences, adding the provision of “fear” that someone may commit a hate crime in the future, and increasing penalties for hate crimes. It also allows people to file complaints against others for posting “hate speech” online.

The Liberal government says the bill is needed to protect children and combat harms from content online.

The Conservatives have been critical of some of the provisions in the bill that they say may infringe on Charter rights, while the Bloc Québécois have proposed splitting the bill into two, with the parts about child phornography and publication of non-consensual pornographic material being fast-tracked, and the more controversial aspects being debated more.

The NDP have said the bill should have been proposed sooner, and that some parts of it need to be strengthened to add more protection.

Justice Minister Arif Virani has said he is open to amending the bill.
Matthew Horwood contributed to this report.