Australia Passes Laws to Criminalise the Creation, Spread of Explicit Deepfakes

Offenders making and sharing sexually explicit deepfake material could spend up to seven years behind bars as new legislation is passed in Parliament.
Australia Passes Laws to Criminalise the Creation, Spread of Explicit Deepfakes
A computer user is silhouetted with a row of computer monitors at an Internet cafe in Shenyang, Liaoning on Jan. 28, 2008. AP Photo
Crystal-Rose Jones
Updated:
0:00

People who make and share sexually explicit deepfake content now face imprisonment, after new laws passed Australia’s federal parliament.

The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 was introduced in June and accepted on Aug. 21.

The bill will impose serious criminal penalties on those using artificial intelligence software and apps to generate fake pornographic content using a victim’s likeness or superimposing their face on explicit material.

On introducing the bill, Australia’s Attorney-General Mark Dreyfus women and girls were often targeted and degraded.

The bill will strengthen existing Commonwealth Criminal Code offences and introduce a new aggravated criminal offence for sharing such content.

“These offences will be subject to serious criminal penalties of up to six years’ imprisonment for sharing of non-consensual deepfake sexually explicit material,” Dreyfus said in a statement.

“Where the person also created the deepfake that is shared without consent, there is an aggravated offence which carries a higher penalty of seven years’ imprisonment.”

The law will also encompass the sharing of real images that have been shared without consent.

“The new criminal offences are based on a consent model to better cover both artificial and real sexual material,” Labor Senator Murray Watt told the Senate on Aug. 21.

In response, Shadow Attorney-General Michaelia Cash expressed concern about parts of the bill, including that victims face cross-examination in court.

The deepfake legislation builds on other actions taken by the government to curb cyber bullying and harm.

These include increased funding for the eSafety commissioner, reviewing the Online Safety Act a year ahead of schedule, and committing to address practices such as doxxing.

Increased technological capabilities have led to an explosion in cases of deepfake images online, some of which are not sexually explicit, but many are.

A parliamentary inquiry found 90 to 95 percent of deepfakes involved non-consensual pornography, and 99 percent of victims were women.

eSafety Commissioner Julie Inman Grant said apps allowing the creation of such content were becoming common.

South Australian Senator Kerrynne Liddle told Parliament a deepfake could be created in as little as two seconds.

“Deepfake imagery can be career-harming for adults, but when used by criminals against our children the consequences can be and have been deadly,” she said.

“Australia’s eSafety commissioner has estimated deepfake imagery has soared by as much as 550 per cent year on year since 2019.”

Liddle said that as shadow spokesperson for child protection, she had heard cases involving criminals extorting money with deepfakes.

“Research released in June found one in seven adults, or 14 percent, has had somebody threaten to share their intimate images,” she said.

“And in a worldwide survey involving Australia that found more than 70 percent of people don’t know what a deepfake is, then we need to do more to educate Australians.

“We need to ensure that young people understand how wrong it is and the harm it causes. We must do more to ensure children do not become victims or, indeed, perpetrators.”

Crystal-Rose Jones
Crystal-Rose Jones
Author
Crystal-Rose Jones is a reporter based in Australia. She previously worked at News Corp for 16 years as a senior journalist and editor.
Related Topics