eSafety Sets Deadline for Online Industry to Protect Children from Pornography

Online companies are given 6 months to come up with enforceable codes to prevent children from accessing graphic pornography.
eSafety Sets Deadline for Online Industry to Protect Children from Pornography
A child does homework using his laptop in Havana, Cuba, on Sept. 5, 2021. (Katell Abiven/AFP via Getty Images)
Alfred Bui
7/2/2024
Updated:
7/2/2024

Australia’s internet content regulator has given online companies six months to devise enforceable codes to prevent children from accessing graphic pornography.

In its latest effort to crack down on pornography and harmful online content, the eSafety Commission has issued notices to key industry members, requesting them to come up with a preliminary draft of the codes by Oct. 3.

Online companies must also submit the final codes no later than Dec. 19.

The proposed codes will cover a wide range of services, including apps, app stores, websites, search engines, social media services, and hosting services.

The new requirement also applies to internet service providers, instant messaging, SMS, chat, multi-player gaming, online dating services, and equipment providers.

eSafety said while the codes would focus on pornography, they also needed to cover other high-impact materials, such as themes of suicide and severe illness, including self-harm and disordered eating.

In addition, the codes will need to provide Australian consumers with options to manage their exposure to certain types of materials on the Internet.

eSafety Commissioner Julie Inman Grant raised the alarm that children were getting exposed to pornography at increasingly younger ages (often by accident) due to the pervasiveness of those materials in the online space.

“Our own research shows that while the average age when Australian children first encounter pornography is around 13, a third of these children are actually seeing this content younger and often by accident,” she said.

“It’s not just porn sites we are talking about here, with 60 percent of young people telling us they were exposed to pornography on social media.

“This exposure was often unintentional and happened on popular services including TikTok, Instagram, and Snapchat.”

While the commissioner acknowledged the role of parents in protecting children from harmful materials, she said the online industry also had a responsibility.

“Kids’ exposure to violent and extreme pornography is a major concern for many parents and carers, and they have a key role to play both from a protective and educative standpoint,” Ms. Inman Grant said.

“But it can’t all be on them, we also need industry to play their part by putting in some effective barriers to protect children.”

eSafety said that online companies could step up their efforts by introducing measures to verify users’ age, and implement parental controls and tools to filter or blur unwanted sexual content.

eSafety’s deadline comes just two months after the online regulator rolled out guidelines on dealing with harmful content online for Australian internet users in the aftermath of the church stabbing accident in Sydney.

After the incident occurred, eSafety requested social media companies to remove the video of the stabbing from their platforms, raising concerns about freedom of expression in Australia.

However, in a parliamentary inquiry hearing in June, Ms. Inman Grant denied that eSafety was operating as an online censor.
Alfred Bui is an Australian reporter based in Melbourne and focuses on local and business news. He is a former small business owner and has two master’s degrees in business and business law. Contact him at [email protected].