Australia Mandates Apple, Google to Stop Terror and Child Sexual Abuse on Their Platforms

‘There is, of course, no Australian internet, so these standards will require changes by companies no matter where they are headquartered,’ Inman Grant said.
Australia Mandates Apple, Google to Stop Terror and Child Sexual Abuse on Their Platforms
A teenager uses her mobile phone to access social media in New York City, on Jan. 31, 2024. Spencer Platt/Getty Images
Monica O’Shea
Updated:

Global tech giants will be forced to tackle child sexual abuse material and pro-terror content on their platforms under two new standards to be enforced by Australia’s eSafety commissioner.

From Dec. 22, tech companies Apple, Google, Microsoft, and Meta will be required to take “meaningful steps” to stop this harmful content being stored, distributed or generated under the ‘world first’ standards.

In what eSafety Commissioner Julie Inman Grant described as a world-first, “nudify apps” and AI tools used to create pornography are also explicitly targeted under the standards.

Inman Grant said the standards were a significant step forward in the battle to protect children online and could have global implications for tech firms.

“Today marks a very important day in the global fight to protect children online and Australia is taking a leading role,” she said.

“These standards will be enforceable and require industry to take meaningful steps to prevent their platforms and services from being used to solicit, generate, store, and distribute the most reprehensible and harmful online material imaginable, including child sexual abuse.”

The new standards are known as the Designated Internet Services (DIS) and Relevant Electronic Services (RES) standards.

They cover file and photo storage services including Apple iCloud, Google Drive, and Microsoft OneDrive, along with chat and messaging services like Meta’s WhatsApp.

Inman Grant said the companies that own and operate these services must take responsibility and deter their “misuse.”

“There is, of course, no Australian internet, so these standards will require changes by companies no matter where they are headquartered.

“We know cloud-based file and photo storage services are often used by those who store and share child sexual abuse material and we also know many popular messaging services are also used by these predators to distribute this material, too.

How Did this Power Come About?

The eSafety Commissioner has the power to create and enforce these standards under authority granted under the Online Safety Act 2021.

This provides the commissioner with the tools to set rules for online safety and hold tech companies accountable for harmful content online.

“The two standards, developed by the eSafety Commissioner and registered with the Australian Parliament in June, reached the end of their Parliamentary Disallowance period on Nov. 18 and will come into force on Dec. 22,” the eSafety commissioner said.

Given no motion passed the Parliament to disallow the standards, they became legally binding in late December.

Companies that fail to comply with directions from the eSafety Commissioner under the Online Safety Act can receive hefty fines of nearly $800,000 per day (US$498,000) (pdf).

Privacy Concerns Raised

Apple expressed privacy concerns with the standards in January 2022, while acknowledging issues with the proliferation of abhorrent child sexual abuse material and pro-terror content.
“We are committed to keeping our users safe and we will continue to invest in technologies that do just that. Yet, we have serious concerns that the draft standards pose grave risks to the privacy and security of our users and set a dangerous global precedent,” the tech giant said in a submission to the draft standards (pdf).

It raised concerns that eSafety could require providers to build backdoors into end to end encrypted services to monitor data they were currently unable to access.

“We believe there are alternative ways to achieve the goal of combatting abhorrent content that do not require undermining the privacy and security of all Australians and we urge eSafety to allow providers flexibility to pursue those means.”

Extension on Online Pornography Content

Meanwhile, the eSafety Commissioner also revealed it had granted an extension to the tech industry on draft industry codes that protect children from online pornography and other high-impact content.
The eSafety commissioner explained this extension was granted in order to provide industry time to consider the new social media ban for youths under 16 years old.

“While much of the public focus has been on the new social media age restrictions legislation, this upcoming regulation is just one interconnecting element of a holistic approach to keeping children safe online which includes continued digital literacy for children and empowerment of parents,” Inman Grant added.

“Our codes and standards, our transparency powers and the age assurance trial currently underway can all support social media age restrictions to provide an umbrella of protection for children and young people.”

The eSafety commissioner is working closely with the government to implement the under-16s social media ban, which will come into effect within 12 months.

Monica O’Shea
Monica O’Shea
Author
Monica O’Shea is a reporter based in Australia. She previously worked as a reporter for Motley Fool Australia, Daily Mail Australia, and Fairfax Regional Media.
Related Topics