Australia Orders Tech Giants Apple, Microsoft, Snap and Meta to Step Up Actions Against Child Abuse Material

Australia Orders Tech Giants Apple, Microsoft, Snap and Meta to Step Up Actions Against Child Abuse Material
Three screens display the splash page for the Meta page on the Facebook website in London, England, on Oct. 29, 2021. Leon Neal/Getty Images
Alfred Bui
Updated:

Australian authorities have ordered global tech giants to report on the actions they have taken to stop the spread of child sexual exploitation materials on their platforms and will impose penalties on non-compliant companies.

Among the companies receiving legal notices (pdf) from the eSafety Commissioner are Apple, Microsoft, Snap, Omegle and Meta, which owns WhatsApp, Facebook and Instagram.

The move is in accordance with the Online Safety Act 2021, which eSafety Commissioner Julie Inman Grant described as a “world-leading tool.”

The act includes basic online safety expectations detailing minimum requirements that tech companies have to meet if they want to operate in Australia.

For instance, online service providers are expected to minimise harmful materials or activities on their platforms proactively. And if they use encryption, they need to develop and implement processes to detect and address child abuse materials.

“They (the expectations) will help us ‘lift the hood’ on what companies are doing–and are not doing–to protect their users from harm,” Grant said in a statement.

“As more companies move towards encrypted messaging services and deploy features like live streaming, the fear is that this horrific material will spread unchecked on these platforms.”

A logo sits illuminated outside the Microsoft booth at the GSMA Mobile World Congress in Barcelona, Spain, on Feb. 28, 2022. (David Ramos/Getty Images)
A logo sits illuminated outside the Microsoft booth at the GSMA Mobile World Congress in Barcelona, Spain, on Feb. 28, 2022. David Ramos/Getty Images

The eSafety Commissioner said the decision to issue a notice was an information-gathering process.

It also mentioned the factors involved in choosing providers to send notices, including the number of complaints the commissioner received, the reach of online services, or the deficiencies in the provider’s safety practices and terms of use.

Regarding penalties, tech companies could be fined up to $555,000 (around US$380,000) daily if they do not respond to the notices within 28 days.

Tech Companies’ Responses To The Notices

After receiving the notices, Microsoft said the company would give a response, while Meta said it was reviewing the details.

“The safety of our users is a top priority, and we continue to proactively engage with the eSafety Commissioner on these important issues,” a Meta spokesperson said in a statement obtained by AAP.

Since 2015, the eSafety Commissioner has processed over 61,000 complaints about illegal and restricted content, most of which involved child sexual exploitation materials.

In addition, a February report by the commissioner found that 11 percent of Australian teenagers between 14 and 17 years old had been asked by strangers on the internet to share a sexual image of themselves.

The eSafety regulator also noted that there had been a significant increase in reports about child abuse materials since the start of the COVID-19 pandemic as children had more access to the internet during the period.

Alfred Bui
Alfred Bui
Author
Alfred Bui is an Australian reporter based in Melbourne and focuses on local and business news. He is a former small business owner and has two master’s degrees in business and business law. Contact him at [email protected].
Related Topics