Australian Authorities Fine X $610,000 for Failing to Tackle Child Sexual Abuse Materials

‘X has stated publicly that tackling child sexual exploitation is the number one priority, but it can’t just be empty talk,’ said eSafety Commissioner.
Australian Authorities Fine X $610,000 for Failing to Tackle Child Sexual Abuse Materials
The new Twitter logo rebranded as X, and the old Twitter bird logo reflect in smartphone screens in Paris, France, on July 27, 2023. Joel Saget/AFP via Getty Images
Alfred Bui
Updated:
0:00

Social media giant X, formerly known as Twitter, has been issued with a $610,500 (US$385,000) fine for failing to eradicate child sexual exploitation materials on its platform.

The Australian eSafety Commissioner (eSafety) has released a report (pdf) revealing how major social media companies have complied with the federal government’s requirements to crack down on child sexual abuse materials (CSAM) in February.

On Feb. 22, eSafety issued a set of notices to Google, X, TikTok, Discord, and Twitch to better understand the measures these companies had taken to address the growing risk of child sexual exploitation and abuse.

The commissioner later found that X and Google did not comply with the notices, and considered X’s non-compliance to be more serious.

It then issued X with an infringement notice for $610,500 and granted the company 28 days to request the withdrawal of the infringement notice or pay the penalty.

“This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion, and we need them all to do better,” eSafety Commissioner Julie Inman Grant said.
“What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children, and the community expects every tech company to be taking meaningful action.”

X’s and Google’s Non-Compliance

According to eSafety, X did not answer some questions in the notice while providing an incomplete or inaccurate response in other instances.

For example, the company did not explain how much time it needed to respond to reports of child sexual exploitation, the measures it had implemented to detect child sexual exploitation in livestreams, and the tools and technologies it used to detect those abusive materials.

In addition, X did not adequately tell how many safety and public policy staff were still employed after tech billionaire Elon Musk acquired the platform in October 2022 and implemented several rounds of job cuts.

A photo illustration of the new Twitter logo in London, England, on July 24, 2023. (Dan Kitwood/Getty Images)
A photo illustration of the new Twitter logo in London, England, on July 24, 2023. Dan Kitwood/Getty Images

eSafety found that in the three months after X’s acquisition, the proactive detection of CSAM on the platform plunged from 90 to 75 percent.

The company later said it had improved the proactive detection rate in the first quarter of 2023 but did not provide specific details.

Meanwhile, Google was issued a formal warning for its non-compliance with the notice.

Specifically, eSafety said Google provided generic responses and aggregated information to many specific questions in the notice.

At the same time, it found that the company did not implement technologies to block, detect or prevent CSAM on several of its services (Google Chat, Gmail, Meet and Messages).

A teenage child looks at a screen of age-restricted content on a laptop screen in London, England, on Jan. 17, 2023. (Leon Neal/Getty Images)
A teenage child looks at a screen of age-restricted content on a laptop screen in London, England, on Jan. 17, 2023. Leon Neal/Getty Images

Ms. Inman Grant said it was disappointing to see X and Google fail to comply with the government’s direction on the issue.

“Twitter/X has stated publicly that tackling child sexual exploitation is the number one priority for the company, but it can’t just be empty talk. We need to see words backed up with tangible action,” she said.

“If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation, they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinise their own operations.

“Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”

The State of Child Sexual Abuse in Australia

According to the Australian Bureau of Statistics’ latest Personal Safety Survey, 11 percent of Australian women (1.1 million) experienced sexual abuse during childhood, while the figure for men was 3.6 percent (343,500).
Furthermore, analysis of Recorded Crime–Victims data showed that among 144,797 victims of sexual assault between 2014 and 2019, 83 percent were female, while 63 percent were under 18 years old.

It is also worth noting that less than half of the victims reported the incident to police within a week of the incident occurring.

Meanwhile, there has been a significant increase in online child sexual exploitation cases in Australia.

In 2021, the Australian Centre to Counter Child Exploitation (ACCCE) received more than 33,000 reports of online child sexual exploitation, up from 21,140 in 2020 and 13,368 in 2019.

Over 217,000 media files related to reports of child sexual abuse and exploitation were also submitted to the ACCCE during the period.

The Australian Federal Police charged 237 individuals with 2,032 alleged child abuse-related offences and removed 114 children from harm.

Alfred Bui
Alfred Bui
Author
Alfred Bui is an Australian reporter based in Melbourne and focuses on local and business news. He is a former small business owner and has two master’s degrees in business and business law. Contact him at [email protected].
Related Topics