The Australian Parliament will continue to put pressure on Big Tech companies by launching an inquiry into “toxic material” on social media platforms.
“We also want to hear from Australians; parents, teachers, athletes, small businesses and more, about their experience, and what needs to change.”
Communications Minister Paul Fletcher said revelations by Facebook whistle-blower Frances Haugen “amplified existing concerns” in the community.
“This inquiry will give organisations and individuals an opportunity to air their concerns, and for Big Tech to account for its own conduct,” he said.
Haugen has disclosed tens of thousands of internal Facebook documents including knowledge that the platform was unhealthy for minors.
David Coleman, the assistant minister to the prime minister for mental health, said there over the years there had been a steady increase in mental health issues from young Australians.
“While the reasons for this are varied and complex, we know that social media is part of the problem,” he said.
Coleman pointed to a Headspace survey in 2018 of 4,000 individuals aged 12 to 25, who nominated social media as the main cause behind mental health issues getting worse.
Big Tech, however, can avoid accountability if they reveal the identity of individuals responsible for those comments.
“In a free society such as Australia, where we value our free speech, it is only free when that is balanced with the responsibility for what you say,” Morrison said.
“Free speech is not being allowed to cowardly hide in your basement and sledge and slur and harass people anonymously and seek to destroy their lives.”
The government’s moves come after the High Court of Australia handed down a ruling in September that social media companies were not responsible for defamatory comments on their platforms.
Concerns have been raised however that targeting online “trolls” is not the most effective way to deal with toxic content online.
“The most pressing problem here is not trolls; it is the disproportionate reach of their content enabled by the algorithms of social media companies that prioritise sensational, outrageous and conspiratorial content—the form which defamatory content usually takes,” Chris Cooper of Reset Australia said in a statement.