A defence expert says it is still unclear how Australian authorities will be able to force the CCP-linked social media app TikTok to comply with a ban on social media use for under-16s.
The ban threatens the long-term viability of some apps, particularly TikTok, which has effectively captured the post-1990s and 2000s demographic.
Yet the exact enforcement mechanism for the ban is being left up to Big Tech companies to work out themselves, a situation that Michael Shoebridge, director at Strategic Analysis Australia, says is “absurd.”
“I think the social media ban will be very difficult for TikTok, more difficult than for some of the other big social media companies, because the Australian government’s model—which we haven’t seen too much of—seems to leave compliance up to the companies themselves,” he told The Epoch Times.
This would effectively mean relying on the Chinese Communist Party (CCP) to regulate its own app for the benefit of young Australians.
“So how the Australian government will be able to satisfy itself that TikTok is doing things that are in Australia’s national interest, and not in the Chinese regime’s interest, is mystifying,” Shoebridge said.
“I think it just adds pressure on Tiktok and the Australian government to face the fundamental problem of the Chinese Communist Party’s reach into that platform and its data.”
TikTok’s Impact on Youths
Earlier this month, the Standing Committee on the Canada–People’s Republic of China Relationship found promoting TikTok to teenagers was part of the CCP’s “no rules” approach to global influence.TikTok itself has also been heavily criticised for collecting large volumes of data from users, including accessing phone cameras, microphones, contacts, and location using GPS.
In 2022, a study conducted by the U.S.-based Center for Countering Digital Hate, set up fake accounts posing as 13-year-olds in western nations, including Australia.
For the study, they also set up two separate accounts with one fake user appearing more “vulnerable,” compared to the other interacting with regular content.
TikTok’s algorithm was three times more likely to show the “vulnerable” accounts content about eating disorders, suicide, and self-harm.