Social Media Giants Earned Nearly $17 Billion from Minors: Senate Inquiry Hears

A Harvard professor says social media platforms will not be motivated to ban children from accessing their platforms.
Social Media Giants Earned Nearly $17 Billion from Minors: Senate Inquiry Hears
A child uses an Apple smartphone in this undated file image. PA
Crystal-Rose Jones
Updated:
0:00

New South Wales Liberal Senator Dave Sharma has raised the question of how much advertising revenue tech companies earn from children under 18.

The issue was raised during the Inquiry Into Online Safety Amendment (Social Media Minimum Age) Bill 2024—Provisions on Nov. 25. The Bill will restrict under 16 year olds from accessing social media, bar a few exceptions.

Sharma referenced a Harvard University study from the U.S. that found the largest social media platforms—Facebook, Instagram, Snapchat, TikTok, X, and YouTube—made a total of $16.8 billion (US$11 billion) from underage users in 2022.

Bryn Austin, professor in the Department of Social and Behavioral Sciences at Harvard, noted from the research that tech companies had “overwhelming financial incentives to continue to delay taking meaningful steps to protect children.”

The lead author of the Harvard research, Amanda Raffoul, said that social media companies generate “substantial advertising revenue” from young people.

Following the study, which used several measures to calculate the value of children on social media, she called for more transparency and government regulations to ensure the well-being of young people.

In speaking at the Senate inquiry, fellow Liberal Senator Sarah Henderson noted that the Chinese-owned app TikTok had explicitly been designed to be addictive to children.

Sunita Bose, the managing director of Digital Industry Group, was quizzed by Sharma on the net value of Australian child users to the most prominent social media companies in the nation.

“I can’t speak for individual companies in terms of that information,” she responded.

When asked if it was her job to stop legislation preventing children from using social media due to their inherent financial value to the industry, Bose said, “I respectfully disagree with the characterisation of our industry.”

Bose was again asked for specific data on under-13s and under-16s, to which she replied she did not have the information.

She continually spoke of the need to balance keeping children off social media and not infringing upon users’ privacy.

Social media platforms could be beneficial to young people, she said, warning that the exodus of children from popular social media platforms could lead to more dangerous and less regulated online spaces.

Some senators questioned why the industry had wanted more time to implement regulations when TikTok had removed seven million users in 2021 that it suspected were under the age of 13.

“Why is a pause needed if some are already doing the work,” Labor Senator Lisa Darmanin asked.

Bose again turned to the issue of age regulation, saying a trial of age assurance had not yet been completed.

When Bose mentioned that children should also be taught how to navigate online spaces, Tasmanian Jacqui Lambie Network Senator Jacqui Lambie said she felt tech companies were offloading their responsibilities.

“If you really [care] about our kids, why don’t you just put in place the algorithms that'll fix this?” she said.

“Why don’t you do the right thing and fix it yourselves?”

Meanwhile, People First Party Senator Gerard Rennick expressed some concerns the ban could amount to excess government interference in online life.

On the Fence

Speaking to ABC Radio National, Independent Senator David Pocock said he was still undecided about whether to support the bill.

“I’m keen to see how they propose it’s actually going to work. Again, I really support this in principle. This is something that we have to confront as a society,” he said.

He also noted that it seemed to be “policy on the run.”

“This is a big problem, but we actually need an ecosystem approach,” he said.

“We need a ban to go hand-in-hand with the digital duty of care ... and what I’ve seen in the misinformation and disinformation bill, which has now been shelved, is an unwillingness to actually tackle some of the root causes of social media harm, and that is around the algorithm, where these big social media companies don’t want us to know how they’re actually running their businesses.”

Pocock said he didn’t believe the three-day inquiry was enough to study the full implications of social media issues.

Crystal-Rose Jones
Crystal-Rose Jones
Author
Crystal-Rose Jones is a reporter based in Australia. She previously worked at News Corp for 16 years as a senior journalist and editor.