EU Targets Algorithms of YouTube, TikTok, Snapchat Amid Concerns of Harmful Content

The European Commission has asked the platforms to disclose how their algorithms recommend more content as part of an inquiry.
EU Targets Algorithms of YouTube, TikTok, Snapchat Amid Concerns of Harmful Content
The TikTok app on a phone in New York City in this photo illustration on March 13, 2024. Michael M. Santiago/Getty Images
Tom Ozimek
Updated:
0:00

The European Commission has sent official requests to YouTube, Snapchat, and TikTok, demanding more information on how their secretive algorithms recommend content to users amid an inquiry into the role these recommender systems may play in amplifying content considered harmful.

In a statement released on Oct. 2, the commission said the inquiry, launched under the European Union’s Digital Services Act (DSA), seeks to assess the potential risks associated with the platforms’ recommendation algorithms. It said the risks include the negative effects on users’ mental health and the spread of harmful content, which can result from the way these algorithms are designed to maximize engagement.
“YouTube and Snapchat must detail their algorithms’ parameters and risk amplification,” the commission said in a statement. “TikTok must explain measures against manipulation and risk mitigation.”

Specifically, the commission wants YouTube and Snapchat to disclose exactly how their respective recommendation algorithms function and the parameters that guide content selection. Both platforms have also been asked to provide detailed information about the role their algorithms play in amplifying risks related to civic discourse, electoral integrity, protection of minors, and mental health—in particular around addictive behavior and content “rabbit holes.”

The commission raised concerns about how TikTok’s algorithms might be used to influence public opinion or spread disinformation, particularly during elections. TikTok has been queried on its measures to prevent manipulation by malicious actors and to mitigate risks in the context of elections, media pluralism, and civic discourse.

Under the DSA, platforms with more than 45 million monthly active users in the European Union are required to implement strict user protection measures. As part of these measures, these large online platforms must assess the risks their systems pose to users, particularly around harmful content and user safety. They must also take action to mitigate these risks, with noncompliance with the DSA potentially resulting in significant fines.

YouTube, Snapchat, and TikTok are required to submit detailed responses by Nov. 15.

If they fail to comply—or if they provide incomplete or misleading information—the commission could initiate formal legal proceedings, possibly leading to penalties.

The Epoch Times has reached out to the three platforms with requests for comment on the commission’s inquiry.

In response, a spokesperson for Snapchat said in an emailed statement that the company has received the commission’s request and will cooperate “to provide the necessary information.”

A YouTube spokesperson told The Epoch Times in an emailed statement that the platform is working closely with the commission to ensure that it “appropriately complies” with the DSA. The spokesperson added that YouTube has invested in products and systems to help protect the platform’s users, “whether that’s tackling disinformation or supporting digital wellbeing and mental health.”

“Our recommendations system plays an important role in this, by making it easier for viewers to find high quality content on sensitive topics like news and health,” the spokesperson added.

The commission’s latest move is part of a broader effort to regulate different aspects of Big Tech in the European Union. Previous noncompliance proceedings have already been initiated against platforms such as Meta’s Facebook and InstagramAliExpress, and TikTok over the way their respective recommender systems function. These actions reflect the European Union’s increasing focus on the ways in which digital platforms influence public discourse and user behavior.

The formal proceedings against TikTok are linked to several areas, including the platform’s protection of minors, transparency in advertising, and data access for researchers. The investigation stems from concerns about TikTok’s system design, particularly how its algorithms may foster addictive behaviors and expose users—particularly minors—to harmful content.

“The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users—young as well as old,” Margrethe Vestager, executive vice president for A Europe Fit for the Digital Age and Competition at the European Commission said in a statement.

Regarding AliExpress, the commission’s investigation focuses on multiple areas of concern, including the platform’s handling of illegal content, consumer protection, content moderation, and transparency in advertising and companies’ recommendation algorithms.

Meta faces formal proceedings focused on the possibility that its algorithms and system design could foster addictive behavior in children, leading to so-called “rabbit hole” effects. These are patterns of content consumption in which users, particularly minors, are drawn deeper into viewing similar, potentially harmful content.

Update: This article has been updated with comments from Snapchat and YouTube.
Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter