The European Commission has sent official requests to YouTube, Snapchat, and TikTok, demanding more information on how their secretive algorithms recommend content to users amid an inquiry into the role these recommender systems may play in amplifying content considered harmful.
Specifically, the commission wants YouTube and Snapchat to disclose exactly how their respective recommendation algorithms function and the parameters that guide content selection. Both platforms have also been asked to provide detailed information about the role their algorithms play in amplifying risks related to civic discourse, electoral integrity, protection of minors, and mental health—in particular around addictive behavior and content “rabbit holes.”
The commission raised concerns about how TikTok’s algorithms might be used to influence public opinion or spread disinformation, particularly during elections. TikTok has been queried on its measures to prevent manipulation by malicious actors and to mitigate risks in the context of elections, media pluralism, and civic discourse.
Under the DSA, platforms with more than 45 million monthly active users in the European Union are required to implement strict user protection measures. As part of these measures, these large online platforms must assess the risks their systems pose to users, particularly around harmful content and user safety. They must also take action to mitigate these risks, with noncompliance with the DSA potentially resulting in significant fines.
YouTube, Snapchat, and TikTok are required to submit detailed responses by Nov. 15.
If they fail to comply—or if they provide incomplete or misleading information—the commission could initiate formal legal proceedings, possibly leading to penalties.
The Epoch Times has reached out to the three platforms with requests for comment on the commission’s inquiry.
In response, a spokesperson for Snapchat said in an emailed statement that the company has received the commission’s request and will cooperate “to provide the necessary information.”
A YouTube spokesperson told The Epoch Times in an emailed statement that the platform is working closely with the commission to ensure that it “appropriately complies” with the DSA. The spokesperson added that YouTube has invested in products and systems to help protect the platform’s users, “whether that’s tackling disinformation or supporting digital wellbeing and mental health.”
“Our recommendations system plays an important role in this, by making it easier for viewers to find high quality content on sensitive topics like news and health,” the spokesperson added.
The formal proceedings against TikTok are linked to several areas, including the platform’s protection of minors, transparency in advertising, and data access for researchers. The investigation stems from concerns about TikTok’s system design, particularly how its algorithms may foster addictive behaviors and expose users—particularly minors—to harmful content.
“The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users—young as well as old,” Margrethe Vestager, executive vice president for A Europe Fit for the Digital Age and Competition at the European Commission said in a statement.
Regarding AliExpress, the commission’s investigation focuses on multiple areas of concern, including the platform’s handling of illegal content, consumer protection, content moderation, and transparency in advertising and companies’ recommendation algorithms.
Meta faces formal proceedings focused on the possibility that its algorithms and system design could foster addictive behavior in children, leading to so-called “rabbit hole” effects. These are patterns of content consumption in which users, particularly minors, are drawn deeper into viewing similar, potentially harmful content.