The UK’s data protection watchdog has launched an investigation into TikTok, Reddit, and Imgur over their use of children’s personal information, amid concerns that their algorithms may expose young users to inappropriate or harmful content.
The Information Commissioner’s Office (ICO) said on Monday it will investigate whether any of these companies are breaking data protection law.
The ICO will review how TikTok uses personal information of 13–17-year-olds to make recommendations to them and deliver suggested content to their feeds.
Reddit was the fastest-growing large social media platform in the UK last year. It will be investigated alongside the online image sharing website Imgur over their use of age assurance measures to tailor appropriate information for child users.
John Edwards, UK information commissioner, said that children’s information rights are a priority area for the regulator.
“We welcome the technology and innovation that companies like social media bring to the UK and want them to thrive in our economy. But this cannot be at the expense of children’s privacy.
“My message is simple. If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law.
TikTok
The ICO chose platforms for investigation based on their growth among young users, market dominance, and potential risks.Edwards clarified that TikTok was not being singled out but was chosen to help understand broader social media trends.
“The selection was made based on the direction of growth travel in relation to young users, market dominance, and potential for harm.
“But the underlying technology is what’s interesting to us and that’s present in X, it’s present in [Instagram’s] Reels, it’s present in Snapchat, it’s there across the board on digital platforms. Now, they’re all competing for attention and eyeballs, and so they’re using techniques to maximise those,” he said.
A TikTok spokesperson said the company is committed to creating a safe experience for young users.
“Our recommender systems are designed and operate under strict and comprehensive measures that protect the privacy and safety of teens, including industry-leading safety features and robust restrictions on the content allowed in teens’ feeds,” the spokesperson added.

Children’s Code
The ICO’s Children’s Code, introduced in 2021, requires apps, online games, and social media sites to protect children’s personal information.By default, profiling must be turned off, unless there is a valid reason to enable it.
If profiling is used, platforms must have safeguards to prevent harm, such as exposing children to harmful content.
In August 2024, the ICO contacted five platforms about their default privacy settings. In response, Dailymotion and Twitch improved their approach to protect children’s privacy.
The ICO is still assessing the video-sharing platform Discord and the educational platform Frog.
Meanwhile, Sendit, an anonymous Q&A app, has stopped automatically adding location data to user profiles. It has also introduced new in-app settings, allowing users to easily enable or disable location services for better control over their information.
Legislation
Beyond the ICO’s Children’s Code, the UK’s Online Safety Act 2023, along with the UK GDPR and the Online Safety Act 2023, protect children’s privacy online.These laws require online platforms to protect children’s personal data and ensure privacy settings are secure by default.
Platforms must obtain parental consent before collecting data from children under 13 and take steps to prevent harmful content, such as cyberbullying or self-harm material.
They are also expected to use age-appropriate design, making privacy settings easy to understand and manage, and must avoid nudging tactics that encourage children to share more data.
Additionally, they must enforce stricter content moderation to create a safer online environment.
Companies that fail to follow the rules could face fines of up to 10 percent of their global revenue or even be blocked.
It also published new guidelines urging platforms to use better technology to stop intimate image abuse and prevent online exploitation. These new safety rules, part of the Online Safety Act, take effect next month and come with heavy fines for violations.
The government is also considering stricter measures to protect children from social media risks.