Protect Children Online or Face Being Blocked, Social Media Giants Told

Regulator Ofcom publishes its first draft codes of practice under the Online Safety Act which focuses on protecting children from harmful content and groomers.
Protect Children Online or Face Being Blocked, Social Media Giants Told
The logos of mobile apps Instagram, Snapchat, Twitter, Facebook, Google and Messenger are displayed on a tablet on Oct. 1, 2019. Denis Charlet/AFP via Getty Images
Patricia Devlin
Updated:
0:00

A social media shake-up under the new Online Safety Act will compel platforms to block suicide and terror sites to children—and remove “suggested friends” to protect them from groomers.

Sites that don’t comply to the strict new rules face being fined or blocked completely by Ofcom, with senior social media managers liable to criminal proceedings, the regulator said.

Publishing its first draft codes of practice under the new bill on Thursday, Ofcom said under the code, the largest platforms will be required—by default—to ensure children on their sites are not presented with lists of suggested friends, do not appear in other users’ lists, that their location information is not visible to other users and that people outside their agreed connections cannot direct message them.

The online safety regulator is set to publish further codes in the coming months on other areas of online safety, such as guidance for adult sites on keeping children away and on protecting children from harmful content promoting things such as suicide or self-harm.

Each of the draft codes will have a consultation period before requiring final approval from Parliament. The regulator’s own timetable says it hopes to begin enforcing its first codes of practice by the end of 2024.

Illegal Content

Writing in the Telegraph, Dame Melanie Dawes, chief executive of Ofcom, sent a warning to those who seek to ignore the new regime.

“We’re prepared to do what we need to do to try to bring companies that pose a major risk to the public and are not complying with their duties, to achieve some change. And fundamentally, if we need to, block, [fine] and use criminal sanctions,” she said.

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression.

“Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

The illegal content code also encourages larger sites to use hash-matching technology to identify illegal images of sexual abuse and use automated tools to detect websites that have been identified as hosting abuse material.

On fighting fraud and terrorism, Ofcom says services should use automatic detection systems to find and remove posts linked to the sale of stolen financial information and block all accounts run by proscribed terrorist organisations.

The codes of practice also propose that tech firms nominate an accountable person who reports to senior management on compliance around illegal content, reporting and complaints duties, ensure their content moderation teams are well-resourced and trained, offer easy reporting and blocking tools to use, and carry out safety tests on recommendation algorithms.

The UK media watchdog Ofcom's logo in an undated file photo. (Yui Mok/PA)
The UK media watchdog Ofcom's logo in an undated file photo. Yui Mok/PA

Abuse Imagery

Technology Secretary Michelle Donelan said the publication of the first codes marked a “crucial” step in making the Online Safety Act a reality by “cleaning up the wild west of social media and making the UK the safest place in the world to be online.”

“Before the bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first,” she said.

“By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today.”

Ofcom said it had been and would continue working with social media and other in-scope platforms over the coming months to help ensure they were in compliance with the proposed codes when they come into full force.

Campaign groups have backed the first proposals from the regulator.

Susie Hargreaves, chief executive of the Internet Watch Foundation, said: “We stand ready to work with Ofcom and with companies looking to do the right thing to comply with the new laws.

“It’s right that protecting children and ensuring the spread of child sexual abuse imagery is stopped is top of the agenda.

“It’s vital companies are proactive in assessing and understanding the potential risks on their platforms and taking steps to make sure safety is designed in.”

The Online Safety Act, which became law last week, has previously received some criticism from Tory MPs who felt the bill was “far too reaching.”

Tech companies have also expressed concerns about the rules around legal but harmful content in the new legislation, suggesting it would make them unfairly liable for material on their platforms.

PA Media contributed to this report.
Patricia Devlin
Patricia Devlin
Author
Patricia is an award winning journalist based in Ireland. She specializes in investigations and giving victims of crime, abuse, and corruption a voice.
Related Topics