Companies are now legally required to block children’s access to “harmful content” in the UK, under internet regulations.
Hailed by the UK government as the world’s first online safety law, the Online Safety Act (OSA) became law in October 2023; however, the duties related to the regulation of child safety measures for sites and apps won’t be introduced until this July.
It said that providers of services likely to be accessed by UK children now have until July 24 to finalize and record their assessment of the risk their service poses to children.
This means creating social media feeds with less harmful and dangerous content as well as protection from being contacted by strangers.
If companies fail to comply with their new duties, Ofcom has the power to impose fines and, in very serious cases, apply for a court order to prevent the site or app from being available in the UK.
Dame Melanie Dawes, Ofcom chief executive, said the changes are “a reset for children online.”
The “riskiest services” such as pornography must use highly effective age assurance to identify which users are children while preserving adults’ rights to access legal content.
They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services, and email-based age estimation.
“We’re a UK-based regulator, but that doesn’t mean the rules don’t apply to sites based abroad. If people are visiting your site from the UK, you’ll likely be in scope, wherever in the world you’re based,” the regulator said.
It said that failure to implement the necessary age-assurance process by July 25 will result in referral to its enforcement team, which can take a number of actions, including imposing financial penalties of up to 10 percent of a company’s revenue, or 18 million pounds ($24 million), whichever is greater.
Forums
Under the act, social media platforms and other user-to-user service providers must also proactively police their sites for content.Sites that allow user interaction, including forums, have had to complete an illegal harm risk assessment.
It notes that the law “makes the site owner liable for everything that is said by anyone on the site they operate.”
“The act is too broad, and it doesn’t matter that there’s never been an instance of any of the proclaimed things that this act protects adults, children, and vulnerable people from ... the very broad language and the fact that [the forum is] based in the UK means we’re covered,” the post reads.
U.S. sites are also blocking users in the UK because of the legislation.
British users are now greeted with this message: “You are accessing this website from the United Kingdom. This is not a good idea. The letter states the UK asserts authority over any website that has a ’significant number of United Kingdom users.' This ambiguous metric could include any site on the Internet and specifically takes aim at the people using a website instead of the website itself.”
Digital rights campaigning organisation Open Rights Group released a statement on Thursday urging the government to “exempt small community websites.”
James Baker, platform power programme manager at Open Rights Group, said that Ofcom has issued more than “1,600 pages of guidance across 17 pdfs on how providers of online services will be expected to comply” with the OSA.
“Only the largest of companies can realistically expect to digest and comply with these extensive regulations, entrenching their market domination,” he said.
“Some small sites are starting to close through fear of not being able to comply, forcing people to use the big tech platforms that have caused harms. Other community organisations are yet to understand what this means for them.”