Tech giants face large fines as new UK child data and privacy protection measures come into full force.
The Age Appropriate Design Code, which comes into force on Sept. 2, sets out 15 standards that companies are expected to build into any online services used by children, making data protection of young people a priority from the design up.
These can stretch from apps and connected toys to social media sites and online games, and even educational websites and streaming services.
Location tracking, profiling, and the use of nudge techniques that encourage users to provide unnecessary personal data are among the features that must be switched off or limited.
The Information Commissioner, whose office devised and will enforce the rules, said the move is not about “age-gating” the internet nor “locking children out.”
“The internet was not designed with children in mind and I think the Age Appropriate Design Code will go a long way to ensure that kids have the right kind of experience online,” Elizabeth Denham told the PA news agency.
“I think it will be astonishing when we look back to ever think of a time when we didn’t have protections for children online because I think they need to be protected in the online world in the same way that they’re protected in the offline world.”
As the code is based on the back of European data protection law, companies risk being fined up to £17.5 million ($24.1 million) or 4 percent of their annual worldwide turnover—whichever is higher—for serious failures.
The Information Commissioner’s Office (ICO) warned that it will probably take more severe action against breaches involving children where it sees harm or potential harm.
Companies were given a year to ensure their platforms adhere to the measures before a September 2 deadline, though several have scrambled to make last-minute changes in recent weeks.
Instagram recently announced it would require all users to provide their date of birth, while Google has introduced a raft of privacy changes for children who use its search engine and YouTube platform.
TikTok also began limiting the direct messaging abilities of accounts belonging to 16- and 17-year-olds, as well as offering advice to parents and caregivers on how to support teenagers when they sign up.
Andy Burrows, head of child safety online policy at the NSPCC, said: “It’s no coincidence that a flurry of tech firms have made child safety announcements on the eve of the children’s code coming into force.
“This landmark code shows that regulation works and that there is little doubt this UK leadership is having a global impact on the design choices of the sites such as Instagram, Google, and TikTok.