Meta is adding more restrictions to its social media apps to prevent underage users from being exposed to inappropriate content.
Under the new default settings, Instagram users under 16 will be blocked from livestreaming and from opening images flagged as potentially containing nudity. They will need parental approval to lift these restrictions.
The update is launching first in the United States, United Kingdom, Australia, and Canada, with a global rollout promised “soon,” the company said.
“Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent,” Meta said in a blog post.
The company didn’t go into detail about how the protections will be implemented on Facebook and Messenger. If they replicate Instagram’s approach, changes would automatically apply to new and existing accounts for users under 18. While older teens can disable these features, those under 16 will require parental permission through supervisory tools to change them.
Apart from content filters, teen accounts offer a suite of parental controls. These allow parents to view who their child is interacting with, monitor the topics they’re exploring, set daily screen time limits, and restrict app usage during certain hours. Most changes to these settings require parental approval.
“Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex,” New Mexico Attorney General Raúl Torrez said when he filed the suit in December 2023.
According to Torrez, his office conducted an undercover operation using decoy accounts that posed as young children. The investigation allegedly uncovered instances in which dozens of adults contacted those accounts, pressured them into sharing explicit content, and directed them to unmoderated Facebook groups dedicated to sexual exploitation.
Later that month, a federal judge denied Meta’s motion to dismiss the case on grounds of jurisdiction and claim of immunity under Section 230, the federal law that shields online platforms from liability for user-generated content. The court did, however, grant Meta founder and CEO Mark Zuckerberg’s request to be removed as a defendant.