Meta Expands Teen Restrictions on Instagram, Facebook, Messenger

Meta says it will broaden restrictions to prevent minors from being exposed to inappropriate content.
Meta Expands Teen Restrictions on Instagram, Facebook, Messenger
Instagram app on a smartphone in an illustration taken on July 13, 2021. Dado Ruvic/Illustration/Reuters
Bill Pan
Updated:
0:00

Meta is adding more restrictions to its social media apps to prevent underage users from being exposed to inappropriate content.

Building on its September 2024 launch of “Teen Accounts” on Instagram—which automatically defaults younger users to private profiles and prevents them from accessing certain features—Meta announced on Tuesday that it will broaden Instagram restrictions and make similar protections available on Facebook and Messenger.

Under the new default settings, Instagram users under 16 will be blocked from livestreaming and from opening images flagged as potentially containing nudity. They will need parental approval to lift these restrictions.

The update is launching first in the United States, United Kingdom, Australia, and Canada, with a global rollout promised “soon,” the company said.

“Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent,” Meta said in a blog post.

The company didn’t go into detail about how the protections will be implemented on Facebook and Messenger. If they replicate Instagram’s approach, changes would automatically apply to new and existing accounts for users under 18. While older teens can disable these features, those under 16 will require parental permission through supervisory tools to change them.

Apart from content filters, teen accounts offer a suite of parental controls. These allow parents to view who their child is interacting with, monitor the topics they’re exploring, set daily screen time limits, and restrict app usage during certain hours. Most changes to these settings require parental approval.

The expansion comes as Meta faces mounting pressure from regulators and child safety advocates. In Europe, both Facebook and Instagram are under investigation over young users’ well-being and privacy. In the United States, Meta is fighting a high-profile lawsuit brought by the state of New Mexico, which accuses the company of creating a “marketplace for predators in search of children.”

“Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex,” New Mexico Attorney General Raúl Torrez said when he filed the suit in December 2023.

According to Torrez, his office conducted an undercover operation using decoy accounts that posed as young children. The investigation allegedly uncovered instances in which dozens of adults contacted those accounts, pressured them into sharing explicit content, and directed them to unmoderated Facebook groups dedicated to sexual exploitation.

In May 2024, Torrez announced the arrests of three suspects in what authorities dubbed “Operation MetaPhile.” The sting operation used law enforcement-run decoy profiles to identify and engage with suspected online predators.

Later that month, a federal judge denied Meta’s motion to dismiss the case on grounds of jurisdiction and claim of immunity under Section 230, the federal law that shields online platforms from liability for user-generated content. The court did, however, grant Meta founder and CEO Mark Zuckerberg’s request to be removed as a defendant.