Meta Restricting Certain Content for Teens on Facebook, Instagram

Meta Restricting Certain Content for Teens on Facebook, Instagram
A person watches on a smartphone Facebook CEO Mark Zuckerberg unveiling the META logo in Los Angeles on Oct. 28, 2021. Chris Delmas/AFP via Getty Images
Katabella Roberts
Updated:
0:00

Facebook and Instagram parent company Meta will restrict the type of content that teenagers can see on its platforms in the near future as part of wider efforts to make the social networks safer and more “age-appropriate” for young users.

The social media giant announced the new restrictions in a blog post on Jan. 9 following consultations with “experts in adolescent development, psychology, and mental health.”

Meta said the new content limitations are designed to ensure teenagers have a more age-appropriate experience when using the apps and will make it harder for teens to view and search for sensitive content such as suicide, self-harm, and eating disorders.

Teens attempting to access such content will instead be diverted to helpful resources including information from organizations like the National Alliance on Mental Illness, the social media giant said.

The company noted that it already aims not to recommend such sensitive content to teenagers in its “Reels” and “Explore” sections on the apps but that the new changes mean teenagers will also not be able to see such content in their “Feeds” and “Stories.”

“While we allow people to share content discussing their own struggles with suicide, self-harm, and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find,” the company said.

“Now, when people search for terms related to suicide, self-harm, and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help. We already hide results for suicide and self-harm search terms that inherently break our rules and we’re extending this protection to include more terms,” it added.

Attorneys General File Lawsuit

The new policy will impact Facebook and Instagram users under the age of 18.

Meta noted the new policy is already being rolled out for teenagers and will be fully in place on both platforms in the coming months.

The company is also automatically placing teens into the most restrictive content control settings on Instagram and Facebook, it said.

The new updates come as Meta—which is headed by billionaire Mark Zuckerberg—faces mounting pressure from regulators and lawmakers in both the United States and Europe over claims its social media sites are addictive and harmful to the mental health of younger users.

In October, the attorneys general of 33 states filed a lawsuit against the company, accusing it of implementing addictive features in the apps that “entice, engage, and, ultimately, ensnare youth and teens,” all while boosting corporate profits.

The lawsuit further accuses Meta of having “profoundly altered the psychological and social realities of a generation of young Americans” through technologies that boost engagement, and says the company flouted its obligations under the Children’s Online Privacy Protection Act by “unlawfully collecting the personal data of its youngest users without their parent’s permission.”

Meta whistleblower Arturo Bejar, former director of engineering for Protect and Care at Facebook in Berkeley, Calif., testifies before the Senate Judiciary subcommittee, alleging the company failed to act on reports of harassment and harm facing teens on the platform in Washington on Nov. 7, 2023. (Madalina Vasiliu/The Epoch Times)
Meta whistleblower Arturo Bejar, former director of engineering for Protect and Care at Facebook in Berkeley, Calif., testifies before the Senate Judiciary subcommittee, alleging the company failed to act on reports of harassment and harm facing teens on the platform in Washington on Nov. 7, 2023. Madalina Vasiliu/The Epoch Times

‘Harmful, Psychologically Manipulative Product Features’

It also argues that Meta designed its platforms with “harmful and psychologically manipulative product features to induce young users’ compulsive and extended Platform use, while falsely assuring the public that its features were safe and suitable for young users,” and that its apps promote body dysmorphia and expose underage users to potentially harmful content.

Meta has denied the claims made in the lawsuit and regularly touted its work over the past decade to bolster the safety of teenagers online, noting that it has over 30 tools to support teenagers and their parents.

However, in November, former Meta employee turned whistleblower Arturo Bejar told the Senate Judiciary Committee on Privacy, Technology, and the Law that the company was aware of the harm its products may cause to young users but failed to take appropriate action to remedy the issue.

Mr. Bejar, who worked as a Facebook engineering director from 2009 to 2015, and later as a consultant at Instagram from 2019 to 2021, told the committee that he had highlighted the issue in an email to Mr. Zuckerberg but that his warnings ultimately went unheeded.

In its blog post on Tuesday, Meta said it wants teens to have safe, age-appropriate experiences across its apps.

“Parents want to be confident their teens are viewing content online that’s appropriate for their age,” Vicki Shotbolt, CEO of ParentZone.org said in the post. “Paired with Meta’s parental supervision tools to help shape their teens’ experiences online, Meta’s new policies to hide content that might be less age-appropriate will give parents more peace of mind.”

According to a Pew Research Center survey conducted in December 2023, 59 percent of U.S. teens reported using Instagram regularly while only 33 percent said they used Facebook; down from 71 percent in 2014–2015.
Reuters contributed to this report.
Katabella Roberts
Katabella Roberts
Author
Katabella Roberts is a news writer for The Epoch Times, focusing primarily on the United States, world, and business news.
Related Topics