Lawsuit Alleges Social Media Giant Meta Put Growth Ahead of Young Users Safety  

Newly released sections of a lawsuit against Meta allege the company did not prioritize the safety of its younger users, an allegation the company denies.
Lawsuit Alleges Social Media Giant Meta Put Growth Ahead of Young Users Safety  
A security guard stands watch by the Meta sign outside the headquarters of Facebook parent company Meta Platforms Inc in Mountain View, Calif., on Nov. 9, 2022. Peter DaSilva/Reuters
Stephen Katte
Updated:

Unredacted sections of a lawsuit against Meta allege the social media juggernaut prioritized company growth ahead of the safety of its younger users, while the company refutes these claims and says it has devoted a lot of time and resources to keep minors safe online.

Last December, New Mexico Attorney General Raúl Torrez filed a lawsuit against Meta Platforms, its subsidiaries and CEO Mark Zuckerberg over allegations the company’s social media platforms were unsafe for children.
In the lawsuit filing, Mr. Torrez alledges Meta fails to remove Child Sexual Abuse Material (CSAM) across its platforms and enables adults to find, contact, and solicit underage users.

He also argues that Meta harms children and teenagers through the addictive design of its platform, degrading users’ mental health, their sense of self-worth, and their physical safety.

Meta has already been the subject of a separate lawsuit from October 2023, when 33 states claimed it is harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

“Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children,” Mr. Torrez said.

“Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta’s executives continue to prioritize engagement and ad revenue over the safety of the most vulnerable members of our society,” he added.

According to Mr. Torrez, his office gathered most of their evidence for the lawsuit through an undercover investigation of Meta’s platforms, creating decoy accounts of children 14 years and younger.

Documents Paint a Disturbing Picture

In the recently unredacted documents, unsealed as part of an amended complaint, it has been revealed Meta employees were aware of a significant volume of inappropriate and sexually explicit content being shared between adults and minors.

An internal email dated from 2017 showed a Facebook executive opposed scanning Meta’s private messaging app, Messenger, for “harmful content” because it would be a “competitive disadvantage vs other apps that might offer more privacy.”

An internal presentation from 2021 also estimated at least 100,000 children per day were sexually harassed on Meta’s messaging platforms, often receiving sexually explicit content and photos.

Another internal document from 2020 showed many of the safeguards on Facebook, such as preventing adults from messaging minors they didn’t know, were not being used on Instagram because it was not a priority.

According to the document, Meta didn’t want to block parents and older relatives from reaching out to their younger relatives.

An employee criticized this stance, saying, in their opinion, it was a “less than compelling” reason for failing to establish safety features while also pointing out that grooming was occurring almost twice as much on Instagram than on Facebook because of the lax safeguards. By March 2021, Instagram announced it was restricting people over 19 from messaging minors.

In a recent media statement, New Mexico Attorney General Raúl Torrez condemned the company for ignoring its employee’s concerns and dragging its feet in addressing issues with its platforms.

“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and child exploitation,” he said.

“Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children’s safety. While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive.”

Employees in Facebook's "War Room" in Menlo Park, Calif., during a media demonstration on Oct. 17, 2018. (Noah Berger/AFP/Getty Images)
Employees in Facebook's "War Room" in Menlo Park, Calif., during a media demonstration on Oct. 17, 2018. Noah Berger/AFP/Getty Images

‘Cherry-Picked Documents’

In a media statement, Meta argued it has always wanted teens to have safe, age-appropriate experiences online. According to Meta, it has spent “a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”

“The complaint mischaracterizes our work using selective quotes and cherry-picked documents.”

The company also said it uses sophisticated technology, hires child safety experts, and shares information and tools with other companies and law enforcement to help minors stay safe online.

American-based social media platforms are legally required to report instances of Child Sexual Abuse Material (CSAM) to the National Center for Missing and Exploited Children (NCMEC) CyberTipline.

According to the most recent data, Facebook did submit roughly 21 million reports of CSAM, while other Meta platforms, such as Instagram, submitted over 5 million reports, and WhatsApp sent 1 million.

Overall, Meta was responsible for about 85 percent of all reports made to the NCMEC during 2022 when the data was collected.

Facebook founder and CEO Mark Zuckerberg testifies at a Senate Judiciary and Commerce Committees Joint Hearing in Washington on April 10, 2018. (Samira Bouaou/The Epoch Times)
Facebook founder and CEO Mark Zuckerberg testifies at a Senate Judiciary and Commerce Committees Joint Hearing in Washington on April 10, 2018. Samira Bouaou/The Epoch Times
The social media giant also announced in a Jan. 9 blog post that it would be restricting the type of content that teenagers can see on its platforms in the near future as part of a broader effort to make social networks safer.

Meta said the new content limitations are designed to ensure teenagers have a more age-appropriate experience when using apps and will make it harder for teens to view and search for sensitive content such as suicide, self-harm, and eating disorders.

Meta CEO Mark Zuckerberg, along with the CEOs of Snap, Discord, TikTok and X, formerly Twitter, are scheduled to testify before the U.S. Senate on child safety at the end of January.

The Epoch Times has contacted Meta for comment.

Stephen Katte
Stephen Katte
Author
Stephen Katte is a freelance journalist at The Epoch Times. Follow him on X @SteveKatte1
twitter
Related Topics