Unredacted sections of a lawsuit against Meta allege the social media juggernaut prioritized company growth ahead of the safety of its younger users, while the company refutes these claims and says it has devoted a lot of time and resources to keep minors safe online.
He also argues that Meta harms children and teenagers through the addictive design of its platform, degrading users’ mental health, their sense of self-worth, and their physical safety.
“Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children,” Mr. Torrez said.
“Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta’s executives continue to prioritize engagement and ad revenue over the safety of the most vulnerable members of our society,” he added.
Documents Paint a Disturbing Picture
In the recently unredacted documents, unsealed as part of an amended complaint, it has been revealed Meta employees were aware of a significant volume of inappropriate and sexually explicit content being shared between adults and minors.An internal email dated from 2017 showed a Facebook executive opposed scanning Meta’s private messaging app, Messenger, for “harmful content” because it would be a “competitive disadvantage vs other apps that might offer more privacy.”
An internal presentation from 2021 also estimated at least 100,000 children per day were sexually harassed on Meta’s messaging platforms, often receiving sexually explicit content and photos.
According to the document, Meta didn’t want to block parents and older relatives from reaching out to their younger relatives.
An employee criticized this stance, saying, in their opinion, it was a “less than compelling” reason for failing to establish safety features while also pointing out that grooming was occurring almost twice as much on Instagram than on Facebook because of the lax safeguards. By March 2021, Instagram announced it was restricting people over 19 from messaging minors.
In a recent media statement, New Mexico Attorney General Raúl Torrez condemned the company for ignoring its employee’s concerns and dragging its feet in addressing issues with its platforms.
“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and child exploitation,” he said.
“Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children’s safety. While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive.”
‘Cherry-Picked Documents’
In a media statement, Meta argued it has always wanted teens to have safe, age-appropriate experiences online. According to Meta, it has spent “a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”“The complaint mischaracterizes our work using selective quotes and cherry-picked documents.”
The company also said it uses sophisticated technology, hires child safety experts, and shares information and tools with other companies and law enforcement to help minors stay safe online.
American-based social media platforms are legally required to report instances of Child Sexual Abuse Material (CSAM) to the National Center for Missing and Exploited Children (NCMEC) CyberTipline.
Overall, Meta was responsible for about 85 percent of all reports made to the NCMEC during 2022 when the data was collected.
Meta said the new content limitations are designed to ensure teenagers have a more age-appropriate experience when using apps and will make it harder for teens to view and search for sensitive content such as suicide, self-harm, and eating disorders.
Meta CEO Mark Zuckerberg, along with the CEOs of Snap, Discord, TikTok and X, formerly Twitter, are scheduled to testify before the U.S. Senate on child safety at the end of January.
The Epoch Times has contacted Meta for comment.