Facebook and Instagram are illegally letting minors see graphic imagery, including pornographic materials, according to a new lawsuit.
Minor profiles operated by New Mexico investigators were exposed to nudity, pornographic videos, and sexually suggestive images of young girls, the suit states.
The companies were alerted to the materials and the users who posted or sent them, in violation of company policy, but the companies often failed to take action, New Mexico’s attorney general said.
“Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex,” New Mexico Attorney General Raul Torrez said in a statement. “As a career prosecutor who specialized in internet crimes against children, I am committed to using every available tool to put an end to these horrific practices and I will hold companies—and their executives—accountable whenever they put profits ahead of children’s safety.”
Meta (the parent company of Facebook and Instagram), Meta CEO Mark Zuckerberg, and Facebook and Instagram were each named as defendants in the suit, which was filed in state court.
Fake Profiles
New Mexico investigators created multiple fake profiles, many of them purportedly representing minor girls.One of the profiles was said to be a 13-year-old girl. She was soon added to a chat group focused on underage girls, featuring pornographic videos and photographs. The profile reported the group to Facebook, but it was never taken down. Facebook just instructed the girl to leave the group.
Investigators listed the girl’s year of birth as 2002, but regularly made posts showing she was much younger, including one marking her first day in seventh grade and another talking about losing her last baby tooth. Some of the posts suggested she had been trafficked, while others showed signs of physical abuse. But as far as investigators could tell, Facebook never alerted authorities. At the same time, the profile was presented with advertisements for law firms representing victims of trafficking.
Messages sent to the profile often included sexual material, including graphic pictures of men. The profile reported the accounts, but some of the accounts were not taken down. None of the messages were ever removed, according to the suit.
After the profile reported a pornographic video sent to it, Meta said it investigated and found no violation of its community standards, according to screenshots included in the complaint.
Minor Girls
Investigators also found multiple accounts that posted pictures of young girls in sexual poses while wearing underwear but after investigators reported it to Meta, the company took no action, according to the suit.Other accounts offered links to purchase or trade child sexual abuse material. The options could be found with simple searches, including for “all new kids links available.” One seller showed an image of a young girl in a bikini with her hand positioned near a private area. The seller promised the “best price links” and said to direct message for an image, which would cost $1,000.
Some of the accounts offering the links included the word “pizza,” which is known to be a code word used by predators.
Other accounts showed graphic images of children appearing to be involved in sexual intercourse.
Investigators working for the state found that after reviewing photographs of girls in sexually suggestive poses, Instagram’s algorithms recommended investigators look at similar accounts with images of girls in similar poses, or engaged in sex.
Violations of State Law?
New Mexico law prohibits human trafficking, or “benefiting, financially or by receiving anything of value, from the labor, services or commercial sexual activity of another person with the knowledge that force, fraud, or coercion was used to obtain the labor, services or commercial sexual activity.”The actions, or lack thereof, or Facebook, Instagram, and Meta violated the law, according to the suit.
Meta’s own standards say that the companies “do not allow content or activity that sexually exploits or endangers children,” and executives have said they protect against such content.
“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” a Meta spokesperson told news outlets in a statement.
“Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children,” Mr. Torrez said. “Despite repeated assurances to Congress and the public that they can be trusted to police themselves, it is clear that Meta’s executives continue to prioritize engagement and ad revenue over the safety of the most vulnerable members of our society.”
New Mexico is seeking an order declaring defendants violated a state law. Authorities want a fine of up to $5,000 levied against the defendants for each violation and disgorgement of profits and data unjustly obtained through the violations.
They are also requesting a permanent order preventing Meta and its employees from engaging in practices that violate state law.
The suit comes after a separate suit from the attorneys general of 33 states accused Meta of targeting children with “addictive” features. Prosecutors obtained internal materials saying, in part, that “every time one of our teen users finds something unexpected their brains deliver them a dopamine hit.”