Court Declines Meta’s Motion to Dismiss Lawsuit Accusing Company of Harming Children

Several platform features induce youngsters into an ‘extended, addictive, and compulsive use’ of these services, the plaintiffs claim.
Court Declines Meta’s Motion to Dismiss Lawsuit Accusing Company of Harming Children
The logos of Meta-owned services on a smartphone screen in Moscow on Oct. 5, 2021. Kirill Kudryavtsev/AFP via Getty Images
Naveen Athrappully
Updated:
0:00

A lawsuit alleging that Meta platforms including Facebook and Instagram pose a risk to children can move forward, a federal court in California ruled on Tuesday.

The multi-district litigation involves hundreds of complaints filed by school districts, personal injury plaintiffs, local government entities, and 34 states, according to an Oct. 15 order from the U.S. District Court of the Northern District of California.

The suit accuses Meta of violating consumer protection laws by “designing and deploying platform features it knew were harmful to young users and misleading and concealing from the public its knowledge of this harm.”

The states also specifically allege that Meta breached the Children’s Online Privacy and Protection Act (COPPA) by collecting data on youngsters below the age of 13 years without first obtaining consent from their parents.

Meta filed a motion to dismiss the case. While District Court Judge Yvonne Gonzalez Rogers did dismiss some of the claims on Tuesday, she allowed allegations regarding minor harm to move forward.

“Much of the States’ consumer protection claims are cognizable,” the judge wrote in the order.

“Meta’s alleged yearslong public campaign of deception as to the risks of addiction and mental harms to minors from platform use fits readily within these states’ deceptive acts and practices framework,” the order said.

“Meta’s design, development, and deployment of certain product features plausibly constitutes unfair or unconscionable practices under all at-issue federal and state standards.”

The plaintiffs allege that several designs, features, and functions of Meta platforms induce “young users’ extended, addictive, and compulsive use” of these networks.

For instance, Meta platforms including Facebook and Instagram use algorithms that recommend content to young users in an “emotionally gripping” manner to provoke “intense reactions,” the plaintiffs claim.

The “infinite scroll” feature loads content automatically as a user scrolls down a webpage, which makes it difficult for young users to disengage, the complainants said.

The display of “like” counts on posts was also criticized, with states accusing Meta of being aware of the harm such social comparison features have on young users. States claimed that Meta’s researchers knew about a causal link between “like” counts and social comparison.

A Meta spokesperson defended the company’s policies, stating that the firm takes measures to protect its young users.

“We’ve developed numerous tools to support parents and teens, and we recently announced that we’re significantly changing the Instagram experience for tens of millions of teens with new Teen Accounts, a protected experience for teens that automatically limits who can contact them and the content they see,” the spokesperson said in a statement to multiple media outlets.

“We believe the evidence will demonstrate our commitment to supporting young people.”

The Epoch Times reached out to Meta for comment.

Section 230 Protections

Even though the court allowed claims regarding alleged harm to children to continue, Rogers pointed out that Section 230 of the Communications Decency Act “provides a fairly significant limitation” on plaintiffs’ claims.
Section 230 provides immunity to online platforms from civil liability for third-party content. Rogers noted that the design and deployment of most Meta platform features that the complaint alleges are “unfair or unconscionable” are insulated by Section 230.

“Similarly, Section 230 protects against personal injury plaintiffs’ consumer-protection, concealment, and misrepresentation theories to the same extent,” the order said.

There have been efforts from lawmakers to revise Section 230. In May, a bipartisan proposal was introduced in the House to nullify the law.

According to the proposal, the sunset of Section 230 would begin on Dec. 31, 2025, with Congress potentially enacting a revised law that would allow citizens to hold social media firms accountable for published content.

Industry groups have criticized the proposal. Kate Tummarello, executive director of trade advocacy group Engine, warned a House panel that sunsetting Section 230 “risks leaving internet platforms, especially those run by startups, open to ruinous litigation, which ultimately risks leaving internet users without places to gather online.”

Meta is also facing a lawsuit from the Ohio Public Employees Retirement System, which accuses the firm of having “knowingly exploited” young users and misrepresenting to investors that its products do not harm children.
Naveen Athrappully
Naveen Athrappully
Author
Naveen Athrappully is a news reporter covering business and world events at The Epoch Times.