A lawsuit alleging that Meta platforms including Facebook and Instagram pose a risk to children can move forward, a federal court in California ruled on Tuesday.
The suit accuses Meta of violating consumer protection laws by “designing and deploying platform features it knew were harmful to young users and misleading and concealing from the public its knowledge of this harm.”
The states also specifically allege that Meta breached the Children’s Online Privacy and Protection Act (COPPA) by collecting data on youngsters below the age of 13 years without first obtaining consent from their parents.
Meta filed a motion to dismiss the case. While District Court Judge Yvonne Gonzalez Rogers did dismiss some of the claims on Tuesday, she allowed allegations regarding minor harm to move forward.
“Much of the States’ consumer protection claims are cognizable,” the judge wrote in the order.
“Meta’s alleged yearslong public campaign of deception as to the risks of addiction and mental harms to minors from platform use fits readily within these states’ deceptive acts and practices framework,” the order said.
“Meta’s design, development, and deployment of certain product features plausibly constitutes unfair or unconscionable practices under all at-issue federal and state standards.”
The plaintiffs allege that several designs, features, and functions of Meta platforms induce “young users’ extended, addictive, and compulsive use” of these networks.
For instance, Meta platforms including Facebook and Instagram use algorithms that recommend content to young users in an “emotionally gripping” manner to provoke “intense reactions,” the plaintiffs claim.
The “infinite scroll” feature loads content automatically as a user scrolls down a webpage, which makes it difficult for young users to disengage, the complainants said.
The display of “like” counts on posts was also criticized, with states accusing Meta of being aware of the harm such social comparison features have on young users. States claimed that Meta’s researchers knew about a causal link between “like” counts and social comparison.
A Meta spokesperson defended the company’s policies, stating that the firm takes measures to protect its young users.
“We’ve developed numerous tools to support parents and teens, and we recently announced that we’re significantly changing the Instagram experience for tens of millions of teens with new Teen Accounts, a protected experience for teens that automatically limits who can contact them and the content they see,” the spokesperson said in a statement to multiple media outlets.
“We believe the evidence will demonstrate our commitment to supporting young people.”
Section 230 Protections
Even though the court allowed claims regarding alleged harm to children to continue, Rogers pointed out that Section 230 of the Communications Decency Act “provides a fairly significant limitation” on plaintiffs’ claims.“Similarly, Section 230 protects against personal injury plaintiffs’ consumer-protection, concealment, and misrepresentation theories to the same extent,” the order said.
According to the proposal, the sunset of Section 230 would begin on Dec. 31, 2025, with Congress potentially enacting a revised law that would allow citizens to hold social media firms accountable for published content.
Industry groups have criticized the proposal. Kate Tummarello, executive director of trade advocacy group Engine, warned a House panel that sunsetting Section 230 “risks leaving internet platforms, especially those run by startups, open to ruinous litigation, which ultimately risks leaving internet users without places to gather online.”