A federal judge ruled on Tuesday that Big Tech social media firms must face a lawsuit accusing the companies of designing platforms in a way that creates a “youth mental health crisis.”
As children are still developing impulse control and are uniquely susceptible to the harms that come with compulsive social media use, the tech firms created a “youth mental health crisis” through their platforms, the plaintiffs argue.
In addition, the lawsuit also accuses platforms of facilitating and contributing to the sexual exploitation of children, including the spread of child sex abuse materials (CSAM).
District Judge Yvonne Gonzalez Rogers ruled that while the First Amendment and Section 230 shield social media networks from some of the plaintiffs’ claims, other allegations made by the plaintiffs are valid.
The decision now allows the lawsuit to move forward.
“Defendants argue that the First Amendment protects them from liability for the speech they publish as well as for all choices they have made in disseminating them,” the judge wrote in her decision.
However, many of the violations alleged by the plaintiffs do “not constitute speech or expression, or publication of same,” she pointed out.
For instance, plaintiffs accused the social media companies of not providing effective parental controls to parents, not offering options for users to self-restrict time spent on a platform, not using robust age verification, and not implementing reporting protocols that would allow users to report CSAM and other such material.
Social Media Harms
During a hearing, the defendants also implied that making the reporting of CSAM material more accessible would necessitate that they remove such content, thus violating Section 230, Judge Rogers noted. She dismissed the argument.“Receiving more reports does not require them to remove the content. They could respond by taking other steps, such as reporting the content to a government agency or providing relevant warnings,” the judge pointed out.
- An “endless” stream of content being shown to users, with platforms being designed for “maximizing the length of user sessions.”
- Algorithms designed to “strategically time when they show content to users in order to maximize engagement.” For instance, Instagram may wait until a piece of content receives several likes before notifying a user about it. This results in the user’s dopamine reaction being intensified, plaintiffs argued.
- Time limits on how long certain content is available to create a sense of urgency among users to engage with it.
- Using “engagement-based algorithms that promote content to users based on the likelihood it will keep them engaged with and using the platform.”
- Recommending accounts of minors to adult strangers while also offering private chat functions that could facilitate the exploitation of children by predators.
- Making it more challenging to delete an account compared to creating it. This sets up barriers to children seeking to delete their accounts.
Attorneys React to Ruling
In a joint statement, Lexi Hazam, Previn Warren, and Chris Seeger, lead attorneys for the consumer plaintiffs, called Judge Rogers’ decision a “significant victory for the families that have been harmed by the dangers of social media,” according to CNN.“The court’s ruling repudiates Big Tech’s overbroad and incorrect claim that Section 230 or the First Amendment should grant them blanket immunity for the harm they cause to their users. The mental health crisis among American youth is a direct result of these defendants’ intentional design of harmful product features,” they said.
A Google spokesperson called the allegations in the lawsuit “simply not true.” The spokesperson insisted that the company has built “age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.”
One bill requires social media platforms to verify that users in the state are at least 18 years old. Minors will need permission from parents to open an account.
The Epoch Times reached out to Google, Meta, TikTok, and Snapchat for comment.