Court Sides With TikTok and YouTube in Moms’ Lawsuit Over ‘Choking Challenges’

A federal judge dismissed all claims against the two video platforms, citing Section 230 of the Communications Decency Act.
Court Sides With TikTok and YouTube in Moms’ Lawsuit Over ‘Choking Challenges’
Joann Bogard, mother of an internet challenge victim, Mason Bogard, speaks during a rally to hold tech and social media companies accountable for taking steps to protect kids and teens online, in Washington on Jan. 31, 2024. Jemal Countess/Getty Images for Accountable Tech
Chase Smith
Updated:
0:00

A federal court on Feb. 24 dismissed a lawsuit brought by a group of mothers who claimed TikTok and YouTube bore legal responsibility for allegedly dangerous “choking challenges” and other harmful content posted by third parties on the platforms.

According to an order filed in the Northern District of California, the court granted the companies’ motion to dismiss, giving the moms a chance to amend their complaint but barring nearly every aspect of their initial claims.

The plaintiffs asserted that both platforms were strictly liable for failing to remove dangerous videos.

The mothers alleged there was a series of design defects and misrepresentations relating to the reporting tools provided by TikTok and YouTube.

“Upon consideration of the moving and responding papers, and the parties’ arguments at the hearing, the Court grants Defendants’ motion to dismiss with leave to amend,” the ruling states. The judge held that, as currently pled, the families did not establish a sufficient legal basis for holding TikTok or YouTube liable.

The complaint recounted harrowing stories from parents who lost children who had allegedly participated in choking challenges.

One plaintiff accused the services of ignoring multiple reports about the videos. The judge concluded, however, that the platforms’ conduct in moderating or removing user-generated content “stems from third-party content” and is largely protected from liability by Section 230 of the Communications Decency Act.

Joann Bogard, the lead plaintiff in the case, sued after her son Mason died after trying the online “blackout challenge” at the age of 15.

Bogard had waited until Mason was in middle school before giving him a phone with parental controls, but she didn’t know that he was viewing YouTube videos of the choking challenge.

Mason tried the blackout challenge in May 2019 and accidentally killed himself because his belt buckle locked up and didn’t loosen after he lost consciousness. He was without oxygen for too long and showed no vital activity after being on life support for three days. The Bogards donated his organs, benefiting five recipients.

“When people hear our stories, it makes them afraid that this is going to happen to their family. They don’t want that; they want to fix that problem,” Bogard told The Epoch Times last June.

“I think that’s why it’s so personal to them. Because they realize our kids are their kids. Our kids are the everyday American kids that are out there.”

The court wrote that “[i]t is not enough that a claim, including its underlying facts, stems from third-party content for Section 230 immunity to apply.”

In examining tort and fraud-based arguments, the court found that the parents had not adequately pleaded how the content-reporting tools themselves were defective.

The judge noted that the plaintiffs’ emotional or non-economic injuries might not suffice under certain state consumer protection statutes, explaining that those legal frameworks commonly require measurable financial harm.

The court also pointed out that the plaintiffs needed “more than an unadorned, the-defendant-unlawfully-harmed-me accusation” to meet the legal standards for fraud or deception.

Although the dismissal allows the plaintiffs to amend their complaint, the court dismissed two Google-affiliated defendants, XXVI Holdings Inc. and Alphabet Inc., from the case entirely, concluding that they were only parent entities lacking direct involvement.

In addition, the Becca Schmill Foundation was dismissed from the case for lack of standing, though the court left open the possibility of re-pleading those allegations.

The judge set March 24 as the deadline for any amended complaint. Attorneys representing the plaintiffs did not respond to a request for comment on the judge’s order.

TikTok and YouTube did not respond to a request for comment on the case dismissal before publication.

Terri Wu contributed to this report. 
Chase Smith
Chase Smith
Author
Chase is an award-winning journalist. He covers national news for The Epoch Times and is based out of Tennessee. For news tips, send Chase an email at [email protected] or connect with him on X.
twitter