Appeals Court Revives Lawsuit Against TikTok Over 10-Year-Old Girl’s Death

Appeals Court Revives Lawsuit Against TikTok Over 10-Year-Old Girl’s Death
A man holds a smartphone displaying the logo of Chinese social media platform TikTok in April 2024. Antonin UTZ/AFP
Catherine Yang
Updated:
0:00

A federal appeals court ruled on Aug. 27 that the lawsuit brought by Tawainna Anderson against TikTok for her 10-year-old daughter’s death can continue. It found that Section 230 of the Communications Decency Act (CDA) does not shield TikTok from these kinds of lawsuits and partly reversed a district court ruling that upheld the CDA immunity.

During oral arguments in January, counsel for Anderson argued that TikTok was not merely a publisher because its sophisticated algorithm is known for its ability to recommend specific content to users.

The app had promoted “blackout challenge” videos to 10-year-old Nylah Anderson’s curated “For You Page” (FYP); in these videos, users recorded themselves engaging in acts of self-asphyxiation, choking themselves until they passed out. Anderson argued in her lawsuit that TikTok’s algorithm determined that this was content “well-tailored and likely to be of interest to 10-year-old Nylah Anderson, and she died as a result.” After watching the video, Nylah hung herself and died.

Anderson described her child as “an active, happy, healthy, and incredibly intelligent child,” in the court filing. “Though only 10 years old, Nylah spoke three languages.”

U.S. Circuit Judge Patty Shwartz wrote the opinion and order for the three-judge panel of the U.S. Court of Appeals for the Third Circuit.

The CDA was passed to immunize “interactive computer services” from liability for content posted by third parties, but this immunity does not extend to the service’s own speech, Schwartz wrote. She referred to a recent Supreme Court case, Moody v. NetChoice, LLC, in which the high court found that a platform’s algorithm reflects “editorial judgments” on the platform’s part.

TikTok has conceded in court filings that its algorithm is comparable to the one described in the NetChoice case, which the Supreme Court has held to be “expressive speech.”

“TikTok’s recommendations via its FYP algorithm is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims,” Schwartz wrote.

While noting that Nylah’s seeing the videos was not based on specific user input, such as searching for the videos, the court declined to rule on whether content as the result of a search would be immune from liability. It also found it unnecessary to address the distinction between a publisher and distributor as it relates to Section 230.

U.S. Circuit Judge Paul Matey, in an opinion concurring in part and dissenting in part, found the distinction between publisher and distributor useful, writing that Section 230 immunizes TikTok as a publisher and host of the video in question but did not immunize TikTok as a distributor of this content.

The ruling means that Anderson’s case can continue in district court and that TikTok can be liable for its algorithm showing Nylah the blackout challenge, but not for hosting it.

TikTok Sued by Iowa, FTC Over Children Using App

In January, the state of Iowa sued TikTok and its parent company, Bytedance, alleging violations of the Consumer Fraud Act to receive a “12+” age rating on the App Store, which allowed Iowan minors ages 13 to 17 to use the app and access content that the state argued was inappropriate for minors.
TikTok claimed that content on its app with alcohol or drug use, profanity, crude humor, sexuality, and nudity were “infrequent/mild,” resulting in the 12+ rating. It filed a motion to dismiss the case, which the Iowa District Court for Polk County rejected on Aug. 26.
The Department of Justice and Federal Trade Commission (FTC) sued TikTok earlier this month, claiming that the app violated the Children’s Online Privacy Protection Act. The government alleges TikTok has collected and retained children’s personal information even after a 2019 court order alleged similar violations.

“TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” FTC Chair Lina Khan said in a statement.