It’s the latest of several lawsuits that have been brought against China-owned app.
After the suicides of two French 15-year-olds, families are suing TikTok in the first group lawsuit against the tech giant in Europe.
Attorney Laure Boutron-Marmion, representing seven families, told broadcaster franceinfo on Nov. 4 that the families allege TikTok’s algorithm recommended children videos promoting suicide, self harm, and eating disorders.
“The parents want TikTok’s legal liability to be recognized in court,” Boutron-Marmion said. “This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings.”
The French lawsuit is the latest of several that have been brought against China-owned app for allegedly harming minors.
TikTok did not immediately return an inquiry from The Epoch Times, but has previously publicly stated that it believes its app is safe for underage users and that it has taken several steps to protect them.
However, the app came under fire last month when improperly redacted court filings of internal TikTok documents revealed that TikTok executives
knew of several risks and negative outcomes the app had on minors, and took steps not to remedy them, but instead to improve the app’s public image.
A bipartisan group of attorneys general
from 14 states sued TikTok last month after a more than two-year investigation, alleging that the app misled the public with claims it was safe when it had intentionally designed the app to be addictive to youth.
In a separate
lawsuit, the Justice Department and Federal Trade Commission accused TikTok of violating the Children’s Online Privacy protection Act by collecting, using, or disclosing personal information from children under the age of 13 without parental consent.
Last month, Texas separately
sued TikTok, alleging the app violated a similar, new state law, Securing Children Online through Parental Empowerment.
Parents in the United States have also sued TikTok. In 2022, multiple lawsuits came after young children were shown and participated in a “
blackout challenge“ promoted on TikTok.
Tawainna Anderson sued after her 10-year-old daughter Nylah died in December 2021, and the lawsuit was revived in appeals court earlier this year.
Separately, the parents of 8-year-old Lalani Erika Renee Walton and 9-year-old Arriani Jaileen Arroyo sued TikTok,
accusing the app of recommending the “unacceptably dangerous video” in the girls’ personalized feeds. The Social Media Victims Law Center, which helped the parents file their petition, has filed
several other lawsuits on behalf of youths who were harmed or died after using TikTok.
TikTok has filed to dismiss such
cases, but many have moved forward this year.
In Iowa, a district court ruled that TikTok
has to face a lawsuit accusing it of violating the Consumer Fraud Act, with the state alleging that it acted deceptively to obtain a 12+ rating in the App Store to allow minors to sign up for the service. In California, a district court similarly ruled that TikTok, along with Facebook and other social media companies,
must face hundreds of complaints filed by school districts accusing the platforms of harming children.
According to a 2023 Pew Research survey,
62 percent of adults under the age of 30 use TikTok, up three times from the year prior. A similar rate of 63 percent of teens use TikTok, with 58 percent saying they use it daily and 17 percent who are on the app “almost constantly.”
In addition to broad pushback against the app over children’s safety concerns, lawmakers and intelligence officials have voiced national security issues. This resulted in a law passed earlier this year that would require TikTok
to cut ties with the Chinese communist regime. TikTok and parent company ByteDance are challenging the law in court, with ByteDance arguing the Chinese regime
will not allow it to sell the app.
Reuters contributed to this report.