A district court in Iowa rejected a petition from TikTok to dismiss a lawsuit filed by the state, which is accusing the company of enabling minors to access inappropriate content.
Iowa filed a motion seeking a temporary injunction to prevent TikTok from claiming that the platform contains little or no content related to sexuality, profanity, mature themes, and references to alcohol and drugs. TikTok filed a motion to dismiss the case, arguing that the state’s request would “dramatically impact its brand,” with users and advertisers affected.
On Aug. 26, the Iowa District Court for Polk County issued a ruling that dismissed the motions submitted by both the state and the defendants. With Tiktok’s motion dismissed, the lawsuit can continue.
When applying to be listed on Apple’s App Store, TikTok claimed that content with alcohol or drug use, profanity, crude humor, sexuality, and nudity were “infrequent/mild,” according to the ruling. Subsequently, Apple gave TikTok a 12+ rating.
An investigation into the issue was conducted by an official from the Iowa Attorney General’s Office in October 2023. The official created a TikTok account under a fake name, using a birth date of May 5, 2010, which would have made the user 13 years old at the time.
Using that account, the official was able to access content that included “profanity, crude humor, sexual content, nudity, alcohol, tobacco, drug use, suicide, depression, self-harm, eating disorders, and other mature themes,” the court wrote.
The state argued that TikTok violated the Consumer Credit Code by “falsely answering questions on the Apple App Store regarding content categories that it classified as mild/infrequent.”
The ruling states that the state’s accusations of TikTok being involved in deception, misrepresentation, and unfair practice were “sufficiently pled.”
Protecting Minors
TikTok defended itself against the lawsuit, claiming earlier in the year that it has put in place safeguards aimed at protecting young people, such as parental control and time limits for users who are younger than 18. These restrictions are based on industry guidelines, it noted.If, at the time of signing up, an individual is younger than 13, TikTok takes the user to a limited version of the app. For users between 13 and 17, TikTok enables the filtering of adult content. The app contains a family pairing feature that parents can use to set safety settings while monitoring or limiting usage and searches on the account.
TikTok categorizes content for age groups into four categories, with Level 1 suitable for all age groups, while Level 4 can include sexual content and material deemed not suitable for minors. The app restricts Level 4 content for users aged 13 to 17.
“We are committed to tackling industry-wide challenges and will continue to prioritize community safety,” TikTok stated in January.
“It’s time we shine a light on TikTok for exposing young children to graphic materials such as sexual content, self-harm, illegal drug use, and worse. TikTok has sneaked past parental blocks by misrepresenting the severity of its content. But no longer,” Bird said.
The deadline for the sale is Jan. 19, 2025, just before the next U.S. president is sworn in. The president can extend the deadline by a maximum of three months to facilitate the completion of a sale agreement.
The Epoch Times reached out to TikTok for comment but didn’t receive a reply by publication time.