The arguments were “taken under advisement,” a court clerk told The Epoch Times, but no order was issued.
TikTok filed the motion in September, arguing that Nebraska Attorney General Mike Hilgers lacked “personal jurisdiction” over TikTok and that the state failed to address “a cognizable cause of action upon which relief can be granted” against the platform.
Hilgers filed his complaint in May, alleging that TikTok violated consumer protection laws by engaging in “deceptive and unfair trade practices” by designing the platform to be addictive and harmful despite that it is advertised as “family-friendly and safe.”
“Within minutes of signing up, the TikTok algorithm has shown kids inappropriate content, ranging from videos that encourage suicidal ideation and fuel depression, drive body image issues, and encourage eating disorders to those that encourage drug use and sexual content wildly inappropriate for young kids. Parents deserve to be fully and truthfully informed so they can help their kids make positive, healthy choices.”
In October, more than a dozen bipartisan state attorneys general sued the platform on the same allegations, claiming that TikTok is harming children while being deceptive in how it advertises itself.
That brought the number of state attorneys general who have sued TikTok up to 23.
Among them is California Attorney General Rob Bonta, who alleged that “TikTok cultivates social media addiction to boost corporate profits.”
He added that the mental health crisis among children has become “a revenue machine” for the platform.
In August, the Department of Justice and the Federal Trade Commission (FTC) sued TikTok, alleging that it violated the Children’s Online Privacy Protection Act, which prohibits websites from collecting data from children younger than 13.
The Epoch Times contacted TikTok for comment.
“We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product,” Michael Hughes, a TikTok spokesperson said in an email.
“We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”