Twitter Tries to Dismiss Lawsuit Filed by Child Sex Trafficking Survivor Using Section 230 Immunity

Twitter Tries to Dismiss Lawsuit Filed by Child Sex Trafficking Survivor Using Section 230 Immunity
A sign is posted on the exterior of Twitter headquarter in San Francisco, Calif., on July 26, 2018. Justin Sullivan/Getty Images
Updated:

Twitter is asking a federal court in Florida to dismiss a lawsuit filed by a child sex trafficking victim, who alleged that the social media giant refused to immediately remove explicit content depicting the child on its platform.

The lawsuit, filed in January, is accusing the Silicon Valley company of knowingly benefiting financially from the dissemination of sexual abuse material when it failed to immediately act on multiple complaints and correspondences asking it to remove the illegal and hurtful content.

The content was eventually removed about nine days after the child and his family made complaints to Twitter. The family and the child victim alleges that Twitter only took action after a Department of Homeland Security agent issued a “take-down demand” against the company when other methods to get the company to remove the content failed, according to the complaint.

“Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material,” the lawsuit states.

It further alleges that “Twitter’s own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.”

The lawsuit also claims that Twitter failed in its duty to report child sexual abuse material, and was negligent when it failed to take action, allowing the content to be viewed over 167,000 times and re-tweeted 2,200 times before it was taken down.

“Twitter’s conduct is an extreme departure from what a reasonably careful person would do in the same situation to prevent harm to others,” the lawsuit argues.

In a court filing on Wednesday, Twitter did not address allegations that it had “refused” to immediately remove the content after the complaints were made. Instead, it sought to characterize its actions as tardiness in removing the offensive content. The company defended the delay by arguing that the “sheer volume” of Tweets that are posted on the platform renders it “simply not possible for Twitter ... to remove all offending content immediately or accurately in all cases.” The company says hundreds of millions of Tweets are posted on the platform every day.
“The fact that nine days transpired before the offending content was taken down does not make Twitter liable under any applicable law,” Twitter states in its motion to dismiss.

The company is also invoking immunity under section 230 of the Communications Decency Act. The provision largely exempts online platforms from liability for content posted by its users, although they can be held liable for content that violates anti-sex trafficking or intellectual property laws.

“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone,” Twitter argued.

They also argue that Congress’s exception to section 230 that permits civil liability claims for online platforms that knowingly participate in a sex trafficking venture does not apply.

“The Complaint does not come close to meeting this specific and exacting criminal standard. It does not allege any facts suggesting that Twitter knowingly participated in any kind of venture with the Perpetrators, let alone a sex trafficking (i.e., commercial sex) venture,” Twitter argued.

The social media behemoth has also argued that the plaintiffs have failed to state a claim to some of the allegations made in the complaint.

According to the lawsuit, Twitter had sent the child an email on Jan. 28, 2020, that stated it had “reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” The child had first made a complaint to Twitter on Jan. 21, 2020. The content was removed on or about Jan. 30, 2020.

“If Twitter had reviewed the material as they claimed in their response to John Doe, they would have seen the comments above, which clearly acknowledge that the material was depicting minors,” the lawsuit argues.

The lawsuit against Twitter is seeking to block the platform from continuing to benefit from the illegal content posted on its site, as well as claim damages for the harm caused to the child.

A hearing for the motion to dismiss has been set for June 4.
An earlier version of the article misstated the year the complaints were made. The year was 2020.