Twitter is asking a federal court in Florida to dismiss a lawsuit filed by a child sex trafficking victim, who alleged that the social media giant refused to immediately remove explicit content depicting the child on its platform.
The lawsuit, filed in January, is accusing the Silicon Valley company of knowingly benefiting financially from the dissemination of sexual abuse material when it failed to immediately act on multiple complaints and correspondences asking it to remove the illegal and hurtful content.
“Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material,” the lawsuit states.
It further alleges that “Twitter’s own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.”
The lawsuit also claims that Twitter failed in its duty to report child sexual abuse material, and was negligent when it failed to take action, allowing the content to be viewed over 167,000 times and re-tweeted 2,200 times before it was taken down.
“Twitter’s conduct is an extreme departure from what a reasonably careful person would do in the same situation to prevent harm to others,” the lawsuit argues.
The company is also invoking immunity under section 230 of the Communications Decency Act. The provision largely exempts online platforms from liability for content posted by its users, although they can be held liable for content that violates anti-sex trafficking or intellectual property laws.
“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone,” Twitter argued.
They also argue that Congress’s exception to section 230 that permits civil liability claims for online platforms that knowingly participate in a sex trafficking venture does not apply.
“The Complaint does not come close to meeting this specific and exacting criminal standard. It does not allege any facts suggesting that Twitter knowingly participated in any kind of venture with the Perpetrators, let alone a sex trafficking (i.e., commercial sex) venture,” Twitter argued.
The social media behemoth has also argued that the plaintiffs have failed to state a claim to some of the allegations made in the complaint.
According to the lawsuit, Twitter had sent the child an email on Jan. 28, 2020, that stated it had “reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” The child had first made a complaint to Twitter on Jan. 21, 2020. The content was removed on or about Jan. 30, 2020.
“If Twitter had reviewed the material as they claimed in their response to John Doe, they would have seen the comments above, which clearly acknowledge that the material was depicting minors,” the lawsuit argues.
The lawsuit against Twitter is seeking to block the platform from continuing to benefit from the illegal content posted on its site, as well as claim damages for the harm caused to the child.