Reddit Failed to Remove Child Pornography in Timely Manner: Lawsuit

Reddit Failed to Remove Child Pornography in Timely Manner: Lawsuit
A Reddit logo at Web Summit 2018 in Lisbon, Portugal, on Nov. 6, 2018. Seb Daly/Web Summit via Getty Images
Zachary Stieber
Updated:

A “troubling number of instances of child pornography” were found on the social media website Reddit, according to a new lawsuit.

The class-action suit alleges Reddit has knowingly benefited financially from videos and images posted to the site that feature children.

After pressure due to instances of child porn on several subreddits, or forums on the site, Reddit in 2011 implemented a policy that banned child porn.

“However, despite Reddit’s ability to enforce this policy and awareness of the continued prevalence of child pornography on its websites, Reddit continues to serve as a safe haven for such content. Indeed, although child pornography is inconsistent with the ‘policy’ announced by Reddit, there are a troubling number of instances of child pornography featured on the site. Reddit claims it is combating child pornography by banning certain subreddits, but only after they gain enough popularity to garner media attention,” the suit states.

“What’s more, Reddit has taken no action to actually prevent users from uploading child pornography in the first place. Posting material on Reddit requires no age verification of any kind. A user simply chooses a subreddit to which they intend to post, writes the text of the post, uploads an image and/or video, and clicks ‘post.’ The user is not even required to click a checkbox confirming that the post complies with Reddit’s policies. Rather, there is a small note on the site that says: ‘Please be mindful of reddit’s content policy and practice good reddiquette.’”

The suit was filed on behalf of Jane Doe, an individual who was filmed having sex by her boyfriend when she was 16 without her consent. Some of the videos were posted to Reddit.

Doe reported the posts to moderators of each subreddit where they were posted but moderators would sometimes wait several days before taking down the content.

But the ex-boyfriend would soon post the exact same post, so the video would be available again for a period of time.

“And somehow it fell on Jane Doe, and not the subreddit’s moderator—to find these new posts and once again fight to have them removed. When Jane Doe was finally successful in having her ex-boyfriend’s Reddit account banned, he simply made a new account and was once again free to post all the child pornography he liked,” the suit states.

Anyone who was under the age of 19 when they appeared in a photograph or image that has been uploaded to Reddit in the past 10 years can join the class action suit. Plaintiff believes there are “many thousand members” of the class.

Reddit told The Epoch Times via email that child sexual abuse material, or CSAM, “has no place on the Reddit platform.”

“We actively maintain policies and procedures that don’t just follow the law, but go above and beyond it. Our Content Policy prohibits any sexual or suggestive content involving minors or someone who appears to be a minor. We prohibit advertising in communities dedicated to Not Safe for Work (NSFW) content. In addition, we do not allow advertisements for sexually explicit content, products, or services anywhere on our platform,” the company said.

“We deploy both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material. When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take the steps required under law to report the relevant user(s) and preserve any necessary user data.”

Zachary Stieber
Zachary Stieber
Senior Reporter
Zachary Stieber is a senior reporter for The Epoch Times based in Maryland. He covers U.S. and world news. Contact Zachary at [email protected]
twitter
truth
Related Topics