Tik Tok Sued by Content Moderator Over Psychological Trauma From Reviewing Graphic Videos

Tik Tok Sued by Content Moderator Over Psychological Trauma From Reviewing Graphic Videos
Tik Tok logos are seen on smartphones in front of a displayed ByteDance logo in this illustration taken Nov. 27, 2019. Dado Ruvic/Illustration/File Photo/Reuters
Tom Ozimek
Updated:

Former Tik Tok content moderator Candie Frazier has filed a lawsuit in a California court, alleging that the platform and its parent company ByteDance failed to provide sufficient safeguards to protect the mental health of moderators who in their job had to view traumatic footage, including acts of extreme and graphic violence.

A complaint proposing a class action suit and demanding a jury trial, filed in the California Central District Court on Dec. 23 and obtained by The Epoch Times, accuses ByteDance Inc. and Tik Tok Inc. of failing to provide a safe workplace for content moderators, who as part of their job to keep the social media platform “sanitized” spend long hours reviewing and moderating videos, including ones that are disturbingly graphic.

“While working at the direction of ByteDance and Tik Tok, Content Moderators—including Plaintiff Frazier—witness thousands of acts of extreme and graphic violence, including sexual assault, genocide, rape, and mutilation,” the complaint states.

Moderators like Frazier spend twelve hours a day reviewing and moderating content so disturbing images don’t reach Tik Tok users, the complaint says, alleging that ByteDance and Tik Tok are aware of the psychological harms viewing such content has on moderators yet failed to implement industry-recognized safety standards to protect them from psychological trauma.

The complaint lists a number of industry standards for protecting moderators, including counseling from professionals specializing in trauma intervention, limiting how much time moderators are exposed to child sexual abuse imagery, as well as technical safeguards like blurring the videos under review or muting audio.

Frazier said in the filing that, as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” she developed and now suffers from significant psychological trauma, including anxiety, depression, and post-traumatic stress disorder.

She alleges that Tik Tok and ByteDance are aware of the impact on moderators of reviewing graphic content but failed to incorporate into their moderation tools such safeguards as reducing video resolution, changing the direction of the video, or superimposing a grid onto the footage, which could “mitigate some of the harm.”

Tik Tok did not immediately respond to a request for comment on the complaint.

The complaint asks the court to force Tik Tok and ByteDance to implement safeguards and establish a fund that would pay for the diagnosis and treatment of psychological trauma to moderators. It also seeks compensatory damages to Frazier and the class.

Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter
Related Topics