Facebook has long faced accusations of left-leaning bias, underscored by the overwhelmingly leftist leanings of its staff. While the company has maintained that personal biases do not seep into its content policing, the documents show the engineers repeatedly conflating right-leaning arguments, some perhaps crudely presented, with “hate speech” or “trolling.”
‘Red-pilling’ or Trolling?
The photos (pdf) were obtained by a Facebook contractor fired about year ago, according to Project Veritas, and include a September 2017 presentation by the company’s Data Science Manager Seiji Yamamoto and Chief Data Scientist Eduardo Arino de la Rubia.The presentation documents what the authors consider “coordinated trolling” and how to counter it.
Trolling is a broad term that describes intentionally eliciting a negative response from someone, most commonly online.
The authors identified trolls with a variety of forms of conduct commonly deemed unacceptable online, such as harassment, doxxing (revealing somebody’s personal information), and falsely reporting content violations.
Yet one of the “destructive behaviors” described by the authors was also “red-pilling normies to convert them to their worldview.”
As an example of such “red-pilling,” the authors posted a link to the YouTube video “Why Social Justice is CANCER | Identity Politics, Equality & Marxism” by Lauren Chen, also known as “Roaming Millennial.”
In a video response to the document’s release, Chen said she was confused by the authors’ choice to single out her video, which she described as “super, super tame.”
“Essentially, I’m arguing that social justice is toxic because it promotes tribalism over individuality and because it chips away at the concept of equality of opportunity for individuals,” she said.
Chen acknowledged she picked a provocative title, yet she also started the video by explaining she doesn’t literally equate “social justice” with cancer. “It is frustrating that I would even need to explain things like hyperbole and metaphor,” she said.
It doesn’t appear Chen’s viewers have felt “trolled” either—the video had some 11,000 likes versus fewer than 500 dislikes as of Feb. 28.
She further took issue with the authors’ apparent belief that converting people to one’s worldview is objectionable.
Punishments
Yamamoto and de la Rubia suggested establishing a “toxic meme cache” and blocking and suppressing images that match the “cache.”They also recommended developing a program that could “predict” whether a user is a troll by, among other things, scouring the user’s language for words like “cuck, zucced, REEE, normie, IRL, lulz, Shadilay, etc.”—some of which are common slang terms used by some online communities.
They proposed targeting troll accounts with “drastically limited bandwidth for a few hours,” which would slow down Facebook’s functioning for the user, as well as logging the user out or redirecting the user to their Facebook home page every few minutes.
‘Action Deboost’
Other photographs from the insider show that some Facebook pages were marked with the code “SI (Sigma): !ActionDeboostLiveDistribution,” which the insider believed was to suppress the distribution of live stream videos posted by those pages. The code was seen on pages belonging to right-leaning author and filmmaker Mike Cernovich, conservative comedian and commentator Steven Crowder, and right-leaning news site The Daily Caller. The insider said she checked several pages belonging to left-leaning figures and entities, such as the Young Turks and Colin Kaepernick, and found that they didn’t include the coding.The insider’s photos “seem legitimate,” former senior Facebook engineer Brian Amerige told The Epoch Times via Facebook Messenger app. He was hesitant to trust Project Veritas, a right-leaning nonprofit, as a source, and said he hadn’t seen the “deboosting” technology with his own eyes. He opined, though, that “‘deboosting’ is probably happening one way or another (for both good and bad reasons).”
Facebook didn’t respond to a request for comment.
Hate Speech
Hate speech, according to Facebook, refers to derogatory statements based on someone’s “protected characteristics—race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, serious disease, or disability.”“We all agree that hate speech needs to be stopped,“ Yamamoto wrote in an internal Jan. 17, 2018, post, ”but there’s quite a bit of content near the perimeter of hate speech that we need to address as well.”
However, the company has acknowledged that, regarding hate speech, “there is no universally accepted answer for when something crosses the line.”
Ultimately, Facebook acknowledges that its content police force, which has tripled since last year to 30,000 strong, has to make a judgment call in each case.
He said that during his time at the company, he tried to change the culture from within and even gained the attention of company leadership, but eventually reached an impasse on the issue of hate speech.
“Hate speech can’t be defined consistently and it can’t be implemented reliably, so it ends up being a series of one-off ‘pragmatic’ decisions,” he previously told The Epoch Times. “I think it’s a serious strategic misstep for a company whose product’s primary value is as a tool for free expression.”