Facebook’s Oversight Board Calls for More Equitable Censorship

Facebook’s Oversight Board Calls for More Equitable Censorship
A smartphone with Facebook's logo is seen with new rebrand logo Meta in this illustration taken on Oct. 28, 2021. Dado Ruvic/Reuters
Tom Ozimek
Updated:
0:00

Facebook parent Meta’s Oversight Board, the quasi-independent body that advises the social media giant on content policy, has issued a report warning of flaws in the company’s “cross-check” system that gives more leeway to high-profile users to post violating content.

The advisory panel has urged Meta to modify cross-check, which it calls a “false-positive mistake-prevention” program, including making it more transparent, more universally accessible, and in some cases, more strict in its censorship.

The Oversight Board, informally dubbed Facebook’s “Supreme Court,” said in its policy advisory opinion on Meta’s cross-check, or “XCheck,” program (pdf) that the system “is flawed in key areas which the company must address.”
Meta says its cross-check program was set up to reduce the number of false positives and content overenforcement by establishing a list of privileged entities such as state actors, human rights organizations, and big companies that enjoy special treatment when it comes to content policing.
Meta asked the advisory panel to look into the cross-check system after The Wall Street Journal reported last year that the program was skewed to the benefit of many of its elite users, who were allowed to post material that would result in penalties for ordinary people or whose violating content was allowed to stay up longer while subject to extra layers of human review.

Among the board’s 32 recommendations for revamping the cross-check system is for Meta to expand eligibility for its VIP list while, in some cases, making enforcement more strict.

For instance, the board urged Meta to remove or hide content while it’s being reviewed and said the company should “radically increase transparency around cross-check and how it operates,“ such as outlining “clear, public criteria” on who gets to be on the VIP list.

Meta said it would review the recommendations and respond within 90 days.

Delayed Removal of Violating Content

The board stated in the report that the cross-check system was skewed to give special treatment to privileged users based on business interests rather than considerations such as human rights.

“We found that the program appears more directly structured to satisfy business concerns,” the panel said.

“The board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”

In 2019, the cross-check system blocked the company’s moderators from removing nude photos of a woman posted by Brazilian soccer star Neymar, even though the post violated Meta’s rules against “nonconsensual intimate imagery,” according to The Wall Street Journal report.

At the time, the Oversight Board reprimanded Meta for not being “fully forthcoming” in its disclosures about cross-check.

In the opinion issued on Dec. 6, the board said that eligibility criteria for the program should be more transparent and more universal.

“The board reiterates its concerns about inequitable access to the benefits of cross-check,” the panel said.

“Meta maintains clear processes to determine some of its users are entitled entities, such as state actors and business partners. Without clear criteria for other users who are likely to post content with significant human rights value, the program less clearly benefits others, including members of marginalized and discriminated-against groups.”

At the same time, the board recommended that Meta increase enforcement of content from privileged entities when such content undermines human rights.

“The board agrees that universal eligibility for a false-positive mistake-prevention system is a positive step,” it stated. “However, such a system should prioritize identifying content that is not also targeted by an entity-based system.”

“It should provide enhanced protection based upon a human rights rationale. While Meta may give some additional protection where over-enforcement might threaten its business interests, similarly to list-based systems, it should not do so at the expense of its human rights commitments.”

Regarding the criteria for getting on the privileged list, priority should be given to human rights defenders, public officials, and journalists, rather than celebrities, musicians, political parties, and big companies.

“If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection.”

Nick Clegg, Meta’s president for global affairs, commented in a tweet that the company had asked the board to review the cross-check system “so that we can continue our work to improve the program.”

Clegg added that in order to fully address the board’s recommendations, “we’ve agreed to respond within 90 days.”

The board also stated in its report that it first became aware of the cross-check program in 2021, when it reviewed Facebook’s decision to ban former President Donald Trump from the platform.

Facebook initially suspended Trump indefinitely after the Jan. 6, 2021, riots, but later concluded it had gone too far with a permanent ban. Instead, it extended Trump’s suspension by another two years, and said it would only reinstate him “if the risk to public safety has receded.”

Trump’s two-year suspension on Facebook is scheduled to last until Jan. 7, 2023.

Trump Exempt From Facebook Fact-Checks

After Trump announced that he’s running for president in 2024, he became exempt from Facebook’s fact-checks.

That’s because Trump, like any other politician, enjoys a fact-checking carveout under the platform’s rules.

Meta’s policies on fact-checking stipulate that its fact-checkers shouldn’t rate posts and ads from politicians.

“This includes the words a politician says as well as photo, video, or other content that is clearly labeled as created by the politician or their campaign,” Meta’s policy states.

In explaining why it doesn’t fact-check politicians, Facebook said in a statement that this “approach is grounded in Facebook’s fundamental belief in free expression, respect for the democratic process, and the belief that, especially in mature democracies with a free press, political speech is the most scrutinized speech there is.”

“Just as critically, by limiting political speech we would leave people less informed about what their elected officials are saying and leave politicians less accountable for their words.”

Trump remains banned from Facebook, and his last visible post is from Jan. 6, 2021, in which he calls for “everyone at the U.S. Capitol to remain peaceful” and to “respect the Law and our great men and women in Blue.”
However, a “Team Trump” page run by his political group is active and has 2.3 million followers.

Twitter also banned Trump from its platform following the Jan. 6, 2021, riots, but the platform’s new owner, Elon Musk, reversed that decision.

Trump has yet to post on Twitter since his reinstatement.

The Associated Press contributed to this report.
Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter
Related Topics