Facebook’s new “Supreme Court” ruled against Facebook four out of five times in its first batch of content-moderation rulings released to the public.
The company’s recently staffed Oversight Board enjoys an independent funding stream and the company can’t fire its members. Facebook has vowed to obey the board’s rulings unless contrary to law.
Before that, Facebook took down the iconic 1972 “Napalm Girl” photograph showing children running from a bombed-out village during the Vietnam War, citing its policy against child nudity.
The company reversed that decision, too.
In September 2019, Facebook CEO Mark Zuckerberg unveiled his plan to create the board.
Board members serve a maximum of three three-year terms each and are paid an amount that the company hasn’t disclosed. The company is allowed to seek an “automatic and expedited review” in exceptional circumstances, “when content could result in urgent real-world consequences,” such as, for example, if someone were live-streaming a murder.
In October 2020, a user had posted a quotation that was incorrectly attributed to Joseph Goebbels, the Reich minister of propaganda in Nazi Germany. It wasn’t accompanied by Nazi symbols or even a photograph of Goebbels, who killed himself in 1945 as the Soviet Army was closing in on his location in Berlin. The quotation, in English, claimed that, instead of appealing to intellectuals, arguments should appeal to emotions and instincts, and “that truth does not matter and is subordinate to tactics and psychology.”
The user, who wasn’t identified in the case decision, said his or her intent was to draw a comparison between the sentiment in the quotation and the presidency of Donald Trump. The board held that the quotation “did not support the Nazi party’s ideology or the regime’s acts of hate and violence,” adding that comments on the post from the user’s friends supported the user’s claim that he or she sought to compare the Trump presidency to the Nazi regime.
Facebook claimed that posts sharing a quotation attributed to a dangerous individual are treated as expressing support for them, unless the user provides additional context to make their intent explicit. The company said it removed the post because the user didn’t make clear that they shared the quotation to condemn the purported author of it, to counter extremism or hate speech, or for academic or news purposes.
The Oversight Board determined that “these rules were not made sufficiently clear to users.”
The board didn’t rely on the First Amendment to the U.S. Constitution, which protects free speech in the United States, instead referencing documents adopted by international bodies, such as the U.N. Guiding Principles on Business and Human Rights, the International Covenant on Civil and Political Rights, and the International Convention on the Elimination of All Forms of Racial Discrimination.
The post featured two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in 2015.
Facebook removed the post as anti-Muslim hate speech but the board allowed it, determining that it should be viewed as “commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China.” The board found that while the post could be deemed offensive, “it did not reach the level of hate speech.”
The post was made during an armed conflict last year between Armenia and Azerbaijan. The user posted historical photographs described as showing churches in Baku, Azerbaijan. The Russian wording accompanying the photos claimed that Armenians built Baku and that this heritage, including the churches, was destroyed.