Facebook Won’t Fact-Check Politicians in Upcoming Australian Election: Meta

Facebook Won’t Fact-Check Politicians in Upcoming Australian Election: Meta
A woman holds a smartphone with the Facebook logo in front of a displayed Facebook's new rebrand logo Meta in this illustration picture taken on Oct. 28, 2021. Dado Ruvic/Illustration/Reuters
Epoch Times Sydney Staff
Updated:
Meta has said it will not be fact-checking politicians in the next federal election in Australia as it outlined the company’s extensive plans to limit the spread of “misinformation” and bolster “election integrity” in the upcoming Australian federal election by expanding its fact-checking network.

According to comments obtained by ZDNet, Meta’s Australia’s Head of Public Policy Josh Machin said that politicians’ claims will not be fact-checked.

“The speech of politicians are already very highly scrutinised,” he said.

“It’s scrutinised by [journalists], but also by academics, experts, and their political opponents who are pretty well-positioned to push back or indicate they don’t believe something’s right if they think they’re being mischaracterised.”
The Facebook and Instagram parent company has invested considerable resources working with local groups to put together a complete set of election integrity measures, Machin said in a press release from March 15.

As part of the tech giant’s plans, Meta has expanded its local third-party fact-checking network to include RMIT FactLab and Australian Associated Press (AAP), in addition to pre-existing partner Agence France Presse.

Meta is also providing additional funding to expand its ability to rate and review content as the election approaches.

“We’ll also be providing one-off grants to all our fact-checkers to increase their capacity in the lead up to the election,” he said. “Our fact-checkers work to reduce the spread of misinformation across Meta’s services. When they rate something as false, we significantly reduce its distribution, so fewer people see it. We also notify people who try to share something rated as false and add a warning label with a link to a debunking article.”

In addition to removing and limiting the reach of “harmful or misleading electoral misinformation,” Manchin said Australians need to be informed in how “they can make an informed decision on what to read, trust and share.”

To facilitate this, Meta is partnering with monitoring organisation First Draft to develop its “Don’t be a Misinfluencer” program across its tech platforms, which teaches creators and influencers to share tips on how to spot “false news.”
They are also working with AAP on its “Check The Fact” campaign–a series of short videos, that will be translated into Vietnamese, Simplified Chinese, and Arabic.

“Those are the three largest communities of non-English speaking people within Australia, and we’ve been very conscious of the risk of potential misinformation, particularly amongst the Chinese-speaking community,” Machin told AAP.

Meta Will Enforce Community Standards on Politicians

The comments from Machin on the election come after Meta was grilled by the leader of the United Australia Party Craig Kelly MP at a hearing of the Select Committee on Social Media and Online Safety on March 2.

Kelly, whose Facebook and Instagram pages were banned in 2021 by Meta for allegedly breaching the company’s misinformation policy over posts containing information about the use of ivermectin and hydroxychloroquine for the treatment of COVID-19 asked Machin to confirm that Meta would guarantee “there will be no foreign interference by Meta in the Australian election” by blocking, shadow-banning, or deplatforming political candidates or parties.

“We enforce our policies and our community standards consistently on users on our platform, whether they’re a private individual or a public figure,” Machin said.

“If a piece of content violates our community standards, then, yes, we'll be removing it. That’s a really important protection that we have in place in order to protect the safety and the integrity of the election campaign.”

Kelly also queried whether Meta regarded its community standards as undermining or threatening Australia’s democracy using these standards.

“If someone who’s a registered candidate at an election, or a registered political party, makes a particular statement, you are saying that, if it somehow doesn’t go along with your community standards or if one of your fact-checkers disagrees with it, you will actually block, censor, shadowban or deplatform that political candidate or political party. Isn’t that direct foreign interference in the Australian election campaign?” Kelly questioned.

Machin responded by clarifying that when a fact-checker tells Facebook that a piece of content is false.

“We‘ll apply a label to it—an interstitial—so that people are only able to view the content if they choose to deliberately click through, and we will take steps in order to reduce the distribution of that piece of content: we’ll remove it from recommendations and we'll show it lower in people’s feeds. But I do want to clarify that we don’t wholesale remove content on the basis of a fact-checker finding it to be false; we remove content that violates our community standards,” Machin said.

Meta To Draw on Previous Election Experience

Meta also noted in the press release that they would be drawing on its experience in more than 200 elections worldwide to tighten up cyber security and combat threats such as organised influence operations. In Australia alone, they have invested AU$7 billion (approximately US$5 billion) on safety and security Machin said.

“We have specialised global teams to identify and take action against threats to the election, including signs of coordinated inauthentic behaviour across our apps. We are also coordinating with the government’s election integrity assurance taskforce and security agencies in the lead up to the election,” he said. “We’ve also improved our AI so that we can more effectively detect and block fake accounts, which are often behind this activity.”