According to comments obtained by ZDNet, Meta’s Australia’s Head of Public Policy Josh Machin said that politicians’ claims will not be fact-checked.
“The speech of politicians are already very highly scrutinised,” he said.
“It’s scrutinised by [journalists], but also by academics, experts, and their political opponents who are pretty well-positioned to push back or indicate they don’t believe something’s right if they think they’re being mischaracterised.”As part of the tech giant’s plans, Meta has expanded its local third-party fact-checking network to include RMIT FactLab and Australian Associated Press (AAP), in addition to pre-existing partner Agence France Presse.
Meta is also providing additional funding to expand its ability to rate and review content as the election approaches.
“We’ll also be providing one-off grants to all our fact-checkers to increase their capacity in the lead up to the election,” he said. “Our fact-checkers work to reduce the spread of misinformation across Meta’s services. When they rate something as false, we significantly reduce its distribution, so fewer people see it. We also notify people who try to share something rated as false and add a warning label with a link to a debunking article.”
In addition to removing and limiting the reach of “harmful or misleading electoral misinformation,” Manchin said Australians need to be informed in how “they can make an informed decision on what to read, trust and share.”
“Those are the three largest communities of non-English speaking people within Australia, and we’ve been very conscious of the risk of potential misinformation, particularly amongst the Chinese-speaking community,” Machin told AAP.
Meta Will Enforce Community Standards on Politicians
The comments from Machin on the election come after Meta was grilled by the leader of the United Australia Party Craig Kelly MP at a hearing of the Select Committee on Social Media and Online Safety on March 2.Kelly, whose Facebook and Instagram pages were banned in 2021 by Meta for allegedly breaching the company’s misinformation policy over posts containing information about the use of ivermectin and hydroxychloroquine for the treatment of COVID-19 asked Machin to confirm that Meta would guarantee “there will be no foreign interference by Meta in the Australian election” by blocking, shadow-banning, or deplatforming political candidates or parties.
“We enforce our policies and our community standards consistently on users on our platform, whether they’re a private individual or a public figure,” Machin said.
“If a piece of content violates our community standards, then, yes, we'll be removing it. That’s a really important protection that we have in place in order to protect the safety and the integrity of the election campaign.”
Kelly also queried whether Meta regarded its community standards as undermining or threatening Australia’s democracy using these standards.
“If someone who’s a registered candidate at an election, or a registered political party, makes a particular statement, you are saying that, if it somehow doesn’t go along with your community standards or if one of your fact-checkers disagrees with it, you will actually block, censor, shadowban or deplatform that political candidate or political party. Isn’t that direct foreign interference in the Australian election campaign?” Kelly questioned.
Machin responded by clarifying that when a fact-checker tells Facebook that a piece of content is false.
Meta To Draw on Previous Election Experience
Meta also noted in the press release that they would be drawing on its experience in more than 200 elections worldwide to tighten up cyber security and combat threats such as organised influence operations. In Australia alone, they have invested AU$7 billion (approximately US$5 billion) on safety and security Machin said.“We have specialised global teams to identify and take action against threats to the election, including signs of coordinated inauthentic behaviour across our apps. We are also coordinating with the government’s election integrity assurance taskforce and security agencies in the lead up to the election,” he said. “We’ve also improved our AI so that we can more effectively detect and block fake accounts, which are often behind this activity.”