Facebook parent company Meta has told an Australian government inquiry that Beijing-run disinformation campaigns are rapidly evolving and moving to stir up social disunity in targeted nations.
Under questioning, Meta’s head of Public Policy in Australia, Josh Machin, revealed that 51 percent of the Coordinated Inauthentic Behavior (CIB) networks known by Meta, that originated from China were taken down in the last seven months following an uptick in activity.
“We have seen quite a shift in tactics and approach by China-based actors over the past seven months or so. Fifty percent of the China-originating CIB networks we have actioned in the last four years we’ve taken down in the last seven months,” Mr. Machin said.
He noted that they were seeing a whole range of new tactics being used by the disinformation networks, which included troll farms, advertising, co-opting media outlets, non-government organisations (NGOs) or other respected third parties.
A troll farm or troll factory is an institutionalised group of internet trolls that strive to interfere in the political environment and decision-making process.
“We are seeing a whole range of new tactics evolving, such as operations that are linked to troll farms; attempts to co-opt journalists, NGOs or other respected third parties; and attempts to work through PR firms,” he said, adding that the company was having success in detecting and removing the CIBs before they were able to get much traction on Meta’s applications like Facebook and Instagram.
Threads to Label State-Sponsored Media
Meta representatives told the inquiry that they will be moving to proactively label all state-affiliated media like Xinhua News Agency or RT on Threads, so users can distinguish between them and other sources of information.“Broader functionality around tags or labels or additional information and context that we can provide about those users are all top priorities for us as we continue to build out the product. Certainly, it’s our aspiration to bring those types of integrity measures,” Mr. Machin said.
Meta said they will remove any state-affiliated media that violate their Threads policies.
Beijing Has Not Made Any Content Moderation Requests
Meta also told the inquiry that they were not involved in any content moderation in China.“I think there are instances where, in countries like Australia, we’re able to set up frameworks where we can resolve these types of situations. But then in countries like China, our services are banned and we have different outcomes that transpire,” said Meta’s Regional Head of Policy Mia Garlick.
She said that most recently, Meta had gotten into a stoush with the Russian government over content in relation to the war in Ukraine.
“We will review any request we receive against our community standards and work through the different procedures there, consistent with international law and human rights standards as well,” she said.
“Most recently, in relation to the war in Ukraine, there were quite significant disagreements in relation to some of the application of our policies around misinformation that led to our services being banned in Russia and elsewhere because we were not able to reach an agreement around where our policies were being drawn.”
Mr. Machn noted that globally, Meta takes a human rights framework in how they think about working across all these countries and will use company structures like the Oversight Board to make binding decisions about how to approach particular pieces of content.
“We want to have those guardrails to make sure we are respecting human rights in all the countries where our services are available. I just wanted to add that that’s very important to us,” he said.
He also said that as part of Meta’s transparency drive, they are making information publicly available four times a year about the requests they receive from all governments in relation to content and user data on its services.