China Used Over 8,000 Fake Social Media Accounts to Target UK, Meta Says

The Facebook and Instagram accounts, linked to Chinese law enforcement, were shut down by Meta earlier this year.
China Used Over 8,000 Fake Social Media Accounts to Target UK, Meta Says
The webpage for the Chinese version of Facebook is seen on a computer screen in Hong Kong on Feb. 2, 2012. Aaron Tam/AFP/Getty Images
Patricia Devlin
Updated:
0:00

Meta shut down over 8,000 social media accounts covertly run by China which targeted the UK, a new EU transparency report has revealed.

The Facebook and Instagram profiles were part of network being coordinated in Bejing, according to the data disclosed by the social media giant.

“One or more” of the 930 Facebook pages identified by Meta had amassed more than 560,000 followers, according to its report published by the European Commission.

Over 7,700 Facebook accounts were identified and shut down by Meta from April to May this year, along with 15 groups and 15 Instagram accounts for “inauthentic behaviour.”

Meta said in its EU report: “This network originated in China and targeted many regions around the world, including Taiwan, the United States, Australia, the United Kingdom, Japan, and global Chinese-speaking audiences.

“Around 560,000 accounts followed one or more of these Pages, and fewer than 10 accounts joined one or more of these Groups.

“We assess that this network’s Pages were likely acquired from spam operators with built-in inauthentic followers primarily from Vietnam, Bangladesh and Brazil—none of which we assess to be the targets of this operation.”

Meta tied the campaign to a group known as Spamouflage, also known as Dragonbridge, which has been linked to Chinese law enforcement agencies.

“On our platform, this network was run by geographically dispersed operators across China who appear to have been centrally provisioned with internet access and content directions,” the report said.

Russia

On top of covert networks being run by China, Meta reported that two of the largest influence operations it had closed down related to Russia.

The social media company said the misinformation focused on the war in Ukraine and were able to link the campaigns to “private actors” including former Wagner Group chief Yevgeny Prigozhin, who died in a plane crash last month.

“During the same period, covert influence operations have adopted a brute-force, ‘smash-and-grab’ approach of high-volume but very low-quality campaigns across the internet,” Meta said.

The data was reported to the European Commission under a new code of conduct joined by Meta, TikTok, Microsoft, and Google, aimed at curbing online misinformation under new EU tech giant rules.

X, formerly known as Twitter, withdrew from the code of conduct but was described by the commission has having the highest rate of disinformation among the social media networks.

In its report to the commission, China owned video sharing app TikTok revealed it had shut down a “covert influence operation” network dedicated to targeting users in Ireland with “divisive” content to “intensify social conflict.”

The influence network was made up of 72 accounts that together had a following of some 94,743 users, and was shut down earlier this year.

No detailed data on the UK was provided in its disinformation document, however TikTok did reveal that 12 accounts being run by a Russian network had been removed for targeting European countries including the UK.

It said: “The individuals behind this network used impersonation in order to artificially amplify specific viewpoints related to Ukraine’s president [Volodymyr] Zelensky, the economic sanctions currently imposed on Russia, and Ukrainian refugees.”

One of the accounts had almost 1,500 followers at the time it was removed, the social media company said.

TikTok logo on an iPhone in London on Feb. 28, 2023. (Dan Kitwood/Getty Images)
TikTok logo on an iPhone in London on Feb. 28, 2023. Dan Kitwood/Getty Images

Online Safety Bill

Google—also signed up to the new EU rules—said it continued to track and disrupt cyberattacks on Ukraine.

It reported that “government-backed actors from Russia, Belarus, China, Iran, and North Korea have been targeting Ukrainian and Eastern European government and defence officials, military organisations, politicians, NGOs, and journalists, while financially motivated bad actors have also used the invasion as a lure for malicious campaigns.”

The tech giant said it is continuing to provide “critical cybersecurity and technical infrastructure” to Ukraine and has donated around 50,000 new Google Workspace licences to the Ukrainian government.

The detailed online misinformation reports from the tech sector is part of the EU’s efforts at tackling misleading online content under the 2022 Code of Practice on Disinformation.

The UK is not signed up to the code and is instead progressing with the newly passed Online Safety Bill.

The controversial legislation will set tougher standards for social media platforms to remove illegal content quickly or prevent it from appearing in the first place.

They will also be expected to prevent children from accessing harmful and age-inappropriate content like pornography by enforcing age limits and age-checking measures.

If companies do not comply, media regulator Ofcom will be able to issue fines of up to £18 million or 10 percent of their annual global turnover.

The legislation has been described by Technology Secretary Michelle Donelan as “game changing,” however it has been heavily criticised by privacy campaigners.

Civil liberties group Big Brother Watch previously described the bill as “disastrous for privacy rights and free expression online.”

Patricia Devlin
Patricia Devlin
Author
Patricia is an award winning journalist based in Ireland. She specializes in investigations and giving victims of crime, abuse, and corruption a voice.
Related Topics