Harlan Report appeared to be a startup news program like many others. Its bio on TikTok promised to make American media great again.
“No opinions, just facts,” it stated.
Like so many insurgent media profiles, the videos posted by Harlan seemed genuinely aimed at exposing government corruption and pushing back against an otherwise left-wing-dominated mediascape.
That much was clear when a video Harlan shared went viral, earning more than 1.5 million views. It claimed to show President Joe Biden making a sexual remark at the annual NATO Summit in Washington.
But something was off.
The transcript used in the video was wrong, and Biden never said what was claimed.
There were also other red flags.
The owner of the Harlan Report account originally claimed to be a U.S. military veteran who had lost faith in Biden. Soon after, the owner claimed to be a 29-year-old Trump supporter in New York. Months later, the owner claimed to be a 31-year-old Republican social media influencer from Florida.
Then the account’s handle was changed to “Harlan_RNC,” insinuating an official link to the Republican Party.
But Harlan was neither a legitimate news source nor run by an American citizen.
That operation, dubbed “Spamouflage,” is a state-backed campaign from communist China with links to Chinese law enforcement.
Unlike the Harlan Report, most of Spamouflage’s efforts are not focused on targeting American conservatives but on amplifying existing criticisms toward American society and government at large.
There are other accounts that create similar content but tailored for Democrats and others that aim to anger and polarize independents, further disenfranchising them from the political process altogether.
What makes the Harlan Report persona unique is its success in finding a following and its pioneering role in targeting a niche audience the same way any advertiser would.
Now, security leaders are concerned that the Chinese Communist Party (CCP) will learn from its successes and continue to deploy Harlan-type social media profiles.
It is an issue that the Congressional Select Committee on Strategic Competition with the CCP is aware of and is now pressuring social media companies to take more seriously.
“It’s no surprise the CCP is now using fraudulent social media accounts to target our upcoming elections,” committee chair Rep. John Moolenaar (R-Mich.) said in a statement shared with The Epoch Times.
China’s Targeting Tactics
Foreign attempts to influence U.S. elections are nothing new, but their increasing stridency and varying levels of success are.That report found that Chinese state-backed actors are “amplifying content highlighting polarizing domestic issues”—including issues related to Black Lives Matter, school campus protests, and U.S. foreign policy toward Israel and Ukraine—to sow discord between Americans.
The report found that Chinese influence operations, including Spamouflage, have historically failed to generate traction among American audiences but are now seeing sporadic breakthrough success with viral content.
Those breakthroughs, in large part, are due to the increasing use of artificial intelligence (AI) and deepfakes, which the operators behind Spamouflage use to play on the likes and dislikes of a target audience.
John Mills, who previously served as the director of cybersecurity policy at the Defense Department, told The Epoch Times that the CCP is using AI to sort and interpret user data to better exploit users’ fears and desires.
“People don’t understand the immense power of big data, big data analytics, and the AI component that China has mastered and is using on an unbelievable scale,” Mills said.
“They [the CCP] are delivering a data stream tailored and customized to that individual, knowing their likes, their dislikes, their trigger points.”
Mills said such data would help the CCP dial in information about social media users’ positive and negative interactions. Those operations could then attempt to trigger mass distrust or hysteria over real or faked events.
“This is psychological operations 101: knowing your target audience, knowing their trigger points, and that’s what they’re doing with Spamouflage on a breathtaking, unbelievable scale and creating these fake accounts,” Mills said.
In most of those cases, the accounts did not start by spreading fake content. Instead, they reshared posts created by real politicians and media outlets from both liberal and conservative sources to build followings and amplify divisive content.
As those followings grew, the profiles changed, both in who they claimed to be and in the type of content they delivered.
Mills said the technique used to identify and exploit Americans was essentially a new iteration of the same type of profiling that big tech corporations have used for years to track consumer preferences.
“When I’m looking for a trailer hitch [online], that commercial for a trailer hitch follows me wherever I go,” he said.
“Now, China has taken what our big tech was doing, but they’re doing it on a much grander scale, with a much more sinister agenda, and without any semblance of bumper cushions or guardrails.”
Conflicting Ideas About China’s Goals
The Department of Homeland Security’s 2025 Homeland Threat Assessment, published Oct. 2, anticipated that foreign use of “subversive tactics in an effort to stroke discord and undermine confidence in U.S. domestic institutions” would increase.Recently, officials from the ODNI have delivered statements to the press to assert that Russian cyber actors are attempting to elect Trump and undermine Harris.
Mills said he believes that the CCP is “trying to influence the election” to ensure the election of a candidate who would be less effective at countering its quest for global hegemony.
“What is the Chinese agenda? I think, as opposed to the Russians, who just want to create hate and discontent writ large ... it is election interference,” he said.
‘No Such Thing as Guardrails’
Just as interpretations of China’s motives have remained murky, so have the various government agencies responsible for defending Americans from such operations, which have largely failed to provide any official guidance for how everyday Americans should identify and respond to such content.“The elections process is the golden thread of American democracy, which is why our foreign adversaries deliberately target our elections infrastructure with their influence operations,” Conley said in a statement.
“CISA is committed to doing its part to ensure these officials—and the American public—don’t have to fight this battle alone.”
When asked what Americans can do to identify and counter foreign influence operations, CISA declined to comment and instead referred The Epoch Times to the ODNI.
The ODNI did not return multiple requests for comment on the matter.
When asked what actions the State Department was taking to address foreign influence in U.S. elections, a department spokesperson said that it was “focused on the information environment overseas.”
The Epoch Times has also requested comment from the Department of Homeland Security.
The notice, issued on Oct. 18, encourages Americans to verify anything they see online and to seek out “reliable sources” including state and local government sources.
Graphika, whose report does not offer any suggestions on identifying or countering the content examined, declined to comment.
That’s a real problem, given the increasing reach of foreign influence campaigns, which, according to the Recorded Future report, frequently aim to mislead the public and engage in electioneering.
Similarly, according to research cited in the same report, most people cannot detect deepfakes and would benefit from guidance on the topic.
Even if a person has identified a deepfake video for what it is, they may still be influenced by it, particularly if it promotes an extreme belief or action.
“Even when people are aware that certain information may not be true, it still impacts their beliefs and actions,” the report reads.
“In other words, even implausible disinformation can influence political beliefs, partially beyond the awareness of recipients.”
The prevalence of deepfakes in foreign influence operations could, therefore, engender a long-term dislike or distrust among American voters for candidates, even after those Americans find out the information was not real.
When asked what advice he would give Americans, Mills said: “You should be very, very suspect of anything you see online.
“There’s no such thing as guardrails with what China is up to.”