Foreign Meddling in US Elections Intensifies, Likely to Persist Through Inauguration Day, IC Warns

Foreign adversaries are escalating AI-powered influence campaigns to divide Americans and undermine confidence in election processes ahead of the Nov. 5 vote.
Foreign Meddling in US Elections Intensifies, Likely to Persist Through Inauguration Day, IC Warns
Voters make selections at their voting booths inside an early voting site in Hendersonville, N. C., on Oct. 17, 2024. Melissa Sue Gerrits/Getty Images
Tom Ozimek
Updated:
0:00

Foreign adversaries are ramping up efforts to influence American voters—and are likely to try to undermine confidence in the democratic process through Inauguration Day—according to a new intelligence community assessment, which was released as the presidential election is just two weeks away.

The foreign influence campaigns, which include the use of artificial intelligence (AI) to generate divisive content, are expected to intensify as Election Day nears—and persist after polls close through Inauguration Day in January, according to an Office of the Director of National Intelligence (ODNI) security update and a National Intelligence Council declassified memo, both announced on Oct. 22.

“Foreign actors—particularly Russia, Iran, and China—remain intent on fanning divisive narratives to divide Americans and undermine Americans’ confidence in the U.S. democratic system consistent with what they perceive to be in their interests, even as their tactics continue to evolve,” reads the security update.

Social media posts, some of which are likely to be enhanced or entirely generated by AI, were identified as the most common type of election-related influence operation by foreign adversaries.

As an example, the ODNI pointed to Russian influence actors manufacturing and amplifying inauthentic content claiming that Minnesota Gov. Tim Walz, the Democratic vice-presidential nominee, was engaged in illegal activity during his earlier career. While the report did not go into specifics, it could relate to claims circulating on social media that Walz sexually assaulted a student while he was a high school teacher.

“Breaking: Tim Walz’s former student, Matthew Metro, drops a shocking allegation-claims Walz s*xually assaulted him in 1997 while Walz was his teacher at Mankato West High School. Metro was a senior at the time. If this is true, it’s a political earthquake,” reads an Oct. 16 post on X, which shared a since-deleted video of a man making the sexual assault allegations. The real Matthew Metro told The Washington Post that the speaker in the video was not him and that no such interaction with Walz had taken place. Further, the man’s brother, Micheal Metro, told AFP that the circulating video was “definitely not him.”

Darren Linvill, co-director at Clemson University’s media forensics hub, told WIRED that the video appeared to be a deepfake bearing the hallmarks of Storm-1516, a group that Microsoft described as a “Kremlin-aligned troll farm” that has put out various deepfakes, including one about Vice President Kamala Harris’s supposed involvement in a hit-and-run accident.
Microsoft’s threat assessment team issued an Oct. 23 report that dovetails with the ODNI update but provides more details about disinformation campaigns from China, Iran, and Russia, including an AI-enhanced deepfake video linked to Storm-1516 that accuses Harris of illegal poaching in Africa.

Despite the heightened influence efforts, the ODNI security update stressed that there is no evidence that foreign actors have attempted to interfere with vote tabulation or election administration processes.

“Even if they decided to try, foreign actors almost certainly would not be able to manipulate election processes at a scale that would materially impact the outcome of the Presidential election without detection,” states the security update.

This message is consistent with earlier remarks made by Jen Easterly, director of the Cybersecurity and Infrastructure Security Agency (CISA), who said at the beginning of October that U.S. election systems are so secure that foreign adversaries won’t be able to manipulate the outcome of the 2024 presidential election in a “material” way.
Further, the intelligence community assessed that foreign actors will at minimum conduct information operations after Election Day through Inauguration Day, according to both the ODNI security update and the National Intelligence Council declassified memo.

“They might also consider stoking unrest and conducting localized cyber operations to disrupt election infrastructure,” the memo states. “However, we judge that operations that could affect voting or official counts are less likely because they are more difficult and bring a greater risk of US retaliation.”

Foreign adversaries, which the memo says are “better prepared” than in previous election cycles to undertake influence operations after Election Day, are expected to “almost certainly” conduct such operations after polls close.

Their overarching aim is to sow doubt about the integrity of the November election, and create confusion and friction more generally around democratic processes in the United States. Other aims include acquiring voter registration data and nonpublic information on local election officials, which they could exploit in future cyber or influence operations.

“US adversaries’ longstanding interest in undermining American democracy suggests it will be difficult to dissuade them from engaging during the post-election period,” the memo reads.

The warnings contained in the ODNI security update and National Intelligence Council memo echo those made by the FBI and CISA on Oct. 18, which raised the alarm on AI-assisted influence operations targeting U.S. elections.
Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter