A State Department team that is focused on misinformation and disinformation meets regularly with Big Tech companies, including Twitter and Facebook, an official said in a recent deposition.
Daniel Kimmage, the principal deputy coordinator of the department’s Global Engagement Center (GEC), said that the team would exchange information during the meetings, including providing social media firms with details on how foreign actors such as Russia and China promote “propaganda and disinformation.”
“The tools and techniques of our adversaries would probably be the number one topic. So what are the campaigns we see, foreign propaganda actors, like Russia, China, Iran or terrorist organizations, what campaigns are they conducting, what tools are they using, potentially which narratives they’re promoting,” Kimmage said during the deposition, which was conducted in November and made public this week. “And we would be in listening mode for anything the companies wanted to share.”
The GEC, established in the 2017 defense funding bill, has a mission of “counter[ing] foreign propaganda and disinformation.”
Focused on Foreign Disinformation
Kimmage said in his deposition he did not recall authorizing GEC staffers to flag posts to the partnership.“We do not target American audiences,” Kimmage said. “The GEC’s concern is with the actions of foreign propaganda actors. The GEC’s concern stops there. It doesn’t extend to the speech of Americans.”
He also claimed that the center only provided Big Tech companies with information and did not pressure them to take action, including action against the spread of purported disinformation.
The center “equips people, it equips, potentially, technology companies to better understand it so that they can take whatever actions they would take to stop the spread,” Kimmage said.
The GEC makes available a suite of tools in a so-called disinfo cloud to social media companies, nonprofits, and other groups, previously released documents show. As of 2021, 1,200 entities had access to the cloud, which contained 70 tools.
Two Instances
Kimmage acknowledged coordinating with the FBI and Department of Homeland Security, including the department’s Cybersecurity and Infrastructure Security Agency (CISA).He was also shown emails that showed Alex Dempsey, a GEC official, forwarding concerns about a YouTube channel run by Americans about the origin of COVID-19.
One video featured a person that claimed COVID-19 was brought to China by an American. Dempsey flagged the video to Brian Scully, an official at CISA, who then sent the concerns to Twitter, Facebook, and Google, which owns YouTube. The incident was also made known to the FBI.
Kimmage said he was aware of the video but did not know about the emails and did not recall whether he authorized anyone under him to contact CISA.
“Are you aware of any other situations where anyone at the GEC flagged content on a YouTube channel run by Americans to CISA or a social media platform or the FBI as a disinformation campaign to be combatted?” John Sauer, Missouri’s solicitor general, who was questioning Kimmage, asked.
“No, I’m not,” Kimmage said.
“I don’t believe that chain teed up an action. I don’t believe it recommended an action,” he added later.
Kimmage also recalled in 2018 how he was informed by a colleague that protesters in a Middle Eastern country were using a social media platform, which he did not identify, to communicate, and that the department was concerned for the safety of its employees.
“And this was a concern that was being tracked in realtime, at the highest levels of the State Department. And that was the one time that I recall that I did communicate directly to a social media platform or representative that this was an ongoing concern,” Kimmage said. “I was very specific in my interactions, simply saying that this is a realtime situation where we believe that the safety of our personnel is at stake, and I would simply ask that you review the activity on these accounts to make a determination in line with your own terms of service.”
“I did not ask for anything to be removed,” he added, “but I did have a direct interaction about specific content motivated by security concerns about the safety of our people.”