- Establishment media claims “Russian bots” pushing conservative causes are littered with factual errors
- Two groups at the heart of the Russian bot stories debunked media claims about their work
- Multiple media outlets falsely claimed hordes of Russian bots were rallying around Fox News host Laura Ingraham
Most of the reporting on the dashboard is “inherently inaccurate,” the alliance’s communication’s director, Bret Schafer, told The Daily Caller News Foundation over the weekend.
“Most notably, and this is the most common error, we don’t track bots, or, more specifically, bots are only a small portion of the network that we monitor,” Schafer said.
“We’ve tried to make this point clear in all our published reporting, yet most of the third party reporting on the dashboard continues to appear with some variation of the headline ‘Russian bots are pushing X …'” he said. “This is inherently inaccurate.”
Examples abound of the kind of article Schafer described.
The articles leaned on two sources, both of which were misrepresented.
Moreover, an increase in percentage says little about the actual number of tweets using that hashtag. A hashtag not commonly used by the monitored accounts (e.g. #IStandWithLaura) can see a huge percentage increase when tweeted a relatively small number of times.
Friday evening, for example, “#FridayFeeling” was the top trending hashtag among accounts on the dashboard, increasing in use by 5,200 percent. But that didn’t translate to very many real tweets.
The hashtag didn’t crack the top 10 hashtags used, falling short of the hashtag “#us,” which was used just 64 times within the previous 48 hours. In other words, an impressive surge in frequency — and rising to the top of the “trending hashtags list” — only translated to a few dozen tweets.
“Trending hashtags are inherently problematic because that section only measures the percent change with a hashtag over a 48 hour period, so it’s always going to favor new hashtag campaigns over, say, #Ukraine, #US, etc.,” Schafer told TheDCNF.
“Obviously, no one was using #istandwithlaura before the David Hogg controversy, so the spike there is likely evidence of only around 20-30 tweets.”
“For that reason, I only pay attention to the trending section if, for instance, all the top 10 trending hashtags are focused on the exact same subject. That’s an indicator that I should watch that topic as there’s clearly some early interest, but even then, it’s not usually something that I would flag until I see more evidence,” Schafer said.
“Also, it’s important to stress that results on the dashboard are meant to be viewed in a nuanced way; i.e., not every URL or hashtag that appears on the dashboard should be interpreted as evidence that pro-Kremlin accounts favor or oppose a certain social or political position,” Schafer said.
“A lot of the partisan topics that appear on the dash are merely used to gain a following to push more targeted, Kremlin-friendly geopolitical content. Case in point, #Syria is far and away the most-used hashtag by monitored accounts since we started with the project, and #Skripal has been at or near the top of the dashboard every day for the past month,” he continued.
“This is rarely mentioned in reports, and without that as a baseline understanding, most of articles that focus on a single trending topic are losing the forest for the trees.”
But Botcheck doesn’t have any data on where the bots it tracks are located or who runs them, Ash Bhat, Botcheck’s cofounder, told TheDCNF on Sunday.
“We’ve been pretty vocal on this point. There isn’t any data that actually points to any specific location or group,” Bhat said. “We (being RoBhat Labs) don’t have the evidence to point to any individual group. Twitter may have that data but it’s not made publicly available.”
BI updated its article with an editor’s note on Monday in response to an inquiry from TheDCNF. “This story has been updated to clarify what types of accounts Hamilton 68’s dashboard and botcheck.me track. We have clarified that Hamilton 68 tracks Russia-linked accounts, not all of which are bots, and that botcheck.me tracks propaganda bots, which aren’t necessarily Russia-linked,” the editor’s note reads.
However, the article still relies on misleading percentage increases, which Schafer said likely only amounts to “20-30 tweets.” And researchers with Botcheck saw bots pushing both sides of the gun debate.
“We saw bots on both sides push #guncontrol and #guncontrolnow during Parkland,” Bhat told TheDCNF. “What worried us was that these bots were pushing a topic that clearly would be divisive instead of a hashtag like #mentalhealth, which would be a rallying point for the nation to come together around.”
After this article was published, WaPo corrected its article in response to TheDCNF’s inquiry. “This post incorrectly stated that botcheck.me tracks Russia-linked bots specifically. It has been corrected and updated,” the correction reads.
The article now states that “many Twitter bots and propaganda accounts, including some linked to Russia, have rallied around the conservative talk-show host.” But that’s still not accurate. The 600 monitored Hamilton accounts sent roughly 20-30 tweets supporting Ingraham, according to Schafer — not quite “rallying around” her.
“There are bots on both sides of the aisle and our hypothesis is that they serve the same purpose – to further divide us politically. Categorizing all bots as being ‘left’ or ‘right’ issue is wrong and only further divides us,” Bhat said.
That presents a sharp contrast with articles on “Russian bots” that cited Botcheck’s work.
“Russia-linked bots are promoting pro-gun messages on Twitter in an attempt to sow discord in the aftermath of the Florida school shooting, monitoring groups say,” read the article’s lede.
Like other articles on the topic, the CNN article cited the Hamilton 68 dashboard and Botcheck. And, like other articles on the topic, it was misleading.
First, the Hamilton 68 data doesn’t support the claim that the Russian bots were pushing specifically pro-gun arguments.
The 600 monitored accounts combined to tweet the two hashtags 111 times in the previous 48 hours — barely more than two tweets per hour. And, as Schafer emphasized, bots only account for a “small portion” of those 600 accounts monitored on the dashboard.
Contrary to the CNN article, the pro-Russia accounts monitored on the Hamilton 68 dashboard barely engaged in the gun debate, pushed both sides of the argument, and bots only accounted for a fraction of what was tweeted on the dashboard.
Second, as noted above, Botcheck’s data doesn’t support the claim that Russian bots were only pushing pro-gun messaging after the shooting.
BI, WaPo, and CNN are the most recent outlets to warn their audiences about Russian bots, but they’re far from the only ones.
“Parkland is an instance where I was and am comfortable saying that there was Russian-linked activity that was attempting to inflame divisions. We saw a concentrated focus on the shooting in the immediate aftermath (as we did with Vegas), and a sustained promotion of divisive content related to gun control over a several week period,” Schafer told TheDCNF on Monday.
But TheNYT still “again incorrectly labelled activity on the dashboard as being the work of ‘bots,'” Schafer said. BuzzFeed declared TheNYT article “total bullshit” in its piece criticizing the media’s sensationalized coverage of “Russian bots.”
“The dashboard is always going to show results from any major breaking news story, from Hurricane Harvey to the bridge collapse to the Austin bombings. But in those instances, the level of activity was not abnormal and died off after the normal 24-48 hour news cycle. We consider that level of activity to be largely irrelevant (other than in the big picture sense of using breaking news and trending topics to engage with users online),” he explained.
“If there are several thousand tweets on a subject or a continued focus on a topic over a several week period, that’s noteworthy. If there are a few hundred tweets (or less) on a subject over a day or two, that’s something that we dismiss as noise,” he added. “That’s the key point that is often lost. As researchers we look for trends over time or abnormal spikes in activity; therefore, it obviously is potentially problematic when someone looks at the dashboard for two minutes and cites one hashtag or URL as evidence of a campaign of influence.”
Schafer believes “the vast majority of the misinterpretations and misunderstandings of Hamilton 68 would be solved if those reporting on the dashboard would read our original methodology paper and follow-up paper.”
“They are not all in Russia,” Clint Watts, one of Hamilton 68’s co-founders, told BuzzFeed. “We don’t even think they’re all commanded in Russia — at all. We think some of them are legitimately passionate people that are just really into promoting Russia.”
Schafer couldn’t rule out that pro-Russia Americans could be among the accounts tracked on the Hamilton 68 dashboard. Researchers “weed out” any known American accounts from the list of monitored accounts, he told TheDCNF, and he said the list is believed to be 95-98 percent accurate.
Still, he said, “We can’t be certain whether some accounts are run by those impersonating average Americans or by average Americans who are heavily engaged with pro-Kremlin content.”
What Schafer is certain about: the Hamilton 68 dashboard doesn’t represent an army of Russian bots.