For six years, a mysterious group has used forged documents and a network of burner accounts to spread misinformation promoting Russian national interests, according to a new report from Graphika.
Dubbed “Secondary Infektion,” the campaign spanned a number of online platforms, beginning on the Russia-based LiveJournal in 2014, and moving to Twitter and YouTube later that year. In the years that followed, the campaign shifted to Reddit, Medium, and even the user-generated portion of BuzzFeed. All told, the report spans more than 2,500 pieces of content, posted across seven languages and more than 300 different forums, websites, and social networks.
“By April 21, 2020, Graphika had identified some 250 images that the operation had planted in its articles, almost all of them suspected forgeries,” the report states. “We expect that more remain to be found.”
The internet is full of viral half-truths, honest mistakes, outright lies, and other unreliable information. But in many cases, you can cut through the chaos by following a few simple rules.
According to researchers, the content tended to promote themes that align with Russian national interests during the period, including the unreliability of Ukraine, hostility to NATO interventions, or personal attacks on critics of the Kremlin or Russian government. The campaign took particular aim at the World Anti-Doping Agency (WADA), which instituted a four-year ban on Russian involvement in international sports after finding evidence of doctored lab results.
Despite the broad scope and long duration of the campaign, few of the posts reached a broad audience, suggesting that the groups coordinating the effort may not have succeeded in their aims. “If Secondary Infektion was aiming at viral impact,” the report concludes, “it failed.”
In many instances, users appear to have dismissed the troll posts as untrustworthy, tipped off by poor grammar or idiosyncratic political agenda. “Repeatedly in the course of this research, Graphika came across comments below Secondary Infektion stories that questioned or ridiculed them, or called them out as ‘Russian trolls,’” researchers say. “It is therefore especially important to maintain a sense of perspective when crafting responses to such online operations.”
The researchers did not find any altered audio content or deepfake video, but rather a huge volume of doctored screenshots, usually of articles or other pieces of writing. In one of the most alarming examples, the group forged a letter from Sen. Bob Corker (R-TN) to former Ukrainian Prime Minister Arseniy Yatsenyuk, supposedly accusing Yatsenyuk of provoking ethnic tensions. Corker never wrote such a letter, but the forgery was used as grounds for an Indymedia UK forum post, which accused the country of “genocide against non-Ukrainians.”
As with much of the group’s writing, the editorial contained bizarre phrasing that suggests it was not written by a native English speaker. In one passage, the author says “the EU countries should bring Ukraine to senses like parents do it with a naughty boy.”
It’s still unclear who coordinated the campaign, although researchers note some similarities to the Internet Research Agency, the Russian agency implicated in similar attempts to undermine the 2016 US presidential election. The group has remained active, with IRA-linked posts removed from Facebook as recently as 2018.