Theme: Society & Culture | Content Type: Blog

How Russia's Army of Trolls Built its Disinformation Campaign

Martin Innes and Andrew Dawson

nhu-nguyen-IL1qSqEMNBo-unsplash-scaled

Nhu Nguyen

| 7 mins read

While a lot of recent attention has focussed upon Russian attempts to influence the 2016 United States presidential election, far less work has addressed their European activities. But there is significant evidence of similar influence and interference strategies being operationalised in Europe by the St. Petersburg based Internet Research Agency (IRA) and allied Kremlin units.

Detailed ‘digital forensic’ investigative methods help to build an understanding of how IRA operators communicate their narratives of disinformation. Here we focus on accounts oriented towards Germany, with the hope that identifying their key tactics and techniques may allow the detection of similar disinformation campaigns in other contexts.

Work at the Internet Research Agency

Two main data sources underpin the evidence: a small number of published accounts and stories from former workers at the Internet Research Agency; and analysis of the ‘FiveThirtyEight Internet Research Agency Twitter dataset’ – an extensive non‐anonymised corpus of tweets posted by IRA accounts.

Based on the accounts of former workers, we know that  staff turnover at the IRA was high and featured a lot of young people and students. Different departments focussed upon specific geographic regions/countries. Different units focussed upon producing memes, commenting on posts by other users, and other activities.

Staff were under considerable pressure to meet their performance metrics. Individual operators ran multiple fake accounts: trolls were expected to make around fifty comments on news articles every day, or maintain six Facebook pages. On Twitter, operators were generally responsible for around ten accounts with up to 2,000 followers each, tweeting at least fifty times daily. Some operators worked in teams, having ‘debates’ online.

These social media accounts were not transmitting such materials all the time. Most of the time they mimicked the kinds of interests and values coherent with the social identities that they were ‘spoofing’, and then occasionally they would message avowedly political content. This makes the task of definitively attributing accounts to Kremlin direction and control challenging.

However, analogous to what happens in police detective work and psychological profiling, little ‘tells’ can be used as clues to detect accounts run by staff for spreading disinformation.

How troll accounts built audience and influence

Staff would shortcut the process of building audience and influence by buying followers from websites such as ‘buycheapfollowerslikes.org’, which offer to increase a client's Twitter following by 1,000 realistic-looking bot accounts for less than $20.

Around half of the most active IRA ‘German’ Twitter accounts also engaged in ‘follower fishing’ where the aim is to get the newly added accounts to reciprocate by following the IRA account. Finally, staff engaged in ‘narrative switching’; starting by talking about fairly mundane issues, before becoming overtly political and frequently aligned with established pro‐Russian interest narratives.

For example, a IRA-run twitter account that had started by posting anti‐Alternative für Deutschland (AfD) statements began to tweet:  ‘I #chooseAfD, because I want to live in the Federal Republic instead of Caliphate #Germany!’. A possible (unsubstantiated) explanation for this switch in position is that it was a direct response to Chancellor Merkel's public statement on 14 September that the EU would not consider lifting sanctions on Russia.

Synchronicity analysis

We used ‘synchronicity analysis’ to infer that eleven prolific IRA accounts adopting a pro‐Merkel stance were probably being controlled by one author, possibly using a system such as Tweetdeck.

In July and August 2016, these accounts sent hundreds of pro‐Merkel tweets, during a period when Angela Merkel was under intense political pressure to step down as Chancellor, with Reuters noting that in January, 40 per cent of Germans thought she should resign over her refugee policy. The media has typically blamed Russia for supporting populist/anti‐immigrant parties, but at a point where Merkel (and EU unity) was politically weakened, Russian controlled accounts were messaging support for her domestically.

The Russians were amplifying both sides of the political argument simultaneously, trying to increase the social fissures associated with them. This would be consistent with the Russian state's known modus operandi for seeking to leverage political weakness to amplify social and political tensions.

Synchronicity analysis has two potential uses: temporal pulses of messaging activity from an identified Kremlin controlled account might enable identification of other ‘accounts of interest’. Equally possible is that, with a number of accounts, the presence of similar pulsing sequences can be interpreted as a potential indicator of a common controller.

Building deeper understanding

Once discovered, troll accounts were usually taken down or exposed. But this has enabled the IRA and other Kremlin-backed units to learn how they are being detected and to adapt their methods accordingly.

Instead, we need a strategy that seeks to build an understanding of a disinformation network over time, and implements interventions against multiple nodes simultaneously. This is how police investigators have learned to do disruptions of criminal networks in offline spaces to maximise impact.

In his coruscating account of life in Russia and the state's normalised use of ‘soft facts’ to convey multiple and shifting ‘truths’ to its citizens, Peter Pomerantsev articulates how the aggregating effect is a profound suspension of belief. Unlike classic propaganda, the design is not intended to seduce people to invest in a particular ‘truth’, but rather to render them in a state of profound and radical doubt about what to believe – a state of epistemic anarchy.

Yet, at this precise moment, it is difficult to know how worried we should collectively be about disinformation communication. There are indications that disinformation is becoming endemic, yet there is actually remarkably little robust evidence that such disinforming communications have a discernible measurable impact upon how the majority of people think, feel or act. 

Perhaps then it is more appropriate to argue that disinformation has more impact in shaping the issues we collectively think about, than what we individually think. That is, its pernicious influence resides in setting our national agendas; framing which troubles come to be defined as key public policy problems.

Read the full article on Wiley

Need help using Wiley? Click here for help using Wiley