By Pamela Williamson

Pamela Williamson looks into the world of Russian trolls and bots.

The mechanisation of bots to power political messaging is a very new phenomenon, although propaganda during war has been around for a long time. The seminal work of Harold Laswell in 1927 described propaganda as: “…. the management of collective attitudes by the manipulation of significant symbols”. During World Wars I and  II propaganda notoriously became weaponised sometimes deceiving even home audiences via posters, animation films, comic books, leaflets, magazines, and newspapers.

But it is a puzzle as to why the mechanised bots that power Russian trolls (who prepare and load disinformation and misinformation and who are deployed in what are colloquially known as “troll farms”), had such a viral effect on the targeted 2016 US Election way beyond the resources of personnel and finance that were required to operate them. Lessons were certainly learned from Islamic State. But Russian “active measures” campaigns have taken advantage of the latest bot technology to create exponential effects. As many as 60 million bots may be infesting Facebook and manipulating algorithms and Russian efforts to influence the 2016 US Election have been well-documented in the Report of the US Senate Judiciary Committee.

Scholars have found that the prevalence and impact of Russian trolls during the election was significant given that

  1. “The average American encountered between one and three stories from known publishers of fake news during the month before the 2016 election”;
  2. 47% of Americans mainly obtain their news from Facebook as well as other social media.

Facebook has now acknowledged that Cambridge Analytica improperly gained access to personal information on 87 million Facebook users. Researchers have also found that the issues that the now indicted Internet Research Agency focused on included race, crime and policing, immigration and guns. all issues that Trump focussed on in his campaign.

Some scholars consider that “Kremlin trolls” are ideologically-motivated as distinct from other trolls commonly understood as hostile actors on social media.

The media concepts identified by Robert Entman (in 1993 and 2007) of framing, agenda-setting and priming (and how they create news slant and bias) may assist in comprehending how media, including social media, defines power today and who has it. For Entman meaning for an audience derives from the noteworthiness of key elements of a text and how they interact.  These concepts assist in explaining the effectiveness of the messaging of Russian trolls.

Trolls

Originally the word troll signified a demonic creature in fairy tales. Nowadays, the word is used as a descriptor of malign online political disruptors, distractors and deflectors who disseminate fake news, lies and divisive content super-powered by bots. They are usually (but not always) sponsored by a State. The purpose is the undermining of Western democracies by “Putinist” Russia “which engages in asymmetric hybrid warfare using cheap internet strategies as part of “active measures” or Influence Operations.  Fast and overwhelming information overload using ‘weaponised’ trolling powered by bots is a key tool of cyber, Information and narrative warfare.

The high number of automated fake bot accounts (60 million estimated by Facebook alone in 2018), each with the ability to influence and persuade unsuspecting populations politically, exploits the vulnerability of unsuspecting voters to such messaging and fake news  based on key tactics such framing, speed, repetition, AI computational propaganda, divisiveness  and targeting identity utilising familiar societal myths.

Bots

Academics describe social media bots as “automatic or semi-automatic computer programs that mimic humans….in online social networks” (aka computer code stand-ins for humans). They are becoming increasingly sophisticated, mechanised and hard to identify. Some are innocuously useful but millions more have more sinister motives. Robert Gorwa and Douglas Guilbeault have concluded that better typology is required to lessen the confusion. Data brokers now deal with mega amounts of information on consumers and voters to identify and profile the most effective messaging for individuals.

Failure to identify and manage bots until very recently has allowed political trolls hostile to the West to infiltrate and influence the thinking of populations worldwide and democratic elections beyond what was thought possible previously:

About 47% of Americans overall report getting news from social media often or sometimes, with Facebook as, by far, the dominant source. Social media are key conduits for fake news sites. Indeed, Russia successfully manipulated all of the major platforms during the 2016 U.S. election, according to recent congressional testimony.

The utilisation of bots by Russian trolls on social media sites such as Twitter and Facebook has amplified and magnified their ability to confuse and divide voters by disseminating targeted salient messages (both misinformation ie fake news and disinformation ie misleading information). The trolls, incentivised as they were by the Russian “active measures” campaigns, had dramatic success in disrupting and influencing the US Election in 2016 as determined by the Senate Judiciary Committee and the US Council on Foreign Relations.

The major Western social media sites such as Facebook, Twitter and Linkedin are all currently contending with the public, media and political pressure to rein in bot-powered trolling from both Russia and the Far-Right that have been shown to have linkages especially in Europe and the US.

Conclusion

Countering these advanced tactics is extremely complex. Ajit Maan who has developed and utilised weaponised narratives in warfare against jihadists, says:

The last thing we want to do is repeat the meaning the adversary has created in order to counter it with our facts. I am not a fan of counter narratives because they are defensive and reactionary when what we need is an offensive strategy that gets out ahead of an adversarial narrative and invites the target audience to understand events within a framework that is advantageous to us. But that framework, to be effective, has to come from the ground up, which means it has to come from the target audience themselves. It has to come from and speak to their identities, to the narratives they live by. We can assist them in connecting the dots in a way that is more meaningful than that which they have been provided by an adversarial narrative.

Maan bases her work on the seminal work of cognitive scientist George Lakoff and political philosopher Mark Johnson on the neural processes of influence. She considers that modern warfare is based on influence not just on facts, but also the meaning of those facts to an audience. In that she agrees with Entman.

The challenge in counteracting the pernicious influence of Russian trolls in counter narrative Influence Operations and Information Warfare, is to do so without compromising the values of Western society e.g. human rights and democratic norms (an assumption of “goodness” and which there is not space to defend here). Sara B King considered that modern warfare is now a “battle for public opinion” unlike the traditional battlefield. Martin Libicki likened finding a good definition of Information Warfare to “the effort of the blind men to discover the nature of the elephant.” He settled on the definition by Thomas Rona as

the strategic, operational and tactical competitions across the spectrum of peace, crisis, crisis escalation, conflict, war, war termination, reconstitution/restoration, waged between competitors, adversaries or enemies using information means to achieve their objectives.

Counter- measures to Russia’s “active measures”, campaigns include the European Union’s External Action Service East Strategic Communications Task Force which produces a weekly Strategic Communications Russian Disinformation Digest. Jonathan White of the UK Institute for European Studies recommended that a more coordinated and efficient approach, such as that utilised by Russia, and turning it back on Russians themselves, is essential and necessary if countering Russian disinformation is to be effective.

The views of Maan, who considers that countering facts misses the point and that actually “misidentification” is the most effective strategy of adversaries, may need to be incorporated in a complete narrative strategy if the West is to defend itself against the modern warfare strategies of adversaries that are succeeding without a shot being fired.


Pamela Williamson is a Masters student in Conflict and Terrorism Studies at the University of Auckland. 

Disclaimer: The ideas expressed in this article reflect the author’s views and not necessarily the views of The Big Q. 

You might also like:

Are hacking, fake news, and paid trolls destroying democracy? 🔊