DRUMS: Distortions, Rumours, Untruths, Misinformation, and Smears 9813274840, 9789813274846

Fake news is not new, and this issue poses an even greater challenge now. The speed of information has increased drastic

732 103 10MB

English Pages xxii+178 [201] Year 2019

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

DRUMS: Distortions, Rumours, Untruths, Misinformation, and Smears
 9813274840, 9789813274846

Table of contents :
CONTENTS
Introduction: The Seemingly Unrelenting Beat of DRUMS
The DRUMS Family
Cognitive Predispositions and DRUMS
The Employment of DRUMS
Countering DRUMS
Moving Forward
References
Part 1 Cognitive Predispositions and DRUMS
Chapter 1 The Psychology of Conspiracy Theories: The Role of Pattern Perception
Conspiracism and Human Psychology
Illusory Pattern Perception
Connections and Causes
Conclusion
References
Chapter 2 Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger
Introduction
Evidence of Negatively-Biased Credulity and Informational Negativity Bias
Individual Differences in Negatively-Biased Credulity
Parallel Hazard Biases in Information Selection and Transmission
Hazard Biases and the Content of Culture
Conclusion
References
Part 2 The Employment of DRUMS
Chapter 3 Fake News: The Allure of the Digital Weapon
Recent Information/Disinformation Wars
Defending Against False Information and Fake News
Engaging Social Media Platforms
Legislation
Counter-messaging
Education and Critical Thinking
Conclusion
References
Chapter 4 Mapping Cyberspace: The Example of Russian Informational Actions in France
Identifying and Mapping Pro-Russian Networks on the French-Speaking Segment of Twitter
An Analysis of the Propagation of Fake or Biased Information: The Example of the Days Between the Two Rounds of the 2017 French Presidential Election
References
Chapter 5 Computational Propaganda in Europe, the U.S., and China
The Rise of Computational Propaganda
Computational Propaganda in the U.S. and Europe
Chinese Computational Propaganda
Automation, Computation, and Propaganda in China
Computational Propaganda in and about China
How Does Technology and Society Enable Computational Propaganda?
Conclusion
References
Chapter 6 Civilians in the Information Operations Battlefront: China’s Information Operationsin the Taiwan Straits
Introduction
“Three Warfares” and the People’s War
Framework: ‘Three Warfares’
People’s War
The PRC’s Use of Civilian Force in its Operations Against Taiwan
Media Manipulation via Media Owners, Content Creators, and Journalists
Influence and Espionage by Civilians in Taiwan
Taiwan’s Capacity to Respond
Allegiance Warfare
Political Warfare Bureau
Other Institutions Responsible for the PRC Affairs
Cyber Army and Cyber Troops
Digital Platforms
Disinformation Specific Initiatives
Is There Room for Improvement?
Integration of Citizens
Delivering a Consistent and Timely Message
Lessons for Other States
Understanding the Threat
Citizen Participation in Policy Responses
Mobilizing Citizens against IO
Citizen Participation in Responding to IO
References
Part 3 Countering DRUMS
Chapter 7 Integrating Resilience in Defense Planning Against Information Warfare in the Post-Truth World
The Fourth Dimension of Modern War
Understanding the Audience
The Receptivity Variable
The Need for Resilience
References
Chapter 8 What Can We Learn from Russian Hostile Information Operations in Europe?
Why are States Using the Hostile Foreign Influence Toolkit?
What are the Features of Russia’s Hostile Foreign Influence Toolkit?
How are Hostile Actions Defined?
Recent Lessons Learnt from Europe: A Case Study of 2017 German and French Elections
Cyber-Security Precautions are the New Black
A Last-Minute Fight Against Disinformation is Useless
It is Too Late to Get Angry Once you Get Attacked
Russia Learnt its Lesson, Europe Should Too
References
Chapter 9 How Germany is Trying to Counter Online Disinformation
Overview
The Lisa Case
Methods and Motives
Initiatives
Hoaxmap
Fact-Checking Initiatives
Legal Situation
Conclusion
References
Chapter 10 Distinguishing Fact from Fiction in the Modern Age
Educating People for a Post-Truth World
The Post-Truth Information Age
Knowledge, Skills, and Character Qualities for the Post-Truth Information Age
Educating for the Post-Truth Information Age
About the Editors
About the Contributors

Citation preview

11115_9789813274846_TP.indd 1

21/11/18 10:32 AM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

11115_9789813274846_TP.indd 2

21/11/18 10:32 AM

Published by World Scientific Publishing Co. Pte. Ltd. 5 Toh Tuck Link, Singapore 596224 USA office: 27 Warren Street, Suite 401-402, Hackensack, NJ 07601 UK office: 57 Shelton Street, Covent Garden, London WC2H 9HE

British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library.

DRUMS Distortions, Rumours, Untruths, Misinformation, and Smears Copyright © 2019 The Authors All rights reserved.

ISBN 978-981-3274-84-6 For any available supplementary material, please visit https://www.worldscientific.com/worldscibooks/10.1142/11115#t=suppl Desk Editor: Karimah Samsudin Typeset by Stallion Press Email: [email protected] Printed in Singapore

Karimah - 11115 - DRUMS.indd 1

15-11-18 10:15:13 AM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

  CONTENTS Introduction: The Seemingly Unrelenting Beat of DRUMS Norman Vasu, Benjamin Ang, and Shashi Jayakumar

vii

Part 1 Cognitive Predispositions and DRUMS

1

Chapter 1  The Psychology of Conspiracy Theories: The Role of Pattern Perception

3



Robert Brotherton

Chapter 2  Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger

17

Daniel M. T. Fessler

Part 2  The Employment of DRUMS

37

Chapter 3  Fake News: The Allure of the Digital Weapon

39



Nicolas Arpagian

v

b3411_FM.indd 5

20-Nov-18 12:41:49 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

vi  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Chapter 4 M  apping Cyberspace: The Example of Russian Informational Actions in France

Kevin Limonier and Louis Pétiniaud

Chapter 5 C  omputational Propaganda in Europe, the U.S., and China

61

Gillian Bolsover and Philip Howard

Chapter 6  Civilians in the Information Operations Battlefront: China’s Information Operations in the Taiwan Straits

49

83

Gulizar Haciyakupoglu and Benjamin Ang

Part 3  Countering DRUMS

115

Chapter 7  Integrating Resilience in Defense Planning Against Information Warfare in the Post-Truth World

117



Jānis Bērziņš

Chapter 8  What Can We Learn from Russian Hostile Information Operations in Europe?

133

Jakub Janda and Veronika Víchová

Chapter 9 How Germany is Trying to Counter Online Disinformation147

Karolin Schwarz

Chapter 10 Distinguishing Fact from Fiction in the Modern Age

159

Andreas Schleicher

About the Editors

171

About the Contributors

175

b3411_FM.indd 6

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

  INTRODUCTION: THE SEEMINGLY UNRELENTING BEAT OF DRUMS NORMAN VASU, BENJAMIN ANG, AND SHASHI JAYAKUMAR

Fake news, while currently making headlines, is not new. Consider, as examples, the panic caused by the radio transmission of an adaptation of H.G. Wells’ The War of the Worlds in 1938 in the United States (U.S.), the role played by the rumor of tallow and lardgreased cartridges in the Sepoy Mutiny of 1857 in India, and the use of the blood libel through history in order to persecute Jews.1  As any volume on DRUMS should be as truthful as possible, it must be noted that the actual panic caused by the radio adaptation of HG Well’s War of the Worlds is disputed. For an excellent summary of the reasons why some hold that mass panic 1

vii

b3411_FM.indd 7

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

viii  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Nevertheless, the issue of fake news poses a greater challenge now more so than ever before. It is possible to hold that the greater challenge posed by fake news today stems from the digitization of information and the Internet as a medium of its dissemination. More specifically, digitized information at this current historical epoch differs from the past, owing to its velocity, intensity, and extensity. Information today moves through the Internet far more rapidly (velocity), comes at a greater volume (intensity), and reaches more people than ever before (extensity). In addition, the bar for entry into the business of the dissemination of fake news has been continually lowered over time. Pre-Internet efforts at fake news dissemination required access to prohibitively expensive assets, such as printing presses or broadcasting stations, while modern disinformation can be spread via free Twitter or Facebook accounts. In addition, while readers are overwhelmed by the flood of information that may be true or false, older markers of veracity such as respected publications have not kept up, nor has there been a commensurate growth in the ability to counter false or fake news. In many cases, staid publications of record such as newspapers are eclipsed by new, visually attractive, and sometimes false, sources of information. Confronted with this torrent of unrelenting information, individuals can inadvertently believe, and sometimes act on, incorrect information — a condition of modern society ripe for manipulation and exploitation by those seeking to destabilize society. Admittedly, fake news is a bloated term stuffed with possibility for those seeking to employ it. For example, it may be understood literally was greatly exaggerated, see Jefferson Pooley and Michael J. Socolow, (2013). “The Myth of the War of the Worlds Panic”, Slate, October 28. http://www.slate.com/ articles/arts/history/2013/10/orson_welles_war_of_the_worlds_panic_myth_the_ infamous_radio_broadcast_did.html.

b3411_FM.indd 8

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   ix

as patently false news. Alternatively, it is a term that can be — as seen when deployed by President Trump or Filipino President Rodrigo Duterte — used to dismiss criticism from political opponents, or whatever information one finds inconvenient or disagrees with. To avoid the baggage and misconceptions surrounding the term fake news, this edited volume is precisely interested in content created and distributed online with the intent to destabilize society. To best capture this specific subset of fake news, we have adopted the acronym ‘DRUMS’ (Distortions, Rumors, Untruths, Misinformation, and Smears), which was coined by Singapore’s Defence Minister Ng Eng Hen. The contributions collected in this volume bar one has evolved from a conference on DRUMS organized by the Centre of Excellence for National Security (CENS), Singapore in July 2017 — a conference that brought together a group of speakers from different disciplines and domains to discuss an international topic requiring both a diversity of perspectives and approaches in order to be successfully countered.2

The DRUMS Family How may the various forms of DRUMS be understood? Perhaps, DRUMS is best understood as the grouping of information sharing a Familienähnlichkeit, or a family resemblance in a Wittgensteiniansense. For Wittgenstein, a group of things thought of as connected by one essential feature may instead be connected by a series of

 Motivated by the belief that international exchange and cooperation is necessary for DRUMS to be countered successfully, the editors and many of the contributors in this volume were invited to give written and oral testimony to Singapore’s Parliamentary Select Committee on Deliberate Online Falsehoods that convened in March 2018. 2

b3411_FM.indd 9

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

x  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

overlapping similarities where no one feature is common to all within the group. As such, following Wittgenstein, DRUMS are similar to things such as games or art. While all forms of DRUMS appear connected by a common essential feature, they have no shared essence. Instead, multiple forms are connected by a series of overlapping similarities with no essence shared by all. With this concept of family resemblance in mind, it is possible to unpack DRUMS inhabiting the online world into a range of phenomena by splitting it into six categories on a spectrum based on the degree of threat they pose to society.3 Listed in descending order based on the degree of threat posed, they are: 1. Falsehoods knowingly distributed to undermine society; 2. Falsehoods distributed for financial gain; 3. Falsehoods from sloppy/poor journalism; 4. Beclouding of fact for political purpose; 5. Differing interpretation of facts based on ideological bias; and 6. Parody. Category (1) is arguably the greatest threat to society as they are part of organized disinformation campaigns with the aim of the subversion of societies as well as processes integral to the democratic process such as elections. In recent times, these campaigns have most notably been carried out by Russia as part of broader influence operations, in areas ranging from the Baltics to Central Europe to France to the U.S. (Vasu et al., 2018).  There are other attempts to categorize fake news. See for example, Damien Tambini. (2017). “Fake News: Public Policy Responses”. Media Policy Brief 20. London: Media Policy Project, LSE. http://www.bbc/future/story/20170301-liespropaganda-and fake-news-a-grand-challenge-of-our-age. 3

b3411_FM.indd 10

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xi

For Category (2), falsehoods are distributed to attain revenue from advertising. Examples include the Macedonian disinformation “boiler houses” that invented fake stories on the U.S. presidential election in 2016 (Subramanian, 2017). As noted by one commentator, these Macedonians “didn’t care if Trump won or lost the White House”, but they were intent on generating enough (false) content to attract viewers and thereby earn advertising revenue; this industry was described by Barack Obama towards the final week of the 2016 Presidential campaign as a “digital gold rush” (Ibid, 2017). While the destabilization of a state may not be the intended outcome, this form of fake news can have a similar destabilizing effect on society as those in the first category. Category (3), misinformation from sloppy/poor journalism, has been something societies have had to contend with since the birth of journalism. An example would be the British tabloid The Sun publishing falsehoods about the actions of Liverpool Football Club supporters during the Hillsborough tragedy (Conn, 2016). The Hillsborough disaster occurred at the Hillsborough football stadium in Sheffield, England, on April 15, 1989 during a football game between English clubs Liverpool and Nottingham Forest. Owing to an overcrowded stadium, a human crush resulted in 96 fatalities and 766 injuries. After the tragedy, The Sun — being fed falsehoods by a police force keen to shirk responsibility — ran articles claiming Liverpool fans urinated on police officers, attacked rescue workers, and pickpocketed the dead. Category (4) emanates from groups or individuals seeking to becloud facts for political gain. This may come in the form of challenging facts by calling into question whether they are true, or by overstating elements of a truth. The most recent example of this would be Trump’s labelling of any news report that challenges his own perspective of events as “fake news”. Separately, another example of this form of falsehood can be found in the lead up to the

b3411_FM.indd 11

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xii  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Brexit referendum in the United Kingdom (U.K.). The “Leave” campaign resorted to tactics ranging from warnings that the country would be overrun by refugees and asylum seekers, to exaggerated claims that a sum of £350 million a week was being sent to Brussels by the U.K. government — money that according to the claim would be saved if the Leave vote won.4 Category (5) stems from the different manner events and issues are framed and understood stemming from ideological biases. More often than not, there is little malice in the framing and understanding of information that may appear clouded with political ideology. As political ideology acts both as a manner in which reality may be understood as well as a decision-making tool, individuals with dissimilar ideologies will understand (and discuss) the world differently. For example, conservative news outlets may interpret new immigration laws making the entry into a state more difficult in a positive light while liberal outlets would not. Category (6) is the creation of fake stories for entertainment. Examples would include the U.K.’s Punch magazine and the online site The Onion. An unintended by-product of this form of fake news is that the gullible may believe the parody to be true. For example, China’s People’s Daily republished an article from The Onion claiming that North Korea’s Kim Jong Un was voted 2012’s sexiest man alive (Scott, 2012). Arguably, with regard to the threat posed to a polity, the effects of Categories (1) and (2) require both attention and action. The  Political strategists from Leave.EU credit Facebook as a game changer for the Brexit campaign, because they were able to micro-target specific types of voters in specific locations and send them specific personalized messages. See Mikey Smith, ‘How Facebook, Fake News and Personalised Ads Could Swing the 2017 Election — And What You Can Do About It’, The Mirror, 8 May 2017, http://www.mirror.co.uk/news/ politics/how-facebook-fake-news-personalised-10382585. 4

b3411_FM.indd 12

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xiii

malicious spreading of disinformation can undermine democracies, sow discord, and lead to conflict. As for Categories (3) to (6), while they may be considered part of this family of DRUMS, action taken against these forms of disinformation would either be superfluous as checks already exist or possible action may be detrimental to society. With regard to action being superfluous, consider Category (3) which is misinformation from sloppy or poor journalism. Professional journalism is an industry with a set of clear standards of sourcing that determine what can or cannot be published. In addition, the industry check has internal checks as different press houses watch and correct each other. Finally, legal action can be taken to secure retraction, attain apologies, and mete out punishment. For Categories (4) and (5), the beclouding of facts and the differing interpretation of facts stemming from different ideological lenses, cannot be and should not be attended to, as it is part of political husting. Furthermore, the media often acts as fact checkers during political jousting by calling out fibs. As for Category (6), while it is often amusing when parody is mistaken for truth, the creators of parody have no motivation for their work to be taken as such. At its essence, parody is designed to imitate, make fun of, or comment on an issue in order to be humorous and/or act as commentary. To be successful, this genre of commentary and humor cannot be crafted to be too genuine to be successful. Hence, parody polices itself as the exercise has failed if it does indeed dupe its readers.

Cognitive Predispositions and DRUMS Following from the establishment of what forms of DRUMS we are interested in here, Part One of this volume discusses the proclivity of humans to believe DRUMS. Are DRUMS believed because we as a

b3411_FM.indd 13

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xiv  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

species are poorly evolved cognitively to process the volume of information we currently receive through the Internet, or does the issue lie with us regardless of the Internet? Robert Brotherton’s contribution approaches the study of our receptivity to DRUMS through a discussion on the psychology of conspiracy theories. For Brotherton, conspiracy theories are claims that events are secretly orchestrated for nefarious ends. However, unlike the representation in popular culture of individuals believing conspiracy theories as unhinged, Brotherton argues the belief in conspiracy theories stems from a basic element of human psychology. Humans have evolved to be a pattern-seeking species as it permits us to anticipate phenomenon, such as the changing of the seasons or the habits of predators. However, in a state of heightened insecurity and threat where one has a sense of diminished control, our patternseeking nature may search and find patterns and causality where none exist. In his chapter, Brotherton unpacks the various psychological reasons for this illusory pattern perception, such as the conjunction fallacy and the psychological need to find causality. Interestingly, Brotherton concludes that the human predisposition towards believing DRUMS has not been amplified with the ubiquity of the Internet. The modern informational environment buttressed by the Internet has not led to an increase in the talk of conspiracies. In fact, much like other revolutions in communication, such as the printing press and the telegraph, fears of information overload and societal harm may be currently overstated. While also discussing DRUMS through psychology, Daniel Fessler’s discussion on evolutionary psychology comes to a different conclusion. For Fessler, the human mind has evolved to be credulous in that it has an innate propensity to believe what others tell us of the world. As the mind contains mechanisms to adjust its credulity to information received based on the expected costs and benefits of

b3411_FM.indd 14

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xv

belief or disbelief, the mind is inclined to believe information about hazards as the costs of a failure would far outweigh disbelief. Termed “negatively-biased credulity”, people exhibit a greater tendency to follow information about hazards, to consider those providing such information as competent, and to communicate such information themselves. As a result, over time, societies have a tendency to accumulate false information about hazards as a possible negative feedback loop is created — negatively-biased credulity leads to the accumulation of information about hazards and this causes the perception of a dangerous world, leading to a greater negativelybiased credulity. Based on this argument, Fessler effectively disagrees with Brotherton. With information being disseminated at an unprecedented scale and speed, Fessler is of the view that the human mind is currently “unprepared for the cyber-environment of the 21st century”, resulting in warnings of hazards spreading swiftly and being believed more widely than ever before. As a result, to paraphrase Jonathan Swift here, falsehoods about hazards fly and more measured truthful assessments come limping after it.

The Employment of DRUMS From a discussion on our cognitive predisposition towards believing incorrect information, Part Two of this volume discusses the manner in which DRUMS have been deployed in order to achieve political goals. Acting as an ideal segue into this section, Nicolas Arpagian’s contribution offers a general overview to how DRUMS may be considered a powerful digital weapon able to non-violently achieve the ideal Clauswitzian terminus of war where an enemy is compelled to fulfil an adversary’s will. Discussing the employment of Twitter and Facebook to influence the 2016 U.S. Presidential election and the Brexit referendum, Arpagian warns that influence operations

b3411_FM.indd 15

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xvi  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

could target the U.S. Congressional midterm elections in 2018. To mitigate the threat, Arpagian calls for a full spectrum response ranging from legislation to the instilling of critical thinking. Building on Arpagian, Kevin Limonier and Louis Pétiniaud deep dive into the Russian network of influence uncovered by an analysis of Russian influence operations in the 2017 French Presidential Elections. Acknowledging Russia has identified cyberspace as key to its strategy to influence global politics, Limonier and Pétiniaud’s chapter fills a current gap in most analyses of Russian influence operations — there have been few attempts to quantify the phenomenon. By doing so, Limonier and Pétiniaud’s contribution uncovers a diverse ‘galaxy’ of Russian influence. Limonier’s analysis concludes that the within this galaxy, there are identifiable central accounts operated by the Russian state apparatus that steer coherence in the galaxy, as well as create connections within it. In addition, all possible vectors are employed in Russia’s influence operations — beyond Twitter, platforms such as Facebook and Vkontakte are harnessed. Finally, the large presence of seemingly third-party actors suggests that the galaxy can exist independently without any action by Moscow. Gillian Bolsover and Philip Howard examine the use of computational means (such as online advertising and targeted campaigns based on big data analytics) to spread propaganda (influencing human action by the manipulation of representations), which they call “computational propaganda” (propaganda created or disseminated using computational means, especially social media ‘bots’ which automate the spread of information online). They examine the influence of social media on political processes in authoritarian and democratic contexts and conclude that it is the contexts that control the importance of this phenomenon over modern political processes. For example, little evidence has been

b3411_FM.indd 16

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xvii

found of state-driven automation of propaganda in China, compared to Europe and the U.S. They posit that computational propaganda is less about technology and more about the power over that technology — the former can be fixed by technological means, but the latter requires tackling the underlying social conditions that increase the susceptibility of individuals to emotionally laden and manipulative messages. Recognizing that the role of civilians in state-sponsored influence operations remains an underdeveloped topic, Gulizar Haciyakupoglu and Benjamin Ang’s contribution, while not presented at the conference, studies this under-analyzed issue. Their chapter ends this section with an analysis of the role of non-combatant citizens in the People’s Republic of China’s (PRC) information operations against Taiwan, with an emphasis on the attacks aimed at swaying public opinion and psychology. Haciyakupoglu and Ang assert that the plausible deniability of civilian actions in unrestricted warfare may allow attacking states to avoid blame and render diplomatic response difficult. They suggest that target states may have to accept that attribution is a lower priority and focus on prevention and response instead. They examine Taiwan’s response capacity, identify the areas that require improvement and, after building on these, offer suggestions to other states. Moreover, they demonstrate that deliberate online falsehoods may serve as part of larger multi-pronged information operations, which integrate many diverse tactics both online and offline.

Countering DRUMS Finally, Part Three deals with the manner in which societies may protect and defend themselves against DRUMS through the study of regions and polities with experience of it. While one must be sensitive

b3411_FM.indd 17

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xviii  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

to the manner in which differing cultures do result in differing tolerances to suggested approaches in the combatting of DRUMS, comparative work here would still yield dividends. As the chapters in this section illustrate, most approaches are similar but may differ in accent. Jānis Bērziņš begins this section by arguing that the idea of influence has opened up a new dimension in modern war beyond the three dimensions of air, sea, and land. Warfare in cyberspace is this new fourth dimension, and it is a battle for the control of perception and the human mind. Similar to Arpagnian’s position in Chapter Three, Bērziņš is of the view that the successful deployment of DRUMS offers an adversary the ability to win wars without fighting (or at least, reduce the amount of fighting) by “morally and psychologically depressing the enemy’s armed forces personnel and civil population” in order to diminish the opponent’s desire to fight. From his study of Russian influence operations in the Baltics, Bērziņš argues for the integration of the concept of resilience into defence planning. For Bērziņš, as influence operations exploit the gap (whether real or perceived) between the government, civil servants and politicians, and the general population, and a reduction of this gap is a priority fordeveloping resilience. He concludes with concrete measures on how to reduce this gap such as the instilling of critical thinking, the encouragement of greater political participation by members of a polity, and a clear articulation by government to the public, if and when their state experiences influence operations. Following on from Bērziņš, Jakub Janda and Veronika Víchová look into the employment of DRUMS in Russia’s influence operations in the 2017 French Presidential and German Parliamentary elections too. Janda and Víchová maintain that the goal of Russian influence operations in Western Europe is to undermine Western

b3411_FM.indd 18

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xix

pushback against its military actions and aggression vis-à-vis its neighbors in the post-Soviet space. If Russia can manipulate public opinion through the targeted dissemination of disinformation as well as help the ascendance of pro-Russian political leaders, this outcome would reduce international pressure on Russia for its occupation of neighboring countries such as Ukraine and Georgia. In order to resist such operations, Janda and Víchová argue that states cannot be reactive to influence operations. Instead, the institutionalization of the countering of influence operations has to be established to ensure long-term slow burn campaigns can be successfully countered, and this effort has to be supported with greater international cooperation. Unlike the previous two chapters that largely focused on Russian operations, Karolin Schwarz’s chapter examines the state of false news and disinformation in Germany to assess the initiatives developed in response. Schwarz’s study reveals that DRUMS in Germany tends to (re)produce racist stereotypes and anti-Muslim sentiments that polarizes society and generates distrust in public institutions, lawmakers, and police. After detailing the specific forms that DRUMS may take in Germany — such as fake eyewitness accounts, fabricated quotes, and taking photos and videos out of context — Schwarz assesses the various responses put in place to mitigate DRUMS. The response by Germany to DRUMS has largely taken the form of legislation and fact-checking initiatives. However, Schwartz warns that these two responses are far from perfect solutions. On the one hand, legislation in the form of Germany’s Netzwerkdurchsetzungsgesetz (Network Enforcement Law) has been criticized for leaving too much power in the hands of social media platforms rather than the German courts to decide on proper content. On the other hand, fact-checking initiatives have to move far more rapidly and accurately in countering DRUMS.

b3411_FM.indd 19

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xx  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

In the final chapter of Part Three, perhaps the most long-term solution requiring the most energy to the challenge posed by DRUMS is discussed in detailed. While admitting that the addressing of the supply side of DRUMS is necessary, Andreas Schleicher argues it is “equally important to strengthen the capacity of people to navigate the new world of information”. Education in the 21st Century for Schleicher needs to focus on developing the capacity of people to access, manage, integrate, evaluate, and reflect on written information, rather than the mere learning of facts. It is only by the development of these capacities that we can have societies composed of individuals possessing the knowledge, skills, and character to navigate what Schleicher describes as the “post-truth information age”.

Moving Forward Undoubtedly, academics, Non-Governmental Organizations (NGOs) and governments seeking to grapple with DRUMS will have to embrace three key issues. Firstly, while the view from psychology may appear pessimistic by suggesting that we are cognitively predisposed to believe DRUMS, the fortitude of human agency to resist this predisposition cannot be denied. Predisposition should not be understood as fate. Secondly, based on the discussions in Part Two of this book, it is difficult to deny that DRUMS have become weaponized and the genie cannot be put back in the bottle. In fact, the threat posed by DRUMS can only get worse, as more actors beyond states and extremist organizations will most certainly adopt this form of warfare as the cost of its deployment can only drop as technological advancements drive its costs down. Modern society has to find ways to mitigate the threat it poses rather than believe the threat will go away.

b3411_FM.indd 20

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Introduction: The Seemingly Unrelenting Beat of DRUMS   xxi

Thirdly, while there is unquestionably no silver bullet and no quick fix to solving the problems posed by DRUMS, Janda and Víchová’s call for a whole-of-society approach must certainly be the only prudent way forward. Combatting DRUMS requires not just cooperation between tech companies and governments, but also civil society and individuals. As Bērziņš warns, as the deployment of DRUMS seeks to exploit the social, economic, and political cracks found in society, it is only by attending to such cracks can the fraying social compact with each other in our individual polities be mended and the effects of DRUMS be countered.

References Conn, David. (2016). “How the Sun’s ‘truth’ about Hillsborough unraveled”, The Irish Times, April 27. https://www.irishtimes.com/news/world/uk/ how-the-sun-s-truth-about-hillsborough-unravelled-1.2625459. Pooley, Jefferson and Michael J. Socolow, (2013). “The Myth of the War of the Worlds Panic”, Slate, October 28. http://www.slate.com/articles/arts/ history/2013/10/orson_welles_war_of_the_worlds_panic_myth_the_ infamous_radio_broadcast_did.html. Scott, Simon. (2012). “Sexiest Man Alive Gets ‘The Onion’ Taken Seriously”, NPR Weekly Edition Saturday, December 1. https://www.npr.org/2012/12/01/ 166293306/the-onion-so-funny-it-makes-us-cry. Smith, Mikey. (2017). ‘How Facebook, Fake News and Personalised Ads Could Swing the 2017 Election — And What You Can Do About It’, The Mirror, 8 May. http://www.mirror.co.uk/news/politics/howfacebook-fake-news-personalised-10382585. Subramanian, Samanth. (2017). “Inside the Macedonian Fake-News Complex”, WIRED, February 15. https://www.wired.com/2017/02/ veles-macedonia-fake-news/. Tambini, Damien. (2017). “Fake News: Public Policy Responses”. Media Policy Brief 20. London: Media Policy Project, LSE. http://www.bbc/

b3411_FM.indd 21

20-Nov-18 12:41:50 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

xxii  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

future/story/20170301-lies-propaganda-and fake-news-a-grand-challengeof-our-age. Vasu, Norman, Benjamin Ang, Terri-Anne Teo, Shashi Jayakumar, Muhammad Faizal Bin Abdul Rahman, Juhi Ahuja. (2018). “Fake News: National Secruity in the Post-Truth Era”, RSIS Policy Report, January 18. https://www.rsis.edu.sg/wp-content/uploads/2018/01/ PR180119_Fake-News-National-Security-in-Post-Truth-Era.pdf.

b3411_FM.indd 22

20-Nov-18 12:41:50 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

PART 1

COGNITIVE PREDISPOSITIONS AND DRUMS

b3411_Ch-01.indd 1

20-Nov-18 12:39:07 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

1 THE PSYCHOLOGY OF CONSPIRACY THEORIES: THE ROLE OF PATTERN PERCEPTION ROBERT BROTHERTON

One of the most notable forms of distortion, rumor, untruth, misinformation, or smear in the pop-cultural zeitgeist is the conspiracy theory — a claim that events are secretly orchestrated toward nefarious ends. And one of the most iconic tropes associated with conspiracy theories — second only to the tinfoil hat — is the “conspiracy board”. An archetypal conspiracy board consists of a plethora of news clippings and scribbled notes arranged on a wall or corkboard, joined together in an elaborate web of pushpins and string. The conspiracy board is a visual representation of the logic of conspiracism. Its purpose is to reveal connections between ostensibly 3

b3411_Ch-01.indd 3

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

4  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

unrelated observations, usually with the strong implication, or outright declaration, that the connected dots trace the outline of a conspiracy. In America during the late 2017, the conspiracy board briefly enjoyed mainstream attention (Bump, 2017). During a House Judiciary Committee hearing on November 14, Republican Representative Louie Gohmert accused several government officials of having conflicts of interest by way of a poster-sized diagram featuring labelled boxes of various colors and shapes connected by a complex network of lines putatively linking the Justice Department with Russia and ISIS, among others, and somewhat tautologically linking former President Barack Obama to himself. Later that same evening, conservative Fox News anchor Sean Hannity began his nightly show with a similar gambit. Hannity stood in front of a large digital screen displaying more than two-dozen labelled boxes, with former Secretary of State Hillary Clinton in the center. “As you can see,” Hannity began, “we are at the Hannity Big Board tonight for a very special reason. Tonight…we are going to untangle the giant web of Clinton scandals and corruption. It’s unlike anything we have ever done before”. On the other side of the political aisle, TheRohrabacherConspiracy.com, a website paid for by the Democratic Congressional Campaign Committee, used graphics representing the traditional corkboard, pushpins, and string to putatively link Republican Representative Dana Rohrabacher to a number of people with alleged ties to Russia.

Conspiracism and Human Psychology Though often mocked, conspiracy boards, and conspiracism more generally, may reflect a basic element of human psychology that goes far beyond claims of conspiracy. Perceiving patterns is an essential function of the human cognitive system. Since real and meaningful

b3411_Ch-01.indd 4

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  5

patterns do exist in the world, it is a valuable skill to possess. In our evolutionary history, early humans who could accurately recognize patterns in phenomena such as the changing of the seasons, food availability, and habits of predators would have been at an advantage compared to individuals who did not possess such an ability. However, it is possible to perceive illusory patterns, or seemingly meaningful patterns in stimuli that are actually random or chaotic. Therefore, some calibration is necessary. Too high a threshold for pattern recognition would result in missing out on patterns which are real and potentially useful. Too low a threshold and illusory patterns would be indistinguishable from real patterns. Recent research suggests that individual differences and situational influences on pattern sensitivity may account in part for the allure of conspiracy theories. In the first research to empirically link conspiracy belief and other forms of pattern perception, Whitson and Galinsky (2008) examined four varieties: (1) visual pattern perception (seeing images within random visual static); (2) superstitions (linking an ostensibly unrelated action and a subsequent event); (3) stock market trends (perceiving illusory correlations between stock movements and world events); and lastly, (4) conspiracy beliefs (assuming that a workplace setback is caused by collusion between colleagues and bosses). In each study, the authors manipulated participants’ sense of control before assessing pattern recognition. Significant differences were observed between the two experimental conditions for all four types of pattern, such that participants made to feel a diminished sense of control reported recognizing more patterns, on average, than participants with a higher sense of control. The authors suggest that when our internal sense of control is diminished, perceiving meaningful patterns can offer compensatory control.

b3411_Ch-01.indd 5

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

6  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Subsequent experimental studies produced additional evidence that conspiracism is increased under conditions of diminished control (Rothschild et al., 2012; Sullivan et al., 2010), as well as during conditions that evoke anxiety (Grzesiak-Feldman, 2013), social threats (Moulding et al., 2016), and uncertainty (van Prooijen and Jostmann, 2013). Correlational research examining related personality measures, such as stress (Swami et al., 2016) and external locus of control (Hamsher, Geller, and Rotter, 1968; Mirowsky and Ross, 1983), also suggests relationships with conspiracy beliefs. Using a more naturalistic measure of control threat, during the three months leading up to the turn of the millennium, van Prooijen and Acker (2015) asked participants how concerned they were about the Y2K computer bug, an issue with how computers store dates, which was anticipated to cause widespread malfunctions when the year ticked over from 1999 to 2000. Self-reported feelings of threat were positively related to the endorsement of a range of conspiracy theories unrelated to the Y2K bug, suggesting that compensatory control can be derived from patterns unrelated to the control threat itself. Likewise, using Google search data, DiGrazia (2017) found that social conditions associated with threat and insecurity, such as unemployment and political losses, are associated with increased conspiratorial ideation. Of course, conspiracy theories often flourish around significant, unexpected societal crises that potentially challenge citizens’ sense of control and understanding, such as terrorist attacks, mass shootings, natural disasters, and industrial disasters (van Prooijen and Douglas, 2017). Experimental studies using fictional vignettes find that the greater the consequences of such an event, the stronger the tendency for participants to attribute the events to a conspiracy (LeBoeuf and Norton, 2012; van Prooijen and van Dijk, 2014). Some qualifications to these findings are necessary, however. Some conspiracy theories concern events that, on the surface, do not appear

b3411_Ch-01.indd 6

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  7

to challenge citizens’ sense of control, such as the Apollo moon landing. On the other hand, diminished control does not invariably lead to conspiracy theorising. Indeed, compensatory control can sometimes take the form of defending, rather than distrusting, institutions, including the government (Kay et al., 2009). Concerning the experimental designs used to investigate the consequences of control, van Prooijen and Acker (2015) point out that simply finding a difference between conditions of diminished and reaffirmed control raises two possibilities — diminished control may enhance conspiracism, or reaffirmed control may decrease conspiracism. By including a neutral baseline condition in additional to manipulating control to be high or low, the authors produced some evidence for the latter possibility; conspiracism was weaker among those with reaffirmed control as compared to the neutral baseline condition, but there was no difference between the neutral condition and the diminished control condition. Feeling some lack of control when it comes to geopolitical events may be the default for most people.

Illusory Pattern Perception Despite these caveats, the evidence that a person’s sense of control is associated with their likelihood of endorsing conspiracy theories is substantial. The explanation typically offered is that control threats increase our sense-making motivation, which lowers our threshold for perceiving patterns, of which conspiracy theories are just one variety. This is bolstered by Whitson and Galinsky’s (2008) demonstration of commonality between conspiracism and other forms of pattern perception. However, this is indirect evidence. Several studies have now examined this possibility more directly. Perhaps the most basic form of pattern recognition is perceiving meaningful patterns in simple random processes, such as a series of

b3411_Ch-01.indd 7

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

8  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

coin tosses. Dieguez, Wagner-Egger, and Gauvrit (2015) used the context of coin tosses to investigate whether people more prone to conspiracism demonstrate a low prior for randomness (i.e., a low threshold for judging sequences to be deterministic rather than random). The authors presented participants with ostensibly random sequences of coin tosses, which were in fact selected to vary in ‘complexity’, that is, in how closely they conform to true randomness. Contrary to expectations, however, the research found no difference between conspiracy believers’ and sceptics’ perceptions of randomness. Even when the authors bolstered the conceptual link with conspiracism by having participants contemplate the possibility that a sequence of coin tosses might indicate malevolent deception — by asking participants whether they thought a sequence was fairly produced or generated by someone ‘cheating’ by just making up a random-looking sequence — conspiracy theorists’ and sceptics’ perceptions did not differ. In contrast to this initially negative finding, van Prooijen, Douglas, and De Inocencio (2017) did find evidence of an association between conspiracism and perception of (non)randomness in coin tosses. Speculating about the divergence in their findings, the authors suggest that conspiracist ideation may not be associated with pattern recognition in general, but specifically with illusory pattern perception. By having sequences that varied in how truly random they were, Diguez et al. (2015) conflated recognition of genuine patterns with perception of illusory patterns, whereas van Prooijen et al.’s (2017) stimuli were entirely random. The authors report another study which provides support for this speculation (van Prooijen et al., 2017, Study 3). In this study, visual stimuli were used in place of coin tosses. Specifically, some participants were shown abstract paintings with a meaningfully ordered structure — works by Victor Vasarely composed with regular design and alignment of

b3411_Ch-01.indd 8

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  9

figures. Other participants were shown works by Jackson Pollock — works composed of relatively chaotic brush strokes and irregular figures. The authors argue that perceiving patterns in the former would be indicative of genuine pattern recognition, whereas perceiving patterns in the latter should be indicative of illusory pattern perception. Consistent with expectations, pattern perception in general (i.e., collapsing across the two groups of participants) was unrelated to conspiracism. Examining the groups separately, illusory pattern perception predicted endorsement of conspiracy theories, whereas genuine pattern perception did not.

Connections and Causes In sum, support for a relationship between conspiracism and the most basic forms of pattern perception is more limited than what was initially suspected, perhaps confined only to illusory pattern perception. However, other studies suggest a more complex form of probabilistic bias may be associated with conspiracist ideation — the conjunction fallacy. The conjunction fallacy is an error of probabilistic reasoning, wherein the co-occurrence of two events is deemed more probable than one of the constituent events in isolation. In the original demonstration of the effect, Tversky and Kahneman (1983) created a description of a fictional individual, Bill, which was designed to stereotypically suggest that Bill is likely to be an accountant and unlikely to play jazz for a hobby — he was described to participants as mathematically inclined and intelligent, but unimaginative. When participants were asked to rate the likelihood of various possibilities, a substantial number rated the statement “Bill is an accountant who plays jazz for a hobby” as more likely than “Bill plays jazz for a hobby”. Since the former is a more restrictive category than the latter, doing so is taken as commission of a conjunction error.

b3411_Ch-01.indd 9

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

10  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Brotherton and French (2014) reasoned that conspiracy theories resemble, in some regards, typical conjunction scenarios. Like the jazz-playing accountant, conspiracy theories posit that relatively more elaborate conspiracist narratives are more probable than alternative explanations with fewer conjunctive elements. For example, some conspiracy theories surrounding the assassination of President John F. Kennedy implicate a man seen conspicuously opening an umbrella moments before the assassination (see Posner, 1994). To endorse a theory in which the “Umbrella Man” is part of a conspiracy to assassinate the president is to think it more likely that the umbrella and the assassination were co-occurring elements of a single plot than merely unrelated coincidences. Brotherton and French (2014) created several conspiracy-themed conjunction scenarios in which the conjunction implied a coordinated plot, similar to the Umbrella Man theory. As expected, people higher in conspiracist ideation tended to commit more of these conspiratorial conjunction errors. Of course, it may not be surprising that people high in conspiracist ideation are drawn to conjunction items that imply conspiracy. More compellingly, conspiracist ideation was also predicted by conjunction errors on non-conspiratorial, neutral conjunction items similar to the “Bill” example. This implies that conspiracy theorists’ susceptibility to the conjunction fallacy is a domain-general bias. That finding has been replicated (Dagnall et al., 2017; Moulding et al., 2016). Moreover, Dagnall et al. (2017) administered a measure of probabilistic reasoning, involving evaluation of sequences of coin tosses similar to those used by Dieguez et al. (2015) and van Prooijen et al. (2017), in addition to a test of susceptibility to the conjunction fallacy. Performance on the probabilistic reasoning task was associated with conspiracist ideation, such that people with a low prior for randomness were more inclined to endorse conspiracy

b3411_Ch-01.indd 10

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  11

theories. Importantly, however, conjunction fallacy susceptibility was a substantially better predictor of conspiracism than probabilistic reasoning about coin tosses. One possible reason for relative strength of conjunction fallacy susceptibility as a predictor of conspiracism concerns causal attribution. Some patterns not only reveal connections, but identify causes. The types of pattern examined by Whitson and Galinsky (2008) — superstitions, stock market movements, conspiracies — all offer causal explanations for the phenomena in question in a way that merely seeing patterns in sequences of coin tosses may not. There are several putative explanations for the conjunction fallacy, but one involves the ability to construct a coherent story which connects the component events by way of a unifying causal mechanism (Ahn and Bailenson, 1996; Nestler, 2008). Explaining a car accident by saying the driver is near-sighted and was driving during a severe storm, for example, may appear more plausible than either cause alone, because the conjunction creates a story that explains how poor vision combined with poor driving conditions caused the accident. In the context of conspiracy, to take the Umbrella Man theory mentioned previously, perceiving the anomalous umbrella as entirely unrelated to the nearby assassination provides no clue as to the cause of either the assassination or the presence of the umbrella. Perceiving the umbrella as part of a conspiracy — either as a signal to the gunman, or perhaps even a concealed weapon, as at least one conspiracy theory postulated — not only connects the dots but suggests a causal mechanism for the assassination.

Conclusion These findings, then, may provide at least a partial account for the allure of the conspiracy board and conspiracy theories more

b3411_Ch-01.indd 11

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

12  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

generally. Conspiracy theories may satisfy epistemic motives to feel that we possess understanding and control of our circumstances by way of identifying subjectively meaningful patterns in the informational environment. This pattern-seeking tendency may be particularly pronounced when we feel a lack of control over our circumstances (or, put another way, this motive may be relatively less pronounced when our sense of control is bolstered). Individual differences in illusory pattern perception (if not pattern recognition in general) are predictive of conspiracist ideation, but an even better predictor appears to be susceptibility to the conjunction fallacy — a specific failure of probabilistic reasoning, whereby the likelihood of a set of circumstances is influenced by the extent to which it affords a causal explanation for an outcome. Not only do conspiracy theories connect dots but perhaps, more importantly, those connections suggest causes. In this light, there is some cause for concern that conspiracy theories and other forms of DRUMS may be exacerbated by the modern informational environment. The Internet provides instant access to practically unlimited information. This can serve as raw material for pattern-seeking brains. The speed at which ideas flow through social media may encourage quick and intuitive thinking rather than slower dispassionate analysis. Thus, the abundance of information may not leave us better informed. With so many dots at our fingertips, the number of ways they can be connected is limited only by our imaginations. Reassuringly, one longitudinal study examining letters published in American national newspapers between 1890 and 2010 found that talk of conspiracies has not increased during that period; the advent and mass adoption of the Internet did not appear to cause an increase in conspiracism (Uscinski and Parent, 2014). Moreover, past revolutions in information communication technologies, such as the

b3411_Ch-01.indd 12

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  13

advent of the printing press and the telegraph, have been accompanied by fears of information overload and societal harm that now appear quaint and unfounded (Jungwirth and Bruce, 2002). Yet, further study of the interplay between innate psychological biases and the changing informational landscape, particularly newer developments in how social media platforms can facilitate the spread of mis- and disinformation, seems prudent.

References Ahn, W. K., and J. Bailenson. 1996. “Causal Attribution as a Search for Underlying Mechanisms: An Explanation of the Conjunction Fallacy and the Discounting Principle.” Cognitive Psychology 31(1): 82–123. https://doi.org/10.1006/cogp.1996.0013. Brotherton, R., and C. C. French. 2014. “Belief in Conspiracy Theories and Susceptibility to the Conjunction Fallacy.” Applied Cognitive Psychology 28(2): 238–248. https://doi.org/10.1002/acp.2995. Bump, P. 2017. “The Brave New World of Political Conspiracy-Theory Illustrations.” Washington Post, November 15, 2017. https://www. washingtonpost.com/news/politics/wp/2017/11/15/the-brave-newworld-of-political-conspiracy-theory-illustrations/. Dagnall, N., A. Denovan, K. Drinkwater, A. Parker, and P. Clough. 2017. “Statistical Bias and Endorsement of Conspiracy Theories.” Applied Cognitive Psychology 31(4): 368–378. https://doi.org/10.1002/acp.3331. Dieguez, S., P. Wagner-Egger, and N. Gauvrit. 2015. “Nothing Happens by Accident, or Does It? A Low Prior for Randomness Does Not Explain Belief in Conspiracy Theories.” Psychological Science 26(11): 1,762–1,770. https://doi.org/10.1177/0956797615598740. DiGrazia, J. 2017. “The Social Determinants of Conspiratorial Ideation.” Socius 3: 1–9. https://doi.org/10.1177/2378023116689791. DiGrazia, J. 2017. “The Social Determinants of Conspiratorial Ideation.” Socius 3 (December): 2378023116689791. https://doi.org/10.1177/ 2378023116689791.

b3411_Ch-01.indd 13

20-Nov-18 12:39:07 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

14  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Grzesiak-Feldman, M. 2013. “The Effect of High-Anxiety Situations on Conspiracy Thinking.” Current Psychology 32(1): 100–118. https://doi. org/10.1007/s12144-013-9165-6. Hamsher, J. H., J. D. Geller, and J. B. Rotter. 1968. “Interpersonal Trust, Internal-External Control, and the Warren Commission Report.” Journal of Personality and Social Psychology 9(3): 210. Jungwirth, B., and B. C. Bruce. 2002. “Information Overload: Threat or Opportunity?” Journal of Adolescent and Adult Literacy 45(5): 400–406. Kay, A. C., J. A. Whitson, D. Gaucher, and A. D. Galinsky. 2009. “Compensatory Control: Achieving Order through the Mind, Our Institutions, and the Heavens.” Current Directions in Psychological Science 18(5): 264–268. https://doi.org/10.1111/j.1467-8721.2009.01649.x. LeBoeuf, R. A., and M. I. Norton. 2012. “Consequence-Cause Matching: Looking to the Consequences of Events to Infer Their Causes.” Journal of Consumer Research 39(1): 128–141. https://doi.org/10.1086/662372. Mirowsky, J., and C. E. Ross. 1983. “Paranoia and the Structure of Powerlessness.” American Sociological Review 48(2): 228–239. https:// doi.org/10.2307/2095107. Moulding, R., S. Nix-Carnell, A. Schnabel, M. Nedeljkovic, E. E. Burnside, A. F. Lentini, and N. Mehzabin. 2016. “Better the Devil You Know than a World You Don’t? Intolerance of Uncertainty and Worldview Explanations for Belief in Conspiracy Theories.” Personality and Individual Differences 98(August): 345–354. https://doi.org/10.1016/j. paid.2016.04.060. Nestler, S. C. 2008. “Hindsight Bias, Conjunctive Explanations and Causal Attribution.” Social Cognition 26(4): 482–493. https://doi.org/10.1521/ soco.2008.26.4.482. Posner, G. L. 1994. Case Closed: Lee Harvey Oswald and the Assassination of J.F.K. London: Warner. van Prooijen, J.-W., and M. Acker. 2015. “The Influence of Control on Belief in Conspiracy Theories: Conceptual and Applied Extensions.” Applied Cognitive Psychology 29(5): 753–761. https://doi.org/10.1002/ acp.3161.

b3411_Ch-01.indd 14

20-Nov-18 12:39:07 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The Psychology of Conspiracy Theories: The Role of Pattern Perception  15

van Prooijen, J.-W., and E. van Dijk. 2014. “When Consequence Size Predicts Belief in Conspiracy Theories: The Moderating Role of Perspective Taking.” Journal of Experimental Social Psychology 55 (November): 63–73. https://doi.org/10.1016/j.jesp.2014.06.006. van Prooijen, J.-W., and K. Douglas. 2017. “Conspiracy Theories as Part of History: The Role of Societal Crisis Situations.” Memory Studies. https://kar. kent.ac.uk/60677/1/vanProoijen%20Douglas%20MemoryStudies.pdf. van Prooijen, J.-W., and N. B. Jostmann. 2013. “Belief in Conspiracy Theories: The Influence of Uncertainty and Perceived Morality.” European Journal of Social Psychology 43(1): 109–115. https://doi. org/10.1002/ejsp.1922. Rothschild, Z. K., M. J. Landau, D. Sullivan, and L. A. Keefer. 2012. “A Dual-Motive Model of Scapegoating: Displacing Blame to Reduce Guilt or Increase Control.” Journal of Personality and Social Psychology 102(6): 1,148–1,163. https://doi.org/10.1037/a0027413. Sullivan, D., M. J. Landau, and Z. K. Rothschild. 2010. “An Existential Function of Enemyship: Evidence That People Attribute Influence to Personal and Political Enemies to Compensate for Threats to Control.” Journal of Personality and Social Psychology 98(3): 434–449. https:// doi.org/10.1037/a0017457. Swami, V., A. Furnham, N. Smyth, L. Weis, A. Lay, and A. Clow. 2016. “Putting the Stress on Conspiracy Theories: Examining Associations between Psychological Stress, Anxiety, and Belief in Conspiracy Theories.” Personality and Individual Differences 99: 72–76. Tversky, A., and D. Kahneman. 1983. “Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgement.” Psychological Review 90(4): 293–315. https://doi.org/10.1037/0033295X.90.4.293. Uscinski, J. E., and J. M. Parent. 2014. American Conspiracy Theories. Oxford: Oxford University Press. Whitson, J. A., and A. D. Galinsky. 2008. “Lacking Control Increases Illusory Pattern Perception.” Science 322(5898): 115–117. https://doi. org/10.1126/science.1159845.

b3411_Ch-01.indd 15

20-Nov-18 12:39:07 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

2 BELIEVING CHICKEN LITTLE: EVOLUTIONARY PERSPECTIVES ON CREDULITY AND DANGER DANIEL M. T. FESSLER

Introduction Believing false information provided by others places individuals at risk both of being manipulated by purveyors of false information, and of unwittingly contributing to the manipulation of others through the dissemination of that information. While there are many The author benefited from the support of award #FA9550-15-1-0137 from the U.S. Air Force Office of Scientific Research. 17

b3411_Ch-02.indd 17

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

18  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

factors that contribute to whether or not someone believes a given claim, in this chapter, I focus on the influence of danger in shaping assessments and transmission of information. An evolutionary psychological approach to the mind (Buss, 2015) provides a useful metatheory for organizing disparate existing findings and generating novel predictions. To briefly summarize this paradigm, first, evolutionary psychologists conceptualize the mind as composed of a large number of discrete mechanisms, each of which evolved in response to a specific class of challenges (e.g., obtaining food; avoiding predators, etc.) that confronted ancestral humans. Second, because evolution by natural selection generally occurs through successive slight modifications of existing traits across generations, these mental mechanisms evolved over periods from hundreds of thousands to millions of years. As a consequence, recent rapid technological and social changes have produced environments for which humans are poorly adapted, often resulting in a mismatch between the way our minds operate, and the information presented to us. Humans are not particularly fearsome or fleet creatures, yet we dominate the globe. One key to our species’ success is our unique reliance on culture, that is, on socially transmitted information. To address challenges in their environment, many social animals primarily rely on largely innate behavioral templates combined with trial-and-error learning. In contrast, every human society is dependent on an enormous repertoire of cultural knowledge. Hence, throughout life, acquiring information from other people is central to individuals’ ability to successfully navigate their physical and social environments. Reconstructing the pathway whereby our species became so different from other animals, it is plausible that, over hundreds of thousands (if not millions) of years, hominids’ growing store of cultural information co-evolved with psychological mechanisms that maximized individuals’ capacities to obtain, retain,

b3411_Ch-02.indd 18

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  19

and employ cultural information: the more extensive the cultural information available at any one time, the greater its adaptive value, and thus the stronger the natural selection pressures for those psychological capacities; the more developed those capacities, the more that individuals who had mastered the existing cultural information could add to it, and improve on it. Importantly, the evolutionary psychological view of the mind as composed of many discrete mechanisms suggests that there is no single, monolithic “capacity for culture”. Rather, because the task of acquiring cultural information is composed of a large number of component goals, each of which entails different demands, we can expect that accessing and using cultural information will be undergirded by many different psychological mechanisms (Fessler, 2006). Here, I focus on those processes that assess the plausibility of information. Being the product of countless individual contributions, cultural knowledge evolves. One consequence is that culture often solves problems without any of its bearers truly understanding the underlying causal processes (Boyd and Richerson, 2006) (for example, traditional medicines can be effective despite inaccurate ethnomedical theories of disease (de Montellano, 1975). Indeed, culture sometimes solves problems without any of its bearers even recognizing the nature of the problems being solved (for example, in regions historically plagued by many diseases, despite the absence of germ theory, traditional cuisines employ spices that possess antimicrobial properties (Billing and Sherman, 1998). Another consequence is that cultural solutions to problems are often so complex that only experts understand the relationships between particular practices and particular outcomes; as a result, the rationale behind those practices is opaque to most learners. Taken together, the above considerations indicate that, in any society, to be successful, individuals must avidly learn from those around them, and, critically, they must be credulous, that is, they must

b3411_Ch-02.indd 19

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

20  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

accept as true information for which the evidentiary basis, the logical rationale, or both are entirely unclear to the learner (Saler, 2004; Fessler, Pisor and Navarrete, 2014). Although credulity is vital to success in any society, nevertheless, it comes at a cost. At the least, individuals who are overly credulous will acquire many incorrect beliefs, and these can shape their behavior in a variety of unproductive ways (Saler, 2004; Boyd and Richerson, 2006). Worse still, excessive credulity invites exploitation by malicious actors who knowingly provide false information (Saler, 2004; Kurzban, 2007). Accordingly, for any given individual in any given environment, there will be an optimal level of credulity, below which the individual fails to take advantage of amassed cultural wisdom, and above which the individual suffers burdensome false beliefs and/or outright exploitation. Importantly, this trade-off point depends in part on the type of information at issue. For information concerning hazards, failing to believe true information will often result in costly encounters with danger, whereas erroneously believing false information will often result in the adoption of precautions that, while potentially entailing costs, will frequently harm the individual less than encounters with the (purported) danger would. No equivalent asymmetry characterizes information concerning benefits. As a consequence, we can expect natural selection to have shaped the human mind so as to make people more credulous of information concerning hazards than of information concerning benefits, i.e., to exhibit negatively-biased credulity (Fessler et al., 2014; Fessler, Pisor and Holbrook, 2017). Negatively-biased credulity builds on negativity bias, the overarching tendency, evident across a wide variety of species, for information concerning threats or losses to have greater attentional salience, evoke stronger emotional responses, be more memorable, and motivate action more strongly than information concerning

b3411_Ch-02.indd 20

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  21

opportunities or gains. Negativity bias too is explicable in evolutionary-functionalist terms, as dangers will often be more imminent than opportunities; will often preclude opportunities; and will often have a greater effect on biological fitness than opportunities (Rozin and Royzman, 2001; Baumeister et al., 2001).

Evidence of Negatively-Biased Credulity and Informational Negativity Bias Consonant with an overarching propensity for negativity bias, the public’s perception of impending economic circumstances is influenced more by negative reports than by positive ones (Soroka, 2006; Nguyen and Claus, 2013; Garz, 2012), and, in turn, news that consumer sentiment is falling has a bigger effect on the stock market than news that consumer sentiment is rising does (Akhtar et al., 2011; Akhtar et al., 2012). Likewise, in keeping with the specific tendency for negatively-biased credulity, people believe claims that commercial products are dangerous more than they believe accounts indicating that those products are safe (Siegrist et al., 2008; Siegrist and Cvetkovich, 2001; Slovic, 1993; White et al., 2003a). The processes underlying the above patterns have been explored experimentally. First, addressing the relationship between overarching  negativity bias and credulity, in a number of studies, using information regarding a variety of subjects, Hilbig (2009; 2012a; 2012b) demonstrated that Germans believe information more when it is framed in a negative rather than positive manner. Paralleling Hilbig’s technique of exploring the determinants of believability by altering the framing of information without changing its substantive content, Fessler, Pisor, and Navarrete (2014) examined negatively-biased credulity by presenting American subjects with sets composed of one of two paired statements, phrased so as to

b3411_Ch-02.indd 21

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

22  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

emphasize either the danger presented by a situation, or the benefit that the situation entails (e.g., “Although proponents consider German shepherds loyal and intelligent pets, a recent study in the U.S. notes that this breed is responsible for 11% of dog attacks,” and “Despite their fierce appearance, German shepherds are considered loyal and intelligent pets. A recent study in the U.S. notes that other breeds of dog are responsible for 89% of dog attacks”). As per predictions, participants judged statements more likely to be true when they focused on hazards than when they focused on benefits. While effective, framing manipulations such as those employed by Hilbig and Fessler et al. suffer the problem that manipulating a negative statement so as to create a positive one can lead to descriptions of benefits that consist primarily of the avoidance of hazards, thereby failing to cleanly disambiguate the two types of information. In part to address this, Fessler, Pisor, and Holbrook (2017) created statements that were thematically paired in each of eight domains, with one statement describing a hazard and one describing a unrelated benefit (e.g., “Kale contains thallium, a toxic heavy metal, that the plant absorbs from soil,” and “Eating carrots results in significantly improved vision”). In studies with American participants, they found additional evidence of negatively-biased credulity. Most recently, in multiple studies employing variants of the above techniques and somewhat differing content, Samore (2017) again documented negatively-biased credulity in American participants.

Individual Differences in Negatively-Biased Credulity Although negatively-biased credulity is thought to be a speciestypical trait of human cognition, nevertheless, individual variation in this trait is plainly evident. A collection of related features underlies such variation. First, the costs and benefits of negatively-biased

b3411_Ch-02.indd 22

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  23

credulity importantly hinge on the probability that a previously unfamiliar hazard described in a given message does, in fact, exist. One factor shaping individuals’ estimations of this probability is the frequency of other hazards. This is because hazards often co-occur. For example, if an ecosystem harbors one species of, say, dangerous predator or lethal mushroom, it often harbors others as well (Andheria et al., 2007; Cai et al., 2016); likewise, a neighborhood blighted by petty crime will frequently suffer from a variety of more serious crimes as well (Perkins et al., 1993). Accordingly, individuals living in dangerous environments will often benefit from enhanced negatively-biased credulity, as the presence of multiple known hazards increases the probability that a message purportedly describing a previously unknown hazard is accurate. Second, independent of issues of danger, people evaluate the plausibility of new information against the backdrop of their existing knowledge such that messages that are consistent with prior understanding are viewed as more plausible than those that are inconsistent with previous knowledge (White et al., 2003b). This offers another pathway whereby beliefs regarding the frequency of hazards should influence assessments of statements purporting to describe previously unknown hazards — whether the individual’s environment is objectively dangerous or not, those who believe their environment to be dangerous should find new information about hazards more congruent with their prior knowledge, and thus more plausible. Third, due to differences in physical and social resources, people differ in their ability to weather encounters with hazards. As a result, the threat posed by a given source of danger will often vary across individuals, with corresponding consequences for the utility of negatively-biased credulity. Lastly, driven by variation in personality (Zuckerman and Kuhlman, 2000) that may in part be evolutionarily maintained through frequency-dependent selection (Dall et al., 2004)

b3411_Ch-02.indd 23

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

24  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

and may in part result from differing adaptive developmental trajectories (Wang et al., 2009), people differ in their willingness to take risks with their safety; correspondingly, features of personality correlate with the extent to which the world is perceived as dangerous (Dallago et al., 2012), and, together, these features likely drive enhanced negatively-biased credulity. To summarize the above, people can be expected to vary in their perceptions of the frequency of hazards in their environment and their willingness to confront them, and this variation should influence the propensity for negatively-biased credulity. Paralleling this prediction, as both a trait and a state, anxiety is associated with the tendency to acquire and transmit rumors (Anthony, 1992; Bangerter and Heath, 2004; Pezzo and Beckstead, 2006; Rosnow, 1980; Rosnow et al., 1988; Walker and Beckerle, 1987). More specifically, concerns about threats enhances susceptibility to rumors about imminent hazards (Greenhill and Oppenheim, 2017). Against this backdrop, directly testing the aforementioned prediction, Fessler, Pisor, and Navarrete (2014) found that the degree to which participants evinced negatively-biased credulity correlated with their responses on a three-item survey assessing generalized belief in a dangerous world (e.g., “The world is a dangerous place,” etc.). In keeping with the personality differences described above, people differ in the extent to which they evince negativity bias in general, and threat reactivity in particular. This variation correlates with differences in political orientation, as political conservatives exhibit more overarching negativity bias, and more attention and reactivity toward threats, than political liberals do (Hibbing et al., 2014; Lilienfeld and Latzman, 2014; Ahn et al., 2014; Mills et al., 2014; Mills et al., 2016; but see, Knoll et al., 2015). Correspondingly, and critical for the present purposes, conservatives tend to see the world as more dangerous than liberals do (Federico et al., 2009). These patterns can

b3411_Ch-02.indd 24

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  25

be understood as follows: to the extent that conservatism (or, more precisely, social conservatism) focuses on the maintenance of existing cultural practices, social structures, and institutions, it constitutes a strategy of maintaining and reinforcing systems that have effectively organized social relations to date. Conversely, to the extent that (social) liberalism embraces cultural pluralism and innovation, the reshaping of social structures, and the revamping of institutions, it constitutes a strategy of experimentation rather than maintenance. Existing practices have, by definition, passed the test of time, including weathering any dangers that confronted society and its members in the past. Experimentation necessarily entails the risk of failure, and both the likelihood of failure and the costs of failure escalate as the level of danger confronting a group increase. Accordingly, conservatism will generally be the better strategy in a dangerous world, while liberalism will be more effective in a safe world. Given the functional associations between: (i) perceptions of the world as dangerous and the value of enhanced negatively-biased credulity, and (ii) social conservatism and perceptions of the world as dangerous, it follows that social conservatives should exhibit greater negatively-biased credulity than social liberals. Fessler, Pisor, and Holbrook (2017) tested this prediction by employing the paired-statements measure of negatively-biased credulity described earlier in conjunction with a variety of existing measures of political orientation. In two studies on Americans, the authors found that, as per predictions, social conservatism was positively correlated with negatively-biased credulity. Likewise, consonant with predictions, fiscal political orientation (which concerns competing philosophies regarding the relationship between government spending and economic growth) was unrelated to negatively-biased credulity. Military conservatism, the tendency to endorse the use of force to resolve international conflicts and maintain domestic order (practices

b3411_Ch-02.indd 25

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

26  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

that are of greater utility in a more dangerous world), was associated with negatively-biased credulity, albeit less so than social conservatism. In a forthcoming work, Samore, Fessler, and Holbrook (forthcoming) replicated the relationship between social conservatism and negatively-biased credulity, studying Americans approximately six and 12 months after the 2016 U.S. presidential election that reversed the political fortunes of conservatives and liberals. Contrary to competing explanations proffered by some, the core relationship between political orientation and negatively-biased credulity was not altered by this change in the power structure, supporting the thesis that it derives not from exogenous political dynamics, but from elementary psychological differences underlying political orientation.

Parallel Hazard Biases in Information Selection and Transmission The same functionalist logic that explains negatively-biased credulity also governs the selection and transmission of information. Specifically, when given a choice as to what information to pursue, people target information concerning hazards over other types of messages, a pattern consonant with the fact that hazards are often more imminent than opportunities; preclude opportunities; and thus have a greater effect on individual welfare than do opportunities (Blaine and Boyer, 2018; Eriksson and Coultas, 2014; Eng, 2008). Likewise, paralleling the perceived greater value of information concerning hazards, participants assess individuals who provide information about hazards as being more competent than those who provide other information (Boyer and Parren, 2015). Lastly, given that: (i) people are most likely to transmit to others information that they themselves believe (Pezzo and Beckstead, 2006); (ii) people are presumably most likely to transmit to others information that they themselves would

b3411_Ch-02.indd 26

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  27

wish to obtain; (iii) people likely understand that transmitting information to others can be an avenue for enhancing one’s own prestige; and (iv) arousal is one factor shaping individuals’ willingness to transmit information (Berger, 2011), and negative events are usually more arousing than positive ones (Baumeister et al., 2001; Rozin and Royzman, 2001), it follows that a pattern paralleling negatively-biased credulity should exist in information transmission, i.e., people should be more likely to faithfully pass on to others messages concerning hazards than messages concerning benefits. This prediction is supported by a growing body of experimental evidence (Altshteyn, 2014; Bebbington et al., 2017; Blaine and Boyer, 2018; see also, Eriksson and Coultas, 2014; Heath et al., 2001; Peters et al., 2009; but see, Eriksson et al., 2016; Stubbersfield et al., 2015).

Hazard Biases and the Content of Culture Since culture exists primarily as information acquired, stored, and transmitted by individuals, cultural patterns observable at a large scale can reflect widespread features of the mind (Boyer, 2000; Conway and Schaller, 2007; Norenzayan and Atran, 2004; Sperber, 1996; 2006). Biases to pursue information about hazards; believe information about hazards; elevate the stature of those who provide information about hazards; and transmit to others information about hazards should, aggregated over time and numerous information transmission events, create an imbalance wherein information about hazards is more common than information about benefits. This asymmetry should be especially evident in domains where accuracy is difficult or impossible to discern. Consonant with the above prediction, rumors describing negative events spread faster and wider than those reporting positive events, even when they are of equal importance (Walker and Blaine, 1991). Likewise,

b3411_Ch-02.indd 27

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

28  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

news reports that elicit high-arousal emotions are more likely to spread rapidly on the Internet, and anxiety is a central determinant in this regard (Berger and Milkman, 2012). Rumors or false reports can solidify into urban legends, that is, untrue accounts of events that: (a) purportedly happened in the present or recent past, in settings familiar to the audience, (b) are intended to be both believable and believed, (c) circulate widely in a social environment, and (d) are believed to be true or likely to be true by a substantial number of people (Tangherlini, 1990; Brunvand, 2001; Fessler et al., 2014). Fessler et al. (2014) evaluated a large sample of urban legends circulating on the Internet, finding that, in keeping with the above prediction, information concerning hazards was approximately three times more common than information concerning benefits (see also, Heath et al., 2001). Although urban legends are believed and transmitted by many individuals, they likely achieve less complete population penetration than do supernatural beliefs, another domain in which the accuracy of information cannot be assessed by prospective adherents. Fessler et al. (2014) also assessed a large sample of supernatural beliefs, collected from a representative collection of accounts of the world’s cultures. As per predictions, hazard information was a component of such beliefs approximately 1.5 times as often as was benefit information.

Conclusion In sum, the human mind co-evolved with, and is intimately dependent upon, cultural information. As the utility and functional logic of cultural information is often opaque to learners, humans have evolved to be credulous; that is, we have an innate propensity to believe what others tell us about the world. However, because excessive credulity is costly, the mind contains mechanisms that adjust credulity in light of expected costs and benefits. If information

b3411_Ch-02.indd 28

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  29

concerns hazards, erroneous incredulity will often be more costly than erroneous credulity, hence, people can be expected to exhibit negatively-biased credulity, a greater propensity to believe information about hazards relative to information about benefits. A growing corpus of findings directly and indirectly supports this thesis. Although negatively-biased credulity is predicted to be a speciestypical characteristic, individuals are expected to differ in the extremity of this bias as a function of individual differences in sensitivity to threats, and differing perceptions of the level of danger in the world. Evidence increasingly supports this contention as well, including the translational application of this idea to the political realm where, consonant with differences in threat reactivity and dangerous-world beliefs, social conservatives have been shown to exhibit greater negatively-biased credulity than social liberals. Paralleling negative bias in credulity, and following a similar functionalist logic, people also exhibit a greater propensity to pursue information about hazards; to view as competent those who provide such information; and to transmit such information themselves. Aggregated across multiple individuals, the result is that cultures tend to accumulate false information about hazards, as evident in assessments of rumors, urban legends, and supernatural beliefs. One implication of the above portrait is that it may be possible for positive feedback loops to arise wherein negatively-biased credulity leads to greater circulation of information about hazards, causing an increase — mediated by credulity — in perceptions of the dangerousness of the world, in turn leading to greater negatively-biased credulity, and so on. Moreover, modern information technology may substantially elevate the risk that such reality-distorting feedback loops will occur. This is because: (i) mass communication channels and social media allow for the dissemination of information on unprecedented scales and at an unprecedented speed; (ii) events are witnessed onscreen as though

b3411_Ch-02.indd 29

20-Nov-18 12:39:20 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

30  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

they occurred in the immediate vicinity even if they are, in fact, distant; (iii) whether motivated by profit or politics, media organizations seek to leverage negativity bias to gain viewers, broadcasting threat information at high rates; (iv) social media provides conduits for information transmission from individuals who, by virtue of their familiarity, are likely to be trusted more; and (v) online communities allow for self-selected segregation of like-minded individuals to a degree that is often impossible in everyday life. These features are united by a common thread, namely that the human mind, which evolved over millions of years for face-to-face information transmission, is unprepared for the cyber-environment of the 21st century. Around the world, we are currently witnessing both the political polarization and the distortion of perceptions of reality that can result.

References Ahn, W.Y., K.T. Kishida, X. Gu, T. Lohrenz, A. Harvey, J.R. Alford, K.B. Smith, G. Yaffe, J.R. Hibbing, and P. Dayan. “Nonpolitical Images Evoke Neural Predictors of Political Ideology.” Current Biology 24, no. 22 (2014): 2,693–2,699. Akhtar, S., R. Faff, B. Oliver, and A. Subrahmanyam. “The Power of Bad: The Negativity Bias in Australian Consumer Sentiment Announcements on Stock Returns.” Journal of Banking and Finance 35, no. 5 (2011): 1,239–1,249. Akhtar, S., R. Faff, B. Oliver, and A. Subrahmanyam. “Stock Salience and the Asymmetric Market Effect of Consumer Sentiment News.” Journal of Banking and Finance 36 (2012): 3,289–3,301. Altshteyn, I. “Evidence for a Warning Bias in Information Transmission in Social Networks,” diss., University of California, Los Angeles, 2014. Andheria, A.P., K.U. Karanth, and N.S. Kumar. “Diet and Prey Profiles of Three Sympatric Large Carnivores in Bandipur Tiger Reserve, India.” Journal of Zoology 273, no. 2 (2007): 169–175.

b3411_Ch-02.indd 30

20-Nov-18 12:39:20 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  31

Anthony, S. “The Influence of Personal Characteristics on Rumor Know­ ledge and Transmission Among the Deaf.” American Annals of the Deaf 137, no. 1 (1992): 44–47. Bangerter, A. and C. Heath. “The Mozart Effect: Tracking the Evolution of a Scientific Legend.” British Journal of Social Psychology 43, no. 4 (2004): 605–623. Baumeister, R.F., E. Bratslavsky, C. Finkenauer, and K.D. Vohs. “Bad is Stronger Than Good.” Review of General Psychology 5, no. 4 (2001): 323–370. Bebbington, K., C. MacLeod, T.M. Ellison, and N Fay. “The Sky is Falling: Evidence of a Negativity Bias in the Social Transmission of Information.” Evolution and Human Behavior 38, no. 1 (2017): 92–101. Berger, J. “Arousal Increases Social Transmission of Information.” Psychological Science 22, no. 7 (2011): 891–893. Berger, J. and K.L. Milkman. “What Makes Online Content Viral?” Journal of Marketing Research 49, no. 2 (2012): 192–205. Billing, J. and P.W. Sherman. “Antimicrobial Functions of Spices: Why Some Like it Hot.” The Quarterly review of biology 73, no. 1 (1998): 3–49. Blaine, T. and P. Boyer. “Origins of Sinister Rumors: A Preference for Threat-Related Material in the Supply and Demand of Information.” Evolution and Human Behavior 39, no. 1 (2018): 67–75. Boyd, R. and P.J. Richerson. “Culture and the Evolution of the Human Social Instincts,” In Roots of Human Sociality, edited by S. Levinson and N. Enfield, pp. 453–477. Oxford: Berg, 2006. Boyer, P. “Evolutionary Psychology and Cultural Transmission.” American Behavioral Scientist 43, no. 6 (2000): 987–1,000. Boyer, P. and N. Parren. “Threat-Related Information Suggests Competence: A Possible Factor in the Spread of Rumors.” PloS ONE 10, no. 6 (2015): e0128421. Brunvand, J. H. The Truth Never Stands in the Way of a Good Story. University of Illinois Press, 2001. Buss, D. Evolutionary Psychology: The New Science of the Mind. Psychology Press, 2015.

b3411_Ch-02.indd 31

20-Nov-18 12:39:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

32  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Cai, Q., Y.-Y. Cui, and Z.L. Yang. “Lethal Amanita Species in China.” Mycologia 108, no. 5 (2016): 993–1,009. Conway, L.G. III and M. Schaller. “How Communication Shapes Culture,” In Social Communication, edited by K. Fiedler, pp. 107–127. New York: Psychology Press, 2007. Dall, S.R.X., A.I. Houston, and J.M. McNamara. “The Behavioural Ecology of Personality: Consistent Individual Differences From an Adaptive Perspective.” Ecology letters 7, no. 8 (2004): 734–739. Dallago, F., A. Mirisola, and M. Roccato. “Predicting Right-Wing Authoritarianism Via Personality and Dangerous World Beliefs: Direct, Indirect, and Interactive Effects.” The Journal of Social Psychology 152, no. 1 (2012): 112–127. de Montellano, B.O. “Empirical Aztec Medicine.” Science 188, no. 4185 (1975): 215–220. Eng, S.J. “Experimental Exposure-Willingness (Eew): Testing a Behavioral Measure of Attraction Vs. Aversion to Disgust Stimuli,” diss., University of California, Los Angeles, 2008. Eriksson, K. and J.C. Coultas. “Corpses, Maggots, Poodles and Rats: Emotional Selection Operating in Three Phases of Cultural Transmission of Urban Legends.” Manuscript submitted for publication 14, no. 1-2 (2014): 1–26. Eriksson, K., J.C. Coultas, and M. De Barra. “Cross-Cultural Differences in Emotional Selection on Transmission of Information.” Journal of Cognition and Culture 16, no. 1-2 (2016): 122–143. Federico, C.M., C.V. Hunt, and D. Ergun. “Political Expertise, Social Worldviews, and Ideology: Translating “Competitive Jungles” and “Dangerous Worlds” Into Ideological Reality.” Social Justice Research 22, no. 2 (2009): 259–279. Fessler, D.M. T., A.C. Pisor, and C.D. Navarrete. “Negatively-Biased Credulity and the Cultural Evolution of Beliefs.” PLoS ONE 9, no. 4 (2014): e95167. Fessler, D.M.T. “Steps Toward the Evolutionary Psychology of a CultureDependent Species,” In The Innate Mind: Culture and Cognition Vol. II,

b3411_Ch-02.indd 32

20-Nov-18 12:39:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  33

edited by P. Carruthers, S. Laurence, and S. Stich, pp. 91–117. New York: Oxford University Press, 2006. Fessler, D.M.T., A.C. Pisor, and C. Holbrook. “Political Orientation Predicts Credulity Regarding Putative Hazards.” Psychological Science 28, no. 5 (2017): 651–660. Garz, Ml. “Unemployment Expectations, Excessive Pessimism, and News Coverage.” Journal of Economic Psychology 34 (2012): 156–168. Greenhill, K.M. and B. Oppenheim. “Rumor Has it: The Adoption of Unverified Information in Conflict Zones.” International Studies Quarterly 61, no. 3 (2017): 660–676. Heath, C., C. Bell, and E. Sternberg. “Emotional Selection in Memes: The Case of Urban Legends.” Journal of Personality and Social Psychology 81, no. 6 (2001): 1,028–1,041. Hibbing, J.R, K.B. Smith, and J.R. Alford. “Differences in Negativity Bias Underlie Variations in Political Ideology.” Behavioral and Brain Sciences 37, no. 03 (2014): 297–307. Hilbig, B.E. “Sad, Thus True: Negativity Bias in Judgments of Truth.” Journal of Experimental Social Psychology 45, no. 4 (2009): 983–986. Hilbig, B.E. “Good Things Don’t Come Easy (to Mind): Explaining Framing Effects in Judgments of Truth.” Experimental Psychology 59, no. 1 (2012): 38–46. Hilbig, B.E. “How Framing Statistical Statements Affects Subjective Veracity: Validation and Application of a Multinomial Model for Judgments of Truth.” Cognition 125, no. 1 (2012): 37–48. Knoll, B.R., T.J. O’Daniel, and B. Cusato. “Physiological Responses and Political Behavior: Three Reproductions Using a Novel Dataset.” Research and Politics 2, no. 4 (2015): 2053168015621328. Kurzban, R. “Representational Epidemiology: Skepticism and Gullibility,” In The Evolution of Mind: Fundamental Questions and Controversies, edited by Steven W. Gangestad and J. A. Simpson, pp. 357–362. New York: The Guilford Press, 2007. Lilienfeld, S.O. and R.D. Latzman. “Threat Bias, Not Negativity Bias, Underpins Differences in Political Ideology.” Behavioral and Brain Sciences 37, no. 03 (2014): 318–319.

b3411_Ch-02.indd 33

20-Nov-18 12:39:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

34  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mills, M., F.J. Gonzalez, K. Giuseffi, B. Sievert, K.B. Smith, J.R. Hibbing, and M.D. Dodd. “Political Conservatism Predicts Asymmetries in Emotional Scene Memory.” Behavioural and Brain Research 306, no. 1 (2016): 84–90. Mills, M., K.B. Smith, J.R. Hibbing, and M.D. Dodd. “The Politics of the Face-in-the-crowd.” Journal of Experimental Psychology: General 143, no. 3 (2014): 1,199. Nguyen, V.H. and E. Claus. “Good News, Bad News, Consumer Sentiment and Consumption Behavior.” Journal of Economic Psychology 39 (2013): 426–438. Norenzayan, A. and S. Atran. “Cognitive and Emotional Processes in the Cultural Transmission of Natural and Nonnatural Beliefs,” In The Psychological Foundations of Culture, edited by M. Schaller and C.S. Crandall, pp. 149–169. Hillsdale, NJ: Lawrence Erlbaum Associates, 2004. Perkins, D.D., A. Wandersman, R.C. Rich, and R.B. Taylor. “The Physical Environment of Street Crime: Defensible Space, Territoriality and Incivilities.” Journal of Environmental Psychology 13, no. 1 (1993): 29–49. Peters, K., Y. Kashima, and A. Clark. “Talking About Others: Emotionality and the Dissemination of Social Information.” European Journal of Social Psychology 39, no. 2 (2009): 207–222. Pezzo, M.V. and J.W. Beckstead. “A Multilevel Analysis of Rumor Transmission: Effects of Anxiety and Belief in Two Field Experiments.” Basic and Applied Social Psychology 28, no. 1 (2006): 91–100. Rosnow, R.L. “Psychology of Rumor Reconsidered.” Psychological Bulletin 87, no. 3 (1980): 579–591. Rosnow, R.L., J.L. Esposito, and L. Gibney. “Factors Influencing Rumor Spreading: Replication and Extension.” Language and Communication 8, no. 1 (1988): 29–42. Rozin, P. and E.B. Royzman. “Negativity Bias, Negativity Dominance, and Contagion.” Personality and Social Psychology Review 5, no. 4 (2001): 296–320.

b3411_Ch-02.indd 34

20-Nov-18 12:39:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Believing Chicken Little: Evolutionary Perspectives on Credulity and Danger  35

Saler, B. “On Credulity,” In Religion as a Human Capacity: A Festschrift in Honor of E. Thomas Lawson, edited by Timothy Light and Brian C. Wilson, pp. 315–329. Leiden: E.J. Brill, 2004. Samore, T.J. “The Effect of Induced Fear on Culturally Transmitted Credulity Assessments,” diss., University of California, Los Angeles, 2017. Samore, T.J., D.M.T. Fessler, and C.C. Holbrook. “Political Orientation, Negatively-Biased Credulity, and Political Supremacy: Conservatives Are More Credulous Even When in Power.” (forthcoming). Siegrist, M., M.E. Cousin, and M. Frei. “Biased Confidence in Risk Assessment Studies.” Human and Ecological Risk Assessment 14, no. 6 (2008): 1,226–1,234. Siegrist, M. and G. Cvetkovich. “Better Negative Than Positive? Evidence of a Bias for Negative Information About Possible Health Dangers.” Risk Analysis 21, no. 1 (2001): 199–206. Slovic, P. “Perceived Risk, Trust, and Democracy.” Risk Analysis 13, no. 6 (1993): 675–682. Soroka, S.N. “Good News and Bad News: Asymmetric Responses to Economic Information.” Journal of Politics 68, no. 2 (2006): 372–385. Sperber, D. “Why a Deep Understanding of Cultural Evolution is Incompatible With Shallow Psychology,” In Roots of Human Sociality: Culture, Cognition, and Interaction, edited by N.J. Enfield and Stephen C. Levinson, pp. 431–449. London: Berg, 2006. Sperber, D. Explaining Culture: A Naturalistic Approach. Cambridge, MA: Blackwell, 1996. Stubbersfield, J.M., J.J. Tehrani, and E.G. Flynn. “Serial Killers, Spiders and Cybersex: Social and Survival Information Bias in the Transmission of Urban Legends.” British Journal of Psychology 106, no. 2 (2015): 288–307. Tangherlini, T.R. “It Happened Not Too Far From Here”: A Survey of Legend Theory and Characterization.” Western Folklore 49, no. 4 (1990): 371–390. Walker, C.J. and C.A. Beckerle. “The Effect of State Anxiety on Rumor Transmission.” Journal of Social Behavior and Personality 2, no. 3 (1987): 353–360.

b3411_Ch-02.indd 35

20-Nov-18 12:39:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

36  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Walker, C.J. and B. Blaine. “The Virulence of Dread Rumors: A Field Experiment.” Language & Communication 11, no. 4 (1991): 291–297. Wang, X.T., D.J. Kruger, and A. Wilke. “Life History Variables and RiskTaking Propensity.” Evolution and Human Behavior 30, no. 2 (2009): 77–84. White, M.P., S. Pahl, M. Buehner, and A. Haye. “Trust in Risky Messages: The Role of Prior Attitudes.” Risk Analysis 23, no. 4 (2003): 717–726. Zuckerman, M., and D.M. Kuhlman. “Personality and Risk-taking: Common Biosocial Factors.” Journal of Personality 68, no. 6 (2000): 999–1,029.

b3411_Ch-02.indd 36

20-Nov-18 12:39:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

PART 2

THE EMPLOYMENT OF DRUMS

b3411_Ch-03.indd 37

20-Nov-18 12:39:34 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

3 FAKE NEWS: THE ALLURE OF THE DIGITAL WEAPON NICOLAS ARPAGIAN

“We may prefer a Roneo to a machine gun” David Galula, from Counterinsurgency Warfare: Theory and Practice (1963)

Two centuries ago, strategist Clausewitz taught us “war is an act of violence to compel our enemy to fulfill our will” (Clausewitz, 1968). Nowadays, the compelling in all conflicts and confrontations does not only involve the application of violence, but also involves the management of information. In the digital era, ensuring the submission of an enemy also comes with imposing one’s will on the information battlespace — a battle for public opinion through the broadcasting of information in order to influence specific people,

39

b3411_Ch-03.indd 39

20-Nov-18 12:39:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

40  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

such as political decision makers, economic managers, opinion leaders, and the general public. The modern capacity to produce, duplicate, exchange, and archive information is without any equivalent in history. Modern society has, to a large degree, become a society nourished by information — a society fed by social networks and constructed with the indexation of search engines. In such a society, information that is not digitized loses its importance through obscurity. The inability to exist in a digitized format destines such information to lose its value, as it will be difficult to find, or its access will be restricted to an elite. Abundance — or even the overabundance — of information was once regarded as richness. However, abundance has led to fragility as information has to compete for an audience like never before. Style and sensationalism through clickbait techniques such as outrageous titles and attractive layouts as well as text optimized to appear at the top of search engine results very often beats substance when it comes to information dissemination. In such a battlespace, disinformation can thrive by attaining an audience if packaged well.

Recent Information/Disinformation Wars With the abundance of information and the contest for an audience, the weapons used in an information space battle are messages, photographs, slogans, and short video clips, tailored to meet the expectations and impulses of Internet and mobile users who are connected almost continuously. The U.S. presidential election in November 2016 is illustrative here with regard to how, in the battle for an audience, disinformation can trump information. Facebook recognized that about 146 million of its users were in contact with fake news before and after the election (Erin, 2017). YouTube

b3411_Ch-03.indd 40

20-Nov-18 12:39:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Fake News: The Allure of the Digital Weapon  41

acknowledged that 1,108 videos of suspicious origin related to Russia had been broadcasted via its intermediary, while Twitter, concerned by the number of fake accounts being used to influence votes, announced in January 2018 that approximately 1.4 million people have received a notification from Twitter to alert them about these fake accounts (Twitter Public Policy, 2018). These people were contacted because they were linked to these accounts, either by retweeting, quoting, replying to, mentioning, or liking those accounts or content created by those accounts. Whether it is obviously uncertain if these fraudulent productions alone are responsible for Trump’s victory or Clinton’s defeat, it is striking to note that, from now on, all democratic elections will be analyzed for evidence of possible disinformation campaigns. During the Lord Mayor’s Banquet in London in November 2017, the head of the British government, Theresa May, did not mince her words by pointing out the involvement of the Russians in the referendum campaign on the Brexit. She maintained, “We know what you’re doing” (May, 2017). The Russian embassy then responded with a Tweet: “We know what YOU are doing” (MFA Russia, 2017) — invectives quite far distant from treaty diplomacy and international law, just as if an intimate belief in the reality of interference could replace physical evidence of the opponent’s responsibility. In the digital world, material or technical proof is very rare. The difficulty in finding evidence, or even revealing evidence as that may compromise ability, affects discussions on the allocation of blame for such cyber-attacks. A site for the next battle with regard to information is slated to be the U.S. Congressional midterm elections in 2018. Besides the manipulation of information, it is the very integrity of these upcoming elections in the U.S. that could be affected, as there are very strong doubts about the integrity of voting machines used in some states (Osnos, 2017). Here, the danger comes from both the

b3411_Ch-03.indd 41

20-Nov-18 12:39:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

42  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

timely combination of misinformation and hacking. Both actions amplify each other in a reciprocal way to pollute the debates, instill doubt as to the ballot’s accuracy, and eventually draw the result desired by those seeking to undermine the process. This is a perspective that could seriously challenge the most solid democratic system. Besides the employment of information and disinformation to compel an enemy to one’s will, in the context of international terrorism, the dissemination of digital information is an essential component of their recruitment strategy, proselytism, claims, and teaching. Supporters of the Islamic State (ISIS) who want to publicize their messages, trumpet their success, and reach out to new members have used YouTube channels, temporary websites, Facebook pages, and Twitter accounts. These groups have become very adept at ‘gaming’ the algorithms of social media platforms — an individual with a few keywords or posts of videos/images of similar content to those on extremist sites can discover a throve of other users who share the same interests. It is not necessary to know activists: they rapidly appear in the contact suggestions identified by the social networks’ affinity logic. The account creation is free, and so the supporters can create accounts without any limit — they often anticipate and create backup accounts in case the previous one should be closed by these digital services managers. Access to the Internet allows connection between geographically distant people with no other relationship besides a shared interest for extremist ideals.

Defending Against False Information and Fake News Engaging Social Media Platforms Besides the battle over information, it must be acknowledged that the platforms that have become key to dissemination — platforms

b3411_Ch-03.indd 42

20-Nov-18 12:39:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Fake News: The Allure of the Digital Weapon  43

such as Facebook, Twitter and YouTube — have to be engaged with, in order to rein in disinformation. It has to be noted that there are debates over how responsible such platforms are when it comes to the dissemination of false information. While such platforms claim to be a mere data hosting provider, the sophistication of their algorithms allow for the arrangement of information, and their correlations to make them available to as many people as possible, making their role more closely related to that of publishers. This role would broaden the scope of the responsibilities of such platforms in the highlighting of fraudulent content.

Legislation States have attempted to combat false information recently. In France, the government has taken steps to bring such platforms (as well as individuals using such platforms) to task for disseminating false information. In 2017, French president Emmanuel Macron’s campaign team’s email was hacked, and emails were leaked on the Internet a few hours before the opening of polling stations. This incident perhaps explains why President Macron is particularly aware of the stakes involved in this new form of destabilization. In January 2018, Macron announced a forthcoming law that would make it possible to sanction the publication of “false information on the Internet during elections”. This system would complement existing regulations on media law and the dissemination of false news. In France, a law of July 29, 1881 updated for the present establishes “all publication, dissemination or reproduction of false news, by any means whatsoever; manufactured coins, falsified or deceptively attributed to third parties when, made in bad faith, it has disturbed the public peace or is likely to disturb it; shall be punishable by a fine of €45,000. The same acts will be punished by a fine of

b3411_Ch-03.indd 43

20-Nov-18 12:39:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

44  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

€135,000 when the dissemination or reproduction made in bad faith will be liable to undermine the discipline or morale of the military, or to impede the nation’s war effort”. This formulation from the 19th century shows how ancient these techniques of destabilization by information are. Except for the amount of the fine being altered from French Francs to Euros, what the law prohibits is still relevant in the first quarter of the 21st century. Germany took a step forward on January 1, 2018, with the enforcement of an ad hoc regulation known as Das Netzwerkdurchsetzungsgesetz (NetzDG). It requires social networks to delete all “manifestly illegal” messages within 24 hours of reporting. In cases subject to interpretation, they have seven days to decide. If found guilty, fines can reach up to 50 million euros. The significant fine by Germany reflects is a reflection of how severe such action is held to be. Traditional media has taken advantage of the recent discrediting of information society giants to propose an alternative based on analysis. A few processes have been initiated to enhance expertise, with the opening of pages and websites dedicated to “fact-checking”, such as Decodex by daily newspaper Le Monde, or Desintox by Libération. This upsurge in disinformation operations has given the historical media an opportunity to highlight the work of their editorial staff, and to promote their editorial process, which involves cross-checking information prior to publishing. This expertise has not escaped platforms such as Facebook. Since February 2017, journalists from the daily newspaper Le Monde have been tracking fake news on Facebook. According to French weekly newspaper Le Canard enchaîné (Nobili, 2018), over an eight-month period in 2017, journalists from Le Monde found on Facebook some 2,865 fake news disseminated on 1,198 pages, of which 147 pages were closed by the site. These are modest figures compared to the 33 million French Facebook users. Deceiving public opinion is an old

b3411_Ch-03.indd 44

20-Nov-18 12:39:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Fake News: The Allure of the Digital Weapon  45

weapon serving the interests of power. Sir Winston Churchill claimed truth should “be attended by a bodyguard of lies”. Unlike camouflage, which obscures what is true as false, “disappointment brings belief to what is not true”, explains historian Rémi Kauffer (Kauffer, 2017).

Counter-messaging With regard to the use of social media platforms by terrorist organizations for their own gains, states such as France have created a website (http://www.stop-djihadisme.gouv.fr/) to respond through videos, phone services, and concrete explanations to the numerous invitations to join the jihadist fight that are widely distributed on the social networks. Admittedly, it is always difficult for a democratic state to seek to use the same means of communication as these combatants, who employ an extensive use of emotion and exaggeration. Democratic governments face a difficult task when contesting the messages of terrorist organizations with their own. Governments have to use carefully crafted and reasoned explanations and truthful data, while their opposition often use movies and video games aesthetics for their propaganda. Often, the latter approach can appear more attractive to an audience. Beyond attempting to debunk extremist messaging, it is important to also recognize the role to be played by platforms that host content. Attempts to remove the litigious content often gives way to many different interpretations of how such content should be understood and dealt with. These interpretations also vary according to national cultures and contexts. Indeed, in this offensive logic, it is public opinion that is being targeted, as to guide its leaders’ decision-making. Also, in the case of democratic countries which are subject to election schedules, the leaders are big consumers of polls and surveys in order to respond to their electorate’s demands. These topics that mix religious, geostrategic,

b3411_Ch-03.indd 45

20-Nov-18 12:39:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

46  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

political, and security matters are very complex to understand. The general public also often has less time and tools to understand all the issues. So, it is easy for activists to focus on simple and emotional messages broadcasted by the social networks to influence their opinions. Following Roger Trinquier (1961), “The challenge of modern warfare is the conquest of the population. Our first goal will there to ensure its protection by all means. This includes confronting the propaganda that seeks to manipulate and recruit”.

Education and Critical Thinking Democratic countries have instituted the principle of freedom of expression, which allows them to say or publish whatever they want, and then assume responsibility for what they say. There is no prior control. This ability to express differing points of view comes with a permission, in principle, to exercise a right for criticism and comment. If the arguments used were to defame someone or mislead the public, there are legal mechanisms in place to hold the perpetrators of these acts accountable. At this stage of the process, and considering the importance given to freedom of expression, it is not good to consider automating the blocking of publications by algorithms. Indeed, their programming would be inevitably bypassed by those wishing to continue to make their views known. On a more trivial register, it is enough to see how quickly Twitter users have picked and used emojis in the form of vegetables or milk glasses to represent male sex organs or to praise white supremacy. It is the value that network users place on these images which give them their meaning, and not their original representation. In this case, it becomes difficult to configure automatons to block these messages. In an intensely digitalized world, it is therefore becoming increasingly necessary to rely on individual intelligence and critical thinking. This is not about infantilizing

b3411_Ch-03.indd 46

20-Nov-18 12:39:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Fake News: The Allure of the Digital Weapon  47

technology consumers by hoping to keep offensive images, texts, or videos out of their sight, as the means to access them will always exist. It is through education and pedagogy that the impact of such propaganda or disinformation materials will be neutralized. In addition, this investment in consumer and citizen knowledge will be beneficial for other aspects related to the digitization of human activities such as, for example, robotization and augmented reality.

Conclusion Our predecessors never faced such a volume of information to base their decisions on the matters of public policy, the economy, or military strategy. The capacity to produce, read, or broadcast information worldwide in a few minutes is now without equivalent, compared to the past. This capacity is naturally used to lead offensive activities. One is based on hacking and technical cyber-attacks, but we must now be very vigilant on the others based on the massive use of disinformation to manipulate public opinion across a region or country. Our societies’ digital transformation is not divided; in fact, it is global. It is therefore in a transversal approach that we need to understand the changes under way. Technology is progressing and deploying faster than strategic analysis of uses and their consequences on our lives. This discrepancy should not be further widened because it would then be at the expense of our fundamental freedoms.

References Galula, D. (1964). Counterinsurgency Warfare: Theory and Practice, London: Pall Mall Press. Kauffer, R, (2017). Histoire Mondiale des Services Secrets, Paris: Perrin — Tempus.

b3411_Ch-03.indd 47

20-Nov-18 12:39:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

48  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Kelly, E. (2017). “Senators threaten new rules if social media firms can’t stop Russian manipulation themselves”, USA Today, November 1. https://www.usatoday.com/story/news/politics/2017/11/01/senators-saysocial-media-companies-must-do-more-stop-russias-election-meddling/ 820526001/. May, T. (2017). PM speech to the Lord Mayor’s Banquet 2017. Prime Minister’s Office, November 13. https://www.gov.uk/government/ speeches/pm-speech-to-the-lord-mayors-banquet-2017. MFA Russia, (2017). “Tweet from the Ministry of Foreign Affairs of the Russian Federation” November 14. https://twitter.com/mfa_russia/ status/930424654858244096. Nobili, C. (2018). “Entre ‘Le Monde’ et Facebook, un beau conte de ‘fake’”, Le Canard Enchainé, January 3. Osnos, E. (2017). “Why the 2018 Midterms Are So Vulnerable to Hackers”. The New Yorker. December 28. Trinquier, R. (1961). Modern Warfare — A French View of Counter­ insurgency. London: Pall Mall Press. Twitter Public Policy. (2018). Update on Twitter’s Review of the 2016 U.S. Election, Official Press Release published by Twitter, January 31. https:// blog.twitter.com/official/en_us/topics/company/2018/2016-electionupdate.html Von Clausewitz, C. (1968). On War. Harmondsworth: Penguin.

b3411_Ch-03.indd 48

20-Nov-18 12:39:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

4 MAPPING CYBERSPACE: THE EXAMPLE OF RUSSIAN INFORMATIONAL ACTIONS IN FRANCE KEVIN LIMONIER AND LOUIS PÉTINIAUD

Since the hacking (and the publication) of data from the American Democratic National Committee (DNC) by presumed Russian hackers, Russia has become an object of major interest, not only in terms of cybersecurity, but also in terms of what is now called hybrid warfare. In other words, the American presidential campaign has shed light on the potential of a Russian strategy which has been developing since at least 2012, and which aims at the political Translation by Maxime Chervaux. 49

b3411_Ch-04.indd 49

20-Nov-18 12:39:45 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

50  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

destabilization of another state through the use of digital technology (broadly speaking, be it hacking data, or spreading fake or biased news). Immediately after the U.S., France was in turn confronted by foreign influences during a particularly tense time. Russian media broadcasting in French explicitly sided with some candidates in the French presidential election campaign (François Fillon, Marine le Pen) and the French media largely discussed the influence of fake news coming from Moscow (Limonier, 2017). Despite this, there has been no work trying to quantify this phenomenon yet. True, it remains hard to grasp: is it possible to identify, classify, and trace back to Moscow the innumerable rumors and polemics that the far right made its own during the presidential campaign? Also, bearing that in mind, can we discern a Russian strategy of destabilization, if one exists? We found it necessary to adopt a global methodology, based on the analysis of data (the so-called “Big Data”) extracted from social networks, in order to comprehend the logic and strategies behind the propagation of the contents produced in Moscow. In this chapter, we want to offer a new method to map the relays of Russian informational power. Divided into different segments, that galaxy was particularly active during the 2017 presidential election in France, and it offers a clear illustration of how this network has been put to good use by Russia.

Identifying and Mapping Pro-Russian Networks on the French-Speaking Segment of Twitter During the campaign, and on top of the Kremlin’s public support for Mrs. le Pen, Russia seems to have been involved mostly through a series of informational actions organized through social networks. At first glance, these actions tried to influence the public debate in

b3411_Ch-04.indd 50

20-Nov-18 12:39:45 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mapping Cyberspace: The Example of Russian Informational Actions in France  51

France, though the production and the spreading of content, mainly produced by the public holding company Rossija Segodnja (with its renowned platform Sputnik News) and by the autonomous nonprofit organization TV Novosti (which administers Russia Today) (Audinet, 2017). These contents however are elements of a strategy of propagation largely resting on social networks. To understand the mechanisms behind this propagation, we decided to conduct a series of quantitative and qualitative measures on the social network Twitter.1 First, we have identified more than 1,000 accounts in French that appeared to be systematic “relays” of Russian media content, either actively (replicating content directly taken from Russian platforms) or passively (propagating a discourse using arguments produced by Russian platforms). For each of these identified accounts, we analyzed all the data offered by Twitter (number of followers, accounts following, tweets, favorites, the description of the profiles, date of creation, localization if provided, etc.) and the links that show interactions between them (follows, retweets, mentions, and answers). The algorithmic mapping that we obtained2 from the data allows us to understand certain things about the architecture and the composition of this “pro-Russian Twittersphere”.  The decision to opt for Twitter can be explained by all the information that is accessible through Twitter’s API (Application Program Interface). The API is an app developed by Twitter that, in our case, allows another app such as NodeXL here, to communicate with it, and to harvest data from it. 2  The algorithm used here, named Force Atlas, positions the hubs of the network depending on the number of links that unite them. In other words, the closer the hubs are one from the other, the more relations they have. A detailed methodological description of the algorithm, often used in social sciences, is available in Mathieu Bastian, Sebastien Heymann, Mathieu Jacomy “Gephi: an open source software for exploring and manipulating networks”, International Conference on Web and Social Media, 2009 — aaai.org 1

b3411_Ch-04.indd 51

20-Nov-18 12:39:45 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

52  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The first graph shows the 1,030 accounts identified form a network. The accounts which are named on the graph are the major hubs of the network, as they unite around them a significant number of “small” accounts (represented here by circles of smaller importance). The colors (blue, green, and red) indicate the probable political leaning of those accounts.3 Dark blue stands for the farright, green indicates proximity to the traditional right, and red to the left. It is important to bear in mind that the accounts represented here have not systematically been created to propagate Russian informational content, or been conscious actors of the phenomenon. Indeed, identifying the accounts has relied, among other things, on a ratio of links taken from pro-Russian platforms on a given period of time. In other words, some actors in our study use pro-Russian contents not because they have any particular link to the country, but because these contents offer information favorable to their own agenda or particular political interests. The accounts’ positions on the graph have been automatically generated by the tool Gephi according to the number of links that exists between all the hubs in the network. Thus, due to the algorithmic mapping, we can detect three distinct groups. The first community, the most distinct of all, is depicted at the bottom-right of the graph. It represents the Identitarian far-right, mainly organized around three accounts, all linked to the National  To achieve this, we studied the behavior of “big” accounts, and we gave each of them one of the three political colors we selected. Then, and using the tool Gephi, color variations appeared automatically according to the number of links uniting the accounts of the graph between themselves. For example, the more an account “tends” towards the blue color, the more the chances that it is linked to the far-right community. The “big” accounts in grey are the ones not using French as primary language, but that were still interesting to keep on the graph, especially for the case study which follows. 3

b3411_Ch-04.indd 52

20-Nov-18 12:39:45 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mapping Cyberspace: The Example of Russian Informational Actions in France  53

Figure 5.1:    Map of the “Pro-Russian Twittersphere”

Front (FN), and which have thousands, or tens of thousands, of followers. The second identifiable community is the one depicted at the center-right of the graph. Also close to the far-right, it is however distinct from the first community by its porosity, as it is linked to accounts that do not all share the same political ideas. The profiles of the users are usually closer to that of the FN as an institution, and are a priori more educated; some are in fact senior officials of the

b3411_Ch-04.indd 53

20-Nov-18 12:39:46 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

54  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Figure 5.2:    The Political Communities in the Pro-Russian Twittersphere

party. Finally, the third community is more politically diverse, and is mainly revolving around the account of a Belarusian engineer. Based in Gomel, he relays information produced by pro-Russian platforms on various topics linked to the Russian foreign policy on a daily basis. He is followed by several academics, bloggers, and political personalities of various political leanings and here identified as hubs on the network.

b3411_Ch-04.indd 54

20-Nov-18 12:39:47 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mapping Cyberspace: The Example of Russian Informational Actions in France  55

Hence, and as opposed to a number of preconceived ideas, the “Russosphere” is not homogeneous, either based on the profile of individuals which compose it, or based on their political leanings. On the contrary, it is a very diverse galaxy which could largely exist without any kind of action from Russia. We can see however that the “central” accounts, being politicians or Russian media, are important to create connections and coherence within the galaxy. In the case of the French presidential election, we will study the propagation of a piece of information with a high potential of destabilization to shed a light on the mobilization of the Russosphere to spread content produced by or emerging from third-party networks.

An Analysis of the Propagation of Fake or Biased Information: The Example of the Days Between the Two Rounds of the 2017 French Presidential Election On the night of May 3, 2017, a few dozens of minutes before the end of the presidential debate between the two candidates qualified for the second round, Mrs. le Pen suggested that Emmanuel Macron owned a bank account hidden in a tax haven. This assumption was followed by the diffusion on social networks of a document believed to be proof of the existence of such an account (it was soon proven to be fake). Two days later, and a couple of hours before the end of the official campaign, more than 150,000 documents appeared on the forum “4chan”. Baptized “Macronleaks”, this data, which was stolen from the team of Mr. Macron after one or more campaigns of phishing, spread quickly and massively on social networks. (Le Monde, 2017) In both cases, these contents showed a high virality which cannot be explained solely by their nature. The following analysis allows us to understand when and how the Russosphere on

b3411_Ch-04.indd 55

20-Nov-18 12:39:47 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

56  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Twitter intervenes in such operations initiated by third parties (in our case, these were American pro-Trump networks). Thus, we conducted several quantitative analyses using the arborescence method we already discussed, that is the precise extraction through the API of a pertinent social network, in our case, Twitter, of some entities (accounts) and of their interactions (tweets, retweets, mentions, and answers) in order to visualize them. Additionally, we cross-checked the database we obtained with the database we update continuously, and which list the accounts that can be seen as active relays of information produced by pro-Russian platforms (see above). For the offshore account rumors, we followed the URL of the American pro-Trump website Disobedient Media which first made it public. Our sample lists all the mentions of the URL between May 3 at 6:37pm UT (8:37pm in France, when the first occurrence appeared on social networks) and May 4 at 11:01am UT (1:01pm Paris time), a total of 426 interactions from 249 unique accounts. If we corroborate this data with the accounts already identified as “pro-Russian”, it appears that only 20 of them were clearly identified as being part of the “Russosphere”. Yet, despite this small number, we realized that the sphere itself was closely related to the arborescence of propagation of the URL. The orange links are those representing the tweets, mentions, and retweets containing the URL of Disobedient Media. The grey links represent the relations previously established in our database between accounts identified as pro-Russian. We can see that there exists a massive link between the two groups. First, the propagation seems to “take root” in the cluster of grey connections (pro-Russian) before moving away from it. Also, the number of accounts relaying the URL Disobedient Media without any link to the cluster is very limited: a couple of dozens at the most, that are depicted at the bottom-right of the graph. Hence, the traditional relays of Russian

b3411_Ch-04.indd 56

20-Nov-18 12:39:47 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mapping Cyberspace: The Example of Russian Informational Actions in France  57

Figure 5.3:   Algorithmic Map of Pro-Russian Accounts in the Propagation of the Link

informational activities are definitely present in our case. If it remains impossible to prove that there is a coordinated action between the pro-Trump actors behind the rumor and the pro-Russian relays that we monitor daily, there is without a doubt collusion. For the MacronLeaks, media coverage was a lot more important. Our analysis of the propagation relies on a database collected between May 5, at 00:59 am UT and May 7, at 11:13 am UT, which lists the approximately 80,000 interactions concerning the keyword “MacronLeaks”. In sum, the 23,036 accounts involved in these interactions are indexed in our database. The latter is characterized by a strong bilingualism — 32% of the interactions are in English, 61% in

b3411_Ch-04.indd 57

20-Nov-18 12:39:48 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

58  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

French. It can be explained by the fact that the leak appeared on American networks first, and by the strong ability of pro-Trump networks to mobilize quickly on social networks. Also, such as in the previous case, the accounts relaying, diffusing, and propagating the information are not necessarily linked to either Russia, or to “proRussian” accounts. Those links do appear, however, when we cross-check the database created after the propagation of the keyword “MacronLeaks” with the one based on “pro-Russian” accounts we monitor daily. When mapped, we see how far-stretched this inter-penetration is. The preceding graph shows all the accounts linked to the “MacronLeaks” campaign. We can see that it is structured around the Wikileaks account (due to mentions of the account by third-party users mostly), but more importantly, that it is organized into three very distinct groups. First, at the bottom-right, we have the accounts in English, mostly from pro-Trump networks, which used the keyword “MacronLeaks”. Then, at the top-right, the accounts in French hostile to the candidacy of Emmanuel Macron, and which gave credit to the hacked documents. Finally, at the top-left, we can see the cluster of accounts hostile to the farright, often supporters of En Marche or journalists.4 In red, we represented the accounts which are listed in our database as entities believed to be informational relays of Russian platforms. We can see first, without much surprise, that the accounts from the “Russosphere” are usually concentrated at the top-right of the graph, that is among the French-speaking far-right. However, it is more interesting to note the hyperactivity of accounts identified as pro-Russian. Out of the 23,036 unique accounts listed in our  We can see the position of the accounts of Marine le Pen and Emmanuel Macron, respectively set in hostile clusters (Macron to the right, le Pen to the left). It can be explained by the fact that both accounts have been massively quoted, especially by the respective adversaries. 4

b3411_Ch-04.indd 58

20-Nov-18 12:39:48 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Mapping Cyberspace: The Example of Russian Informational Actions in France  59

Figure 5.4:  Relations Between the Arborescence “Macronleaks” and the “ProRussian” Twitter Accounts

“MacronLeaks” database, around 350 were already identified as pro-Russian informational relays. In other words, only 1.5% of all the accounts listed in the “MacronLeaks” database can be considered as potential active supporters of Russian informational activities. Yet, this small number does not account for the importance of the red color in our graph, which represents the pro-Russian accounts and the interactions emanating from these accounts. Simply speaking, if they are not numerous, these accounts made “a lot of noise”, and

b3411_Ch-04.indd 59

20-Nov-18 12:39:49 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

60  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

took an important part in the virality of the keyword. One way to confirm objectively that hyperactivity is to calculate the average number of tweets they produce in an hour. The group of 350 singledout accounts produced an average of 2.4 tweets per hour, when the average number for the 23,036 accounts in the “MacronLeaks” database was 1.12 tweets per hour. In other words, the pro-Russian accounts proved to be approximately twice as active as the others, with some of them capable of producing up to 28 tweets per hour, which cast no doubt on their total or partial automatization. The actions attributed to Russia in cyberspace are fundamentally protean, and often unpredictable. However, analyzing informational strategies in the context of a tense and uncertain political period in France allows us to take notice of two fundamental elements. First, we can identify only the visible part of the strategies of the Russian state apparatus, that is, the production of contents and the construction of some narratives through national press agencies. Second, Russian actions in cyberspace seem to be taking advantage of all the available vectors. Beyond Twitter, platforms such as Facebook and Vkontakte are also harnessed. Finally, the efficiency of these strategies is very often dependent on cybernetic actions from seeming third-party actors.

References Limonier, K. and Gérard, C. (2017). Guerre hybride russe dans le cyberespace. Hérodote, 166-167(3), 145–163. doi:10.3917/her.166.0145. Maxime Audinet, “La voix de Moscou trouble le concert de l’information international”, Le Monde Diplomatique, April 2017. En marche! dénonce un piratage «massif et coordonné» de la campagne de Macron, Le Monde, 05.06.2017, https://abonnes.lemonde.fr/electionpresidentielle-2017/article/2017/05/06/l-equipe-d-en-marche-fait-etat-dune-action-de-piratage-massive-et-coordonnee_5123310_4854003.html

b3411_Ch-04.indd 60

20-Nov-18 12:39:49 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

5 COMPUTATIONAL PROPAGANDA IN EUROPE, THE U.S., AND CHINA GILLIAN BOLSOVER AND PHILIP HOWARD

The Rise of Computational Propaganda Propaganda has a long and tumultuous history. Modern usage of the word originates in the 1600s with the founding of the Sacra Congregatio de Propaganda Fide (The Sacred Congregation for the Propagation of the Faith), the arm of the Catholic Church tasked The authors gratefully acknowledge the support of the European Research Council, “Computational Propaganda: Investigating the Impact of Algorithms and Bots on Political Discourse in Europe,” Proposal 648311, 2015–2020, Philip N. Howard, Principal Investigator. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the European Research Council. 61

b3411_Ch-05.indd 61

20-Nov-18 12:40:03 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

62  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

with spreading Christianity. Evolving from these religious origins, traditional forms of state-produced propaganda reached their zenith in the early 1900s. Between the First World War and the end of the Cold War, all of the major world powers used propaganda to solidify domestic allegiances and vilify enemies. However, in the public imagination, the idea of propaganda is most associated with Nazi Germany and the Communist propaganda states of Soviet Russia and Maoist China. Academically, propaganda is understood as the technique of influencing human action by the manipulation of representations (Lasswell, 1995; Pratkanis and Aronson, 1992). It works by appealing to and manipulating emotions, bypassing rational thought to achieve the predetermined ends of its creators (Institute for Propaganda Analysis, 1995). Although this distinction is difficult to determine in practice, propaganda is often defined in opposition to persuasion, with propaganda focused only on the needs and aims of the communicator, and persuasion attempting to satisfy the needs of both the persuader and the audience (Garth and O’Donnell, 1999). Thus, while the word remains associated with propaganda states such as Nazi Germany and Maoist China, this definition highlights that propaganda can be found everywhere in modern society, from advertisements to political campaigns. The movement of politics onto the Internet has led to the rise of a new form of propaganda. This computational propaganda is propaganda created or disseminated using computational means. Where once propaganda was the province of states and other large institutions, the ease of content creation and dissemination in online spaces has led to the creation of propaganda by a wide variety of individuals and groups. The anonymity of the Internet has obscured the producers of information (and thus made the identification of the intent of propaganda producers more opaque). This has allowed

b3411_Ch-05.indd 62

20-Nov-18 12:40:03 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  63

state-produced propaganda to be presented as if it were not produced by state actors. It has also facilitated international propaganda campaigns and interference in domestic politics by foreign states. The affordances of online spaces have also profoundly changed the mechanisms through which propaganda achieves its aims. Social media sites facilitate the rapid dissemination of this emotive content through social network connections, lending credence to these spurious messages through user endorsement. The online echo chambers created through homophily provide an environment that nurtures one-sided information and hinders access to contradictory perspectives. Exposure to information online is often governed by measures of popularity that can be manipulated by those seeking to spread propaganda messages. Additionally, the prevalence of online advertising and targeted campaigns based on big data analytics mean that propaganda messages can be more accurately tailored to the existing biases of users, hence, vastly increasing the persuasiveness of these messages. Although computational propaganda was found as early as January 2010 in the U.S. state of Massachusetts’ Senate Election (Mustafaraj and Metaxas, 2010), the issue exploded into the public consciousness in 2016. Evidence continues to emerge of misinformation and the use of bots to push online political content into public awareness in the 2016 U.S. Presidential Election. Investigations are underway into potential interference in the election process by interests associated with the Russian state. However, the unique combination of social and technical phenomenon in computational propaganda, combined with the interests of its creators to hide its provenance, means that data, analysis, and explanations of this phenomenon have been slow to emerge (Bolsover and Howard, 2017). This chapter will provide an overview of some of the most recent research in the field of computational propaganda across the globe.

b3411_Ch-05.indd 63

20-Nov-18 12:40:03 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

64  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

The influence of social media on political processes has previously been shown to be very different in authoritarian and democratic contexts (Bolsover, 2017a) and, thus, this chapter will examine the use of propaganda in and by both democratic and authoritarian states. It will show that there is a great deal of variation in computational propaganda between different countries in which the same or similar Internet platforms are used. What is different between these countries is the context: the political system, the media system, the strength and nature of civil society, and the extent to which commercial Internet platforms have power over what it means to be a citizen in a technologically enabled society. Based on this overview of the current literature, this chapter concludes that, although computational propaganda is both a social and technical phenomenon, the importance of this phenomenon over modern political processes is less about technology and more about power over that technology, which derives from underlying configurations of power in society. Thus, quick-fix, technological, and policy solutions will not address the problem of computational propaganda. This must be done by addressing the underlying social conditions that increase the susceptibility of individuals to emotionladen, manipulative messages.

Computational Propaganda in the U.S. and Europe One of the key technologies that are used to distribute computational propaganda, are referred to as ‘bots.’ Bots are pieces of code that are designed to mimic human activity online, performing posting, forwarding, liking, or commenting on activities according to their program design. They are often used to create an artificial grassroots community online, producing the illusion that numerous individuals support a particular perspective, brand, or candidate.

b3411_Ch-05.indd 64

20-Nov-18 12:40:03 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  65

Bots appear to have become extremely common on social media sites, particularly in the U.S., and are used to exploit the algorithms that govern how these sites process and display information. These bots can be used to disseminate particular messages to, for instance, artificially push a particular message into trending topics by retweeting or posting using particular hashtags or keywords. Bots can also be used to dominate the information that is returned for a particular search query (through frequent posting) or to silence particular discussions (through automatically reposting or replying to  particular messages or users). They can also be used to send messages publicly or privately to selected users, such as those who are posting with a particular keyword or are located in a particular area. Bot activity was found to be extremely prominent on Twitter in the 2016 U.S. Presidential Election. Collecting 19.4 million tweets that were made in the week leading up to the vote, the Computational Propaganda team at the Oxford Internet Institute found that between 20% and 25% of tweets during waking-hours that used election related hashtags were generated by accounts that showed evidence of automation (Kollanyi et al., 2016). However, this bot activity was not spread equally between the different candidates; the amount of automation in pro-Trump hashtags outnumbered the amount of automation in pro-Clinton hashtags by five to one (Ibid, 2016). The Computational Propaganda team also looked at the content of information being shared by Twitter users in the run-up to the U.S. election. This was one of the first academic attempts to speak to this debate about what is being called “fake news”: the prevalence of online misinformation and the potential influence that it might have had on the election outcome. Based on this same dataset of 19.4 million tweets, the team chose to focus on the key battleground state of Michigan, which was won by Trump by a margin of only 10,704

b3411_Ch-05.indd 65

20-Nov-18 12:40:03 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

66  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

votes, hypothesizing that computational propaganda may have been more intense in battleground states. The team conducted a content analysis of the approximately 26,000 links shared in Michigan in the week leading up to the election (Howard et al., 2017). It was found that, what the team called “junk news” — misinformation, and hyper partisan and extreme content — was shared just as much as news produced by professional outlets (including online start-ups and citizen journalists) (see Figure 6.1). The decision to focus on the swing state of Michigan was backed up by later research by the team that applied the same coding scheme to a set of 1.2 million geolocated tweets from across the U.S. The

Figure 6.1:    Sources of Information Shared in Political Discussions on Twitter in the Run-Up to the U.S. Presidential Election Vote in Michigan n = 25,339 Source: Howard, P.N., Gillian Bolsover, Bence Kollanyi, Samantha Bradshaw, and Lisa-Maria Neudert. 2017. “Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing Over Twitter?” 2017.1.COMPROP Data Memo 2017.1.

b3411_Ch-05.indd 66

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  67

researchers found that Twitter users in swing states shared junk news at a rate greater than the national average, even after weighting for the number of users in that state (Howard et al., 2017). The team also repeated this methodology in major political events in Europe in 2017 (see Table 6.1). Based on this research, junk news appears to be much less prominent in the U.K., Germany, and France. In the German Federal Election in February 2017 and the German Parliamentary Election in September 2017, misinformation made up less than 10% of links shared in the lead-up to the votes, and professional news sources outnumbered junk news sources by about 4:1 (Neudert et al., 2017a; 2017b). Similar proportions of junk and professional news were found in the U.K. General Election in June (Kaminska et al., 2017). Misinformation was even less Table 6.1:  A Comparison of the Prevalence of Junk News in Recent Political Events in the U.S. and Europe

Event

Percentage of junk news links share on Twitter

Ratio of professional news to junk news

U.S.

2016 U.S. Presidential Election

16%

1:1

U.K.

2017 U.K. ‘Snap’ General Election

10%

4:1

Germany

2017 German Federal Presidency Election

10%

4:1

Germany

2017 German Parliamentary Election

9%

4:1

France

2017 French Election Round One

6%

7:1

France

2017 French Election Runoff

4%

11:1

Country

Source: Howard et al., 2017; Kaminska et al., 2017; Neudert et al., 2017a; 2017b; Bradshaw et al., 2017; Desigaud et al., 2017.

b3411_Ch-05.indd 67

05-Dec-18 6:26:19 AM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

68  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

prominent in the 2017 French Presidential Election, making up 6% of links in the main election and 4% in the runoff. In the French Presidential runoff, the ratio of professional to junk news was 11:1, compared to the ratio of 1:1 found in the 2016 U.S. Presidential Election (Bradshaw et al., 2017; Desigaud et al., 2017). There is, thus, a big difference in how prominent misinformation has been in recent political events on exactly the same social media platform — Twitter — and using exactly the same methodology to analyze the data. This demonstrates that context is important: misinformation was highest in U.S. swing states but higher in the U.S. than the U.K., Germany, or France. However, in all of these countries, computational propaganda is seen as a threat to the democratic political process, which relies on accurate information to support informed citizen choices. However, these same technologies that support computational propaganda are also available to authoritarian states, who can utilize them to support their political agendas both domestically and internationally. A great deal of attention has been paid to the influence of interests associated with the Russian state in recent political events in the U.S. and Europe (e.g., The Japan Times, 2017; LoBianco, 2017). The abovementioned research, however, found that Russian-produced content was shared relatively infrequently on Twitter, making up between 1% and 3% of content in the German and French elections, and less than 1% of content in the U.S. and U.K. elections. Comparing the content shared in the run-up to the 2016 Brexit election in the U.K. against public lists of Russian troll and bot accounts, researchers found that these accounts contributed little to the overall conversation (Narayanan et al., 2017). While the majority of the attention in relation to computational propaganda in authoritarian countries has been paid to Russia,

b3411_Ch-05.indd 68

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  69

much less attention has been devoted to China, despite the fact that it has a much longer history of advancing state objectives through the control of Internet discourse. It is, thus, to China that the next section turns.

Chinese Computational Propaganda Automation, Computation, and Propaganda in China It is important to remember that the idea of propaganda and its place in society is very different in the Chinese context. Thus, in order to understand computational propaganda in China, it is useful to distinguish between the ideas of automation, computation, and propaganda, and how these apply in the Chinese context. In terms of automation, most of the computational propaganda discussed thus far has been automation: social media ‘bots’ that have had a large effect in recent political events in the U.S. and Europe. However, there is little evidence of domestic automation in China. A team at Harvard looked specifically for evidence of automation as part of the state propaganda strategy, drawing from a leak from an Internet Propaganda Office in Jiangxi Province; however, they found no evidence of automation as part of the propaganda effort (King et  al., 2016). Similarly, no evidence was found for automation in comments on political information shared on Weibo during the 2017 Spring Festival period (a time of particular sensitivity where more control of online discourse would be expected) (Bolsover, 2017b). There is, of course, a great deal of automation used in China in relation to censorship: content blocking, website blocking, deletion, and automatic prevention of posts containing certain words (Fu et al., 2013; King et al., 2012; Ng, 2015; Wright, 2014). This automation is backed up by human activity to decide and refine what is automatically blocked or deleted. This kind of automation

b3411_Ch-05.indd 69

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

70  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

in China, although widespread, is, in some sense, a much less insidious form of information control than propaganda; if you delete or prevent something from being seen and people see an absence of that information, then they know that something is being held from them (Bolsover, 2017b). Propaganda is, thus, a much more sophisticated form of information control, especially in China where positive propaganda floods out potentially undesirable content (King et al., 2016). Computation, the second key term, encompasses automation. The Chinese state has embraced the use of computation as part of its propaganda strategy. A 2015 article in the Party political theory journal Red Flag Manuscripts illustrates the state’s approach to automation: “State governance is like a large computer… This requires us to use smart Internet technologies and modern information systems, which according to uniform objectives and concrete norms, operate in coordination, deal with problems, and dissolve contradictions, providing basic platform guarantees for innovating governance structures and mechanisms” (Song, 2015). This quote encapsulates how the Chinese state has embraced computation as part of its overall political strategy. In relation to the third key term, propaganda, it is important to remember that this word has very different connotations in China. Propaganda did not start to be understood in the West as negative until after World War Two, when it came to be associated with Nazi Germany. Until the 1950s, a large variety of Western organizations openly referred to the persuasive, emotional information they were producing as “propaganda”; similarly, in China, the idea of propaganda is not necessarily a negative one. This means that researchers of computational propaganda must be cognizant of its different place as part of society and politics in China.

b3411_Ch-05.indd 70

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  71

Computational Propaganda in and about China Research into computational propaganda in China has found no evidence of the kind of automation on domestic social media that has been found in the U.S. and Europe. This means that new understandings of what computational propaganda is are necessary to approach this issue in mainland China. However, a major propaganda battle is being fought over the control of discourse about China in the Chinese language on Twitter. Tracking 27 hashtags associated with China and Chinese politics over a six-week period, it was found that almost 30% of the 1.1 million tweets in the dataset were from users posting more than 100 statuses per day. All of these 100 accounts used automation (third-party software packages such as “If This Then That”, “twittbot.net” or custom scripts) to disseminate their messages (Bolsover, 2017b). Almost half of these 100 accounts belonged to one of two coordinated groups, both of which were posting anti-Chinese state content in simplified Mandarin (see Table 6.2). The more active of these, the 22 accounts in the 1989 group, posted almost 10% of the 1.1 million tweets in the sample. These bots flooded content in hashtags such as China, Hong Kong, and human rights with posts in simplified Chinese intended to keep alive the memory of the 1989 democracy movement that ended with the Tiananmen Square incident. The accounts in this group also frequently posted quotes from and links to the universal declaration of human rights in simplified Mandarin. The second group, the Pan-Asia group, was more complex and also had 22 accounts in the top 100. However, they posted less frequently, making up 4% of tweets in the dataset. Most of the accounts were designed to look as if they were Chinese news

b3411_Ch-05.indd 71

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

72  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears Table 6.2:    Top 100 Highest-Posting Accounts Within China-Related Hashtags Number of accounts

Number of posts

Percentage of posts in dataset (out of 1.1 million)

Average BotOrNot Score

1989 group

22

117,578

9.98%

60

Pan-Asia group

22

44,678

3.79%

48

Independent anti-Chinesestate bots

5

7,969

0.68%

65

Both anti-Chinese-state and commercial content

1

1,090

0.09%

50

10

39,239

3.33%

48

4

10,213

0.87%

71

Commercial bots

8

34,860

2.96%

58

Job bots

6

8,592

0.73%

55

Other bot (non-political)

4

6,620

0.56%

39

18

64,170

5.45%

100

335,009

28.44%

Anti-Chinese-state bots

Other political bots Professional news bots “Fake news” bots Other non-political bots

Account suspended Account suspended Total

Source: Bolsover and Gillian, 2017. “Computational Propaganda in China: An Alternative Model of a Widespread Practice.” COMPROP Working Paper 2017.4.

organizations or universities, based on their display names and profile and header photos. This group existed to disseminate information but was not necessarily automated. However, all of the accounts posted and retweeted each other’s content in simplified Mandarin. This group aimed to keep alive the memory, to call for reparations, and to allege corruption in the case of a Ponzi scheme in

b3411_Ch-05.indd 72

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  73

Yunnan province in Southern China, in which about 200,000 people lost money several years ago. This finding is very surprising in that previous evidence of computational propaganda on Twitter was of pro-Chinese state propaganda about Tibet disseminated in the English language (Free Tibet, 2014). There are some who might say that this anti-Chinese state computational propaganda on Twitter is a positive counterpart to Chinese state propaganda. However, this computational propaganda is likely aimed at diasporic Chinese (such as Chinese students studying abroad) and those who ‘jump the wall’ from mainland China to access blocked platforms. These individuals come to Twitter, making efforts to get out of the constrained Chinese Internet sphere, but what they find is not the diverse, public sphere that Twitter markets itself as but a place that, at least to do with China and Chinese politics, is also dominated by propaganda messages; it is just propaganda from the other side.

How Does Technology and Society Enable Computational Propaganda? The research studies introduced here have shown that computational propaganda is becoming a major phenomenon on Twitter in relation to a variety of political events and topics. Research providing data about the extent of online computational propaganda has become a hot topic but what is much more important than more data is putting this phenomenon in context. Misinformation and opinion manipulation is not new. It is important to maintain a sense of historical perspective and humility about current affairs. When Donald Trump is calling critical articles in the New York Times, Washington Post, and CNN “fake news” (Jamieson, 2017) and Xinhua is calling reports of torture of political

b3411_Ch-05.indd 73

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

74  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

prisoners “fake news” (Phillips, 2017), then it is clear that these terms are being used as a political tool to justify the actions of power holders in controlling information that they dislike. Similarly, some of those who are talking about the problem of fake accounts on social media are doing so to justify actions to require real name registration and other forms of profitable control. In many cases, these issues are framed and can be thought of as a balance between protection and freedom. If there is a manipulation of the information about the risk of computational propaganda that overstates the need for protection, then there is likely to be a movement too far away from freedom. Thus, having accurate information about the level of risk is necessary. There is currently a great lack of information about the level of risk, and often a significant overreaction about this risk and the need to protect from it. Much of the reaction to computational propaganda has focused on technical and policy solutions.  The German state has introduced legislation to control computational propaganda despite the low influence of misinformation in their domestic political process (Neudert, 2017). Similarly, social media sites such as Facebook have announced their own plans to address computational propaganda on their sites (Solon and Wong, 2016). However, these responses are reactive; in thinking about computational propaganda, it is important to address this issue on a societal level, rather than simply responding to incidences of the phenomenon. Firstly, and most importantly, propaganda only works when it appeals to an individual’s existing beliefs. Whether you believe a piece of misinformation depends on whether your existing ideas mean that that information seems like it might be true or if you would like it to be true to support what you believe. Thus, it is not misinformation that is the problem; it is people’s tendency to believe this misinformation, and their desire to and ability to verify the veracity of information they receive.

b3411_Ch-05.indd 74

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  75

This means that at the basic level, computational propaganda should not be fought with regulation of technology companies or deletion of misinformation, but by building critical thinking skills, providing access to truthful and impartial information, and challenging the tendency of individuals to seek information that confirms pre-existing beliefs. Commercial social media platforms have exacerbated individuals’ tendencies to expose themselves to people and information that matches existing preferences. There was a great deal of hope that the Internet would help people access more diverse information, but the opposite has happened. As there is so much information online, people are drawn to providers that they already know and trust. Thus, the information people consume online is less diverse than the information consumed through traditional media. People do not like to be exposed to information that contradicts their beliefs. Thus, it is in the interests of social media sites to provide information to users that makes them feel comfortable rather than challenged. The rise of the problem of misinformation online is predicated on the preferences of individual users and a collective addiction to information gratification. It does not matter if one believes that attention-grabbing, emotive content is frivolous or stupid. It does not matter if one believes that high-quality, impartial journalism is important and should be supported. If a user clicks on clickbait headlines while procrastinating, working or sleeping, they are contributing to the furtherance of this phenomenon by driving advertising revenue and sending the message to social media sites that the user wants to see more of that kind of content. When people’s information and online actions become the commodity through which the spaces in which they inhabit function, then people need to be much more careful about what they do within these spaces. Coffee shops (the quintessential space of the Habermasian

b3411_Ch-05.indd 75

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

76  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

public sphere) made money whatever kind of conversation individuals had within them and whatever kind of newspaper they read at the table. This is not true of social media sites. All user actions within these sites are quantified and commodified. This fact is at the heart of the current climate of misinformation and opinion manipulation. A second way in which these technologies enable computational propaganda is through promoting the need to stay constantly informed, with a focus on trending topics, what is “hot” now, and popular threads, users, and keywords. This is part of a discourse of urgency that keeps people coming back to these sites, and to encourage continual consumption and exposure to advertising. This is also linked to a discourse of popularity that promotes increased production and consumption of information. These discourses harken to the idea of a marketplace of ideas, in which if there is a free market for information, then correct and useful ideas can rise to the top. However, trending topics do not actually represent correct and useful ideas rising to the top. They are skewed toward newness to keep users coming back, often include sponsored and advertising content, and are tailored to individual users based on their location and preferences. They are also, as the studies on computational propaganda show, subject to outside manipulation. The influence of this manipulation is increased because users do not understand how these sites work and who has influence over them. These sites enable the external manipulation of the information contained in them (such as computational propaganda) because it is in their business interest to manipulate this information internally for advertising, data collection, and providing a comfortable experience that keeps users coming back frequently. Thus, the current status of computational propaganda is intimately linked with the commercial nature of the social media sites on which this propaganda is disseminated.

b3411_Ch-05.indd 76

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  77

Conclusion This chapter has provided an overview of recent research into the prevalence and influence of computational propaganda in the U.S., Europe, and China. These studies have shown that there is a large variation in the prevalence of misinformational content and the computational techniques used to disseminate this content in different political events. The 2016 U.S. Presidential Election, and particularly the campaign of Donald Trump, seems to show much higher rates of computational propaganda than the elections and referenda held in 2016 and 2017 in the U.K., France, and Germany. Research has found little evidence, thus far, of Russian influence in these elections. However, the anonymity of the Internet and the sophistication of the technical phenomenon of computational propaganda means that more research should be done to attempt to understand the extent of cross-national computational propaganda. In authoritarian China, little evidence has been found of statedriven automation, despite researchers looking specifically for it. The differences in the Chinese context, such as cheaper labor costs, a large state workforce who can execute state propaganda initiatives manually, and a populace who can mobilize to advance the state agenda mean the kind of computational propaganda found in the U.S. and Europe does not seem prevalent either on the Chinese mainland or in cross-Strait propaganda (see Monaco, 2017, for a discussion of computational propaganda in Taiwan). However, recent research has uncovered automated, computational propaganda on Twitter promoting anti-Chinese state content in simplified Mandarin. The differences in the prevalence and form of computational propaganda in different contexts point toward the need for a greater

b3411_Ch-05.indd 77

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

78  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

emphasis on the social and political, rather than the technological, in understanding this phenomenon. Current research providing data on the prevalence of computational propaganda has gone a long way to advancing knowledge in this area. However, future studies should broaden and deepen their remit to understand why the political economy of online platforms and the nature of information consumption in the Internet age seem to promote the conditions in which users and political processes are particularly susceptible to propaganda messages.

References Bolsover, Gillian. 2017a. “Technology and Political Speech: Commercialisation, Authoritarianism and the Supposed Death of the Internet’s Democratic Potential.” Oxford, U.K.: University of Oxford. ———. 2017b. “Computational Propaganda in China: An Alternative Model of a Widespread Practice.” http://comprop.oii.ox.ac.uk/wp-content/ uploads/sites/89/2017/06/Comprop-China.pdf. Bolsover, Gillian, and Philip N. Howard. 2017. “Computational Propaganda and Political Big Data: Moving toward a More Critical Research Agenda.” Big Data, December. Bradshaw, Samantha, Bence Kollanyi, Clementine Desigaud, and Gillian Bolsover. 2017. “Junk News and Bots during the French Presidential Election: What Are French Voters Sharing Over Twitter?” 2017.3. COMPROP Data Memo 2017.3. Desigaud, Clementine, Philip N. Howard, Samantha Bradshaw, Bence Kollanyi, and Gillian Bolsover. 2017. “Junk News and Bots during the French Presidential Election: What Are French Voters Sharing Over Twitter In Round Two?” 2017.4. COMPROP Data Memo 2017.4. Free Tibet. 2014. “Free Tibet Exposes #ChinaSpam on Twitter | Free Tibet.” July 17, 2014. https://freetibet.org/news-media/na/free-tibet-exposeschinaspam-twitter.

b3411_Ch-05.indd 78

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  79

Fu, King-wa, Chung-hong Chan, and Marie Chau. 2013. “Assessing Censorship on Microblogs in China: Discriminatory Keyword Analysis and the RealName Registration Policy.” Internet Computing, IEEE 17(3): 42–50. Garth, Jowett, and Victoria O’Donnell. 1999. Propaganda and Persuasion. London: Sage. Howard, Philip N., Gillian Bolsover, Bence Kollanyi, Samantha Bradshaw, and Lisa-Maria Neudert. 2017. “Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing Over Twitter?” 2017.1. COMPROP Data Memo 2017.1. Howard, Philip N., Bence Kollanyi, Samantha Bradshaw, and Lisa-Maria Neudert. 2017. “Social Media, News and Political Information during the U.S. Election: Was Polarizing Content Concentrated in Swing States?” 2017.8. COMPROP Data Memo 2017.8. Institute for Propaganda Analysis. 1995. “How to Detect Propaganda.” In Propaganda, edited by Robert Jackall, pp. 217–224. Basingstoke and London: MacMillan. Jamieson, Amber. 2017. “‘You Are Fake News’: Trump Attacks CNN and BuzzFeed at Press Conference.” The Guardian, January 11, 2017, sec. U.S. news. https://www.theguardian.com/us-news/2017/jan/11/trumpattacks-cnn-buzzfeed-at-press-conference. Kaminska, Monica, Bence Kollanyi, and Philip N. Howard. 2017. “Junk News and Bots during the 2017 U.K. General Election: What Are U.K. Voters Sharing Over Twitter?” 2017.5. COMPROP Data Memo 2017.5. King, Gary, Jennifer Pan, and Margaret E. Roberts. 2016. “How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument,” June. http://gking.harvard.edu/ files/gking/files/50c.pdf?m=1464086643. King, Gary, Jennifer Pan, and Molly Roberts. 2012. “How Censorship in China Allows Government Criticism but Silences Collective Expression.” In APSA 2012 Annual Meeting Paper. http://papers.ssrn.com/sol3/ papers.cfm?abstract_id=2104894. Kollanyi, Bence, Philip N. Howard, and Samuel C. Woolley. 2016. “Bots and Automation over Twitter during the U.S. Election.” Data Memo

b3411_Ch-05.indd 79

20-Nov-18 12:40:04 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

80  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

2016.4. Oxford, U.K.: Project on Computational Propaganda. http:// comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2016/11/DataMemo-US-Election.pdf. Lasswell, Harold D. 1995. “Propaganda.” In Propaganda, edited by Robert Jackall, pp. 13–25. Basingstoke and London: MacMillan. LoBianco, Tom. 2017. “The 4 Russia Investigations in Congress, Explained.” CNN. April 25, 2017. http://www.cnn.com/2017/04/25/politics/congressrussia-investigations/index.html. Monaco, Nick. 2017. “Computational Propaganda in Taiwan: Where Digital Democracy Meets Automated Autocracy.” http://comprop.oii. ox.ac.uk/wp-content/uploads/sites/89/2017/06/Comprop-Taiwan-2.pdf. Mustafaraj, Eni, and P. Takis Metaxas. 2010. “From Obscurity to Prominence in Minutes: Political Speech and Real-Time Search.” Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, April 26-27th, 2010, Raleigh, NC: US. http://repository.wellesley.edu/computerscience faculty/9/. Narayanan, Vidya, Philip N. Howard, Bence Kollanyi, and Mona Elswah. 2017. “Russian Involvement and Junk News during Brexit.” 2017.10. COMPROP Data Memo 2017.10. Neudert, Lisa-Maria. 2017. “Computational Propaganda in Germany: A Cautionary Tale.” http://comprop.oii.ox.ac.uk/wp-content/uploads/ sites/89/2017/06/Comprop-Germany.pdf. Neudert, Lisa-Maria, Bence Kollanyi, and Philip N. Howard. 2017a. “Junk News and Bots during the German Federal Presidency Election: What Were German Voters Sharing Over Twitter?” 2017.2. COMPROP Data Memo 2017.2. ——— . 2017b. “Junk News and Bots during the German Parliamentary Election: What Are German Voters Sharing over Twitter?” 2017.7. COMPROP Data Memo 2017.7. Ng, Jason. 2015. “Tracking Censorship on WeChat’s Public Accounts Platform.” The Citizen Lab (blog). July 20, 2015. https://citizenlab. org/2015/07/tracking-censorship-on-wechat-public-accounts-platform/.

b3411_Ch-05.indd 80

20-Nov-18 12:40:04 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Computational Propaganda in Europe, the U.S., and China  81

Phillips, Tom. 2017. “China Accuses Western Media of ‘fake News’ about Human Rights.” The Guardian, March 2, 2017, sec. World news. https:// www.theguardian.com/world/2017/mar/02/china-accuses-westernmedia-fake-news-human-rights. Pratkanis, Anthony, and Elliot Aronson. 1992. Age of Propaganda: The Everyday Use and Abuse of Persuasion. Santa Cruz: University of California. Solon, Olivia, and Julia Carrie Wong. 2016. “Facebook’s Plan to Tackle Fake News Raises Questions over Limitations.” The Guardian, December 16, 2016, sec. Technology. https://www.theguardian.com/technology/2016/ dec/16/facebook-fake-news-system-problems-fact-checking. Song, Fangmin. 2015. “State Governance in the Internet Era.” Translated by Rogier Creemers. Red Flag Manustripts, May 22, 2015. https:// chinacopyrightandmedia.wordpress.com/2015/06/01/state-governancein-the-internet-era/. The Japan Times. 2017. “Russia-Linked ‘Computational Propaganda’ Campaigns Seen Distorting Public Opinion Worldwide.” The Japan Times Online, June 21, 2017. https://www.japantimes.co.jp/news/2017/06/21/ business/russia-linked-wave-computational-propaganda-misinformationseen-manipulating-public-opinion-worldwide/. Wright, Joss. 2014. “Regional Variation in Chinese Internet Filtering.” Information, Communication & Society 17(1): 121–141.

b3411_Ch-05.indd 81

20-Nov-18 12:40:04 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

6 CIVILIANS IN THE INFORMATION OPERATIONS BATTLEFRONT: CHINA’S INFORMATION OPERATIONS IN THE TAIWAN STRAITS GULIZAR HACIYAKUPOGLU AND BENJAMIN ANG

Introduction In the murky world of state-on-state information operations (IO),1 deniability and difficulty of attribution are powerful weapons,  Information Operations is the “integrated employment of the core capabilities of electronic warfare, computer network operations, psychological operations, military 1

83

b3411_Ch-06.indd 83

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

84  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

especially when states claim that operations have been carried out by ordinary citizens or private companies, and not by state forces. These claims need to be examined in the context of countries where the line between the state and its citizens is not clear, such as the People’s Republic of China (PRC), whose concept of “People’s War” suggests the need to mobilize masses as auxiliary forces in addition to the traditional army (Mao, 1972: 88–90). The People’s War operates in congruence with the PRC’s “Three Warfares” approach (san zhong zhanfa) and “unrestricted” approach to warfare, which embraces winning opponents’ minds as a goal, and lifts the boundaries between military and nonmilitary operations, and between war and peacetime (Lee, 2014: 201). Within these approaches, the PRC’s experiences in nonmilitary coercion, the execution of “People’s War” within the context of IO, and its heightened interest in cyberspace, signal the potential intensification of IO conducted by non-military people to deceive and manipulate the population of adversaries, including Taiwan. Taiwan acknowledges the threat of PRC’s IO targeted at the minds of Taiwanese. Accordingly, it has been taking initiatives to equip its citizens against IO-related undertakings,2 and to integrate them in defence endeavors (“All-Out Defense Education”). Taiwan can further leverage its democratic system, deception and operations security, in concert with specified supporting and related capabilities, to influence, disrupt, corrupt or usurp adversarial human and automated decision making while protecting our own.” See “What are information operations,” Air University, http://www.au.af.mil/info-ops/what.htm. 2  See, for example, Nicola Smith, “School kids in Taiwan Will Now be Taught How to Identify Fake News,” Time, April 7, 2017, http://time.com/4730440/taiwan-fake-newseducation/; and Sophia Yang, “Taiwan declares war on fake news from China,” Taiwan News, January 3, 2017, https://www.taiwannews.com.tw/en/news/3062594.

b3411_Ch-06.indd 84

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  85

global social media platforms, national solidarity, and citizenmilitary relations, to further incorporate citizens as proactive forces in the campaign against (foreign) state-sponsored IO. Other states in the region who are concerned about foreign statesponsored IO can also draw lessons from these experiences and responses. This chapter analyzes the role of non-combatant citizens in the PRC’s IO against Taiwan, with an emphasis on the attacks aimed at manipulating public opinion and psychology. The role of civilians in state-sponsored IO remains an underdeveloped topic, especially within the framework of the Taiwan Straits conflict. This research contributes to this nascent area, and it requires attention due to the importance of the Taiwan Straits for the political balance in the region,3 Taiwan’s alleged emergence as a test-bed for the cyberoperations of PRC (Gold, 2013, July 19), and the need to understand the critical issue of civilian involvement in IO. This chapter also suggests possible lessons that can be learned by other states that are concerned about state-sponsored IO. At time of writing, much attention has been given to the role of “bots”4 in conducting IO in Europe and the U.S. However, the evidence on the PRC’s employment of bots in its IO remains limited (Bradshaw and Howard, 2017: 13; Monaco, 2017: 21), possibly because the PRC has the capacity to mobilize a large state workforce and civilian populace instead (Monaco, 2017: 21). This unique approach requires “greater emphasis on the social and political  See, for example, the concerns raised in “Taiwan to Boost Defense Spending, U.S. Concerned over Possible Military Imbalance” Reuters, 30 Oct. 2017, www.reuters. com/article/us-taiwan-usa/taiwan-to-boost-defense-spending-u-s-concerned-overpossible-military-imbalance-official-media-idUSKBN1CZ07H. 4  Bots are computer codes designed to mimic human activity online. These automated accounts comment on, post, forward, and like online information or comment. 3

b3411_Ch-06.indd 85

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

86  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

rather than the technological” in researching means of identifying and countering IO from the PRC (Bolsover and Howard, 2018).5

“Three Warfares” and the People’s War The PRC acknowledges the limitations of the kinetic warfare in responding to today’s hybrid challenges (Green, 2016: 1), the need to advance in cyber operations (Chase and Chan, 2016), and the service of information as a weapon of war (Fravel, 2015: 1, 4). The PRC’s broadening and constantly developing perspective on warfare rests on the concept of “Unrestricted Warfare”. The Unrestricted Warfare rises on the cooperation of domains that were once thought to be separate (e.g., politics, economics, diplomacy, etc.) and suggests the concurrent operation of the measures of these different domains such as “cultural infiltration”, “media propaganda”, and “formulating and applying international rules”, as “military means” (Liang and Xiangsui, 1999: 192). It espouses the employment of all the necessary means, including “military and non-military”, “armed force or non-armed force”, and “lethal and nonlethal” opportunities to “compel” the opponent to submit to one’s demands (Liang and Xiangsui, 1999: 7). The PRC does not have a strict separation between military and non-military operations, and peace and wartime, and places great importance to winning adversaries’ minds (Lee, 2014: 201–202). With this, it also blurs the distinction between IO and Information Warfare (IW), which is a subset of IO carried in times of a conflict or war (Barrett JR, 2005: 684; Hutchinson, 2006: 215–216). The “Three Warfares” and the “People’s War” concepts fit into and rise on this unrestricted approach to warfare.  As stated by Gillian Bolsover and Philip Howard in the preceding chapter of this book, Computational Propaganda in Europe, the U.S. and China. 5

b3411_Ch-06.indd 86

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  87

Framework: ‘Three Warfares’ Born in the era of the Unrestricted Warfare, “Three Warfares” addresses the challenge of winning minds, and lays the guiding framework for IO (Raska, 2015, December 18), through three “mutually-reinforcing” approaches: (1) “psychological operations”, (2) “public opinion” and “media manipulation”, and (3) “legal warfare” (Raska, 2015, December 18; Raska, 2015, December 2; Cheng, 2012, May 21). The psychological warfare is waged to interrupt decision-making processes, and instil insecurity and “antiwar sentiments” (Cheng, 2012, May 21). The public opinion or media warfare involves “media manipulation” targeted at swaying public opinion (Raska, 2015, December 18; Raska, 2015, December 2; Cheng, 2012, May 21). The legal warfare aims to influence “strategies, defense policies, and perceptions of target audiences aboard” (Raska, 2015, December 18). The discussions of this chapter fall within the spectrum of the public opinion (media) and psychological warfare. Operationally, “Three Warfares” is supervised by the General Political Department’s (GPD) Liaison Department, which operates in coordination with People’s Liberation Army (PLA) and has a group that concentrates on Taiwan operations (Raska, 2015, December 18). “Three Warfares” is also supported by the Chinese Communist Party’s (CCP) (also called Communist Party of China (CPC))6 Central Foreign Propaganda Office (CFPO), which among other things, studies public sentiment; releases foreign propaganda guidelines, plans, and methods; and promotes China’s views, international public relations, movies, and book production (Lee, 2014: 211).

 Chinese Communist Party (CCP) and Communist Party of China (CPC) are used interchangeably by different sources. 6

b3411_Ch-06.indd 87

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

88  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

People’s War The “Three Warfares” benefits from the People’s War, which provides the grounds for the use of civilians (non-military) against opponents to compel them to accept one’s interests. The “People’s War” concept dates back to Mao Zedong (Chase and Chan, 2016: 31), and it is inspired by the vision of a prolonged, territorial war that encircles the opponent in Chinese lands and gradually “destroy[s]’ it through attrition” (Anderson and Engstrom, 2009: 3). “People’s War” propagates the need to complement conventional forces with non-military affiliate “people’s militia” (Zedong, 1972: 88–90; Bullard, 1997: 6), and leverages on the organizational weapon, which seeks to organize non-military and non-political populace as masses impacting the outcome of conflicts (Bullard, 1997: 6, 8). “Mass mobilization” by propaganda; sabotage and espionage; and kindling domestic turmoil by exploiting internal opposition are some of the strategies employed within this spectrum (Bullard, 1997: 9–11). Civilian contributions to the military fall under two categories: “absorption” and “integration” (Green, 2016: 2). Absorption refers to the recruitment of civilians for the operations of People’s Liberation Army (PLA), Ministry of State Security (MSS), Office of Propaganda, and other related bodies (Green, 2016: 2). Integration refers to civilians who are employed by civilian industry, but who may be requested to assist the government in IW (or IO) when desired (Green, 2016: 2; emphasis added). Integration can be further divided into “formal procurement relationship”,“formal outsourcing” (further divided into “commercial transactions” and “government payroll”), “transactional and coerced outsourcing”, and “operational insourcing” (Sheldon and McReynolds, 2015: 191). While this categorization excludes “episodic” civil–military collaborations

b3411_Ch-06.indd 88

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  89

(Sheldon and McReynolds, 2015: 191), such temporary engagements may still play a role in operations. Civilian groups assisting IW efforts are often hired from and arranged into “‘cells’ within government, telecommunications and academic institutions” (Green, 2016: 3). They are regularly called on by the PRC to circulate propaganda, shape domestic and international opinions, cement the regime’s credibility, and to dispel the risk of “collective action” against the government (Green, 2016: 4–5). For instance, the “50c Party” is a group of hired commentators who manipulate the information space online by dumping large volumes of information supportive of the party policies or deflecting attention from sensitive issues (King et al., 2017). The involvement of civilians is not limited to hired citizens or industry professionals. There are volunteer groups such as “volunteer 50c members” (“bring your own grainers”), “little red flowers”, and “American Cent Party”, who distribute party propaganda or antiWestern publicity without any monetary attachment and with no organized structure (King et al., 2017: 1–2). These groups cannot be neatly subsumed under the categories of “absorption” and “integration” because, aside from military–civilian engagement, some act individually or participate because of personal motivations, and it is hard to identify the nature of their ties to the official sources and the reasons for joining in such operations. Additionally, the reserve forces provide support to IW/IO efforts. For instance, the Hubei province, Echeng district, was cited as an example of a reserve force “training base for IW” and of “district level” IW groups conducting “electronic warfare”, and “intelligence and psychological warfare” (Anand, 2006, October). The future may bring a swell in the size of auxiliary noncombatant forces, especially with the further implementation of the Social Credit system in China. China’s Social Credit system scores

b3411_Ch-06.indd 89

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

90  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

the social performance of citizens, and grants or restricts their access to important services, such as means of transportation, based on their score (Botsman, 2017, October 21). This could potentially create an incentive for citizens to participate in pro-China IO for the purposes of improving their Social Credit score.

The PRC’s Use of Civilian Force in its Operations Against Taiwan Taiwan’s ongoing defence strategy continues to prepare for a future PRC attack against the island, and considers the PRC’s capabilities to wage “multiple forms of warfare” as a significant threat (Grossman and Chase, 2017, June 14). The prominent areas where the PRC’s “People’s War” on the minds of Taiwanese unfold include media manipulation via control over media owners, content creators (including the 50c army) and journalists, and the activities of the United Front (including espionage).

Media Manipulation via Media Owners, Content Creators, and Journalists Media is an integral apparatus for manipulating public opinion, agenda setting, constructing opinion on politics and political actors, and for framing issues in a particular way (Hutchinson, 2006: 216). While the PRC’s tightly-knit media control makes it difficult for foreign agents to penetrate and spread influence, Taiwan’s open media sphere and access to PRC information sources render its citizens more vulnerable to the IO, especially public opinion and psychological warfare of the PRC. Deception and manipulation by way of media personnel occurs mainly through the channels of media owners, content creators, and journalists.

b3411_Ch-06.indd 90

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  91

Media owners Various Taiwanese sources have voiced concerns about alleged PRC propaganda being transmitted by local media channels of conglomerates with businesses or stakes in Mainland China (Kenichi, 2014: 25; Wang, 2008, November 5). For instance, the Want Want Group, whose media holdings include three Taiwan newspapers, a television station, various magazines, and a cable network (Flannery, 2014, August 6; Wang, 2008, November 5) derives a significant portion of its revenues from its operations in the PRC (Wang, 2008, November 5). Their chairman has been accused of trying to shape opinion through media that echo the views of Beijing, of having the PRC’s financial backing, and of allowing “very biased” reporting in favour of positive news about China (Higgins, 2012, January 21). Content creators, hackers, and internet commentators Taiwan’s national security apparatus claimed in 2017 that the disinformation campaign on President Tsai Ing-wen’s pension reform was orchestrated by “Chinese elements” who “mobili[zed] protesters” and triggered “disinformation” circulation in the electronic media, including the social media site LINE and WeChat, and who potentially leveraged “content farms”,7 flock of hired content creators, in this operation (Cole, 2017, July 18). The Taiwanese Ministry of National Defence also reported in 2017 that China hackers working for the Chinese government had attempted to inundate “online forums with negative and misleading information about Han Kuang [military] exercises”, echoing the previous year  Content farms refers to the large numbers of paid content creators who create and disseminate large amounts of content in a way that manipulates online search results. 7

b3411_Ch-06.indd 91

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

92  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

when they had shared trumped-up photographs depicting Chinese jets scouting over (Tien-pin and Heterington, 2017, May 30). Various sources, although hard to prove, attribute these attacks to the “Internet commentators” or “50c party” members; the PRC’s large team of hired content creators who deflect unwanted conversations or spread celebratory comments about the CCP in cyberspace in the guise of ordinary citizens (King et al., 2017: 1–2). 50c Party’s endeavours are complemented by the operations of the “Internet water army”, which has “astroturfers”8 offering paid service to “companies and other actors” for promoting their interests, and volunteer cyber forces such as the “volunteer 50c members” (“bring your own grainers”), “little red flowers”, “American Cent Party” (King et al., 2017: 1–2). While some of these groups conduct international operations, others focus on the domestic sphere. According to Gary King and colleagues (2017), the PRC “fabricates and posts about 448 million social media comments a year nationwide” (King et al., 2017: 26). Some of this fabricated content is available to Taiwanese who share the same language with the PRC. One prominent attack took place in 2016, when an online forum in the PRC galvanized millions of pro-China PRC netizens to show their disapproval of the election of (pro-independence) Taiwanese President Tsai by attacking her Facebook page as well as the comments sections of some news sites (e.g., Apple Daily, SETN.com) (Tang, 2016, January 21). These PRC web users appear to be volunteers who were either spontaneously expressing the views of  Astroturfing is a covert and orchestrated effort to install a sense of “widespread” “grassroots” support for an intended occasion, policy, persona, or brand. See “Astroturfing,” Merriam Webster, https://www.merriam-webster.com/dictionary/ astroturfing. See Also Adam Bienkov, “Astroturfing: What is it and why does it matter?,” The Guardian, February 8, 2012, https://www.theguardian.com/ commentisfree/2012/feb/08/what-is-astroturfing. 8

b3411_Ch-06.indd 92

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  93

China’s youth, or who had been successfully shaped by years of thought and political courses in school (Tang, 2016, January 21). While domestic IO takes place within the PRC’s Great Firewall, international attacks like this involve volunteers using social media platforms that are outlawed in the PRC. This is a conundrum because in the PRC, access to Facebook and some of the other sites used for disinformation campaigns are prohibited (Tang, 2016, January 21; Monaco, 2017), and the use of virtual private networks (VPNs) to reach Facebook may be a criminal offence (Tang, 2016, January 21). The question remains open if the PRC authorities, in fact, sanction such breaches of their Internet control, and if the results serve the broader objectives of the People’s War. Journalists The PRC also controls what news is reported in and on the PRC, and constructs its side of the story at the expense of other views, through enforcing stringent visa policy for journalists (Sciutto, 1996: 136; Johnson, 2016, September 22), influencing journalists by perks, “rewards”, and blackmail (Sciutto, 1996: 135), and by pressuring journalists and media companies (Johnson, 2016, September 22). Foreign journalists are also closely monitored during their stay in PRC (Johnson, 2016, September 22). These measures defend the PRC against foreign penetration (including Taiwan’s intervention) into its media ecology, and amplify its power over journalists and media content and companies.

Influence and Espionage by Civilians in Taiwan United Front Work Department (UFWD) Another prominent non-military force in the “People’s War” is the United Front Work Department (UFWD), which is attached to the

b3411_Ch-06.indd 93

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

94  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

CCP’s Central Committee (Gao, 2017, October 24). Its mission is to accumulate information and intelligence, “co-opting” and “control” non-CCP affiliate “elites”, and specifically, its bureau overseeing Hong Kong, Macao, and Taiwan (Gao, 2017, October 24). Also, Overseas liaison is tasked with propagating “one country, two systems” ideology, “recruiting pro-China people from overseas”, and gathering information (Gao, 2017, October 24). Some of the alleged operations of the UFWD in Taiwan include lobbying by Taiwanese businesspeople with business interests in the PRC to strengthen trade ties between two countries (Lee and Hung, 2014, November 27). The PRC is also actively luring Taiwan’s top graduates with better paid employment in the PRC, contributing to a brain drain on the Taiwanese economy, and raising fears of “attempted social engineering”, and influence (Smith, 2017, August 21). Espionage Espionage or intelligence-gathering efforts, including UFDW information collection efforts, are essential to support IO. In recent years, Taiwan authorities have detained a PRC student studying in Taiwan on suspicion of espionage and breaching national security laws (“Taiwan detains Chinese student”, 2017, March 10). The bodyguard of the former Taiwanese Vice-President Annette Lu was allegedly hired by the PRC to “recruit an intelligence officer to gather information for Beijing” (Bristow, 2017, March 18). Additionally, four “pro-unification New Party” (Taiwanese political party) members were arrested with the claim of “violating the National Security Act” (Connor and Smith, 2017, December 20).

Taiwan’s Capacity to Respond Taiwan is faced with an external military and non-combatant danger posed by the PRC, and the internal threat stemming from the risk of

b3411_Ch-06.indd 94

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  95

espionage, sabotage, and the PRC’s exploitation of domestic dissent (Bullard, 1997). Accordingly, Taiwan, not only has to defend itself against military operations, but also “social, political and economic forces” (e.g., spies, defectors, political organizers, etc.) (Bullard, 1997: 5). The challenge lies in confronting the threats within the confines of democratic constitutional values (Bullard, 1997: 11). The key values in this conflict are solidarity and nationalism, citizen education, and military–citizen engagement, and to protect Taiwanese from influence, while extending Taiwanese influence within and beyond its borders. Taiwan aims to instil these values through the works of the Political Warfare Bureau, other institutions responsible with PRC affairs, cyber army, and digital platforms, and in doing so, it builds on the early concept of Allegiance Warfare.

Allegiance Warfare Allegiance warfare9 was a “political socialization”10 and a “non-violent warfare” strategy “aimed at human mind” within the framework of active “civil war” (Bullard, 1997: 13–14). It was proposed as Taiwan’s response to the organizational weapon of the PRC (Bullard, 1997: 16) in the times of nation-state building, and arguably, it continues to inform Taiwan’s military–citizen engagement and national integration efforts. It sought to nurture good relations between the military and civilians; its target audience included youth, civilians, and military  While Bullard dates the concept back to the establishment of Taiwan and asserts its introduction as a mechanism to counter the ‘Leninist organizational weapon threat,’ it aided ‘national integration,’ and ‘civic development.’ (See Bullard, “the Soldier,” p. 173). Arguably, the perseverance of the strategies proposed under the allegiance warfare are overt in the efforts of Taiwan’s political Warfare Bureau. 10  Political socialization entails informing citizens on the ‘political culture’ via official ‘education’ as well as through unofficial means such as ‘family,’ and ‘media.’ See Bullard, “The Soldier,” pp. 13–14. 9

b3411_Ch-06.indd 95

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

96  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

personnel; and it carried out activities such as education, indoctrination, and propaganda (Bullard, 1997: 68, 77). The Political Warfare Bureau and other contemporary government and military initiatives continue to enhance military–civilian relations, and strengthen national integration and civilian involvement in national defence.

Political Warfare Bureau The Political Warfare Bureau has been responsible for bridging the military and society, guiding political warfare (Bullard, 1997: 12), conducting the military’s political warfare planning and supervision, psychological warfare training, and carrying out political education of civilians (“Development History: Introduction of the General Political Warfare Bureau”). It works with ministries, schools, and other parties in “promo[ting] All-Out Defense Education” (“All-Out Defense Education”), through various means including public education,11 campus rallies, improving in-service and staff training, and instituting effective communication and a review system (“AllOut Defense Education”). Building national solidarity is central to its role, and some of its efforts are aimed at political socialization. The Reserve force is also of help in this aspect. Among other things, it seeks to forge better relations between civilians and the military, prepares people for mobilization, ensures the conservation of military skills, and helps advance national solidarity (Easton et al., 2017: 15–17, 20).

 Holding conferences, shooting training arranged in cooperation with the Ministry of Education, training of educational staff, incorporating information on military and national defense in the instructional content offered to grade 1–9, promoting ‘All-Out’ defense are some of the highlighted measures (Political Warfare Bureau: All-Our Defense Education, 2017). 11

b3411_Ch-06.indd 96

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  97

Other Institutions Responsible for the PRC Affairs There are multiple bodies in Taiwan responsible for responding to the PRC’s messages and supervising affairs with the PRC, including the Ministry of Foreign Affairs (MOFA), Ministry of Culture, and Mainland Affairs Committee (MAC). Among them, MAC conducts “research” on, and “plan[s]”, “review[s]”, and “coordinat[es]” “Mainland policies” and assists the execution of “inter-ministerial programs” (“Organization and Function”). These bodies with different visions, audiences, and responsibilities12 need to have consistency and coordination among their operations to be able to deliver a consistent and credible message.

Cyber Army and Cyber Troops In 2016, Taiwan declared its interest to build a cyber army as a “fourth branch of the armed forces” (Pan, 2016, May 27). A year later, the new Information, Communications and Electronic Force Command was formed under the Ministry of National Defense to oversee cybersecurity “readiness” and explore “electromagnetic technologies” (“Ministry of National Defense launches,” 2017, July 03). Differently from the official cyber forces, some political parties in Taiwan have their own “cyber troops” that carry out IO (Bradshaw  Rawnsley (2016) argues that the “dissolution” of Government Information Office (GIO) and the allocation of its work into two different bodies, Ministry of Foreign Affairs (MOFA) and a new “Ministry of Culture,” hamper the central coordination and delivery of a “consistent message.” Moreover, GIO and MOFA diverge in their “world-views, audiences, time horizons and priorities,” which impede the construction of a unitary and credible public diplomacy outlook. Furthermore, existence of other bodies (such as MAC) dealing with the PRC affairs further complicates the issue and challenges the delivery of a consistent message on intersecting concerns. 12

b3411_Ch-06.indd 97

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

98  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

and Howard, 2017: 13). For instance, cyber troops were used in a mayoral race to promote one of the candidates and defame the other (Ko and Chen, 2015: 394–395). While the official cyber army aims to secure Taiwan of cyber threats, these political cyber troops, at times, directly or indirectly, aid the PRC’s IO from within. On the other hand, there is no evidence so far of the cyber troops being used in operations against the PRC (Monaco, 2017: 15). Hence, internal operations of these political cyber troops may create confusion in the domestic sphere and may take a toll on the trust in government. This, coupled with some of the efforts of cyber troops which champion unification messages and promoters over independence messages and supporters, may aid the PRC by serving its IO goals.

Digital Platforms Taiwan recently introduced two digital platforms, vTaiwan and Pol.is, that may support its efforts against IO. vTaiwan is a platform for citizens to interact, connect with, and pose questions to government officials, and receive information on various topics (e.g., to counter fake news) (O’Flaherty, 2017, March 15). The vTaiwan platform has also been used to deliberate legislation related matters. Pol.is an artificial intelligence (AI)-facilitated tool that consolidates vast amount of online communication into “maps of public opinion” and facilitates citizens’ inclusion in the policymaking process (“Crowdsourcing Legislation”; Simonite, 2017, June 2). Pol.is is used to push surveys to interest groups, it is credited with helping ROC “break a six-year deadlock over how to regulate online alcohol sales”, and the platform supposedly allowed opposition groups to discern their common concerns (Simonite, 2017). In addition to vTaiwan and Pol.is, the nongovernmental initiative G0V has been releasing projects to enhance information “transparency” and “citizen participation” (“About G0V”).

b3411_Ch-06.indd 98

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  99

Disinformation Specific Initiatives There are proposals to legislate disinformation in Taiwan (Lin, 2017). As the discussions continue, government entities, industry, and third parties (Hsaio, 2017) put non-legislative measures into practice. Taiwan plans to “commission businesses and third parties to set up fact-checking mechanisms” (Hsaio, 2017), and to collaborate13 with Facebook and other social media companies in this endeavor (Liu, 2017, April 10). G0V, a non-governmental body aspiring to enhance “transparency” and “citizen participation” (“About G0V”) has been fighting disinformation with its “browser extension” solution called News Helper, which identifies fake news through crowd-sourcing, notifies the reader on already fact-checked stories, and shares links to the real story or contradictory evidence (McKenzie, 2017, January 9). Besides, G0V funded a bot working on LINE to help users identify fake news (Monaco, 2017: 14–15). Taiwan is also preparing to introduce media literacy into the school curriculum (Smith, 2017, April 7).

Is There Room for Improvement? The mechanisms discussed above play a significant role in raising citizen awareness and building national solidarity. In the information operations sphere, they can provide a foundation for building the citizens’ role in combating IO. Building on this foundation, Taiwan needs to enlist new strategies to confront IO in the 21st century, with active participation from citizens.  Executive Yuan and the National Communications Commission (NCC) made the statement. See Chad Liu, “How to fight fake news in Taiwan,” Taipei Times, April 10, 2017, http://www.taipeitimes.com/News/editorials/archives/2017/04/10/ 2003668418. 13

b3411_Ch-06.indd 99

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

100  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integration of Citizens Citizen participation in policy making The vTaiwan and pol.is platforms, both of which seek to involve citizens in decision-making, can further enable citizen participation in policymaking to counter the PRC’s IO efforts by hosting relevant topics in their platforms. For example, akin to the operations of the platforms, when a contentious policy question needs to be determined (which may be a prime target for IO) is, or even the question on how to combat IO itself, such questions can be addressed through these digital platforms: (1) First pol.is can be used to pull out surveys on the question and consolidate discussions around the contentious policy question; (2) Findings of pol.is may be opened to deliberation, and a digital public meeting may be held with the participation of scholars and officials. The scholars and officials may respond to said issues and answer the questions of citizens; (3) Civil society and the government may co-facilitate an in-person stakeholder meeting, and broadcast to remote participants; and (4) Finally, the government may bind its action to points that reached consensus or provide a point-by-point explanation of why those consensus points are not (yet) feasible.14 Crowd sourced platforms like G0V can expand their solutions into different formats (besides browsers and LINE bots). They can  vTaiwan draws information from the polls and maps of pol.is, broadcast expert discussions on the mapped issue, transfers the stakeholder meeting on the subject, and allows the government to make an informed decision on the issue and provide solid reasons for the decision taken. See Liz Barry, “VTaiwan: Public Participation Methods on the Cyberpunk Frontier of Democracy,” Civic Hall, August 11, 2016, https://civichall.org/civicist/vtaiwan-democracy-frontier. 14

b3411_Ch-06.indd 100

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  101

further integrate citizens into developing solutions aimed at combating public opinion and psychology related operations. A number of caveats have to be addressed when using these platforms. (1) Platforms have to be protected against attempts to subvert as they scale up. At present, poli.s visualizations reveal astroturfing by exposing multiple identical views, but offenders may try to cheat the system (Simonite, 2017, June 2). (2) pol.is and vTaiwan meetings may not be attended by some citizens, and not all citizens will choose G0V’s solutions. It is crucial to expose more citizens to these platforms for greater citizen engagement in decision-making, information verification, and conscious information consumption and circulation. Engagement in pol.is and vTaiwan and use of G0V’s solutions may be promoted in classes and other public platforms (e.g., bus stops, housing estates, television, radio, etc.). (3) These platforms have to be protected from possible cyberattacks. Citizen participation in combating IO Taiwan recently established an Information, Communications and Electronic Force Command under the Ministry of National Defense. This division could seek to further raise the awareness of civilians on IW and IO threats. They could also provide civilians with ammunition in the form of facts debunking current disinformation, and updates on current IW and IO trends. Civilians can play an active role in diverse aspects of IO including fact-checking, debunking disinformation, and spreading counter-messages. The civilian role should not extend to unethical tactics like flooding online forums with comments, or generating large amounts

b3411_Ch-06.indd 101

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

102  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

of repeated content to influence online search results, or fabricating social media comments, as the 50c party is accused of doing. Such actions would damage Taiwan’s credibility as an open and democratic state. With this, Taiwan should also investigate the party-affiliated cyber troops operating in its territories. Also, it should examine the existent anti-Chinese state propaganda waged on Twitter to appeal to Chinese diaspora (Bolsover and Howard, 2018).15

Delivering a Consistent and Timely Message On the part of the government, it must deliver a consistent message in a timely manner to preserve citizen support. To achieve this, Taiwan needs to overcome bureaucratic obstacles to communication, and install a coordination between bodies dealing with PRC affairs and responding to the messages coming from the PRC, as failure to do so would allow the PRC to create narratives before the official one is put forth (Rawnsley, 2016, September 2).

Lessons for Other States Understanding the Threat This chapter discussed the information operations in the Taiwan Straits. However, IO, which stretch beyond the Western conception of cyberspace and override the separation between wartime and peacetime, are not restricted to the Taiwan Straits. For instance, Australia has raised concerns about a “concerted campaign by China to “infiltrate” Australian politics to promote Chinese interests”, and  As described by Gillian Bolsover and Philip Howard, “Computational Propaganda in Europe, the U.S. and China,” in the preceding chapter of this book. 15

b3411_Ch-06.indd 102

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  103

cited political donations by Chinese businessmen as a channel of influence (Westbrook, 2017, December 5). Other alleged tactics include “monitoring” Chinese students in Australia, “creat[ing] joint ventures” between Chinese state media and “Australian Chineselanguage news outlets”, and “establishing centres for Chinese language and cultural studies” (Kurlantzick, 2017, December 13). Also, IO is not under the monopoly of Russia and China; other states are also equipping themselves in this front. According to the study of Samantha Bradshaw and Philip N. Howard (2017), 28 countries have cyber troops. In its 2017 report, Freedom House revealed that 30 out 65 countries included in their study had “pro-government commentators” (Freedom on the Net, 2017). The domestic content manipulation efforts cannot be evaluated in isolation from international affairs, as in today’s ‘connected’ world order domestic politics can impact international relations and politics of different countries, and as domestic population may be deceived on issues concerning other countries. More importantly, the recent scandal of Cambridge Analytica demonstrated the complex network of agents involved in IO and the variety of measures taken to sway public opinion and psychology. The examples discussed in this chapter illustrate a key lesson for other states: deliberate online falsehoods may be part of larger multi-pronged information operations, which integrate many diverse tactics both online and offline. The plausible deniability of civilian actions in Unrestricted Warfare, especially in cyber operations and IO, allows attacking states to keep their hands clean, and makes diplomatic response difficult, if not impossible. Target states may have to accept that attribution is a lower priority, and focus on prevention and response instead. Target states who subscribe to the Western view of cyber operations as involving networks of information technology

b3411_Ch-06.indd 103

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

104  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

infrastructures and computers are at a disadvantage against other states such as the PRC that take a broader view, such as the information domain which comprises of “subdomains” including “the electromagnetic”, “computer network”, “psychological”, and “intelligence” domains (Sheldon and McReynolds, 2015: 197). Target states need to take account of this broader view to avoid being blindsided by threats that cross several subdomains.

Citizen Participation in Policy Responses For many of the target states, combating IO by clamping down on information flows would variously compromise their core democratic principles and human rights stance, and adversely affect their international trade and business interests. On the other hand, experts like Joseph Nye posit, “openness is a key source of democracies’ ability to attract and persuade;” closing access or ending openness would waste this crucial asset (Nye, 2018). For policy responses to have greater legitimacy, they must therefore draw input from multiple stakeholders in the public sector, private sector, and civil society. This can take the form of public consultations, town hall meetings, or methods like the vTaiwan process described earlier.

Mobilizing Citizens against IO Taking a leaf from the “Three Warfares” emphasis on winning minds, states with a history of building national solidarity will be more resilient against IO. Strategies like Taiwan’s “All Out Defence”, or Singapore’s Total Defence (“The Five Pillars of Total Defence”), must be kept fresh and updated to respond to modern threats.

b3411_Ch-06.indd 104

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  105

Some themes are still deeply relevant, such as Singapore’s Social Defence pillar which is against “extremist ideologies and racial prejudice and discrimination [that] endanger social cohesion and harmony” and stands for “gracious and compassionate society when we help the less fortunate and underprivileged among us” (“The Five Pillars of Total Defence”). States that can harness modern information and communication technologies to strengthen such narratives will be in a good position not only to respond to IO threats, but also to proactively mobilize their populations against them.

Citizen Participation in Responding to IO As states see the opportunity to mobilize their civilian populations against foreign IO, more study is needed to determine how to best implement this: whether by integrating civilians who are hired as needed, by mobilizing volunteers, or by some other means. In doing so, they should abide by the guidelines of human rights, and ensure the physical and psychological security and well-being of citizens. Nye suggests that the democratic states should not mimic or retaliate the attacks of authoritarian forces by way of surreptitiously orchestrated IO (Nye, 2018, January 24). This is because covert actions would eventually be exposed and consequently damage the democratic states’ soft power.16 Instead, true soft power comes from civil societies more than official public diplomacy efforts. Sources of soft power include civil societies, entertainment media, educational opportunities, and arts and culture, all of which can benefit from state support, but are best left to be operated and managed by  Soft power, according to Nye is “the ability to affect others by attraction and persuasion rather than through the hard power of coercion and payment.” See Nye, “How Sharp.” 16

b3411_Ch-06.indd 105

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

106  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

civilians in a liberal media environment. This soft power can then counteract information operations at home, or even deter citizens of the attacker from participating in information operations against the target (Nye, 2018, January 24). This may be the democratic states’ version of the “People’s War”, where civilians participate in building soft power, exposing IO, and strengthening national solidarity and resilience. Authoritarian countries conversely have trouble generating their own soft power because of their unwillingness to free the talents of their civil societies (Nye, 2018, January 24). A Chinese author had suggested that “an IW (information warfare) victory will very likely be determined by which country can mobilise the most computer experts and parttime fans… That will be a real People’s War” (as cited in Anand, 2006, October). Amid the surreptitious integration of civilians into IO against public opinion psychology in domestic and international arenas, to avoid further cyber-state fights, major preventive steps have to be taken in the international front.

References “About G0V.” G0V. https://g0v.tw/en-US/about.html. “All-Out Defense Education of the Republic of China (Taiwan).” Political Warfare Bureau, M.N.D., January 20, 2017. Accessed on December 20, 2017, http://gpwd.mnd.gov.tw/english/Publish.aspx?cnid=267&p=4421. Anand, Vinod. 2006. “Chinese Concepts and Capabilities of Information Warfare.” Institute for Defence Studies and Analyses, October 2006. https:// idsa.in/strategicanalysis/ChineseConceptsandCapabilitiesofInformation Warfare_vanand_1006. Anderson, Eric C. and Jeffrey G. Engstrom. 2009. “Capabilities of the Chinese People’s Liberation Army to Carry Out Military Action in the Event of a Regional Military Conflict.” Paper for the U.S.-China Economic and

b3411_Ch-06.indd 106

20-Nov-18 12:40:21 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  107

Security Review Commission, (2009): 1–61. https://www.uscc.gov/sites/default/ files/Research/CapabilitiesoftheChinesePeople%27sLiberationArmy toCarryOutMilitaryActionintheEventofaRegionalConflict.pdf. “Astroturfing.” Merriam Webster. https://www.merriam-webster.com/ dictionary/astroturfing. Barrett JR, Barrington M., “Information Warfare: China’s Response to U.S. Technological Advantages.” International Journal of Intelligence and CounterIntelligence 18, (2005): 682–706. http://dx.doi.org/10.1080/ 08850600500177135. Barry, Liz. 2016. “VTaiwan: Public Participation Methods on the Cyberpunk Frontier of Democracy.” Civic Hall, August 11, 2016, https://civichall. org/civicist/vtaiwan-democracy-frontier. Bienkov, Adam. 202. “Astroturfing: what is it and why does it matter?.” The Guardian, February 8, 2012. https://www.theguardian.com/ commentisfree/2012/feb/08/what-is-astroturfing. Bolsover, Gillian, and Philip Howard. 2018. “Computational Propaganda in Europe, the U.S. and China.” In DRUMS, edited by Norman Vasu, Benjamin Ang, Shashi Jayakumar, World Scientific. Botsman, Rachel. 2017. “Big data meets Big Brother as China moves to rate its citizens.” Wired, October 21, 2017. http://www.wired.co.uk/article/ chinese-government-social-credit-score-privacy-invasion. Bradshaw, Samantha and Philip N. Howard. 2017. “Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation.” Computational Propaganda Research Project no 2017.2, (2017): 1–36. http://comprop.oii.ox.ac.uk/wp-content/uploads/ sites/89/2017/07/Troops-Trolls-and-Troublemakers.pdf. Bristow, Michael. 2017. “China accused by Taiwan of stepping up spy operations.” BBC, March 18, 2017. http://www.bbc.com/news/worldasia-39307866. Bullard, Monte R.. 1997. The Soldier and the Citizen: Role of the Military in Taiwan’s Development. New York: M.E. Sharpe, Inc. Chase, Michael S. and Arthur Chan. 2016. China’s Evolving Approach to Integrated Strategic Deterrence. California: Rand Corporation.

b3411_Ch-06.indd 107

20-Nov-18 12:40:21 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

108  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Cheng, Dean. 2012. “Winning Without Fighting: Chinese Legal Warfare.” The Heritage Foundation, May 21, 2012. http://www.heritage.org/asia/ report/winning-without-fighting-chinese-legal-warfare. Cole, Michael J. 2017. “Taiwan Confirms China’s ‘Black Hand’ Behind Anti-Reform Protests.” Taiwan Democracy Bulletin, July 18, 2017. https://bulletin.tfd.org.tw/tdb-vol-1-no-10-china-black-hand-protests/. Connor, Neil and Nicola Smith. 2017. “China outcry as Taiwan arrests proBeijing party members accused of spying.” The Telegraph, December 20, 2017. http://www.telegraph.co.uk/news/2017/12/20/china-outcrytaiwan-arrests-pro-beijing-party-members-accused/. “Crowdsourcing Legislation.” Pol.is. accessed January 9, 2018, https://pol. is/home. “Development History: Introduction of the General Political Warfare Bureau, M.N.D.” Political Warfare Bureau, M.N.D.. May 7, 2012, accessed November 10, 2017, http://gpwd.mnd.gov.tw/english/Publish. aspx?cnid=97. Easton, Ian, Mark Stokes, Cortez A. Cooper, and Arthur Chan. 2017. “Transformation of Taiwan’s Reserve Force.” RAND Corporation. https:// www.rand.org/content/dam/rand/pubs/research_reports/RR1700/RR1757/ RAND_RR1757.pdf.20. Flannery, Russell. 2014. “Billionaire’s Media Push Tests The Toughness Of A Taiwan “Strawberry.” Forbes, August 6, 2014. https://www.forbes. com/sites/russellflannery/2014/08/06/billionaires-media-push-tests-thetoughness-of-a-taiwan-strawberry/. Fravel,Taylor M. 2015. “China’s New Military Strategy: ‘Winning Informationized Local Wars’.” China Brief 15, no. 13 (July 2, 2015): 3–7. https://ssrn.com/abstract=2626925. “Freedom on the Net 2017: Manipulating Social Media to Undermine Democracy,” Freedom House, https://freedomhouse.org/report/freedomnet/freedom-net-2017. Gao, Charlotte. 2017. “The 19th Party Congress: A Rare Glimpse of the United Front Work Department.” The Diplomat, October 24, 2017. https://thediplomat.com/2017/10/the-19th-party-congress-a-rareglimpse-of-the-united-front-work-department/.

b3411_Ch-06.indd 108

20-Nov-18 12:40:22 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  109

Gillian Bolsover and Philip Howard. 2018. “Computational Propaganda in Europe, the U.S. and China,” in the preceding chapter of this book. Gold, Michael. 2013. “Taiwan a ‘testing ground’ for Chinese cyber army.” Reuters, July 19, 2013. https://www.reuters.com/article/net-us-taiwan-cyber/taiwan-atesting-ground-for-chinese-cyber-army-idUSBRE96H1C120130719. Green, Kieran Richard. 2016. “People’s War in Cyberspace: Using China’s Civilian Economy in the Information Domain.” Military Cyber Affairs 2, issue 1 (2016): 1–11. doi: 10.5038/2378-0789.2.1.1022. Grossman, Derek, and Michael S. Chase. 2017. “Taiwan’s 2017 Quadrennial Defense Review in Context.” The RAND Blog, June 14, 2017. https:// www.rand.org/blog/2017/06/taiwans-2017-quadrennial-defensereview-in-context.html. Higgins, Andrew. 2012. “Tycoon prods Taiwan closer to China.” The Washington Post, January 21, 2012. https://www.washingtonpost.com/ world/asia_pacific/tycoon-prods-taiwan-closer-to-china/2012/01/20/ gIQAhswmFQ_story.html?utm_term=.bfa9e86cafd6. Hsiao, Russell. 2017. “Taiwan’s ‘Cyber Army’ to Become Operational Before End of 2017.” Global Taiwan Brief 2, issue 16 (April 19, 2017). http://globaltaiwan.org/2017/04/19-gtb-2-16/. Hsiao, Alison. 2017. “Online system to combat ‘fake news’.” Taipei Times, March 17, 2017, http://www.taipeitimes.com/News/taiwan/archives/ 2017/03/17/2003666932. Hutchinson, William. 2006. “Information Warfare and Deception,” Informing Science 9, (2006): 213–223. http://www.inform.nu/Articles/ Vol9/v9p213-223Hutchinson64.pdf. “Information, communication and electronic warfare command formed,” The China Post, June 29, 2017, https://chinapost.nownews.com/ 20170629-2048. Johnson, Ian. 2016. “Foreign Reporters in China Face More Restrictions Now, Report Says.” The New York Times, September 22, 2016. https:// www.nytimes.com/2016/09/23/business/china-foreign-media-penamerica.html. Ken-ichi, Yamada. 2014. “Taiwanese Media’s ‘Going along with Beijing’ is Becoming More Evident,” NHK Broadcasting Culture Research Institute

b3411_Ch-06.indd 109

20-Nov-18 12:40:22 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

110  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Media Research and Studies, (February 2014): 1–30. https://www.nhk. or.jp/bunken/english/reports/pdf/report_14020101.pdf. King, Gary, Jennifer Pan, and Margaret E. Roberts. 2017. “How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, not Engaged Argument.” (April 2017): 1–42. https://gking. harvard.edu/files/gking/files/50c.pdf. Ko, Man-Chun and Hsin-His Chen. 2015. “Analysis of Cyber Army’s Behaviours on Web Forum for Elect Campaign.” In Information Retrieval Technology, edited by Zuccon G., Geva S., Joho H., Scholer F., Sun A., Zhang P. Lecture Notes in Computer Science 9460, (Springer, Cham, 2015): 394–399. https://doi.org/10.1007/978-3-319-28940-3_32. Kurlantzick, Joshua. 2017. “Australia, New Zealand Face China’s Influence.” Council on Foreign Relations, December 13, 2017. https://www.cfr.org/ expert-brief/australia-new-zealand-face-chinas-influence. Lee, Yimou and Faith Hung. 2014. “Special Report: How China’s shadowy agency is working to absorb Taiwan.” Reuters, November 27, 2014. http://www.reuters.com/article/us-taiwan-china-special-report/specialreport-how-chinas-shadowy-agency-is-working-to-absorb-taiwanidUSKCN0JB01T20141127. Lee, Sangkuk. 2014. “China’s ‘‘Three Warfares’’: Origins, Applications, and Organizations.” Journal of Strategic Studies 37, issue 2 (2014): 198–221. Doi: 10.1080/01402390.2013.870071. Liang, Qiao and Wang Xiangsui. 1999. “Unrestricted Warfare.” PLA Literature and Arts Publishing House (February 1999): 1–228. http:// www.c4i.org/unrestricted.pdf. Lin, Lihyun. 2017. “Taiwan.” Digital News Report. http://www. digitalnewsreport.org/survey/2017/taiwan-2017/. Liu, Chad. 2017. “How to fight fake news in Taiwan.” Taipei Times, April 10, 2017, http://www.taipeitimes.com/News/editorials/archives/2017/ 04/10/2003668418. “LKY School’s Professor Huang Jing identified as ‘agent of influence of a foreign country’: MHA.” Channel NewsAsia, August 4, 2017. https:// www.channelnewsasia.com/news/singapore/lky-school-s-professorhuang-jing-identified-as-agent-of-9093316.

b3411_Ch-06.indd 110

20-Nov-18 12:40:22 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  111

McKenzie, Jessica. 2017. “How Civic Activists Counter Fake News in Taiwan.” Civic Hall, January 9, 2017. https://civichall.org/civicist/civicactivists-counter-fake-news-taiwan/. “Ministry of National Defense launches new cybersecurity command.” Taiwan Today, July 3, 2017. https://taiwantoday.tw/news.php?unit= 2&post=117794. Monaco, Nicholas J. 2017. “Computational Propaganda in Taiwan: Where Digital Democracy Meets Automated Autocracy.” Computational Propaganda Research Project No 2017.2, (2017): 1–34. http://blogs.oii. ox.ac.uk/politicalbots/wp-content/uploads/sites/89/2017/06/CompropTaiwan-2.pdf. Nye, Joseph S.. 2018. “How Sharp Power Threatens Soft Power,” Foreign Affairs, January 24, 2018. https://www.foreignaffairs.com/articles/ china/2018-01-24/how-sharp-power-threatens-soft-power. O’Flaherty, Kate 2017. “Fighting fake news: societies using technology to search for truth.” The Guardian, March 15, 2017. https://www. theguardian.com/public-leaders-network/2017/mar/15/fighting-fakenews-societies-technology-search-truth. “Organization and Function.” Mainland Affairs Council Republic of China. https://www.mac.gov.tw/en/News_Content.aspx?n=803F3469131DAF 19&sms=B82C8C4331A350DC&s=6D9EA46A1BCA7586. Pan, Jason. 2016. “Taiwan to go ahead with cyberarmy plan: ministry.” Taipei Times, May 27, 2016. http://www.taipeitimes.com/News/taiwan/ archives/2016/05/27/2003647240. Raska, Michael. 2015. “China and the ‘‘Three Warfares.” The Diplomat, December 18, 2015. https://thediplomat.com/2015/12/hybrid-warfarewith-chinese-characteristics-2/. Raska, Michael. 2015. “Hybrid Warfare with Chinese Characteristics.” RSIS Commentaries, December 2, 2015. https://www.rsis.edu.sg/rsispublication/rsis/co15262-hybrid-warfare-with-chinese-characteristics/#. WgT4exRMGf4. Rawnsley, Gary. 2016. “Taiwan’s trouble talking to the world.” China Policy Institute: Analysis, September 2, 2016. https://cpianalysis. org/2016/09/02/taiwans-trouble-talking-to-the-world/.

b3411_Ch-06.indd 111

20-Nov-18 12:40:22 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

112  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Sciutto, James E. 1996. “China’s Muffling of the Hong Kong Media.” American Academy of Political and Social Science, 547, (September 1996): 131–143. Sheldon, Robert and Joe McReynolds. 2015. “Civil-Military Integration and Cybersecurity.” In China and Cybersecurity: Espionage, Strategy, and Politics in the Digital Domain, edited by Jon R. Lindsay, Tai Ming Cheung, and Derek S. Reveron, pp. 188–225. New York: Oxford University Press. Simonite, Tom. 2017. “The Internet Doesn’t Have to Be Bad for Democracy.” MIT Technology Review, June 2, 2017. https://www.technologyreview. com/s/607990/the-internet-doesnt-have-to-be-bad-for-democracy/. Smith, Nicola. 2017. “Taiwan Is Suffering From a Massive Brain Drain and the Main Beneficiary is China.” Time, August 21, 2017. http://time. com/4906162/taiwan-brain-drain-youth-china-jobs-economy/. Smith, Nicola. 2017. “School kids in Taiwan Will Now be Taught How to Identify Fake News.” Time, April 7, 2017. http://time.com/4730440/ taiwan-fake-news-education/. “Taiwan detains Chinese student in unusual suspected spying case.” Reuters, March 10, 2017. https://www.reuters.com/article/us-taiwanchina-students/taiwan-detains-chinese-student-in-unusual-suspectedspying-case-idUSKBN16H13T. “Taiwan to boost defense spending, U.S. concerned over possible military imbalance: official media.” Reuters, October 30, 2017. https://www.reuters. com/article/us-taiwan-usa/taiwan-to-boost-defense-spending-u-s-concernedover-possible-military-imbalance-official-media-idUSKBN1CZ07H. Tang, Didi. 2016. “Chinese people are flooding the Internet with a campaign against Taiwan.” Business Insider Military & Defense, January 21, 2016. http://www.businessinsider.com/chinese-people-are-flooding-theinternet-with-a-campaign-against-taiwan-2016-1/?IR=T. “The 5 Pillars of Total Defence.” The Ministry of Defence Singapore. https://www.mindef.gov.sg/oms/imindef/mindef_websites/topics/ totaldefence/about_us/5_Pillars.html.

b3411_Ch-06.indd 112

20-Nov-18 12:40:22 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Civilians in the Information Operations Battlefront  113

Tien-pin, Lo and William Hetherington. 2017. “MND halts China ‘fake news’ hack.” Taipei Times, May 30, 2017. http://www.taipeitimes. com/News/taiwan/archives/2017/05/30/2003671564. Wang, Lisa. 2008. “China Times Group is sold to Want Want.” Taipei Times, November 5, 2008. http://www.taipeitimes.com/News/biz/ archives/2008/11/05/2003427822. “What are information operations.” Air University Cyberspace and Information Operations Study Center. N.d. http://www.au.af.mil/infoops/what.htm. Westbrook, Tom. 2017. “Australia, citing concerns over China, cracks down on foreign political influence.” Reuters, December 5, 2017. https://www.reuters. com/article/us-australia-politics-foreign/australia-citing-concerns-over-chinacracks-down-on-foreign-political-influence-idUSKBN1DZ0CN. Yang, Sophia. 2017. “Taiwan declares war on fake news from China.” Taiwan News, January 3, 2017. https://www.taiwannews.com.tw/en/ news/3062594. Zedong, Mao. 1972. Quotations from Chairman Mao Tsetung. Pekin: Foreign Languages Press.

b3411_Ch-06.indd 113

20-Nov-18 12:40:22 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

PART 3

COUNTERING DRUMS

b3411_Ch-07.indd 115

20-Nov-18 12:40:34 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

7 INTEGRATING RESILIENCE IN DEFENSE PLANNING AGAINST INFORMATION WARFARE IN THE POST-TRUTH WORLD JĀNIS BĒRZIŅŠ

War is about politics. This is an obvious and quite simplistic statement, which is based on Clausewitz. As he puts it, war “[…] is not merely a political act but a real political instrument, a continuation of political intercourse, a carrying out of the same by other means. [...] The political design is the object, while war is the means, and the means can never be thought of apart from the object” (Clausewitz, 2000, 280). As a result, since the objective of war is to achieve 117

b3411_Ch-07.indd 117

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

118  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

political victory, nowadays it is simpler and less costly to engage in indirect warfare through influence campaigns or, as Sun Tzu puts it, “warfare is the art (Tao) of deceit”. Therefore, the use of influence operationalizes a new form of warfare not characterized as a military campaign in the classical sense, but instead focuses on using military  instruments to achieve non-military objectives and nonmilitary instruments to achieve military objectives at the same time.

The Fourth Dimension of Modern War Nowadays, the idea of influence is at the very center of operational planning. Planners use levers of influence to achieve operational objectives: skillful internal communications; deception operations; psychological operations; and well-constructed external communications. The main objective is to influence an enemy audience to legitimize specific strategic objectives. These methods focus on an opponent’s inner sociocultural decay ensuing from a culture war exploited by specially prepared forces and commercial irregular groupings. The battleground extends beyond the traditional three dimensions of air, sea, and land, to incorporate contactless information or psychological warfare to become a perception war in cyberspace and in the human mind. A combination of political, economic, information, technological, and ecological influence campaigns creates permanent asymmetric warfare as the natural state of life in an identified operational environment. The main differences between traditional and the current form of warfare can be summarized in 10 points: 1. From direct destruction to direct influence 2. From direct annihilation of the opponent to its inner decay 3. From a war with weapons and technology to a culture war

b3411_Ch-07.indd 118

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  119

4. From a war with conventional forces to especially prepared forces and commercial irregular groupings 5. From the traditional (3D) battleground to information/ psychological warfare and war 6. From direct clash to contactless war 7. From a superficial and compartmented war to a total war, including the enemy’s internal social, economic, cultural, and political structures 8. From war in the physical environment to a war in the human consciousness and in cyberspace 9. From symmetric to asymmetric warfare by a combination of political, economic, information, technological, and ecological campaigns 10. From war in a defined period of time to a state of permanent war as the natural condition in the national life. This gives the possibility of establishing a narrative that serves as an alternative reality as a military strategy, where the support for the strategic objectives of war by the society of a country at war; in other words, the legitimization of war is fundamental to achieving victory. The success of military campaigns in the form of armed conflicts and local wars is much dependent on the relationship between military and non-military factors — the political, psychological, ideological, and informational elements of the campaign — then on just military power as the isolate variable. The security of information systems at a technological level and the willingness of citizens to defend their country at a cognitive level are fundamental elements to guarantee the security of any country. These tactics involve a notion of permanent war that assumes an equally permanent enemy. Therefore, one of the fundamental aspects of modern warfare is the idea that the main battlespace is in the mind. As a result,

b3411_Ch-07.indd 119

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

120  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

new-generation wars will be dominated by information and psychological warfare. This dimension of warfare aims to morally and psychologically depress both the enemy’s military personnel and civilian population in order to reduce the deployment of hard military power to the minimum necessary. It makes the opponent’s military and civil population support the attacker to the detriment of their own country, values, culture, political system, and ideology. It has the purpose of constructing a specific worldview within the population, expressed ideologically by the political support for the interests of the opponent. The main objective is to create an alternative reality as strategy. The idea is that the support for the strategic objectives of war by society in a country at war, in other words, the legitimization of war, is fundamental for achieving victory. In other words, the success of any military campaign in the form of armed conflict is dependent on the relationship between military and non-military factors — the political, psychological, ideological, and informational elements of the campaign — rather than pure military might (Chekinov and Bogadanov, 2010, 13–22). This is facilitated by the almost absolute freedom of information, combined with the public’s appetite for conspiracy theories and postmodern relativism. With social media becoming one of the most important sources of information, the result is a situation in which everything is true, and yet nothing is true at the same time. An invisible occupation, like one in the informational space, cannot be considered an occupation by definition such as a military occupation. In implementing influence operations, it is recognized that the human mind is warfare’s main battlespace. Information and psychological warfare subsequently dominate post-modern wars achieving superiority by morally and psychologically depressing an enemy’s armed forces and population.

b3411_Ch-07.indd 120

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  121

Understanding the Audience Military psychological and information operations is specifically based upon identifying the key audiences in a population, profiling those audience behaviors — both current and latent — and producing influence pathways to either encourage or mitigate that behavior. It has developed very advanced profiling techniques — which is called Target Audience Analysis (TAA) — in which the end state of the program is clearly articulated. Thus, a typical campaign could begin with: “Under what circumstances will villages stop making road side bombs”. TAA then uses advanced social sciences research to determine who the key group(s) are to stop that behavior. In many instances, it may not actually be the people exhibiting the actual behavior, i.e., those laying the roadside bombs. Instead, it is often other groups, perhaps overlooked ones, who are able to exert influence over the target. As illustrated in the Crimea, for example, Russia’s main objective was to minimize the need to deploy hard military power by compelling an opponent’s military and civil population to support the attacker against their own government and country. In this example, Western influence — its civilization, its values, culture, political system, and ideology — is the clear enemy. Russia would like to undermine NATO’s Article 5 and weaken the West’s geopolitical influence. Thus, Russia aims to leverage its influence to debase support for NATO and the EU. Russian strategy has focused on using any means to create schisms to disrupt common security interests, including employing single-issue lobbies with divisive messages, well-funded fringe parties, Russia Today, think tanks, and business lobbies, while at the same time using military power as an informational tool and not as hard power to achieve traditional military tactical objectives. This strategy can be explained in five main points (Vorobyov and Kiselyov, 2014, 45–57):

b3411_Ch-07.indd 121

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

122  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

1. Every available means is used to invigorate the unlawful activity of various extremist nationalist, religious, or separatist movements, organizations, and structures aimed at violating the unity and territorial integrity of the country under attack to destabilize its internal political situation. 2. Purposeful actions to provide support for subversive forces, both directly and indirectly. 3. Distributed attack to destroy the country’s social and ideological system. 4. The mechanism of self-destruction and self-annihilation penetrates into the state’s internal structure, into its governance system, as a virus does, surreptitiously and spontaneously. 5. The chief target of the informational impact is self-awareness of the population, the nation’s mindset. The ideological dimension in war, to win peoples’ hearts and minds, is fundamental for victory. The side with a clear system of ideas and narrative based on a previous understanding of the population to be attacked has a clear operational advantage. Since the ideological dimension of war is fundamental for victory, winning the hearts and minds of the population is decisive. For example, during the SinoJapanese War, Mao Zedong had a significant advantage over Japan, since his ideology had great appeal and motivated people to fight for his ideals, while the Japanese had no ideology to offer (Yamaguchi, 2012). A significant part of influence operations takes place at a mental level. It is easier for the adversary to achieve its objectives if the society of the state being attacked believes that their country is a failed state that does not care for the interests and needs of its own population, and the loss of current statehood will bring better living conditions. Thus, public discontent with the social and economic development of the state can result in a significant security vulnerability, if war is to be conducted by non-military means. The

b3411_Ch-07.indd 122

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  123

following nine points summarize the main objectives to be achieved with influence operations (Nagorny and Shurygin, 2013): 1. Stimulate and support armed separatist actions with an objective to promote chaos and territorial disintegration. 2. Polarize the elite and society, resulting in a crisis of values followed by a process of reality orientation to values aligned towards Russian interests. 3. Demoralize armed forces and the military elite. 4. Conduct strategically controlled degradation of the socioeconomic situation. 5. Stimulate a sociopolitical crisis. 6. Intensify simultaneous forms and models of psychological warfare. 7. Incite mass panic, with loss of confidence in key government institutions. 8. Defamation of political leaders not aligned with Russian interests. 9. Annihilation of possibilities to form coalitions with foreign allies. In the Baltics, the instruments of Russian influence operations are mainly NGOs, informal groups, journalists, academics, artists, opinion leaders, and government officials who may be or may not be aware they are being used as such. The objective is to create discontentment about specific issues within specific social groups, to stimulate some sort of closer connection with Russia, or to increase the rejection of Western values. This has been done by inserting disinformation articles in the Russian language media, using social media (in the form of trolls) to spread fake news or share opinions in favor of Russian interests, promoting events such as May 9,1 and  May 9, also known as Victory Day, is a holiday that commemorates the defeat of Nazi Germany in 1945. 1

b3411_Ch-07.indd 123

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

124  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

commemorating New Year at Moscow’s time, the establishment of a hockey team “owned” by a former KGB colonel to play in the intercontinental “Putin’s” hockey league, just to cite a few. The governments of the Baltic States have been closely following the developments in Russian information and influence operations. Since these countries are established democracies, there is no direct repression against the agents, unless they are spreading radical and hate messages or calling for violent action. This includes not prohibiting the broadcast of Russian television and radio, for example. The main policy is to present the population with facts and critical information, and to directly inform the population about such operations clearly stating who is the attacker (if known), what are the objectives, what is the narrative, and why it is not true. The main narratives used by Russia are: 1. Russian-speaking minorities are marginalized and treated unfairly by the government; 2. The Baltic States are failed states and corruption is widespread; 3. EU membership resulted in economic and social underdevelopment. Latvia should follow its own path without foreign interference; 4. EU membership is equivalent to being in the USSR; 5. NATO membership decreases the overall level of security because of possible Russian countermeasures; 6. Western values are corrupted. Tolerance towards homosexuals and other minorities is presented as the moral degradation of traditional family’s values; 7. There is no real democracy in the West. Politicians are puppets controlled by the financial system and work against the real interests of the population; 8. Fascism is glorified in the Baltics;

b3411_Ch-07.indd 124

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  125

The Receptivity Variable The success of these methods is directly dependent on the level of “exit” found within a country. This can be understood by using A. Hirschmann’s theoretical framework of voice, exit, and loyalty. People can express discontent in two ways: firstly, by directly communicating their dissatisfaction by voice; secondly, by exiting, usually the result of a citizen becoming convinced that voicing has no results. The conclusion is that more voice equates to more loyalty, but more exit means less loyalty. To exit makes sense in economics, as it is the way the market mechanism works. However, at the political level, it is associated with negative trends, as voice is the basis for political participation for the democratic system to work. In this sense, the most radical form of exit is emigration. Hirschman went as far as analyzing the effects of emigration in small states but did not touch on the political nor security implications of another kind of exit, the one that can be called “internal exit”. In this way, a logic theoretical development of Hirschman’s framework is the case when instead of emigrating, people become, voluntarily or otherwise, isolated from the political, economic, cultural, and social systems of the country where they live. Most of the times, it results from the combination of multiple factors, although the most important seems to be political and economic alienation. In this case, the level of loyalty to the country’s macrostructures is negatively correlated to the population’s level of internal exit. Thus, it is also correlated to the level of influence of foreign narratives championing the interests of other countries. In the West, the latest financial crisis and the failure of Neoliberalism and austerity to deliver economic growth and development resulted in more than just increasing support for

b3411_Ch-07.indd 125

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

126  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

far-right parties created four main narratives, which goes from the far-right to the far-left political spectrum (Starbird, 2017): 1. The alternative right: anti-mainstream media, pro-Christian, anti-LGBT, anti-feminist, anti-globalist, climate change denying, nationalist. 2. The alternative left: anti-mainstream media, anti-corporatist, critical of the police, anti-globalist or anti-New World Order/ Cabal, anti-corporatist, conspiracy-focused, nationalist. 3. The white nationalist and/or anti-Semitic; primarily whitenationalist or anti- Semitic positions. 4. Russian multipolarity: supports Russian interests, anti-globalist, anti-U.S., anti-European Union. The common denominator among them is anti-globalism, antiWestern democratic political and social models. They are based on deep suspicion of free trade, multinational businesses, and global institutions. They are anti-mainstream media, anti-immigration, antiscience, anti-U.S. government, and anti-European Union. It is the combination of incompetent economic policies, unable to deliver economic growth and development, with the lack of alternatives that resulted in the growing popularity of populist political movements. The lack of understanding or, worst, empathy about the effects of political choices towards society, especially regarding economic policies, is a serious issue affecting the legitimization of democratic Western values and the sustainability of the Western political model. It is the reflection of the feeling that the government and politicians broke the social contract, are corrupted by the economic elite and financial institutions, and only care about their private interests. The level of exit in the Western society nowadays has no precedents. According to the 2017 Edelman Trust Barometer, 71% of

b3411_Ch-07.indd 126

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  127

the survey respondents said government officials and regulators are not at all or somewhat credible; 63% answered the same about CEOs, while only 52% trust businesses to do what is right. More than three-quarters of respondents among both informed and general population agree that the system is biased against regular people, and favors the rich and powerful, and nearly half of the adults (aged 25 to 64) with a college education and consuming large amounts of media have lost faith in the system. For the first time in 17 years, there was a combined decline of trust in business, media, government, and NGOs. The process of globalization and technological development disrupted the traditional forms of social reproduction. Firstly, by relocating production and eliminating traditional forms of labor. Since the establishment of industrial capitalism in the 21st century, the social norm has been that people work to get a wage, form a family, and raise children. The children would have a similar job the parents or would study to have a better life. This worked for generations. Although this may be still true in certain parts of the world, in the West, the process of productive reorganization of the last 40 years resulted in many professions becoming irrelevant from the pure economic perspective. At the same time, getting a Bachelor and even a Master’s degree is not a guarantee for finding a good job anymore. The youth are very much affected by structural underemployment or unemployment. The perception that immigration is a threat for the locals’ ability to find work exists, although it is debatable if it has any real foundation. Secondly, the process of post-modernization of the Western society has resulted in an increased feeling of disconnection and nostalgia for an idealized past where norms and values were clear. In other words, the future is not what it used to be. Uncertainty becomes the norm and nostalgia for an imagined golden past its expression.

b3411_Ch-07.indd 127

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

128  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Recent examples are Brexit, the increasing popularity of right-wing anti-democratic politicians in Europe, and the popularity of antiWestern narratives within the West. The result is a very fertile field for influence operations.

The Need for Resilience To integrate resilience into defense planning against information warfare, it is necessary to monitor the information environment and social resilience. It starts with a general strategic assessment to determine the possible attackers’ main objectives and strategic goals. After this initial analysis, the concept of “resilience to information warfare” must be operationalized by constantly performing Target Audience Analysis (TAA) to understand influence triggers within the society. This includes the analysis of the cognitive processes which determine the level of exit within the society or within a specific population, including but not limited to the willingness to defend country, trust in state institutions, trust in the political system, judiciary, corruption, just to cite few. This gives the opportunity to understand how the attacker might try to weaken or destroy the independence of a country’s regime or political system. At the same time, it gives the base for choosing the right way to communicate with the society by establishing an appealing narrative to increase the awareness of the population by creating an attitude of resistance, while delegitimizing the attacker’s objectives. It enables the government to strengthen the weak points and to reinsure the strongest parts of society to minimize foreign influence. It is also important to develop a system to monitor social media, at the same time holding social media providers responsible in removing fake news as quickly as possible, such as in Germany. The main task is to

b3411_Ch-07.indd 128

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  129

decrease the gap between governments and societies, the notion that the political class broke the social contract, which is the main vulnerability that can be used as leverage by adversaries. Therefore, the main points to neutralize foreign influence should be focused on reducing the gap between the government, civil servants and politicians, and the population. Some concrete measures include: 1. Constantly performing Target Audience Analysis (TAA) to monitor the population’s level of openness for influence, and the issues and weak points that might be exploited. Therefore, it is necessary to develop this capability or to outsource to competent companies; 2. Enhancing the critical thinking of the population, government officials, and politicians. In this current era, when everything and nothing are true at the same time, the use of public relations by the government and politicians to try to neutralize the effects of bad political choices, policy failures, or just incompetence, turned out to be a plague. When people do not have contact in real life with the issue in question, this sort of public relations strategy might work, although with limitations, because of the reaction of the press and mouth-to-mouth spread of information. Nevertheless, the part of the population directly affected in real life will become convinced that the government or the politicians are lying. Therefore, public relations is to be avoided, and accepting criticism and openly discussing failures and problems with the public is fundamental, even if counterintuitive; 3. In case of suspicion of foreign influence operations, the government must seize the initiative as a pre-emptive attack. This means informing the population about the possibility of such an operation is ongoing, explaining the attacker’s interests, motivations, and objectives;

b3411_Ch-07.indd 129

20-Nov-18 12:40:34 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

130  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

4. Creating direct channels of communication between the government and the population. It is fundamental that the government explain the logic and motivations determining public policy. This can be done by using social media platforms such as YouTube, and other means of direct communication, including promoting discussions at the local level, such as conferences and seminars; 5. Increasing the direct participation of society in formulating public policy. This has to be done on two levels. First, at the micro level or the policies directly affecting individuals, a community, a neighborhood, and a minority. Second, at the macro level or the policies affecting the country as a whole, such as educational policy, health policy, etc. It is not rare, at least in the West, that politicians and government officials will use populism to justify policy choices against the public will and/or contrary to what they promised during electoral campaigns. The result is people losing faith in the political and economic elites, in the government, and in public institutions. This significantly increases the margin of success of influence operations.

References Chekinov, S. G. and S. G. Bogadanov. 2010. “Asimmetrichnyye deystviya po obespecheniyu voyennoy bezopasnosti Rossii” [Asymmetrical Actions to Ensure Russia’s Military Security]. Voennaia Mysl. 13–22. Nagorny, Alexander. and Shurygin, Vadim. 2013. “Defense Reform as an Integral Part of a Security Conception for the Russian Federation: a Systemic and Dynamic Evaluation.” Moscow: Izborsky Club. http:// www.dynacon.ru/content/articles/1085/ Starbird, Kate. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on

b3411_Ch-07.indd 130

20-Nov-18 12:40:34 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Integrating Resilience in Defense  131

Twitter.” Proceedings of the 11th International AAAI Conference on Web and Social Media (ICWSM 2017). https://www.aaai.org/ocs/index. php/ICWSM/ICWSM17/paper/view/15603/14812 Vorobyov, I.N. and Kiselyov, V. A.. 2014. “Strategii sokrusheniia i izmora v novom oblike” [Strategies of Destruction and Attrition in a New Version]. Voennaia Mysl (3). 45–57 Yamaguchi, Nobom. 2012. “An Unexpected Encounter with Hybrid Warfare: The Japanese Experience in North China, 1937–1945” in Murray, W and P. R. Mansoor, (eds.). Hybrid Warfare: Fighting Complex Opponents from the Ancient World to the Present. New York: Cambridge University Press.

b3411_Ch-07.indd 131

20-Nov-18 12:40:34 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

8 WHAT CAN WE LEARN FROM RUSSIAN HOSTILE INFORMATION OPERATIONS IN EUROPE? JAKUB JANDA AND VERONIKA VÍCHOVÁ

Subversive influence operations by foreign countries have been used for centuries as a means of soft power. Differences between countries and continents chiefly reside in the scale of operations and the variety of tactics and tools used. Today, Russian influence operations practiced in Europe — at least since 2014, following Russia’s invasion of Ukraine — constitute the testing ground of a new era. This era will be defined by states deploying subversive influence operations primarily via new information and cyber technologies in order to achieve their foreign policy objectives. The precise 133

b3411_Ch-08.indd 133

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

134  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

combination of online and offline tools and levels of secrecy and aggressiveness will always be specifically tailored to the match, and undermine, the resilience of target countries.

Why are States Using the Hostile Foreign Influence Toolkit? First, let us address the question of motivation: why are states choosing to deploy this toolkit? Typically, the objective of a war is to alter the political behavior of your adversary, to paraphrase military theorist Carl von Clausewitz. Specific aims can include military surrender, territorial handover, or the prevention of specific international activities, from trade to coalition-building. However, open war waged by conventional military means is expensive financially as well as politically — invading a country without legitimate grounds that are internationally recognized undermines any claim to the instigator’s moral high ground. For this reason, some countries resort to the development and use of a clandestine foreign influence toolkit. Such methods of hostile influence are generally much cheaper both in financial and political terms. Moreover, they are often deniable, preventing the loss of international credibility. As Sun Tzu argued, it is the supreme art of war to subdue the enemy without fighting. In practical terms, the Russian Federation deploys its hostile foreign influence toolkit in Europe (and in the West more broadly, including the U.S.) in order to effectuate ‘soft regime change’. Since the Russian government considers the prospect of Ukraine’s and Georgia’s modernization by way of EU association or membership a threat to its domestic legitimacy (i.e., where Ukrainians and Georgians would have a significantly higher quality of life than their culturally proximate Russians), Russia uses military occupation of sovereign territory in

b3411_Ch-08.indd 134

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   135

Georgia (since 2008) and Ukraine (since 2014) to maintain ‘frozen conflicts’ and fuel internal struggles in those countries, thus effectively nullifying their chances for future NATO and EU membership. The Russian Federation increasingly prioritizes such aggressive and internationally illegitimate moves in its foreign policy strategy, and consequently seeks to mitigate the international pressure it faces from the West in retaliation (i.e., harsh economic and political sanctions, as well as international isolation). This is the core reason for Russia’s hostile influence efforts in Europe: to undermine Western pushback against its military actions and aggression vis-à-vis its neighbors in the post-Soviet space (Janda, 2017b). If Russia can manipulate some extent of public opinion through the targeted dissemination of disinformation, as well as (in)directly support individual pro-Russian political representatives (e.g., Marine Le Pen, whose National Front party received Russian funding) (Gatehouse, 2017), Moscow can increase sympathy to its own causes within certain segments in the European and other key Western countries. The ascendance of pro-Russian political leaders in European states would lead to reduced international pressure on Russia for its occupation of neighboring countries such as Ukraine and Georgia. This is the chief rationale for Russia’s deployment of subversive influence efforts in Europe.

What are the Features of Russia’s Hostile Foreign Influence Toolkit? The toolkit — like any other policy or military operation — is always tailored to its target environment, and is based upon exploiting local vulnerabilities, rifts, and predilections. In France, for example, Russian disinformation and influence narratives refer to the historical cultural proximity of French and Russian elites. By contrast, such a strategy would never work in Poland, which has been at odds with

b3411_Ch-08.indd 135

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

136  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Moscow for centuries. In the Baltic states, namely Latvia, Moscow exploits the domestic grievances of the large Russian-speaking minority to undermine its loyalty to the state (Berzina et al., 2016) and utilizes them as a springboard for intelligence operations. In Germany, Russia buys geopolitical influence through energy projects such as Nord Stream 2, which is an unworkable strategy in Eastern European countries that are familiar with Moscow’s habitual energy bullying to exert geopolitical influence (Public Appeal of security experts from E.U. member states, 2018). With respect to Russia’s hostile influence toolkit, we can identify seven specific measures: 1. Intelligence operations: ranging from the extraction of sensitive information to “active measures” — i.e., proactive influence and/or manipulation efforts through physical engagement on the ground or the use of proxies in cyber activities. 2. Disinformation operations — in military terms, what are called information operations involving gray (without disclosing a source for disseminated information) or black propaganda (posing as a source that is not genuine). 3. Exploitation of political allies — seeking friendly or sympathetic politicians in target countries. Most of these relations are legal and legitimate; however, their deepening into financial or intelligence connections with a non-allied or indeed hostile state can render a politician a “Trojan horse” (Polyakova et al., 2016) who serves as a power base for an adversarial foreign power. 4. Exploitation of radical and extremist groups — seeking to cultivate, support, and exploit the disruptive energy of groups that subscribe to extremist ideologies. In Europe, for example, such groups often form pro-Russian paramilitary extremist entities that Russian intelligence agencies partly employ (Janda et al., 2017) in their operations to terrorize certain Ukrainian territories.

b3411_Ch-08.indd 136

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   137

5. Economic operations with political goals — the use of proxy or direct operations, posing as regular business entities, to blackmail or gain influence over sensitive sectors of a target state or to use these operations as a springboard for other influence activities. 6. Exploitation of linguistic, religious, or ethnic minorities — select individuals or groups within vulnerable minorities can be used for intelligence purposes or as a simple foreign policy tool for the justification of hostile measures (e.g., military moves rationalized on grounds of “protecting” these minorities). For example, Moscow has been relying on artificial “dangers” to Russian minorities in Europe to justify its aggressive behavior at least since World War Two. 7. Use of non-governmental organizations (NGOs) and governmentorganized non-governmental organization (GONGOs) — while NGOs can often project influence into the political and media establishment of a target country, they are often used to build up an influence infrastructure (Vojtiskova et al., 2016). GONGOs are government-orchestrated NGOs.

How are Hostile Actions Defined? For a government to be able to properly implement any policy, precise definitions are essential. Therefore, beyond awareness of what an adversarial foreign power is doing on your soil, it is crucial to distinguish which particular actions are hostile. The need to define “hostile” activities is essential not only for government agencies, so that they can filter out of all foreign influence measures (which are not all problematic per se) only those activities that should be countered. Additionally, a clear definition of “hostile” activities also publicly legitimizes government countermeasures, and moreover indicates red lines to any foreign powers that may potentially engage in such activities. The first step is to define and publicly codify national

b3411_Ch-08.indd 137

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

138  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

security interests — if any foreign actions contradict these interests, they may be deemed “hostile”. Vital national interests typically include territorial integrity, sovereignty (i.e., political independence and the right to national self-determination), and democratic constitutional order. Beyond these foundational interests, countries typically have additional unique national interests, such as energy independence, regional stability, or active participation in regional defense or security initiatives. The more specifically articulated these interests are, the easier it is for government agencies to protect them and qualify certain foreign activities as hostile. Without such a comprehensive and detailed process of government evaluation, decisive and exact defenses of national interests are unlikely to be successful. Based on the knowledge of how foreign powers operate in a given country and which of these activities the government considers hostile for threatening national security interests, we can identify four main response areas within the whole-of-society strategy for countering hostile foreign influence: 1. Government reaction: the government acknowledges the threat of hostile foreign influence, conducts a detailed review of the toolkit that is being used, projects how the threat could evolve over the next five years (Policy shift overview: How the Czech Republic became one of the European leaders in countering Russian disinformation, 2017), reviews the specifications of national security interests (whether it applies even after auditing the expected incoming threat), and devises security and foreign policy countermeasures.1   See: detailed set of 50 known countermeasures used in Europe (Full-Scale Democratic Response to Hostile Disinformation Operations, 2016), and set of 35 countermeasures for mitigating electoral interference (Janda, 2017a). 1

b3411_Ch-08.indd 138

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   139

2. Confronting the content of disinformation operations: the government must empower and support the cultivation of civil society and relevant media, which will then organically function as self-defense mechanisms against hostile narratives and disinformation operations. 3. Exposing the background and networks of hostile foreign influence: the best weapon against foreign subversion is to publicly expose it and explain why it endangers the country’s key national security interests. Specialists across different sectors of civil society can be directly or indirectly supported by government institutions so that there is both a hard defense (government) as well as a softer defense infrastructure that actively seeks to expose the agents of hostile foreign influence. 4. Building resilience: the long-term goal is to limit vulnerabilities and grievances that may be exploited by a hostile foreign power, for example, through efforts from social policy programs to media literacy classes.

Recent Lessons Learnt from Europe: A Case Study of 2017 German and French Elections While the Russian Federation has sought to destabilize Western democracies for several years, thus far, the transatlantic response has been anemic. Only a few states (e.g., the Baltics) have taken effective institutional steps to counter these attacks. Their warnings about Russia’s hostile intentions were long underappreciated by other Western European states. Today, Europe’s strongest democracies remain mostly ill-prepared to defend themselves against Moscow’s subversion efforts. In this context, the recent French presidential (April–May 2017) and German parliamentary (September 2017) elections provide a

b3411_Ch-08.indd 139

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

140  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

particularly relevant example. Most post-mortem assessments have praised the resilience of the French elections against Russian disinformation and hacking. Specifically, there is a consensus that France demonstrated high levels of readiness and creativity in defending its electoral process, and thus successfully prevented the far-right (and pro-Kremlin) National Front from gaining power. In the German case, contrary to expectations, there appeared not to have been any targeted disinformation campaign, and no major hack or leak operations. However, the electoral success of the AfD — the first far-right party to reach the Bundestag in 70 years — should be borne in mind, not least due to the fact that it received a significant number of votes from Germany’s Russian-speaking minority (Meyer, 2017).2 Some assessments suggest that the Germans failed to act where the French succeeded. However, a closer analysis reveals a slightly different reality.

Cyber-Security Precautions are the New Black Both France and Germany had every reason to expect Russian meddling — the evidence of previous attempts all over Europe has been overwhelming since 2014. The intelligence services also expected the worst. As early as December 2016, the French cyber security agency, Agence Nationale de la Sécurité des Systèmes d’information (ANSSI) warned of cyberattacks on the presidential campaign and the presence of “fake news” in the media (France warns Russia against meddling in presidential election, 2017).  The AfD received only 12.6% of the German vote in the September 2017 elections. But when taken together with the votes cast for Die Linke, another party often promoted by pro-Kremlin media, the proportion rises 21.8%. 2

b3411_Ch-08.indd 140

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   141

Likewise, in Germany, the head of the domestic security agency, the Bundesamt für Verfassungsschutz (BFV) publicly stated that the agency detected recurrent email phishing attacks and expected more in the future (Dearden, 2017). These warnings were noteworthy as state intelligence services rarely issue such public proclamations. Thus, given such unprecedented warnings, it would have been highly irresponsible for any political candidate to not take appropriate precautions. Emmanuel Macron wisely hired a team of IT experts to prepare his campaign for a potential attack. Meanwhile, Germany endeavored to address the cyber-security threat on the governmental level. By 2011, Germany had established new structures to improve and coordinate state security and cyber-threats response. The government also began cooperating with the private sector and, earlier this year, brought in several thousand soldiers and civilians to protect the country’s infrastructure (Werkhauser, 2017).

A Last-Minute Fight Against Disinformation is Useless Major tech companies also accepted responsibility, and sought to assist France and Germany in their fight against Russian disinformation efforts. Facebook and Google joined forces in France with their CrossCheck initiative. Furthermore, following political and public pressure, Facebook suspended around 30,000 French fake accounts connected to extremist propaganda or other illegal content (O’Reilly, 2017). In Germany, the tech companies were somewhat less willing to cooperate with authorities. In April, a new bill was introduced to fine social media platforms for failing to remove “fake news” and posts containing hate speech and other criminal content. However, the legislation only came into effect recently, at the beginning of October 2017 (Lomas, 2017). Crucially,

b3411_Ch-08.indd 141

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

142  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

although these initiatives arguably sufficed to prevent a large-scale disinformation attack prior to the elections, they were established too late to counteract long-term Russian and domestic extremist efforts to undermine public trust in democratic institutions, traditional political parties, and the mainstream media. Unfortunately, neither the French nor German government took steps to establish a permanent center or unit responsible for countering disinformation or engaging in strategic communications more broadly. However, without an institutionalized governmental approach, based on identification of the key national security interests mentioned above, it is near impossible to mount a viable national defense against disinformation attacks and other forms of hostile influence. Other European countries are now learning this lesson — the Czech Republic and Finland, for example, are developing major institutional initiatives to this end.

It is Too Late to Get Angry Once you Get Attacked French administration has made a shift after new French President E. Macron started to name Russian disinformation as a threat publicly and his administration is currently preparing policy and legislative moves on this issue — in some cases, they still mention Russia as a partner, for example, in the fight against terrorism (Raffarin, 2016). In contrast, the 2015 Annual BfV report published by the German Interior Ministry describes the Russian Federation as a major player responsible for espionage activities and cyberattacks against Germany (2015 Annual Report on the Protection of the Constitution, 2016). Of course, President Macron is not responsible for prior administrations’ positions on Russia, but an examination of his preelection program reveals that he also had little to say about Russia. Before the cyber-attacks on his campaign, Macron was largely silent

b3411_Ch-08.indd 142

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   143

about the Kremlin threat. It was only after the attacks took place that he became significantly more vocal in his condemnation of the Kremlin’s disinformation and influence efforts. Further, while some European leaders have called out Vladimir Putin on the issue of influence and meddling, concrete steps in response have been few and far between. France, for example, has not taken such steps (despite threatening to do so) since the beginning of the Macron presidency. If European politicians truly want to show the Kremlin that red lines have been crossed, they must use their power to lead a collective response. For example, a French doctrine of identifying and countering Russia’s subversive activities or a German initiative to stop the problematic Nord Stream 2 project are urgently needed.

Russia Learnt its Lesson, Europe Should Too Both French and German citizens have long been targets of proKremlin disinformation and hostile influence efforts intended to undermine faith in the democratic process and the Euro-Atlantic alliance. These attacks did not start a few weeks prior to the elections; they have been underway at least since the annexation of Crimea in 2014. Every European country should now know that it is not exempt from this threat, and that it will not disappear anytime soon. Likewise, other Western democratic states must recognize their vulnerability and preventively address the hazard of hostile foreign influence. France was relatively fortunate in the outcome of its election. In the French case, the Kremlin used a similar modus operandi as in the U.S. 2016 presidential election: hacking and releasing emails and altered documents shortly before the election date. Judging by the level of preparation, it is likely that if Russia had applied the same

b3411_Ch-08.indd 143

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

144  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

method in Germany, it would have been able to defend itself just as well. However, information warfare is flexible, creative, and offers many opportunities for evolution. Russia did not repeat the same mistake twice; likewise, Europe cannot afford to do so. It is essential for Western countries to wake up, move beyond mere discussions of the threat, and forge a formidable defense of their values. Information and experience sharing is essential, in particular, Eastern European countries that have been resisting the Kremlin’s aggression for far longer are key sources. All European states, not only France and Germany, must now commit to building relevant and sustainable institutional structures, cooperating with civil society and the private sector, and improving their resilience against this new breed of threats. Due to the interconnected nature of European infrastructure and security, international cooperation is essential to success. France and Germany now both have direct experience with disinformation and influence operations. They should be the leaders in identifying the lessons learnt, sharing them with other countries, and fighting to ensure that Europe succeeds in countering Russian hostile influence. Indeed, without decisive action on part of the Western alliance to thwart the Kremlin’s destructive agenda vis-à-vis the West, the situation will continue to deteriorate, with potentially disastrous consequences.

References Berzina, Ieva, Berzins, Janis & Hirss, Martins. 2016. “The Possibility of Societal Destabilization in Latvia: Potential National Security Threats.” National Defence Academy of Latvia Center for Security and Strategic Research. http://www.naa.mil.lv/~/media/NAA/AZPC/Publikacijas/ WP%2004-2016-eng.ashx Dearden, Lizzie. 2017. “German spy chief warns Russia cyber attacks aiming to influence elections.” Independent, May 4, 2017. https://www.

b3411_Ch-08.indd 144

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

What can we Learn from Russian Hostile Information Operations in Europe?   145

independent.co.uk/news/world/europe/germany-spy-chief-russiancyber-attacks-russia-elections-influence-angela-merkel-putin-hansgeorg-a7718006.html Gatehouse, Gabriel. 2017. “Marine Le Pen: Who’s funding France’s far right?” BBC, April 3, 2017. https://www.bbc.com/news/world-europe-39478066 Janda, Jakub. 2017a. “A framework guide to tools for countering hostile foreign electoral interference.” Kremlin Watch Report, May 11, 2017. http://www.europeanvalues.net/wp-content/uploads/2017/05/35measures-in-15-steps-for-enhancing-the-resilience-of-the-democraticelectoral-process-1-1.pdf Janda, Jakub. 2017b. “What Putin Really Wants- And Why We Cannot Give It to Him.”  Observer, January 25, 2017. http://observer.com/2017/01/ russia-vladimir-putin-interests-opposite-united-states-donald-trump/ Janda, Jakub, Vejvodova, Petra & Vichova, Veronika. 2017. “The Russian connections of far-right and paramilitary organizations in the Czech Republic” in The activity of pro-Russian extremist groups in CentralEastern Europe, April 28, 2017. http://www.politicalcapital.hu/news. php?article_read=1&article_id=933 Lomas, Natasha. 2017. “Germany’s social media hate speech law is now in effect.” Tech Crunch, October 3, 2017. https://techcrunch.com/2017/10/02/ germanys-social-media-hate-speech-law-is-now-in-effect/ Meyer, Henry. 2017. “Putin Has a Really Big Trojan Horse in Germany.” Bloomberg, May 2, 2017. https://www.bloomberg.com/news/articles/201705-02/putin-s-trojan-horse-for-merkel-is-packed-with-russian-tv-fans O’Reilly, Andrew. 2017. “France concerned over Russian interference in elections amid reports of hacking, fake news.” Fox News, April 20, 2017. http://www.foxnews.com/world/2017/04/20/france-concerned-overrussian-interference-in-elections-amid-reports-hacking-fake-news.html Polyakova, Alina, Laruelle, Marlene, Meister, Stefan & Barnett, Neil. 2016. “The Kremlin’s Trojan Horses.” Atlantic Council. http://www. atlanticcouncil.org/publications/reports/kremlin-trojan-horses Raffarin, Jean-Pierre. 2017. “Rapport relative à l’activité de la délégation parlementaire au renseignement pour l’année 2015.” 25 February, 2017. http://www.assemblee-nationale.fr/14/rap-off/i3524.asp

b3411_Ch-08.indd 145

20-Nov-18 12:40:52 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

146  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Vojtiskova, Vladka, Schmid-Schmidsfelden, Hubertus, Novotny, Vit & Potapova, Kristina. 2016. “The Bear in Sheep’s Clothing: Russia’s Government-Funded Organisations in the EU.” Wilfried Martens Centre for Education Studies, July 2016. https://www.martenscentre.eu/publications/bear-sheeps-clothingrussias-government-funded-organisations-eu Werkhauser, Nina. 2017. “German army launches new cyber command.” DW, April 1, 2017. https://www.dw.com/en/german-army-launchesnew-cyber-command/a-38246517 “France warns Russia against meddling in presidential election.” DW, May 15, 2017. https://www.dw.com/en/france-warns-russia-against-meddlingin-presidential-election/a-37572358 “Full-Scale Democratic Response to Hostile Disinformation Operations.” European Values Think-Tank, June 20, 2016. http://www.europeanvalues. net/vyzkum/full-scale-democratic-response-hostile-disinformationoperations/ “Policy shift overview: How the Czech Republic became one of the European leaders in countering Russian disinformation.” Kremlin Watch Report, May 10, 2017. http://www.europeanvalues.net/wp-content/uploads/2017/ 05/Policy-shift-overview-How-the-Czech-Republic-became-one-of-theEuropean-leaders-in-countering-Russian-disinformation-1.pdf “Public Appeal of security experts from EU member states: 6 reasons Nord Stream 2 will be Germany’s strategic mistake for decades to come.” European Values Think-Tank, 2018. http://www.europeanvalues.net/ nordstream/ “2015 Annual Report on the Protection of the Constitution.” Federal Ministry of the Interior, June 28, 2016. https://www.verfassungsschutz. de/embed/annual-report-2015-summary.pdf

b3411_Ch-08.indd 146

20-Nov-18 12:40:52 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

9 HOW GERMANY IS TRYING TO COUNTER ONLINE DISINFORMATION KAROLIN SCHWARZ

The alleged role of disinformation and false news in the Trump election victory in 2016 has stoked fears that something similar might happen in Germany. After all, the German general elections took place just about a year after Trump was elected. This article examines the state of false news and disinformation in Germany, and sheds light on the initiatives that have been rolled out in response.

Overview Even though the Trump election victory sparked debates surrounding disinformation in Germany, hoaxes and fake news targeting 147

b3411_Ch-09.indd 147

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

148  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

German-speaking social media users have in fact been disseminated long before Donald Trump even announced his 2016 presidential campaign. For several years, false claims about Muslims have been making rounds on the Internet. For instance, individual rumors about German “Christmas markets” being renamed “winter markets” to cater to the feelings of the Muslim population in Germany have been circulated since at least 2014. Even though these allegations were debunked,1 these stories still find their way around on social media platforms every December. False claims about Eastern Europeans trying to abduct children using white vans have been around longer than most social media platforms, but are now also being spread online.

The Lisa Case In January 2016, a 13-year-old Russian-German girl claimed to have been abducted and raped by three men, whose German she described as “not very good”. She told the police that the men were of “southern” origin, which is a term used to describe non-white people from northern African, Arab, and even southern European countries. In this case, social media users were quick to blame refugees, since Germany had seen several hundreds of thousands of them coming to Germany in the previous months. Even though just a few days after the incident had become public the police released a statement about her allegations being made up, the report about Lisa’s abduction led to demonstrations by Russian-Germans in several German cities. Several far-right,   While it is true that some Christmas markets have been renamed to “winter markets”, the reason behind this is that they are often kept open until after Christmas. 1

b3411_Ch-09.indd 148

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

How Germany is Trying to Counter Online Disinformation   149

anti-Muslim groups and actors joined these demonstrations, and used the case for online campaigns against refugees and Germany’s asylum policies. Russian foreign minister Sergei Lavrov accused Germany of covering up criminal acts, and dismissed the police statement revealing that her accusations had been made up. This incident has been referenced widely as one of the biggest fake news cases Germany has seen.

Methods and Motives Compared to the U.S., Germany has seen fewer fake news sites dedicated to spreading false information online. While the spectrum of “alternative” or hyper-partisan media in Germany is quite broad, news that is completely made up accounts for only a part of their content. Many sites typically use real events and fabricate some facts. In March 2018, one notorious website claimed that the Tafel (a volunteer-based organization in Germany distributing food to people in need) in Essen were no longer taking in new foreigners, because fewer Germans had been coming there. The website also claimed that Angela Merkel had reacted to the news saying: “We invited these people, so we should serve them first” (Laut Merkel ist, 2018). While the Tafel was indeed no longer accepting applications by foreigners, the Merkel quote was fabricated. Nonetheless, this fabrication made it the most widely shared article about the Essen Tafel in Germany, dwarfing even pieces published by established news outlets. As this example suggests, asylum policies, (perceived) foreigners, and Islam are some of the most, if not the most, important topics for people spreading disinformation online. Aside from hyper-partisan news websites and blogs, most disinformation is posted directly on social media platforms such as

b3411_Ch-09.indd 149

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

150  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Facebook, Twitter, and WhatsApp. Many fabricated stories about refugees or non-white people committing crimes are posted on Facebook or Twitter, as alleged eyewitness accounts from people who claim to have witnessed a crime or to have heard about it from a trusted friend. These stories are often accompanied by claims about the police or other actors not being allowed to speak publicly about what happened. Some of these stories are shared thousands of times before being refuted by the media or the police. Taking photos or videos out of context is another very popular method to illustrate false claims. Most of these media are not digitally manipulated but simply put in a new context. A selfie that a young Syrian refugee named Anas Modamani had taken with Angela Merkel was used to suggest he was a terrorist connected to the 2016 airport attack in Brussels as well as the attack on the Christmas Market in Berlin later that year. Modamani even tried — and failed — to sue Facebook, wanting the company to establish a filter preventing his photo from being uploaded to their platform. Fabricated quotes, like the one about the Essen Tafel attributed to Angela Merkel, are a popular method too. Most fake quotes are shared as pictures accompanied by a photo of the person, mostly politicians, alleged to have said something outrageous. Several high-ranking German politicians have filed complaints regarding false quotes; among them former Minister of Food and Agriculture Renate Künast (The Greens) and Martin Schulz (SPD), the Social Democrats’ candidate for chancellorship in the 2017 elections, both against the same blogger who published articles suggesting that Künast defended a refugee who murdered a 19-year-old female student, and Schulz wanted to build re-education camps for supporters of Germany’s right-wing populist party Alternative for Germany (AfD).

b3411_Ch-09.indd 150

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

How Germany is Trying to Counter Online Disinformation   151

Sometimes, photos of forged documents are spread on social media platforms. Examples include letters from institutions requesting tenants to make room for refugees in their homes, to even more bizarre documents, such as fake coupons for brothels that are allegedly being handed out to refugees. A few of the culprits may be driven by financial motives. Some hyper-partisan websites have web shops, selling everything ranging from stickers or t-shirts to weapons that have been imported illegally from Hungary (Biermann and Polke-Majewski, 2017). Most actors engaging in online disinformation, however, seem to be politically motivated. They are campaigning against migration and asylum policies, hailing politicians and state leaders implementing strict migration policies, like Hungary’s Viktor Orbán. While they might not aim to swing elections, they do cater to racist and anti-Islamist sentiments, and foster a sentiment of distrust towards public institutions, politicians, and the police. Trolling is another motive. In breaking news situations, such as terror attacks or shootings, trolls spread fake photos of alleged perpetrators or missing persons simply to harm people or promote fear.

Initiatives As was already established, the public debate on fake news and their potential to influence elections came right after the U.S. elections. Afterwards, several fact-checking initiatives were launched in Germany. Two initiatives dedicated to countering online hoaxes and disinformation were already established before — Mimikama, an Austrian website dedicated to bogus virus warnings, fake lotteries, as well as politically motivated disinformation; and Hoaxmap, a website mapping false claims about refugees and non-white people in German-speaking countries, initiated by the author.

b3411_Ch-09.indd 151

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

152  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Hoaxmap Hoaxmap.org was launched in February 2016, initially mapping 176 false stories about asylum seekers in Germany. All false stories are put on a map, accompanied by the date, a link to an article refuting it, and a category describing the content of the story. Most of these fabricated stories are related to crimes, mostly theft and robbery, sexualized violence, or excessive social welfare. Claims about asylum seekers receiving free iPhones when moving into shelters are not uncommon at all. Neither are false claims about supermarkets closing down as a result of shoplifting refugees. Hoaxmap aims to create a database observing the fake news phenomenon, and is a resource for people confronted with these claims on social media platforms and offline. It benefits journalists as well as educators using Hoaxmap as a teaching resource. Like Mimikama, Hoaxmap is a volunteer project. Both projects rely heavily on crowdsourcing information. People who are confronted with questionable content on social media usually approach the initiatives asking about suspected fakes they have seen online or — in the case of Hoaxmap — sharing information about fake stories that have been refuted by other, mostly local, media. Since February 2016, Hoaxmap has gathered 491 false stories in Germany, Austria, and Switzerland. These countries have been added to the map at the request of Hoaxmap users. A group of journalists has also created Huhumylly, a Finnish version of Hoaxmap.

Fact-Checking Initiatives In 2017, Germany saw the formation of a handful of fact-checking initiatives. Correctiv.org’s Echtjetzt was launched in April 2017. The

b3411_Ch-09.indd 152

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

How Germany is Trying to Counter Online Disinformation   153

non-profit media team, which the author was part of, is the only fact-checking initiative partnering with Facebook in Germany. The team, which was initially made up of four fact-checkers, dedicate their work to checking viral stories on social media platforms as well as claims made by politicians. Since Echtjetzt is still Facebook’s only German fact-checking partner and Facebook used to require two fact-checks from two independent organizations, disinformation was not flagged for German Facebook users until Facebook changed their method of warning users about fake news. At first, Facebook displayed warning signs and required users to confirm twice before posting a link to a fake news article. Users could still access those links, but a warning sign was displayed below them. Now, however, fact-checks are now displayed as related articles below posts. In September 2017, Echtjetzt partnered up with Firstdraft, a project dedicated to fighting misinformation online, and others, in order to intensify their work in the weeks before the general elections, publishing a newsletter each day. After the elections, an Echtjetzt analysis found that most disinformation with political content did not have significant impact on the outcome and that social bots were seldom used to amplify certain stories. However, there were a few stories that went viral and unaccounted fake news which was shared within small audiences (Kramm, 2017).6 Several studies have confirmed this finding, suggesting that disinformation contributes to shaping narratives about asylum policies, and shifts public debate away from other topics (Sangerlaub, 2017).7 Researchers also argued that while only a few stories were spread in the weeks before the elections, all infrastructure to influence public opinion is in place, and could be used during catastrophes or to further existing diplomatic tensions between two or more countries, for example (Applebaum, 2017).8 Facebook has not yet provided much data showing the effects on their cooperation with fact-checkers (Levin, 2017).9 The effectiveness

b3411_Ch-09.indd 153

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

154  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

of their cooperation is thus hard to measure. Most of Echtjetzt’s articles investigate claims about migration policies and asylum seekers circulated by far-right circles. Articles about disinformation spread by the left are rare. Since asylum policies have been in the center of political debates since 2015 and right-wing populist politicians tend to focus on this topic, it comes as no surprise that refugees and migration policies would be targeted by disinformation as well. German public-service broadcasters also launched fact-checking initiatives, the biggest being ARD’s Faktenfinder and ZDF’s #ZDFcheck17. While Faktenfinder offers a mix of articles refuting online fake news, checking politician’s statements, and educational content, #ZDFcheck17 focused on claims made by politicians. While ZDF’s initiative was discontinued after the elections, Faktenfinder is still up and running, although, like Correctiv’s Echtjetzt, now working with a smaller team. Another initiative is Stimmtdas.org, also dedicated to checking politicians’ statements. Several smaller sites are dedicated to singular hyper-partisan outlets or political parties. While Germany has seen several attempts at establishing factchecking, one cannot yet consider these initiatives sustainable. Without reliable funding, larger teams, and useful tools, these teams can — like most of their counterparts in other countries — only check a very small share of fake news that are being distributed in social media.

Legal Situation In 2017, Germany introduced the Netzwerkdurchsetzungsgesetz (network enforcement law). As of January 2018, social media platforms with more than two million German users may be fined with up to EUR5,000,000 if they fail to remove patently illegal content within 24 hours, or seven days, in cases in which the illegality

b3411_Ch-09.indd 154

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

How Germany is Trying to Counter Online Disinformation   155

of a comment or post is not immediately obvious. While the law is considered an ‘anti-fake news law’ in Germany and internationally, only parts of it refer to criminal offences connected to fabricated news. Most existing criminal offences covered by the law are connected to online hate speech, including ethnic hatred, xenophobia, anti-Semitism, and incitement of violence, among others. The law was met with broad criticism from experts and nongovernmental organizations (NGOs) from all over the political spectrum. While NGOs and the civil society have certainly welcomed initiatives against the spreading of hatred online, they were concerned that legal assessments would now be handed over to the likes of Facebook and Twitter, rather than being left to German courts. Many expressed a fear of over-blocking, that is, the wrongful removal of controversial but legal content by platforms to avoid the risk of being fined. As of March 2018, no fines have been imposed on any social media networks. Meanwhile, right-wing activists and nationalists who strongly opposed the new law have been trying to silence others by reporting accounts of political opponents, mostly those of feminists (Mutzel, 2018). So far, only a few people have been convicted of spreading fake news online, most of them on charges of mass instigation, as they fabricated stories about refugees committing crimes and posted about them on social media. Furthermore, in 2018, there is a pending lawsuit which has been sought by Renate Künast and Martin Schulz, against the propagators of made-up quotes.

Conclusion While false information surrounding health topics, miracle drugs, and computer viruses certainly is being circulated among German

b3411_Ch-09.indd 155

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

156  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Internet users, most politically-charged disinformation in Germany reproduces racist stereotypes and anti-Muslim sentiments. While disinformation might not necessarily cause a political party or candidate to win or lose, it certainly contributes to polarizing societies and generates distrust in public institutions, lawmakers, and the police. Platforms should accept that it is their responsibility to delete illegal content. However, in the longer run, the phenomenon of online disinformation cannot be countered with blocking or removing content alone. As we have seen in the past, manipulators might just move on to other social networks, create new accounts, or find new methods to spread false claims. Since disinformation in Germany is different from fake news being spread in the U.S., possible solutions might not be the same for these or any other two countries. However, some narratives are quite similar, although they might be presented differently in social media. For example, disinformation about refugees in Germany and Europe is not only spread there but worldwide, in nationalist circles in America and Russia alike. Many of these stories cross national borders and language barriers. Countering disinformation is a concern for most countries worldwide. While there are local characteristics to disinformation campaigns, actors engaging in online disinformation certainly learn from their counterparts in other countries. This is why certain findings from long-running fact-checking organizations such as Snopes, PolitiFact, or Africa Check can and should be used in countering fake news online, even if some adjustments need to be made to fit to local contexts. This also applies to best practices in media literacy education. Institutions should also be aware of their role in countering misinformation. While it is unusual for them to take up a role as fact-checkers — a term mostly reserved for journalists — they should realize that answering viral rumors and

b3411_Ch-09.indd 156

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

How Germany is Trying to Counter Online Disinformation   157

fake news as quickly and accurately as possible helps stop their circulation. Countering disinformation is also a challenge facing teachers and educators just as well as journalists. Disinformation does not stop at national borders, and neither should the initiatives fighting it.

References Applebaum, Anne, Pomerantsev, Peter, Smith, Melanie & Colliver, Chloe. 2017. “Make Germany Great Again: Kremlin, Alt-Right and International Influences in the 2017 German Elections.” London: Institute for Strategic Dialogue. https://www.isdglobal.org/wp-content/uploads/2017/12/MakeGermany-Great-Again-ENG-061217.pdf Biermann, Kai and Polke-Majewski, Karsten. 2017. “Rechter Waffenshop ist offline.” [The Right Weapon Shop is offline]. Zeit. February 2, 2017. https:// www.zeit.de/gesellschaft/zeitgeschehen/2017-02/migrantenschreck-illegalewaffen-website-offline Huhumylly. https://huhumylly.info/ Kramm, Jutta. 2017. “Den Fake News keine Chance” [No chance for the fake news]. Correct!V, September 25, 2017. https://correctiv.org/ echtjetzt/artikel/2017/09/25/wahlcheck17-zieht-bilanz-den-fake-newskeine-chance/ Levin, Sam. 2017. “‘Way too little, way too late’: Facebook’s factcheckers say effort is failing.” The Guardian, November 13, 2017. https://www. theguardian.com/technology/2017/nov/13/way-too-little-way-too-latefacebooks-fact-checkers-say-effort-is-failing Mutzel, Daniel. 2018. “Trolle und rechte Aktivisten verbünden sich auf pr0gramm, Discord und Twitter, um das Gesetz gegen Hassrede in sein Gegenteil zu verkehren: ein Denunziations-Tool gegen Linke, Frauen und Migranten” [Trolls and right-wing activists are joining forces on program, Discord and Twitter, to turn the law against hate speech into its opposite: a denunciatory tool against leftists, women and migrants]. Motherboard. January 12. Motherboard. https://motherboard.vice.com/

b3411_Ch-09.indd 157

20-Nov-18 12:41:08 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

158  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

de/article/kznxz3/vom-netzdg-zum-hetzdg-wie-trolle-das-maas-gesetznutzen-um-politische-gegner-mundtot-zu-machen Sangerlaub, Alexander. 2017. “Verzerrte Realitaten: “Fake News” im Schatten der USA und der Bundestagswahl” [The Perception of “Fake News” in the Shadow of the US and the Bundestag Election]. Stiftung Neue Verantwortung. https://www.stiftung-nv.de/de/publikation/verzerrte-realitaeten-fake-newsim-schatten-der-usa-und-der-bundestagswahl ‘Laut Merkel ist Flüchtlingen bei den Tafeln unbedingter Vorrang zu geben — Wir luden sie ein Die Tafel Essen’ [According to Merkel, refugees must give absolute priority to the tables — we invited them to eat the food]. Allgemein, February 27, 2018. https://web.archive.org/web/ 20180726005120/https://blog.halle-leaks.de/2018/02/laut-merkel-istfluechtlingen-bei-den-tafeln-unbedingter-vorrang-zu-geben-wir-ludensie-ein/

b3411_Ch-09.indd 158

20-Nov-18 12:41:08 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

chapter 

10 DISTINGUISHING FACT FROM FICTION IN THE MODERN AGE ANDREAS SCHLEICHER

Educating People for a Post-Truth World This article examines the kinds of knowledge, skills, and character qualities that people need to thrive in the post-truth information age, and how school systems can get ready for that.

The Post-Truth Information Age Post-Truth is a word on everyone’s lips — it was the word of the year in 2016 by the Oxford English Dictionary. It is not exactly a new phenomenon, but the speed, volume, and reach of information flows in the current digital ecosystem has hugely affected its magnitude. 159

b3411_Ch-10.indd 159

20-Nov-18 12:41:32 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

160  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

This confluence of factors has created the perfect conditions for fake news to thrive, affecting public opinion and political choices. In the current environment, virality seems to be privileged over quality in the distribution of news — the consequences of which are now being manifested politically. Combined with low levels of public trust in “experts”, we are now facing the uncomfortable reality that truth and fact are losing currency in decision-making and in the democratic choices. Assertions which “feel right” but have no basis in fact seem to be accepted as valid, on the grounds that they challenge the elite and vested interests. Reality has become fungible. Post-truth may have more to do with consumers of lies than with their producers. Algorithms that sort us into groups of like-minded individuals create social media echo chambers contribution to siloization. They create virtual bubbles that homogenize opinions, insulating people from divergent views that would challenge their own truths and beliefs. The biggest asset in life now is getting people’s attention. There is scarcity of attention, but abundance of information. And those algorithms are not a design flaw; they are the heart of the workings of social media. Digitalization is connecting people, cities, countries, and continents in ways that vastly increase our individual and collective potential. Any one of us can now play a part in changing the world, for better or worse. Together, we can address the world’s biggest challenges. However, digitalization has also made the world more complex, more volatile, and more uncertain. Digitalization can be incredibly democratizing — we can connect and collaborate with anyone — but it can also concentrate incredible powers; consider Amazon and Google. Digitalization can be particularizing — the smallest voice can be heard everywhere — but it can also be homogenizing, squashing individuality and cultural uniqueness. Digitalization can be empowering; consider the role it plays in the

b3411_Ch-10.indd 160

20-Nov-18 12:41:32 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Distinguishing Fact from Fiction in the Modern Age  161

birth of new entrepreneurs and start-ups. However, it can also be disempowering when we consider how we may not be able to follow the reasoning of machines and algorithms playing ever more important roles in our lives. This is the age of accelerations, a speeding-up of human experience through the compound impact of disruptive forces on every aspect of our lives. It is also a time of political contestation. The priority of the wider international community is to reconcile the needs and interests of individuals, communities, and nations within an equitable framework based on open borders, free markets, and a sustainable future. However, where disruption has brought a sense of dislocation, political forces emerge that are offering closed borders, the protection of traditional jobs, and a promise to put the interests of today’s generation over those of the future. The fake news phenomenon can significantly amplify those forces. The question is then — how can we live successfully in this new world of information? Should we approach the issue from a consumer protection angle — working on this from the supply side, or from a skills angle — strengthening the capacity of people to better navigate information? From the “consumer protection” point of view, we do not treat knowledge in the same way that we address consumer protection issues with physical products. People sue McDonalds on account of their health issues or obesity, or Starbucks, when they burn themselves with hot coffee. However, it seems just very hard to do anything against those who produce and deliver fake news, because tinkering with free speech tends to be felt as an immediate threat to democratic principles. The result is that the market for information remains totally unregulated. Can and should we place certain constraints on the behavior and pronouncements of the influential and powerful? Can

b3411_Ch-10.indd 161

20-Nov-18 12:41:32 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

162  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

and should we introduce more robust standards for our gatekeepers, the journalists, who play such an important role in holding power to account? Has the time come to extend consumer protection to voters? And, if we do so, where will this restrict freedom of speech and creativity in knowledge creation? Transparency in political advertising in the social media space also merits closer attention, given its increasingly prevalent use. The degree and sophistication of targeting techniques being deployed are astounding, and they are poorly understood by the majority of social media users. Business, too, has a role to play in placing more stringent conditions on where it advertises, and prioritizing quality and credibility over clicks. It may well be, too, that the role of the social media platforms will need to evolve in recognition of the fact that they have, in effect, become publishers of news and information. They have a key role to play in stemming the flow of fake news, both by being more transparent in the algorithms being used and by adapting them, in order to de-prioritize fake news content. There clearly is a need for trusted brands, and a need for trusted journalists, and markets will always gravitate towards these trusted brands.

Knowledge, Skills, and Character Qualities for the Post-Truth Information Age The question is how far pushing action on the “supply side” alone will address the fake news phenomenon. It seems at least equally important to strengthen the capacity of people to navigate the new world of information. This is the educational dimension of the phenomenon. Any treatment of the issue should begin with the most basic aspect of the educational side, literacy, the capacity of people to access,

b3411_Ch-10.indd 162

20-Nov-18 12:41:32 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Distinguishing Fact from Fiction in the Modern Age  163

manage, integrate, evaluate, and reflect on written information. In the information age, the construct of literacy has evolved rapidly. The literacy of the 21st century is no longer about decoding prefabricated content in a linear way, but about constructing knowledge and about questioning the established wisdom of our times. In the past, teachers could tell students to look information up in an encyclopedia, and to rely on that information to be accurate and true. Nowadays, Google presents us with millions of answers to any questions, and the task of readers is to triangulate, evaluate, and construct knowledge. This involves managing non-linear information structures and building our own mental representation of information, as we find our way through hypertext on the Internet, dealing with ambiguity, and interpreting and resolve conflicting information. Between 2012 and 2014, the OECD Survey of Adult Skills has tested the digital literacy skills of adults in the industrialized world. It is the first time that we have measured the skills of people, not just in terms of their degrees or qualifications, but by testing directly to what extent they can navigate complex digital information. Those kinds of skills are not a safeguard for recognizing fake news, but they can be considered a necessary condition. The Survey of Adult Skills revealed large gaps in the most elementary information management skills. There are about 200 million workers in the industrialized world who do not read as well as what one would expect from a 10-year-old child in terms of having acquired basic vocabulary, fluency, and comprehension of simple texts (Level 1 on the Survey of Adult Skills). Broadly speaking, these are individuals who would be considered more vulnerable to fake news. The results of the test also showed that just about one in 10 adults in the age group 55–64 years had moderate or advanced digital literacy skills (see Figure 10.1). As one would expect, digital literacy skills are significantly higher among younger adults. However, even in the age group

b3411_Ch-10.indd 163

20-Nov-18 12:41:32 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

164  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears Level 2

Level 3

Level 2

Level 3

Singapore Korea Finland Sweden Netherlands Flanders (Belgium) Norway Czech Republic Germany New Zealand Canada Australia Austria Denmark Estonia Slovenia OECD average Japan Northern Ireland (UK) England (UK) Slovak Republic Ireland Russian Federa on Poland United States Israel Lithuania Chile Greece Turkey

100

80

60

40

20

0

20

40

Figure 10.1:    Skills to Manage Complex Digital Information Source: Survey of Adult Skills (PIAAC) (2012, 2015), Table A3.8(L).

16- to 24-years-old, the share of people with at least moderate digital literacy skills remains below 50% on average across countries. Cross-country differences are important: the U.S. (the first country to invest in universal education) leads the world in the skill levels of older workers, but among young adults, it trails far behind the country average. This is not because skill levels declined (the U.S. has more young people than older people with at least moderate digital literacy skills), but because so many other countries advanced so much faster. Singapore provides a good example of a country where young people are much better skilled than their older counterparts — and better skilled in fact than their young counterparts in other countries. However, even in Singapore, onethird of young adults are not prepared for the flat world, with only Level One capability (see Figure 10.2).

b3411_Ch-10.indd 164

20-Nov-18 12:41:32 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Distinguishing Fact from Fiction in the Modern Age  165

0 Chile Turkey Greece Israel England/N. Ireland (UK) United States Ireland Canada Norway Germany Poland Slovak Republic New Zealand Slovenia Russian Federation Denmark Lithuania Sweden Austria Australia Flanders F lande land la lande lan an a and nde nd nd rs r ((Belgium) Singapore S Estonia Czech Republic Finland Netherlands Japan Korea

25 0

1624

(%)

45 0

2554

65

5565

Figure 10.2:    Share of Age Group Lacking Basic Skills Source: Survey of Adult Skills (PIAAC) (2012, 2015)

The post-truth information age demands far more from people than simply navigating written information. In these times, we can no longer teach people for a lifetime — education needs to provide people with a reliable compass and the navigational tools to find their own way through an increasingly complex and volatile world. As future jobs will pair computer intelligence with human knowledge, skills, character qualities, and values, it will be our capacity for innovation, our awareness, our ethical judgement, and our sense of responsibility that will equip us to harness machines to shape the world for the better, and to deal with complex information sources that people face. This is the main conclusion drawn by OECD countries working on a new framework for curriculum design, referred to as ‘Education 2030’. Not surprisingly then, schools increasingly recognize the need for fostering ethics, character, and

b3411_Ch-10.indd 165

20-Nov-18 12:41:33 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

166  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

citizenship, aiming also to develop a range of social and emotional skills, such as empathy, compassion, mindfulness, purposefulness, responsibility, collaboration, and self-regulation. In our Education 2030 framework for curriculum design, OECD countries have put creating new value, dealing with tensions and dilemmas, and developing responsibility at the center. Creating new value, as a transformative competency, connotes processes of creating, making, bringing into being and formulating; and outcomes that are innovative, fresh, and original, and contributing  something of intrinsic positive worth. It suggests entrepreneurialism in the broader sense of being ready to venture and to try without anxiety about failure. The constructs that underpin the competence are imagination, inquisitiveness, persistence, collaboration, and self-discipline. Young people’s agency to shape the future will partly hinge on their capacity to create new value. In a structurally imbalanced world, the imperative of reconciling diverse perspectives and interests, in local settings with sometimes global implications, will require young people to become adept in handling tensions, dilemmas, and trade-offs. Striking the balance, in specific circumstances, between competing demands — of equity and freedom, autonomy and community, innovation and continuity, and efficiency and democratic process — will rarely lead to an either/or choice or even a single solution. Individuals will need to think in a more integrated way that avoids premature conclusions and attends to interconnections. The constructs that underpin the competence include empathy, adaptability, and trust. The third transformative competency is a prerequisite of the other two. Dealing with novelty, change, diversity, and ambiguity assumes that individuals can ‘think for themselves’ with a robust moral compass. Again, this seems vital also to deal with information in a post-truth world. Equally, creativity and problem-solving require

b3411_Ch-10.indd 166

20-Nov-18 12:41:33 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Distinguishing Fact from Fiction in the Modern Age  167

the capacity to consider the future consequences of one’s actions, to evaluate risk and reward, and to accept accountability for the products of one’s work. This suggests a sense of responsibility, and moral and intellectual maturity, with which a person can reflect upon and evaluate their actions in the light of their experiences, and personal and societal goals; what they have been taught and told; and what is right or wrong. The perception and assessment of what is right or wrong, good or bad, in a specific situation is about ethics. It implies asking questions related to norms, values, meanings, and limits. Central to this competency is the concept of self-regulation, in the spheres of personal, interpersonal, and social responsibility, drawing on constructs of self-control, self-efficacy, responsibility, problem-solving, and adaptability.

Educating for the Post-Truth Information Age So, what does all of this imply for the people and institutions charged with developing the knowledge, skills, and character qualities for the post-truth information age? In the past, it was sufficient for education to sort students because our economies and societies could rely on a few highly educated individuals. In the post-truth world of complex information, everyone needs to have fairly high skill levels not just for economic but also for social participation. In traditional bureaucratic school systems, teachers are left alone in classrooms with prescriptions on what to teach. Today, teachers and schools need to look outwards to collaborate with the next teacher and the next school. The past was about delivered wisdom, the future is about user-generated wisdom. The past was divided: we could afford teachers and content to be divided by subjects and student destinations. The past was also isolated: schools were designed to keep students inside, and the rest

b3411_Ch-10.indd 167

20-Nov-18 12:41:33 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

168  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

of the world outside. The future needs to be integrated. This means emphasizing integration of subjects, integration of students, and integration of learning contexts. The future also needs to be connected with real-world contexts, and permeable to the rich resources in the community. Instruction in the past was subjectbased; but instruction in the future will be project-based. In the past, different students were taught in similar ways. Now, we need to embrace diversity with differentiated pedagogical practices. The past was curriculum-centered; the future is learner-centered. The goals of the past were standardization and compliance; that is, students are educated in batches of age, following the same standard curriculum, all assessed at the same time. The future is about personalizing educational experiences: building instruction from student passions and capacities, and helping students personalize their learning and assessment in ways that foster engagement and talents. In the past, schools were technological islands: technology was deployed mostly to support existing practices for efficiency gains. Future schools will be empowered, and will use the potential of technologies to liberate learning from past conventions and connect learners in new and powerful ways. The past was interactive, the future is participative. The future is also about more innovative partnerships. Isolation in a world of complex learning systems will seriously limit potential. Powerful learning environments are constantly creating synergies and finding new ways to enhance professional, social, and cultural capital with others. They do that with families and communities, with higher education, with other schools and learning environments, and with businesses. We still have far too few innovators and gamechangers in education. We need to find better ways to recognize, reward, and give exposure to their successes. All this has profound implications for teaching and teachers. While the past was about prescription, the future requires an

b3411_Ch-10.indd 168

20-Nov-18 12:41:33 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Distinguishing Fact from Fiction in the Modern Age  169

informed profession, and the replacement of the industrial work organization in education (with its administrative control and accountability) with much more professional and collaborative working norms. Only when teachers feel a sense of ownership over their classrooms and students feel a sense of ownership over their learning can learning for the post-truth information age take place. The past was hierarchical. Students were recipients and teachers the dominant resource. However, the future will be co-created. That means we need to recognize both students and adults as resources for the co-creation of communities, for the design of learning, and for the success of students. The future also needs to be collaborative, and that means changing working norms. Expressed differently, we are seeing a shift from a world where knowledge stacked up somewhere depreciating rapidly in value to a world in which the enriching power of collaboration is rising. The sheer pace of change is the central reason why teachers’ ownership of the profession is a must-have rather than an optional extra. Even the most effective attempts to push a governmentestablished curriculum into classroom practice will drag out over a decade, simply because it takes so much time to communicate the goals and methods through the different layers of the system. In this age of accelerations, such a slow process inevitably leads to a widening gap between what students need to learn and what teachers teach. When fast gets really fast, being slow to adapt makes us simply too slow. Last but not least, in order to address the root causes of the posttruth phenomenon, institutions and political elites will need to address the underlying trust deficit. What has become clear in recent political debates is that citizens want and expect a voice. Citizens are demanding a systemic shift, a more agile form of democracy and government, more participative, and more responsive to their needs

b3411_Ch-10.indd 169

20-Nov-18 12:41:33 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

170  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

and views. Harnessing the power of digital transformation, civic tech can help governments to be more open and transparent, more innovative, and inclusive in how they go about policymaking, budgeting, or service provision. The democratization of media brought about through new digital technologies and platforms has blurred the lines between content creators, mediators, standard-setters, and consumers of information. The bottom line is that we can no longer rely on producers of information distinguishing “information” from “meaning”, and “news” from “opinion”. End-users themselves need to become better at this. In this networked age, the critical challenge will be how we co-construct and crowdsource the truth.

b3411_Ch-10.indd 170

20-Nov-18 12:41:33 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

  ABOUT THE EDITORS Norman Vasu is Senior Fellow and Deputy Head of the Centre of Excellence for National Security (CENS) at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. He is the author of How Diasporic Peoples Maintain their Identity in Multicultural Societies: Chinese, Africans, and Jews (Edwin Mellen Press, 2008), editor of Social Resilience in Singapore: Reflections from the London Bombings (Select Publishing, 2007), co-editor of Nations, National Narratives and Communities in the Asia Pacific (Routledge, 2014), as well as Immigration in Singapore (Amsterdam University Press, 2015). His research on multiculturalism, ethnic relations, narratives of governance, citizenship, immigration, and national security have been published in journals such as Asian Survey, Asian Ethnicity, Journal of Comparative Asian Development and The Copenhagen Journal of Asian Studies, and in a number of edited volumes. He was a Fulbright Fellow with the Center for Strategic Communication, Hugh Downs School of Human Communication, Arizona State University in 2012, a Visiting Senior Fellow at the Takshashila 171

b3411_About the Editor and Contributor.indd 171

20-Nov-18 12:38:49 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

172  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Institution, Bangalore, India, in 2016 and a Visiting Scholar at the Daniel K. Inouye Asia-Pacific Center for Security Studies in Honolulu, Hawaii in 2018. Benjamin Ang is Senior Fellow in the Centre of Excellence for National Security (CENS) at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. He leads the Cyber and Homeland Defence Programme of CENS, which explores policy issues around the cyber domain, international cyber norms, cyber threats and conflict, strategic communications and disinformation, law enforcement technology and cybercrime, smart city cyber issues, and national security issues in disruptive technology. Prior to this, he had a multi-faceted career that included time as a litigation lawyer arguing commercial cases, IT Director and General Manager of a major Singapore law firm, corporate lawyer specializing in technology law and intellectual property issues, in-house legal counsel in an international software company, Director-Asia in a regional technology consulting firm, in-house legal counsel in a transmedia company, and Senior Law Lecturer at a local Polytechnic, specializing in data privacy, digital forensics, and computer misuse and cybersecurity. Benjamin graduated from Law School at the National University of Singapore (NUS), and has an MBA and MS-MIS (Master of Science in Management Information Systems) from Boston University. He is qualified as an Advocate and Solicitor of the Supreme Court of Singapore, and was a Certified Novell Network Administrator back in the day. He also serves on the Executive Committee of the Internet Society Singapore Chapter. Shashi Jayakumar assumed the appointment as Head, Centre of Excellence for National Security (CENS) on 1 April 2015, and the

b3411_About the Editor and Contributor.indd 172

20-Nov-18 12:38:49 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

About the Editors  173

appointment of Executive Coordinator, Future Issues and Technology on 1 August 2017. Shashi was educated at Oxford University, where he studied History (BA, 1997, DPhil, 2001). He has published in various peer-reviewed journals and edited volumes on topics relating to medieval history (the focus of his doctorate). He was a member of the Singapore Administrative Service from 2002–2017. During this time, he was posted to various Ministries, including the Ministries of Defence, Manpower, Information and the Arts, and Community Development, Youth and Sports. From August 2011 to July 2014, he was Senior Visiting Research Fellow at the Lee Kuan Yew School of Public Policy (LKYSPP). His research interests include extremism, social resilience, cyber, and homeland defence. He is currently completing a major book project relating to domestic (Singapore) politics (forthcoming, National University of Singapore Press, 2019).

b3411_About the Editor and Contributor.indd 173

20-Nov-18 12:38:49 PM

July 24, 2015

11:52

bk01-009-applied-9x6

This page intentionally left blank

8420-main

page vi

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

  ABOUT THE CONTRIBUTORS (in order of appearance)

Rob Brotherton is an academic psychologist and author of Suspicious Minds: Why We Believe Conspiracy Theories. He completed an ESRC-funded PhD at Goldsmiths, University of London, and is now an Adjunct Assistant Professor at Barnard College, Columbia University in New York City, where he teaches classes on conspiracy theories, social psychology, and science communication. His research addresses the cognitive origins of conspiracy theories, as well as personality correlates and measurement issues. Daniel M. T. Fessler is Professor of Anthropology and Director of the Center for Behavior, Evolution, and Culture at the University of California, Los Angeles. Adopting an integrative perspective in which humans are viewed as both the products of complex biological evolutionary processes and the possessors of acquired culturally evolved idea systems, he combines experiments, ethnography, and 175

b3411_About the Editor and Contributor.indd 175

20-Nov-18 12:38:49 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

176  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

published data in exploring the determinants of behavior, experience, and health in domains such as emotions, aggression, cooperation, disease avoidance, morality, food and eating, sex and reproduction, and risk taking. Nicolas Arpagian is Director of Strategy and Public Affairs, Orange Cyberdefense (Orange). He is also Scientific Director of the CyberSecurity Program at the French National Institute for Advanced Studies in Security and Justice (INHESJ — French Prime Minister Office), and Senior Lecturer at the France’s National Police College (ENSP). He is the author of a dozen of books, among them: La Cybersécurité, Presses Universitaires de France, L’Etat, la Peur et le Citoyen, Vuibert, La Cyberguerre, la guerre numérique a commencé, Vuibert. Kevin Limonier is Associate Professor at French Institute of Geopolitics (University of Paris 8), and is Scientific Director of the Russian-Speaking Infosphere Observatory in the French Ministry of Defense. He is also a member of the Castex Chair of Cyber Strategy at the Institute of Advanced Studies in National Defence, Ecole Militaire, Paris. Limonier’s fields of research include the cartography and geopolitics of cyberspace, Big Data Analysis, and the History of Internet and computers in the post-Soviet space. Louis Pétiniaud is a PhD candidate at the French Institute of Geopolitics (University of Paris 8). Maxime Chervaux did translation for this contribution. Gillian Bolsover is Post-Doctoral Researcher on the Computational Propaganda project at the Oxford Internet Institute. She completed her PhD at the Internet Institute in 2017, and holds a dual MSc/MA in Global Media and Communications from the London School of Economics and Political Science, and Fudan University in Shanghai, China, and a BA in photojournalism and political science from the

b3411_About the Editor and Contributor.indd 176

20-Nov-18 12:38:49 PM

“8x6”  b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

About the Contributors  177

University of North Carolina at Chapel Hill. Philip Howard is Director of the Oxford Internet Institute. He is Professor of Sociology, Information, and International affairs. He is the author, most recently, of Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up. Gulizar Haciyakupoglu is Research Fellow and Benjamin Ang is Senior Fellow at the Centre of Excellence for National Security (CENS), a constituent unit of the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. Jānis Bērziņš is Director of the Latvia-based Center for Security and Strategic Research at the National Defense Academy of Latvia, and Senior Fellow at the Potomac Foundation. He has authored over 60 publications on strategic and security issues, and has lectured as a guest all over the world, including at institutions such as the Swedish Defense Research Agency, the Norwegian Military Academy, and NATO’s Special Operations Command Europe. Jakub Janda is Deputy Director of the Prague-based European Values Think-Tank, and Head of its Kremlin Watch Program. He has co-authored Czech national policy on countering hostile foreign influence, and he advises democratic governments on this issue. Veronika Víchová serves as Analyst at the Kremlin Watch Program of the European Values Think-Tank. Karolin Schwarz was formerly with correctiv.org, an investigative non-profit organization. In 2016, Schwarz founded Hoaxmap.org, a platform dedicated to refuting false news about refugees and migrants in Germany. She also works as a freelance journalist covering (inter alia) disinformation and platform policies.

b3411_About the Editor and Contributor.indd 177

20-Nov-18 12:38:49 PM

b3411   DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears “8x6”

178  DRUMS: Distortions, Rumours, Untruths, Misinformation and Smears

Andreas Schleicher is Director for Education and Skills, and Special Advisor on Education Policy to the Secretary-General at the Parisbased Organization for Economic Co-operation and Development (OECD). Schleicher is the author of the book World Class: How to Build a 21st-Century School System.

b3411_About the Editor and Contributor.indd 178

20-Nov-18 12:38:49 PM