Thursday, May 11, 2017

Can Public Diplomacy Survive the Internet? Bots, Echo Chambers, and Disinformation

image from

Edited by Shawn Powers and Markos Kounalakis May, 2017


Transmittal Letter

Forward: Public Diplomacy in a Post-Truth Society 1

Francis Fukuyama, Olivier Nomellini Senior Fellow at the Freeman Spogli Institute for International Studies (FSI), and the Mosbacher Director of FSI’s Center on Democracy, Development, and the Rule of Law

Executive Summary

Shawn Powers, Executive Director, U.S.Advisory Commission on Public Diplomacy 2

Remarks on “Public Diplomacy in a Post-Truth Society” 7

Bruce Wharton, Acting Under Secretary for Public Diplomacy and Public Affairs [;] Hoover Institution, Stanford University, Stanford, California, March 20, 2017


Computational Propaganda and Political Bots: An Overview 13
Samuel C. Woolley, Director of Research, Oxford Internet Institute’s Computational Propaganda Project

Understanding the Psychology Behind Computational Propaganda 19
Matt Chessen, Foreign Service Science,Technology and Foreign Policy Fellow at The George Washington University

Rethinking Countermeasures in the Age of Computational Propaganda 27
Tim Hwang, Executive Director, Pacific Social

Public Diplomacy’s (Misunderstood) Digital Platform Problem 33
Sam Ford, Research affiliate and consultant with Massachusetts Institute of Technology’s Program in Comparative Media Studies/Writing

Understanding the Challenges of Artificial Intelligence and Computational Propaganda to Public
Diplomacy 39
Matt Chessen, Foreign Service Science,Technology and Foreign Policy Fellow at The George Washington University


Psychological Principles for Public Diplomacy in an Evolving Information Ecosystem 49
Jeffrey T.Hancock, Professor of Communication, Stanford University

Facts Matter, and People Care: An Empirical Perspective 55
Ethan Porter, Asst. Professor at George Washington University School of Media and Public Affairs

VOA: A Weapon of Truth in the War of Words 61
Amanda Bennett, Director,Voice of America

U.S. 2016 Elections: A Case Study in “Inoculating” Public Opinion Against Disinformation 65
Jonathan Henick, Principal Deputy Coordinator for International Information Programs and Ryan Walsh, Senior Advisor for Digital Product, Bureau of International Information Programs

In Defense of Truth, and the Threat of Disinformation 71
Jason Stanley, Jacob Urowsky Professor of Philosophy,Yale University


Public Diplomacy and Strategic Narratives 77
Laura J. Roselle, Professor of Political Science and International Studies, Elon University

Crafting Resilient State Narratives in Post Truth Environments: Ukraine and Georgia 83
Vivian S. Walker, Professor of National Security Strategy, National War College

America’s Strategic Narrative and a Path for Public Diplomacy 91
Markos Kounalakis, Visiting Fellow, Hoover Institution, Stanford University


By Shawn Powers, Executive Director of the U.S. Advisory Commission on Public Diplomacy

Scientific progress continues to accelerate, and while we’ve witnessed a revolution in communication technologies in the past ten years, what proceeds in the next ten years may be far more transformative. It may also be extremely disruptive, challenging long held conventions behind public diplomacy (PD) programs and strategies. In order to think carefully about PD in this ever and rapidly changing communications space, the Advisory Commission on Public Diplomacy (ACPD) convened a group of private sector, government, and academic experts at Stanford University’s Hoover Institution to discuss the latest trends in research on strategic communication in digital spaces. The results of that workshop, refined by a number of follow-on interviews and discussions, are included in this report. I encourage you to read each of the fourteen essays that follow, which are divided into three thematic sections: Digital’s Dark Side, Disinformation, and Narratives.

Digital’s Dark Side focuses on the emergence of social bots, artificial intelligence, and computational propaganda. Essays in this section aim to raise awareness regarding how technology is transforming the nature of digital communication, offer ideas for competing in this space, and raise a number of important policy and research questions needing immediate attention. The Disinformation section confronts Oxford English Dictionary’s 2016 word of the year – “post-truth” – with a series of compelling essays from practitioners, a social scientist, and philosopher on the essential roles that truth and facts play in a democratic society. Here, theory, research, and practice neatly align, suggesting it is both crucial and effective to double-down on fact-checking and evidence-based news and information programming in order to combat disinformation campaigns from our adversaries. The Narrative section concludes the report by focusing on how technology and facts are ultimately part of, and dependent on, strategic narratives. Better understanding how these narratives form, and what predicts their likely success, is necessary to think through precisely how PD can, indeed, survive the Internet. Below are some key takeaways from the report.


• We are not living in a “post-truth” society. Every generation tends to think that the current generation is less honest than the previous generation. This is an old human concern, and should be seen today as a strategic narrative (see Hancock, p. 49; Roselle, p. 77). Defending the value and search for truth is crucial. As Jason Stanley notes (p. 71), “without truth, there is just power.”

• Humans are remarkably bad at detecting deception. Studies show that people tend to trust what others say, an effect called the truth bias. This bias is actually quite rational—most of the messages that a person encounters in a day are honest, so being biased toward the truth is almost always the correct response (see Hancock,p.49).

• At the same time people are also continuously evaluating the validity of their understanding of the world. This process is called “epistemic vigilance,” a continuous process checking that the information that a person believes they know about the world is accurate. While we have a difficult time detecting deception from interpersonal cues, people can detect lies when they have the time, resources, and motivation. Lies are often discovered through contradicting information from a third source, or evidence that challenges a deceptive account (see Hancock, p. 49).

• Fact checking can be effective, even in hyper-partisan settings (see Porter, p. 55), and is crucial for sustained democratic dialogue (Bennett, p. 61; Stanley, p. 71). Moreover, it is possible, using digital tools, to detect and effectively combat disinformation campaigns in real time (Henick and Walsh, p. 65).


• Computational propaganda refers to the coordinated use of social media platforms, autonomous agents and big data directed towards the manipulation of public opinion.

• Social media bots (or “web robots”) are the primary tools used in the dissemination of  computational propaganda. In their most basic form, bots provide basic answers to simple questions, publish content on a schedule or disseminate stories in response to triggers (e.g.breaking news). Bots can have a disproportionate impact because it is easy to create a lot of them and they can post a high-volume content at a high frequency (see Woolley, p.13).

• Political bots aim to automate political engagement in an attempt to manipulate public opinions. They allow for massive amplification of political views and can empower a small group of people to set conversation agenda’s online. Political bots are used over social media to manufacture trends, game hashtags, megaphone particular content, spam opposition and attack journalists. The noise, spam and manipulation inherent in many bot deployment techniques threaten to disrupt civic conversations and organization worldwide (see Chessen, p.19).

• Advances in artificial intelligence (AI) – an evolving constellation of technologies enabling computers to simulate cognitive processes – will soon enable highly persuasive machine-generated communications. Imagine an automated system that uses the mass of online data to infer your personality, political preferences, religious affiliation, demographic data and interests. It knows which news websites and social media platforms you frequent and it controls multiple user accounts on those platforms. The system dynamically creates content specifically designed to plug into your particular psychological frame and achieve a particular outcome (see Chessen, p. 39).

• Digital tools have tremendous advantages over humans. Once an organization creates and configures a sophisticated AI bot, the marginal cost of running it on thousands or millions of user accounts is relatively low. They can operate 24/7/365 and respond to events almost immediately. AI bots can be programmed to react to certain events and create content at machine speed, shaping the narrative almost immediately. This is critical in an information environment where the first story to circulate may be the only one that people recall, even if it is untrue (see Chessen, p. 39)

• PD practitioners need to consider the question of how they can create and sustain meaningful conversations and engagements with audiences if the mediums typically relied upon are becoming less trusted, compromised and dominated by intelligent machines.

• Challenging computational propaganda should include efforts to ensure the robustness and integrity of the marketplace of information online. Defensively, this strategy would focus on producing patterns of information exchange among groups that would make them difficult to sway using techniques of computational propaganda. Offensively, the strategy would seek to distribute the costs of counter-messaging broadly, shaping the social ecosystem to enable alternative voices to effectively challenge campaigns of misinformation (see Hwang, p. 27). In the persuasive landscape formed by social media and computational propaganda, it may be at times more effective to build tools, rather than construct a specific message.

• Practitioners are not alone in their concern about the escalating use of social bots by adversarial state actors. The private sector is, too. Social media platforms see this trend as a potentially existential threat to their business models, especially if the rise of bots and computational propaganda weakens users’ trust in the integrity of the platforms themselves. Coordination with private sector is key, as their policies governing autonomous bots will adapt and, thus, shape what is and isn’t feasible online.


• Folk theories, or how people think a particular process works, are driving far too many digital strategies. One example of a folk theory is in the prevalence of echo chambers online, or the idea that people are increasingly digitally walled off from one another, engaging only with content that fits cognitive predispositions and preferences.

• Research suggests that the more users rely on digital platforms (e.g. Twitter and Facebook) for their news and information, the more exposure they have to a multitude of sources and stories. This remains true even among partisans (though to a lesser extent than non-partisans). It turns out we haven’t digitally walled ourselves off after all (see Henick and Walsh, p. 65).

• Despite increased exposure to a pluralistic media ecosystem, we are becoming more and more ideological and partisan, and becoming more walled off at the interpersonal and physical layers. For example, marriages today are twice as likely to be between two people with similar political views than they were in 1960.

• Understanding this gap between a robustly diverse news environment and an increasingly “siloed” physical environment is crucial to more effectively engaging with target audiences around the world. Interpersonal and in-person engagement, including exchange programs, remain crucial for effective PD moving forward (see Wharton, p. 7)

• Despite this growing ideological divide, people are increasingly willing to trust one another, even complete strangers, when their goals are aligned (see the sharing economy, for example). This creates interesting opportunities for PD practitioners. Targeting strategies based on political attitudes or profiles may overshadow the possibility of aligned goals on important policy and social issues (see Hancock p. 49)


• Virality – the crown jewel in the social media realm – is overemphasized often at the expense of  more important metrics like context and longevity. Many of the metrics used to measure the effectiveness of social media campaigns are vulnerable to manipulation, and more importantly, don’t measure engagement in any meaningful way. These metrics were built for an industry reliant on advertising for revenue generation, and as a result, may not be well-suited when applied to the context of PD (see Ford, p. 33; Woolley, p. 13).

• Overemphasizing certain metrics, such as reach or impressions, fails to account for the risks created
by relaying on the same portals as other, less truthful and more nefarious actors. We need to be cautious and aware of the various ways in which the digital media business industries are shaping PD content, be aware of the risks, and think carefully about safeguarding the credibility U.S. Department of State PD programs operating in this space (see Wharton, p. 7; Ford, p. 33).


• Strategic narratives—a means for political actors to construct a shared meaning of the past, present and future of politics in order to shape the behavior of other actors.”[sic with quotation mark - JB] They provide the ideological backdrop for how audiences assess the meaning and significance of current events and breaking news. Put another way, they help people make sense of what would otherwise be a dizzying onslaught of news they are exposed to on a daily basis (see Roselle, p.77; Kounalakis, p.91).

• Crafting effective narratives require a genuine consensus-even if limited or temporary-on our policy priorities and their underlying values, as well as a detailed understanding and appreciation of local grievances and concerns about the related policy issue (see. Wharton, p. 7; Roselle. [p]. 77). As such, effective strategic narratives must be mutually constructed.

• Rather than focusing on trending news topics and stories alone, we need to develop greater capacity to understand competing public narratives in foreign contexts and track how they adapt over time. Understanding distinctions between system (or governance), value, and identity narratives would allow PD practitioners to construct policy narratives that speak to, or at least acknowledge, the underlying pillars of belief in a given community (see Walker, p. 83; Roselle, p. 77).

• Every new administration creates new opportunities for foreign engagement. A shift towards a more transactional approach to PD, focused less on values but more on shared policy priorities,could allow for improved relations and cooperation with a number of countries previously hostile to American PD efforts and programs (see Kounalakis, p. 91).

No comments: