Saturday, May 27, 2017

Minutes and Transcript from the Quarterly Public Meeting on can Public Diplomacy Survive the Internet


state.gov/
; via TC
image (not from entry) from

MINUTES AND TRANSCRIPT FROM THE QUARTERLY PUBLIC MEETING ON CAN PUBLIC DIPLOMACY SURVIVE THE INTERNET? BOTS, ECHO CHAMBERS, AND DISINFORMATION
Tuesday, May 9, 2017 | 10:30-12:05 p.m.
George Washington University Elliot School for International Affairs
COMMISSION MEMBERS PRESENT:
Mr. Sim Farar, Chair
Mr. Bill Hybl, Vice Chair
Ambassador Penne Korth Peacock
Ms. Anne Terman Wedner
Ms. Georgette Mosbacher
COMMISSION STAFF MEMBERS PRESENT:
Dr. Shawn Powers, Executive Director
Mr. Chris Hensman, Senior Advisor
Ms. Michelle Bowen, Program Support Assistant
MINUTES:
The U.S. Advisory Commission on Public Diplomacy met in an open session from 10:30 a.m. – 12:05 p.m. on Tuesday, May 9, 2017 to launch its latest report, Can Public Diplomacy Survive the Internet? After welcoming remarks from Dr. Janet Steele (Director of the Institute for Public Diplomacy and Global Communication) and a keynote address from retired Congressman Mike Rogers, a panel of experts discussed the report’s findings, focusing specifically on how public diplomacy practitioners need to adjust strategies and tactics for the modern information ecosystem, including understanding echo chambers, automated disinformation, algorithmic bias, and the proliferation and diversity of foreign propaganda efforts. Panelists included: Matt Chessen (Foreign Service Science, Technology and Foreign Policy Fellow at George Washington university); Tom Cochran (Former White House, State Department Technology Lead and Acquia’s Vice President and Chief Digital Strategist); Markos Kounalakis (J. William Fulbright Foreign Scholarship Board and visiting fellow at Stanford's Hoover Institution); Ethan Porter (Assistant Professor, George Washington’s School of Media and Public Affairs); and Ory Rinat, (State Department Transition Team Digital Lead). Commission Member Georgette Mosbacher closed the meeting briefly discussing the Commission’s ongoing work and the importance of continued focus on the question of computational propaganda.
---TRANSCRIPT---
Janet Steele: Good morning, and welcome. I'm Janet Steele, the director of the Institute for Public Diplomacy and Global Communication. I'm so pleased that we're able to host this quarterly meeting of the U.S. Advisory Commission on Public Diplomacy and the launch of this incredible report. IPDGC is a joint venture between GWU's School of Media and Public Affairs, which is my home department, and the Elliott School of International Affairs. We have done a series of programs this year on Russian disinformation, partly because of the work of our wonderful public diplomacy fellow, Tom Miller, and so we're just thrilled to have an opportunity to host this event that Shawn Powers, the executive director of the Advisory Commission on Public Diplomacy, has put together.
We have a really exciting program today. We're going to start with a keynote address from Congressman Mike Rogers, followed by a panel discussion from some of the authors of this report. We will then open it up to questions and answers of the panel and of the Commission. We will end with a final address by Georgette Mosbacher. It's going to be an exciting morning, and we're so grateful you're here. This is on the record. We have a number of media outlets here, and if any of you are interested in tweeting, and we hope you are, the hashtag is (#)echochamber.
Thank you very much. I'd now like to introduce Mr. Sim Farar, the chairman of the Advisory Commission on Public Diplomacy.
Sim Farar: Thank you very much. Hello, and welcome to our public meeting for the United States Advisory Commission on Public Diplomacy. I am Sim Farar, the chairman of the Commission. Thank you, Dr. Steele, for opening up this beautiful room for us and our warm welcome, and partnering us with organizing today's session. We're very excited to have Congressman Rogers here today. It's a very, very, very important day for all of us, and very timely, I believe.
GWU's Institute for Public Diplomacy and Global Communications is a thriving hub for all things public diplomacy. First, let me tell you a little bit about our Commission. Since 1948, ACPD has been charged with appraising U.S. government activities intended to understand, inform, and influence foreign publics. It also works to increase the understanding of and support for these same activities. The Commission conducts research and symposiums that provide assessments and informed disclosures on public diplomacy efforts across government, such as this meeting, which was our second for 2017.
The work of this bipartisan group remains crucial to supporting American public diplomacy efforts. A task that is especially important today. As many of you know, public diplomacy efforts are crucial to America's national security and offer substantial return on investment insofar as they help to prevent the use of military force, promote American business interests abroad, and support foreign interests in traveling to and studying in the United States.
Before welcoming this morning's keynote speaker, I'd like to introduce our colleagues on our Commission. Bill Hybl's our vice chairman from Colorado Springs, Colorado. Ambassador Penne Peacock from Austin, Texas. Anne Wedner from Chicago, Illinois. Georgette Mosbacher from New York City. Ambassador Lyndon Olson from Waco, Texas. Unfortunately, he's unable to attend today.
We are thrilled and honored to have Congressman Mike Rogers with us this morning. Congressman Rogers represented Michigan's 8th Congressional District from 2001 to 2015, and he is very, very missed, believe me. He chaired the House Permanent Select Committee on Intelligence, and was a member of the Energy and Commerce Panel. In Congress, he built a legacy as a tireless and effective leader on counterterrorism and national security policy as well as being active on healthcare, telecommunications, and cyber security.
Prior to his time in Congress, Rogers was an officer in the United States Army and an FBI special agent. Recently, he's been quite vocal on the challenges we face from foreign propaganda. In an article published in the Wall Street Journal, Congressman Rogers called for greater attention and research in, his quote, "Russians propaganda operations and information warfare." He said that, "We need to figure out how to reduce the influence of foreign trolls and teach Americans about Moscow's capabilities."
We-the Commission-could not agree more, and are today releasing our latest report, Can Public Diplomacy Survive The Internet, on precisely this topic. Without further ado, please offer a nice warm welcome for Congressman Rogers. Thank you.
Mike Rogers: Thank you, Mr. Chairman. Thank you, committee members and Georgette, a friend from New York City. Thank you all for the opportunity to be here. There's so many topics or so many places you could go when you talk about propaganda from foreign nations and intelligence services engaging in something we call active measures, which would be covert actions and ways that were designed to influence the outcome of elections or politics or business or fill in the blank, to whatever the purpose of that nation-state is.
I just thought, given today's topic, I would pick one of those countries and talk about it for a minute. Let's just say, how about Russia? Good pick. Media literacy is going to be incredibly important as we go forward. In this last display in the last election, I think changed the course of the way many Americans, and certainly our allies overseas, look at what an active measure by an intelligence service is and what its designed outcome is intended to be.
I'm going to back up for just one second. If you go back to 1992, a disheveled, older-looking gentleman walks into the British embassy in Latvia. He says, "I think I have some things that you all might be interested in." He had seven large cartons, eventually, that he wanted to bring into the embassy. For those folks in the intelligence business, your first reaction is, "Let's figure out we know what we're dealing with," right?
As they went through some of the beginning parts of some of this material, realized that somebody that had spent 30 years as an archivist for the KGB First Directorate had meticulously taken notes on all of their operations over the course of those 30 years. He brought a little of that information, a little bit of his own notes, and laid it on the table for the British intelligence service. In the FBI, we would say, "That's a good day."
If you think about what that material represents, it was really quite striking. It really laid out all of these active measures that a country, the Soviet Union at that time, Russia today, would engage in countries to actually try to influence for their own political purposes. It was the laundry list of things that you've seen movies about from the 1950s until today. They used honeypot operations to get people into blackmail positions around the world. They recruited politicians. They recruited people in individual parties. They paid bribes to journalists to write stories. They actively sought to try to make sure there was discourse and political actions, both in Europe and places like Latin America.
There was a great book on some of this material called The KGB, The World Was Going Our Way, I think was the name of the book. It was based on the notion that they really believed through all of these activities that they were winning the course of these governments. It was changing to a more pro-Moscow attitude.
Imagine, if you fast forward that a little bit, and you add a whole another toolkit to what we know are former KGB, now FSB and GRU. As a matter of fact, many of the folks who study this for a living would tell you it's really the GRU who's ramped up their game, not necessarily the FSB. Let's just say they're all pretty good at active measures.
They're going through this process, and they think, "Okay, where can we apply this new tool that we have, this ability to influence outcomes based on social media platforms that we've never really probably understood let alone knew how to participate with?"
If you look at places like Moldova, great example. They were fostering an attitude. This was earlier on in their ability to use these social media platforms to influence an outcome. They were trying to make sure that the separatists in Transnistria had the opportunity to have their voices heard maybe even louder than it should have been. They were planting fake stories about Russian-language genocide, where if you went into pharmacies and were speaking Russian, you couldn't get served unless you changed to speaking Romanian, the language of Moldova. That was creating lots of discourse, as you can possibly imagine. It was bolstering their position in Transnistria, and it was certainly adding to the confusion and chaos in Moldova.
It worked exceptionally well. When you look at what was happening early on in their effort to ... Take Crimea. That was I think more of an opportunity than it was a plan, but to at least disrupt. Moscow wanted to make sure that Ukraine was not leaning west economically and was not a member of NATO. That was a big ... As a matter of fact, Putin said that early on in his first presidency, that this was one of his primary goals. He also mentioned the country of Georgia, and if you look at both of them, in his mind, he's two and O. Georgia is no time soon going to be a NATO partner.
When you look at these activities, what happened in the first part of Ukraine is all of these stories started showing up about saying, "Hey, guess what? The government supported by the West is going to start claiming Polish cities as its own." There was a list of three or four Polish cities that have some historical relevance, where people in places like Ukraine could theoretically make the case. They weren't doing that. That certainly wasn't their intent, certainly wasn't supported by any other Western allies.
Then they did an ad campaign, covertly, that talked about join your brother Slavs in the fight against the West, trying to take away Polish cities. That discourse happens in Poland. Now you have a question about certainly Poland's engagement and whose side they ought to be on. You have certainly raised questions within Ukraine itself and its own governance model. The Russians, at that time, are enjoying pretty high marks for their ability to cause and sow discontent in places in which they have interests.
Then you look at the American election. There was a troll that was hired, in theory. This was a BBC discovery. It's in a RAND report. It's a pretty interesting report. They hired these centers. Think about this. This is pretty clever. They decided, "You know what? Let's go into the American election." I think this is where Americans need to stop saying, "My candidate won because of it," or, "My candidate lost because of it." We need to take that out of our lexicon. We need to get to where we can understand what the Russians were trying to accomplish with what they were doing, because it worked so well here they went to other places.
Think about what they decided to do. By the way, they saw the same polls that most Americans did. They probably believed that one candidate was going to win based on those polls. Either way, either one of their intentions or motives on this is something we should be concerned about. They decided one presidential candidate, we ought to muddy that person up. How much can we make that person not a credible leader to the American electorate if that person wins? That's pretty dangerous stuff, if you start to think about it.
They used this new process that they had. This gets back to those trolls. They hired these centers. Many believed they were scattered across eastern Europe. I've heard reports as high as 190 of them. I'm not sure it's that high, but let's say they're half right. That's still a lot. Each center, and each person in this center, had the obligation to post at least 135 times a day with the messages that were already pre-approved, but in real time, attaching to real stories in the United States and places, and where it would be agents of influence across Europe. It had to be at least 200 characters before you could go home.
If you think about that and you multiply that just by even those 80 centers, and then you add the AI botnets that they had producing and rapidly placing stories, you've got a real problem on your hands. Think of the stories they used in Moldova. There could be found no evidence that someone was turned away from a pharmacy for speaking Russian. There is no evidence that there was any effort by anyone in the new Ukrainian government that was trying to take over Polish cities.
There is really an interesting trend that happened here. Think about where the Russians are today. They are still getting liquored up on good vodka over this operation. They not only influenced our ability to not like each other in this campaign. They have actually damaged an elected president's ability to get his agenda done, because we're still at each other's throats about what happened or what didn't happen with the Russians involved in our election. If you're an intelligence officer sitting at the First Directorate at the center in Moscow, man, you are thinking about that new retirement home that you're going to get for this operation. It was hugely and outrageously successful.
You say, "Maybe." It was so successful the defense minister, late last year, it was getting after the election, Shoygu announced that they were going to create a new directorate. They were going to combine all of their cyber and propaganda and information warfare troops, as he called them, in one place. Why? They wanted better command and control for a more focused outcome.
Think about that. They decided, "This thing worked so well, why have it dispersed around the government? Let's put it in one place and focus our effort on disrupting elections, on doing all the things we have been doing for 70 years, except this one was far more successful." Because we're using a social media platform that most Americans probably don't understand how it works together, how it's layered together, in the day of affirmation news where you want to tune in to the news that you agree with, this is absolutely ripe for effectiveness.
That's what we saw in this election. We saw that Russians take that next big important step about putting it together, and then being very vocal about it. We've watched what they're doing in France. We've seen what they're doing in Germany. They're not going to stop. They're going to use all those old techniques. They're still going to use blackmail techniques. Yep, it still happens, 2017. There are still cases of people being blackmailed by intelligence services for doing activities they probably shouldn't do. If you can't tell your mother about it, this is an old intelligence thing, then don't do it, right?
If you think about the recruiting and the payments and the bribes, all of those things are still effective in 2017. Then you lay over top of it this new way to have operations and operators in Moscow or in eastern Europe or fill in the blank being able to conduct business that impacts your ability to run free and fair elections. I don't care if you're in Colombia. I don't care if you're in Egypt or Jordan. There may be some debate there about free and fair elections. That's my only joke I have today, so please, would you guys go with me on this, please? Thank you.
That's why I'm excited about the Commission's report, and the opportunity to start arming ourselves with information [JB emphasis]. The first thing that we can do as Americans is just get smart on it, that every time something flies into your inbox, it may or may not be an accurate portrayal of that story. If you look at the media propaganda that they were using, it was based on disinformation. It was taking a real story, distorting the facts, putting it together, just to get it enough believable. It was actually very well done, very well done, because it wasn't so outrageous that you clicked on it and went, "That doesn't even make sense." It did make sense.
Think about where the EU is. I find this interesting. They just declared recently, they got so jazzed up about these influence operations that they declared the entities RT TV, Sputnik News, and Russkiy Mir as propaganda tools for the Russian intelligence services. Think about what that means. These are news agencies who have been functioning around both here in the United States and across Europe for some time now. Europe was so frustrated, they just threw it on the table and called it what it was.
That's what we have to worry about now. Imagine that. I can have a news agency at which I probably have significant influence write the story that I want, and now I have the ability, through this new system which we're going to learn about from this panel today, to put that in lots of places that you are likely to encounter it in your daily consumption of news. Man, is that a powerful tool. It's only powerful if we let it happen, and right now I think it's important for us to understand the tools, understand what they do, and then have this dialogue of two levels. One, we're going to have to have a policy discussion at the U.S. government about what we can do with this in our own way to counteract it, either covertly or overtly. We also have to fight the fight in the public domain by a good understanding of what their tools are and how they use them so we can make better decisions as we do and go about our business across America.
My congratulations to the Commission. Thanks. I look forward to a lot more great work. Ambassador, I hope there's another ambassador at the table before too long. Am I allowed to say that? No. Okay, good. What I meant to say was thank you and congratulations to the Commission. Thanks for having me today.
Shawn Powers: I'd like to invite our panelists to the front of the room. Thank you so much, Congressman Rogers, for a wonderful keynote address. It was really insightful. It's important to be reminded of the historical dimensions of the current conflict, information conflict. I think some of your examples also remind us that it's a lot easier to spread fake news than it is to actually combat fake news. These are some difficult issues to comprehend.
The panelists are joining me at the front of the room. I'd like to introduce this wonderful panel, several of whom contributed to the Commission's report, and others are just experts on the questions that we're discussing today. First, we're going to go in alphabetical order, because that's the most fair. Matt Chessen, who's a foreign service, science, technology, and foreign policy fellow at George Washington's Elliott School.
We're also happy to welcome Tom Cochran, the former director of new media technologies at the White House, and a former managing director of the State Department's Bureau of International Information Programs. Tom is currently Acquia's vice president and chief digital strategist.
I'd like to welcome Dr. Markos Kounalakis, a J. William Fulbright Foreign Scholarship Board member and visiting fellow at Stanford University's Hoover Institution. Dr. Ethan Porter, who's joining us from George Washington's School of Media and Public Affairs, and finally Ory Rinat, the State Department's transition team digital lead and previously head of digital strategy for the Heritage Foundation and its multimedia news organization, the Daily Signal. Let's give them a warm welcome.
I'm going to try to keep my questions to a minimum so that we've got plenty of time for you to get involved in the conversation, but I do have two questions for each of the panelists. First, I'd like to start with Ory. Before joining government, where you've worked closely with both the State Department and the White House on improving how we engage with foreign publics, you led digital strategy for the incredibly successful Daily Signal and a number of other organizations and political campaigns. What does your non-government and campaign experience bring to the public diplomacy family? Put another way, where are we behind and what needs to be done so we can be more effective in this digital war of ideas?
Ory Rinat: I guess I get to start off. I've been in government for a total of about 120 or so days. I've been struck by how we talk about all of the right things. We talk about measurement, we talk about impact, but we're not really forcing ourselves to be up to speed with doing all those things as well as we could. If you think about digital communications broadly, you have content, you have the platforms where you put your content, and you have the channels you use to distribute that content. Across all three, the private sector's so far ahead of where we are in government.
One of the most acute ways I think you see this is when you talk about measurement. We talk so much about we need to measure the impact of our work. We need to make sure that we have ways to analyze the effectiveness of our programs, but so often, the numbers that we put out there are things like reach or potential reach, measures of scale rather than of engagement. We were just hearing from Congressman Mike Rogers about bots and artificial intelligence, and some of the things that are driving so much of the conversation artificially. That necessitates us moving away from measures of reach, measures of impressions, and towards measures of actual engagement.
That's the kind of measure that's harder to fake and more important for us as we look to the impact of our work. It's something the private sector has had to do, because if you look at any private corporation, any publisher, reach isn't getting them anywhere. If you're in the business of selling advertising against your traffic, against your impressions, you can't sell reach. You can sell engagement. You could sell time. You could sell views, but you can't sell those broad, easy to fluff up metrics. I think broadly speaking, that's one area that I'm struck by where I think we have a lot of catching up to do.
Shawn Powers: Great; thanks. Tom, in the Obama administration, you were a digital leader in both the White House and the State Department, aiming to transform government. My question for you is actually similar to the one I just asked Ory. As we are operating in a constantly evolving space, what advice can you give not just to compete in the global marketplace for ideas but to actually get ahead of our adversaries like Russia?
Tom Cochran: Sure. First, it's good to be here. I feel like this is a bit of a family reunion for me. I see so many familiar faces. It's really nice to be back. In terms of getting ahead in the global marketplace, Ory is 100% correct in that it's about analytics and measurement and engagement, but even more important than that is the foundation of technology. Without the right tools to do our jobs at the State Department, it's impossible to expect us to be able to do our jobs. It's funny that I keep saying our even though I'm not part of the team anymore, but I really feel that I am and have been a really strong part of this team, to push forward the importance of technology as an enabler for good diplomatic strategies.
The State Department spends $2.2 billion every year on technology. That's about $31,000 per employee per year. That's the cost of a car every single year. Imagine if I gave you a check for $31,000 every January, and said, "Buy whatever technology you need to do your job." A top of the line Macbook is $3000. You'd have about $28,000 left over. The average Fortune 500 company, which in the private sector is pretty bureaucratic and slow, like a Walmart or an Exxon, spends $10,000 per employee per year and has better technology.
Why is that? How could we possibly expect to go to battle in an information landscape when people, or our adversaries, have access to whatever tools they need whenever they want it, don't play by any rules, and don't have the burden of truth? Now we're sending our soldiers into battle without weapons, essentially. Without the right tools and without the right technology, how could we implement any digital strategies or any communications strategies?
Really, for the three years that I was at State Department, this was the constant thing that I banged my head against the wall for, which was give people the tools to do their job. Let me end with one small anecdote. I went to Argentina, which was my first trip, and I met the head of social media there. An LES, really energetic, really loved his job, and had a really hard time doing his job because he didn't have a mobile device. How can the person that's in charge of social media for an embassy overseas be expected to be the social media expert without a mobile device? Fortunately, after two years of asking for one, he got one. It was a seven-year-old Blackberry with the little rolly ball. If you turned it over, on the backside of it was a camera, but the camera was covered with an asset tag.
If that doesn't speak to the failures of the technology at the State Department, I don't know what does. If this doesn't serve as a warning for the entire department that not only should you get this technology, you should demand this technology, because it's your job. It's simply unacceptable for this team, our diplomatic corps, to go out there and expect to fight in this information landscape that we are in today.
Shawn Powers: No discussion of technology at the State Department is complete without a Blackberry reference, that is for sure. It's important to think about these issues from a tactical perspective, as well. Ethan, there's a growing concern that efforts to correct misinformation or to fact-check actually backfire, oftentimes encouraging people to dig in on opinions that may be factually incorrect. You've conducted extensive research in the U.S. and the UK on exactly these backfire effects and how fact-checking can be actually effective. Can you talk to us a bit about your research and how it differs from previous studies on fact-checking, and what insights it may have for public diplomacy?
Ethan Porter: Sure. Unusual for an academic, I bring a bit of good news, which is to say there's been skepticism for decades in academia and certainly recently in the public consciousness about people's ability to respond to factual information. The backfire effect has been mentioned repeatedly on Meet the Press. You've probably heard some variation of it around your Thanksgiving table, your dinner table. The backfire effect supposedly suggests that if you are presented with factual information that contradicts some political commitment of yours, you're going to reject the factual information and instead double down on the prior misconception you may have held. You can imagine this is really worrisome if this is true, because what's the point of empirical argument or investigation?
I've looked for the backfire effect in the U.S. I looked for it repeatedly during the 2016 election, and I never found it. Working with teams of researchers at universities across the country, never found any evidence that people would backfire. The key thing to keep in mind is that people actually do heed the implication of some factual information that they are told, even if that factual information cuts against a statement made by a politician they support. In the last election, the supporters of the eventual victor in this country, we told, "Listen, he's made several statements at odds with the facts. Are you willing to accept, essentially, the factual willing or not that contravenes his position?" Repeatedly, supporters were willing to accept that factual information.
This was true on the night of the presidential debate. This was true when it came to the national party convention. Ultimately, the backfire effect is overstated, and people do accept factual information. In the UK, we found they can even accept factual information when it's really complicated. I'm sure everyone in the State Department knows the public wildly overestimates the amount of money that goes to foreign aid. We all know this. This is true in the UK and the U.S. It turns out that you can teach people the real number with just a small intervention, and they'll retain that information.
Again, I think I have some better news, which is to say that there is a place for facts still, and that people are still willing to accept factual information, even when it's complicated, as in the case of the foreign aid budget, or when it cuts against their political commitments during the heat of a presidential election.
Shawn Powers: One of the things we talk about in the report are the prevalence of "folk theories," which are the ways we think media operate and how that shapes our understanding of social media platforms. It sounds like the backfire effect may be exactly one of those folk theories that actually may not be true as a matter of fact.
Ethan Porter: Yes.
Shawn Powers: Can we actually focus on the question of public diplomacy? How can folks at the State Department, folks throughout government, use these studies to think about how to influence and inform foreign audiences in a more effective manner?
Ethan Porter: Sure. I think the first point is to become more unafraid of making a factual intervention. We shouldn't be hesitant to say, "This is wrong. This is incorrect. This statement is at odds with the facts," because people will be willing to say, "Okay, thank you for letting me know. I now will move toward the factually accurate position." Public diplomacy should be especially unafraid to make interventions because it turns out that becoming told that your preferred candidate makes a statement at odds with the fact doesn't affect your level of support for that candidate. If I tell you, "Look, your candidate has made a statement at odds with the facts," you're going to actually move in the direction of the factual information, and that's not going to affect your level of support for the candidate.
I actually think that's great news, because that means those of us who are interested in fact-checking and correcting misinformation can make fact checks without putting our thumb on the scale. If I fact-check one candidate, one party, or one country, that does not amount to favoring the other person or the other candidate or party in question, which I think is actually exciting.
The final point I would make is that we found again and again that neutral government sources are really valuable when you're making a fact-based claim. It's not enough to say, "I've done research as a GWU professor." It's not enough to say this other candidate thinks something different. Point to the data sources. The data from this neutral government source conflicts with that claim.
Shawn Powers: Which is also really interesting because trust in government is not exactly at an all-time high, but trust in neutral government sources actually is effective.
Ethan Porter: It does seem that way, yes.
Shawn Powers: Thank you so much. Matt, you're the scariest guy on the panel. Matt has spent the last year here at GWU really digging deep on artificial intelligence, political and social bots, and thinking about how these technologies may transform the digital platforms we've all come to depend on on a daily basis. You've uncovered some pretty scary stuff, so scary that we actually gave you two chapters in the report that we're issuing today. Tell us about what we're looking at, what we're up against, and why it matters for public diplomacy.
Matt Chessen: Hi, Shawn. Thank you. As Shawn said, I've been researching the international implications of artificial intelligence in my time at the Elliott School. One of the things I've come across is this idea of computational propaganda. As Congressman Rogers mentioned, there are teams of trolls out there that are spreading disinformation. They're increasingly using tools known as bots, which can amplify the information by spreading it out autonomously. These bots don't require a high level of technical sophistication, and one person can run conceivably thousands of bots with very little technical knowledge.
Basically, the challenge here is that you're also, in addition to the existing tools for computational propaganda, which are social media technologies, these autonomous agents or bots, big data, you've now got these emerging artificial intelligence technologies. You've got AI chat bots that are very soon going to be able to emulate human conversations online. You've got dynamic content creation, where artificial intelligence tools can dynamically create everything from news articles to videos. You've got affective computing, where artificial intelligence tools can now both understand human emotions very well from either text, video, or speech, and they can also portray human emotions very realistically. You've got video manipulation technologies, where you can take a video of a world leader and basically map any sort of expressions or conversation onto their face, combine that with speech synthesization technologies where you can actually take voice samples and replicate someone's voice completely. You can actually create completely pliable reality now.
Combine that with big data and some of the really sophisticated psychometric profiling tools now. You can actually figure out exactly what someone's personality is and what their political frame is, what their preferences are, what their likes, their dislikes are. You can create a personalized persuasion tool that targets you exactly with personalization and knows you better than you know yourself.
These tools can run 365 days a year all day long. They scale at digital scale, so once you create and configure one, you can create thousands, tens of thousands, millions of these things. What we're looking at over the next several years is the emergence of these propaganda bots that are very persuasive, they know you very well, and they're going to be doing everything from trying to sell you things to, more insidiously, trying to convince you that this information is true.
The real insidious part of this is, these propaganda bots are going to be running information campaigns on all these user accounts online, but they're not going to be able to distinguish between the machines and the people. The machines are going to be running information operations on other machine accounts. We're facing a future where we may actually have the Internet completely overwhelmed by machines running disinformation on other machines and people.
Where is the space for human speech there? Where is the space for human beings to connect with each, have democratic discourse? How do you conduct public diplomacy in that sort of environment? It's a tough question.
Shawn Powers: Yeah, so even if we can fact check, and fact-checking is effective, if no one's actually going to hear the facts of the matter, then it doesn't actually help us. If you're not sufficiently terrified, I strongly recommend you pick up a copy of the report and take a look at the computational propaganda section, which provides some really terrific detail on these trends. Matt, can you talk a bit more about how these tools impact the work of public diplomacy practitioners at the State Department? Is there a role for bots in the work that we do, and if so, how do we embrace these technologies while still holding true to the values that undergird public diplomacy in the first place, honesty, attribution, and a genuine commitment to engaging foreign audiences?
Matt Chessen: I think your last point's actually a really important one, because we need to make sure that we hold true to our values, especially within the Department of State. I do think there is a role for bots, and I think there's a role for these technologies. Where I think we do not want to go, and we can't go, is into the realm of fighting fire with fire. We can't go to the disinformation realm. The State Department needs to do things in a very aboveboard manner. We need to make sure that all of our accounts are attributable to the State Department.
Unfortunately, that hamstrings you of having the same types of effects that the propagandists have, because their accounts are unattributed. They create user profiles that blend into the population. However, there's still room for these technologies to be used. Basically, the State Department is developing these basically detection and awareness tools. One of the really important things is, when these campaigns come out, is to be able to identify these propaganda campaigns and be able to attribute them, and then be able to inoculate the population that they're targeting before they have a chance to go viral. These detection and attribution tools are very important.
Bots can also be used positively. They can be used to connect people. You can use them to autonomously share information with populations that might not otherwise be connected to each other. The department can use AI chat bots to engage people about foreign policy and teach them about foreign policy in a more technologically savvy way that might appeal to younger people, but also a way that's much more interactive and doesn't require a human being. They can actually interact with a chat log and learn about why the United States has a particular policy.
I think there's room for chat bots to also be engaged in customer service. When people are applying for the visa process, which can be very frustrating, and is a frustrating experience for some people with the United States, smooth that process out and make it a little more helpful for people.
I also think there's room for these big data psychometric targeting tools. I think the State Department does use artificial intelligence. When we use Facebook to put out ads, we're using their artificial intelligence platform. We do that with a lot of the tools that we have.
I think also one of the things we have to think about is that these tools and these campaigns run at machine speed. They go viral very quickly. Do we need some sort of rapid reaction force that's on call all the time, that can go and respond to these types of things, and do this inoculation, do this real-time fact-checking?
I think there's also room for partnerships with other entities. I think the department can help both work with the private sector to build tools, work with NGOs to build tools, and then also build this independent culture of journalism out in other countries.
Finally, I think there's a conceptual leap that has to happen. I think people need to get really comfortable with the future being these partnerships between humans and machines. Human and machine teaming is the future of the economy, and it needs to be the future of public diplomacy, as well. My colleague at the FCC, David Bray, likes to call these crews, and that's these teams of humans and machines that will be working towards objectives together. That conceptual leap I think is probably the hardest thing for folks within the department and really within the public diplomacy community to grasp.
Shawn Powers: Thank you so much. I'd like to turn to Markos now. I also want to mention Markos co-edited the report that we issued today, and we're really grateful for your support in editing. Markos helped us organize a workshop at Stanford in March that was crucial, really, to getting this together. Markos, in your essay, you explore the public diplomacy consequences of a return to a "big, bad America." Specifically, you write about the significance of shifting from a values-based approach to a transactional approach to engaging with foreign publics. Tell us a bit more about the risks and opportunities with this shift and maybe what history can help us understand about the possibilities with the transactional approach to public diplomacy.
Markos K.: Sure. Thanks, Shawn. It's good to be here from the Silicon Valley. I'm still trying to figure out how I'm going to spend my $31,000 a year with that check. I talk about big, bad America, and the main point is that, in fact, yes, America is moving away from a values-based approach and towards a transactional one in defense, development, and in diplomacy. It has consequences, but it also uncovers some unpracticed opportunities. I want to make clear that the analysis here is not a normative analysis. It's just my overall analysis of what I see is happening in the state of contemporary public diplomacy.
The definition of a big, bad America is a globally assertive, zero-sum focused America aiming aggressively to right perceived historic imbalances in the exploitation of our immigration, economic, alliance, military, and trade policies up to now. This may seem like a contemporary aberration, but there's actually a historic precedent. Those who recall the Reagan administration will probably remember the Kirkpatrick doctrine, how that was applied. That was essentially where America took a moderate approach towards its friends, even those friends which were authoritarian in nature, in their regime nature, and a more confrontational approach towards adversarial authoritarians and totalitarians. There's some historic precedent to this approach that we'll be taking.
As America practices this updated Kirkpatrick doctrine, it's a continual move away from democracy protection and nation-building. That began in earnest in the last administration, but seems to be continuing and gaining momentum. It's an environment in which authoritarian regimes are courted and not castigated. A big, bad America is subject to an external negative narrative that is more credible, unfortunately, in that it focuses and emphasizes the three Cs of corruption, conspiracy, and cynicism. While the United States has tried to be painted with these three Cs in the past, because of some of the technologies and some of the narratives that are being drawn, and some of the changes that are being seen within our own press and our own society, they're finding more fertile ground.
Adversaries are aggressively promoting this narrative using these tools and technologies. Foreign audiences are much more receptive to this narrative, again reinforced by the technologies that are out there. We find ourselves suddenly in this new transaction reality as a result. I'm just going to list some of the features and then maybe some of the opportunities.
The features of transactionalism include the reduction in short-term political risk and foreign regime volatility when you're actually able to make deals. It privileges national sovereignty, so we're not challenging, as America, the national sovereignty of nations around the world. It operates at the elite level of society, not on civil society level, which is a bit of a challenge to some of the contemporary PD practice and to some of the practitioners in terms of the way they've been looking at PD in the past. It promotes policy-making efficiency, and it enables summit or leader-level grand bargains, which again could have very positive results.
The result is essentially a privileging of authoritarian states with which a big, bad America shares national security and economic interests, with less public diplomacy emphasis on civil society, human rights, and pluralism. It also is an opening for more aggressive public diplomacy measures targeting our adversaries because of the more confrontational approach that we'll be taking with them. Ultimately, the move towards transactional foreign policy-making means a move away from the core of post-World War Two type public diplomacy, with the exception of this period of the Kirkpatrick doctrine, and ultimately the conclusion really is that running towards a new American transactionalism means straying from a traditional American exceptionalism.
Shawn Powers: Great. Staying with Markos, I'd like to turn with something that has nothing to do with digital technology, which you know well as a member of the J. William Fulbright Foreign Scholarship Board. What is the role of exchange programs? This is person to person, people to people, physical connections in the space that increasingly seems dominated by digital technologies?
Markos K.: Senator Fulbright really put it best. I'm just going to quote him, because it sums it all up. "Educational exchange is not merely one of those nice but marginal activities in which we engage in international affairs, but rather from the standpoint of future world peace and order, probably the most important and potentially rewarding of our foreign policy activities."
You can look at the stats. We've had 370,000 Fulbrighters from 160 countries, seven decades. If again we're returning to this transactional framework, then you have to look at the question of ROI, return on investment. There are numbers like 57 Nobellas, 82 Pulitzer Prize winners, 37 current or former foreign heads of state, but the way you make the sale, I think in this environment, is that you look at the numbers. International students today deliver 36 billion annually. There's an increased demand for our cultural products. Even today's nominee for ambassador to China was someone who was built through an international exchange. Terry Branstad had gotten to know Xi Jinpeng on his own 1985 visit to the United States.
On the security question, though, it's not just ROI, it's also security. International relations scholars look at understanding intent of a nation, and the security dilemma, which is one of the things we often talk about, is in great part due to the inability to read a foreign leader or a foreign nation's intentions. Knowing your allies and adversaries gives an insight into their intentions, limitations, their fears and approaches, and knowing your allies and adversaries helps prevent miscalculation and mistakes, providing a greater opportunity for security.
Shawn Powers: Great. Thank you so much. Turning back to Ory, I'm curious about your reactions to some of what we've discussed so far. In particular, the focus on the digital dark side, as we've described it in the report. Is there a risk of focusing on digital outreach in this space that are vulnerable and may not be trusted as a means of reaching foreign publics? How can we better synchronize our people to people programs with our digital outreach and strategy?
Ory Rinat: Yeah. That's a good question. I've been thinking about this since we first spoke about this panel. There's a risk being too invested in any one approach, and too invested in any one digital platform, but there's a larger risk not learning how to address these problems and play a credible role on these platforms and in these spaces. When you look at it, a lot of this goes to better digital governance. We're limited in what we're going to be able to do to counter the dark side by regulation, legislation, by our own values that just are a good thing and things we want to stick by.
In that world, we need to be better and more efficient at the core of what we do. When you look at it now, Tom was talking about 22 billion a year is spent on IT. There's so much wasteful spending that goes along with that when it comes to actual digital communications. The sheer number of websites, of social platforms we maintain, that don't talk to each other, that don't coordinate with each other, that don't amplify each other. What we end up with is this almost ecosystem of properties, of platforms, of sites that don't speak in a cohesive way, and in doing so, limit the effectiveness of each other.
We're stepping on each other's messaging. We're confusing that broader effort that we should be able to be putting out there. The digital dark side is there. It means that we have to be better about the core of our work. To Tom's point earlier, when you talk about some of the things holding us back and not having the tools we need as PD practitioners, there's a lot that we should be able to do even along the lines of what Matt was talking about with bots, with artificial intelligence, with personalization, in a very white hat kind of way. Things that we're doing in the private sector now that we can't, simply because we are governed by a set of rules and practices that weren't built for this era.
An example I use a lot is personalization. Personalization is a really powerful tool to deliver audiences the content they care about and need. You can do that both in a passive way or in an active way. An active way is saying to somebody, "Hey, what issues do you care about? We'll give you content about those issues." A passive way is saying, "Okay, you've come to my site four times in the past week, and you keep clicking on energy content. I guess you must care about energy. Let me give you more content about that."
Both of those are ways we serve the audience better, but we're limited right now in how we can do those things. We need to have a conversation around modernizing the frameworks we're working in, whether that's the Privacy Act, the Paperwork Reduction Act, some of those rules that I think a lot of us as practitioners run up against every day and are worth a second look.
Shawn Powers: Great. Thank you. Before we go to the question and answer session, I've got one last question for Tom, who's been very patient over the course of this panel. You wrote just a fabulous piece in the Harvard Business Review on the challenges of pushing institutional change in a bureaucracy like the State Department. As you and I'm sure everyone here knows, Secretary Tillerson has announced an effort to reorganize the State Department from top to bottom. What advice would you offer on the reorganization process, not just in terms of what public diplomacy should look like after reorganization, but building coalitions of folks at the State Department that will support that new structure and support that kind of change?
Tom Cochran: Sure. It's amazing how many questions and how much mileage I've received out of that one piece, which in my mind talks about how I've failed at not pushing enough change. It is a good recipe for how you can affect change positively in a large organization, the first being the most important, which is building relationships within the organization by listening to people. You cannot come into an organization as an outsider, as an interloper, and just immediately tell people what to do, because you actually don't know what to do. You have absolutely no idea how the place runs. You may have some really good ideas from the outside, but you need to partner with the people that are on the outside of the building to navigate the building. I didn't even know what the building meant until I got into the building.
The second, of course, being risk, understanding that risk is a necessary ingredient for change. We keep talking about innovation and moving things forward, and we need digital innovation, but we don't accept the risk component of that. Innovation inherently requires risk, and with risk, you will fail sometimes. Failure is okay. What's not okay is repeating failure, and what's not okay is not learning from failure.
The last thing is the thing that I had a long monologue about 10 minutes earlier, which is technology, enabling people with the right technology. If we don't have the right tools to do our jobs, we can't do our jobs. The lesson here that I'm driving at is, change is possible in a large, storied, very respected organization like the State Department, but there are certain things and ingredients that have to happen. Of course, we talked about listening to people. We talked about risk. We talked about enabling people with technology, but what I also want to emphasize is things that shouldn't change and things that should be remembered.
Remember that the power of diplomacy stems from relationships, humans, human to human relationships. We can't get caught up in the fact that all these technologies are out there and then immediately rush to it like five-year-olds playing soccer. We have to recognize that at the heart of public diplomacy lies people. There's nothing more powerful than connecting on a human emotional level, one to one, with an individual to share ideas and maybe even disagree, because that's really what is so powerful and so fundamental about this country. It's supposed to be a country with free speech. It's supposed to be a country where we all can have discourse and disagree in a civil manner, and not take up arms and kill each other. That's what represents America, and that's how we represent ourselves overseas, whether it is through technology or whether it is through human to human interaction.
My biggest recommendation is yes, absolutely, change is needed. More technology is needed. Better technology is needed, but don't forget or lose sight of the most important part of the State Department, which is people. We have to go out there and building relationships on a one to one basis with individuals. If we didn't, everyone could just stay here in the building and tweet at everybody. That's not how it works. That's why you go overseas. You actually make friends with regular people in different countries, and then they build that trust with you, and therefore they are less likely and less susceptible to believe any fake news or any trolls out there, because they have a single source of truth, and that's you.
Shawn Powers: Great. Thank you so much. Let's give the panelists a warm round of applause here. I'd like to invite Congressman Rogers to join the panel for the Q and A session, if you'd like. I'd like to invite the Commission members to offer some initial questions.
Anne Wedner: Thank you guys so much for your presentation today. I wish there were a woman up there, but there's not. I'll ask a question. In terms of I think even the whole backfire thesis and disproving that, I feel like Tom maybe closed on what I think is missing from the discussion, and that discussion. I wish that George Lakoff, even though he's another man, had been a part of this group. What you're really talking about when you talk about facts and trying to rebut facts with facts or disinformation with facts is, people can't really understand unless you make a values framework that allows them to accept information. Even though they can accept the new fact, it may not change their perception.
Let's use domestic politics as an example. With Trump, he made a bunch of promises that he's not going to fulfill. His voters are aware of that, but they still really support him. I think the same, because it's an emotional connection, which is what Tom was getting to at the end. I think it's the same thing that's operative in public diplomacy. How do we create an emotional connection, which we've spent time in previous Commission endeavors talking about the narrative, and how people accept information?
I would just counsel us to go back to that. I have one other point on that, which is that all of you guys are Enlightenment-educated. You're philosophers. You've studied history. Your rational base is in how you think about the world. It turns out that neuroscience shows that we actually aren't rational beings. Our skills and understanding and argument and fact presenting really aren't relevant in how ideas influence us. I think it's super important to introduce neuroscience concepts into this whole discussion. I would encourage maybe finding a woman to talk about that next time.
Shawn Powers: Thank you. Does anyone want to respond on the panel?
Penne Peacock: I'll ask Tom a question, if I could.
Shawn Powers: I think actually Dr. Porter wants to respond.
Ethan Porter: Yeah, sure. I think I'm of two minds, which is to say that in lots of work, there's lots of good evidence that narrative matters, and certainly I subscribe to that, but I wouldn't put too much emphasis on ... It's not like you need a narrative to accept facts, which again and again, simply the succinct presentation of factual presentation does actually compel people to move towards facts.
Anne Wedner: A values framework isn't exactly a narrative. You should read Lakoff.
Ethan Porter: I have read Lakoff. I would just say that again it's not a necessary ingredient. This is what I think we've shown multiple times.
I would also say, in terms of neuroscience and rationality, I really don't think that neuroscience is that anti-rational. I think that there's lots of neuroscience work which shows the opposite. I think there's lots of good survey research which shows the opposite. I'm generally of the point of view that people are surprisingly rational.
Anne Wedner: I don't know if I agree with that. I think there are all sorts of changes. It's not the place, but maybe we can add it next time. For example, lying and how that affects the human brain, and why when you lie once, you're much more likely to lie again and not understand what you're doing. There's so much in neuroscience that shows the irrationality and how patterns are created. I think it's worth it to add it to our discussion. I think it's something that public diplomacy offers should understand.
Shawn Powers: I should note that we do have an essay on neuroscience in the wonderful report that you may have heard about today.
Penne Peacock: Yes, this is for Tom, and really for Matt. When you're talking about that they don't have the tools, and then we're hearing that they want to cut the budget at State, is there a point in time where State will come out of this age of innocence about how fast technology is growing and what a great tool it is, and have someone maybe give them the budget to do it?
Tom Cochran: The last thing we need is more money to buy technology.
Penne Peacock: You need people who decide to use it.
Tom Cochran: Right. We have people who decide to use it. It's an entire room, an entire building of people that want it. What we have is a leadership structure in the organization that, frankly, does not understand technology needs of their customer. It's unfair for me to say the entire leadership structure that runs technology is not good at their jobs, but what I am saying is that there are a lot of people that make decisions about technology that don't know what it's like to be that social media person in Buenos Aires not having the right tools to do their job. That's the most important thing that needs to be fixed at the State Department.
Matt Chessen: If I can just add to that, I'm optimistic that they might take a good look at technology when they reorganize the department. I came from a technology consulting background, and when you're implementing a technology platform, you never, ever just replicate your existing business process. You always change your business process to make optimal use of that technology. My hope is that when the department is actually reorganized, leadership takes a good look at when you're revising that business process, how do you then use technology to make the department more productive and more efficient.
Georgette Mosbacher: To that point, exactly, I'd like to get back to the basics a little bit. That is, we've talked about echo chambers and we've talked about bots, etc. You can have technology. We can give everyone in the building the most up to date technology. The question is, what do we do with it? What is our aim? What is it that we're trying to do? If we're talking about propaganda, and regardless of how you want to define that word, our values, we want to propagate our values, then how do use this technology?
We could go back to The Manchurian Candidate. They use drugs and isolation. Now we have all of this sexy technology, but yet, we're still trying to do the same thing, brainwash. You may not like the word, but let's be frank about this. We want other people to think the way we want them to think. We want to influence the outcome of our objectives. We want democracy. We want our values. How do we use this technology to best do that? We can give everyone in the department that technology, but what's the framework for how that technology will be used? What are the rules?
Tom Cochran: I can answer that, since I'm the person that doesn't have to go through a clearance process to have talking points. Let me be clear that technology is not the solution. Technology in and of itself is completely useless. It's a means to an end, but it is absolutely clear that the tools that we have here are terrible. That said, the people that we have here are generally great. I went to embassies around the world, and in spite of terrible technology, they were doing great work. What really needs to happen is this a question of leadership within Washington, of what is the direction of the department.
Georgette Mosbacher: What is that work? You said they do great work, but who defines what that work is?
Tom Cochran: There's nobody defining it right now, because there are not enough leaders in the building. Once they're in the building-
Georgette Mosbacher: We're back to basics.
Tom Cochran: We're back to basics, but what you need is you need assistant secretaries, you need undersecretaries, you need a deputy secretary, you need a team of people telling the crew where's this ship going. Because these are great soldiers and sailors, or whatever metaphor you want to use, just looking for direction.
Georgette Mosbacher: I understand.
Tom Cochran: Right.
Matt Chessen: I would actually push back a little bit on your premise that really we're in the business of propaganda. I actually don't think that's what we're in the business of. I think that you can look at it from two different perspectives. One is that there is no objectivity in the world, and I think that's what the opposition wants. I think the opposition wants us to believe that everything's a narrative, everything's just what you can convince people of, and there is no truth, there are no facts.
I come from a different perspective, where I think the fundamental human liberty is the ability to think, choose what you think, and think what you want. I think that's what we need to strive for as a value here. I don't think we want to play the propaganda game and get into how we can persuade and manipulate and influence people the best to get people to think what we want. I think we want to create an environment and give people the tools so that they can choose to think whatever they want, because I am confident that our values in that sort of marketplace of ideas are going to be the choice that people make.
The danger is, there's no free market right now. The market is being corrupted. We need to fix the corruption in the free market of ideas.
Shawn Powers: Great. We've got time for one more commission question, then we've got to open it up.
Sim Farar: Back to you.
Shawn Powers: Back to me. Great. Wonderful. Thank you so much, Commission members. We've got about 20 minutes for Q and A from the public. If you wouldn't mind raising your hand high. Mr. Hybl, could you pass the microphone behind you, perhaps? Once you have the microphone, if you could introduce yourself and your organization affiliation, that would be great.
Matthew Wallin: Thank you. Matthew Wallin, American Security Project. I've got a two-part unrelated question, at least to each other. The first question is for Ethan. With regards to people believing the facts when you explain them to them, but then not changing their particular support for a candidate or a politician or something like that, what does the fact matter in the first place? If the point of the facts is to influence people, especially in public diplomacy, to make political decisions, whether that's the way they vote, or whether they support a particular policy or not.
The second question's a little more broad, in terms of the technology. We've talked about it a lot in terms of a broadcasting technology and of an influence tool. What about the listening side of it? How do we use technology to better understand what foreign publics are thinking and then process that into our foreign policy planning?
Tom Cochran: Great, thank you. Let me answer that second question first. Again, since I don't have a clearance process, I can be free to answer this with what I think is the truth, and being objective. In private sector, Coca-Cola for example, or Walmart can use any tools available in order to listen to their customers online on Twitter, Facebook, whatever. They can understand their customers and understand everything about that customer, how much Coke they buy, when they buy it, do they like Coke, do they drink Pepsi, are their friends drinking Coke, and on and on and on, to have this great picture of the individual and how that person relates to their product. They use this information to make sure that the person drinks more Coke, and has a better experience with Coca-Cola. That person benefits because they like Coke. Coke benefits because they have another customer. Aggregate that over millions and millions of people, Coke does a really good job.
The United States government is encumbered by things called laws and regulations and privacy, the Privacy Act of 1974. All these laws and regulations, which were put in place well before the Internet, well before the fax machine, for example. The problem is that information today travels at the speed of the Internet, and we have data that enables unparalleled levels of institutional knowledge and memory, and we have networks, technology and social networks, that amplify the power of relationships, except we can't take full advantage of all these things because of all these laws.
We cannot listen to conversations on Twitter on an individual basis. We can do it on an aggregate basis. On an aggregate basis, we can understand what the conversation is, but we can't really take action on that information, because of these laws. This is actually not a State Department problem. This is a Congressional problem, of understanding that these laws, which are good laws, intended to protect privacy of individuals and citizens, encumber us and handcuff us to not be able to use the best tools that are created in our own country here. That's a big challenge that we have to overcome.
Shawn Powers: Great.
Ethan Porter: Can I respond?
Shawn Powers: Yeah, please. Go ahead.
Ethan Porter: I do think that we run the risk sometimes of expecting too much from facts, which is to say that as an educator, my primary goal is just cultivating a more informed citizenry. Dissemination of facts and people's willingness to accept facts, that's exciting, but whether or not can persuade and convince people especially to abandon their most cherished political commitments, that's a totally different set of concerns. I would say that facts are good in and of themselves, that education is good in and of itself, regardless of whether or not that causes people to support one party or one country or oppose one party or one country or another. That's a different set of tools that are probably orthogonal to facts but not synonymous with, or probably more useful.
Shawn Powers: Great, okay.
Kaitlin Turck: Hi. Kaitlin Turck, State Department. I have more of a managing up question, because you talked a lot about a lot of serious issues. I've been in meetings in the State Department where senior people are like, "We've got to get on FaceTime and Snapfish. Let's get on Snapfish." It's okay, but you would never hear someone be like, "We invaded Iran. Was it Iraq? Was it Iran? Who knows?" It's very easy to say that with social media and to pass it on, and so what you actually find is we're passing it off like, "Oh, you look young. Can you just do it? You're the youngest person. This is not important for my time as a senior-level person." We're not actually investing.
Okay, we talked a lot about tools, but what about the training? If you're a press officer, you get training on how to have good judgment and talk to the public, but that does not exist for social media. How can we make this a priority for senior-level officials, like, "Hey, these another platforms with all of these very serious things happening on them, and it's how billions of people communicate. Maybe you're not on these, but these are important tools." How can we convince them so that there can be a focus on actually training and allowing people to do things? I'm sure your excellent person you met in Argentina, that's a perfect example. Local staff member. What did the public affairs officer in charge of the section know about communicating to the public? Are they trained on these tools, and not delegating it to the person who probably isn't clear to just go and have a conversation with the public without permission?
Ory Rinat: I can touch on a little bit of that. I don't think we're talking about a problem unique to the State Department. I don't think I've ever worked in any organization where we didn't have this problem. You hear about board members of nonprofits saying, "Hey, we need to be on the Tweeters." It's systemic. It's endemic. I think the question isn't just training. We do have a unique situation in the State Department, and I've been so impressed by foreign service officers and their ability to move between different offices, in particular office directors, and manage a team that has a skill set or is learning a skill set that might not be theirs. There's training that addresses a lot of that, and that to me is the more solvable of the problems.
The bigger issue is, how do we get to a point where instead of saying, "You want us to be on the FaceTime, whatever you called it?" Instead of the answer being, "Yes, now let's go train everybody, and then let's do it on 20 other platforms," how do we get the answer to be, "No, hold on. Why are we doing this platform? Is it worth our investment?"
To Tom's point, Tom was saying it's not an issue of money and resources. It's an issue of being more targeted about what we're doing. We've gotten to a point where we have so many platforms, hundreds and hundreds of people managing those platforms, the governance issue is so much more powerful than the training issue. Unless we get to a point where we're goals-oriented and mission-oriented in creating these things, it's not going to work. When you look at every social account, it needs to have a purpose. It needs to have an audience.
Five, six years ago, we were in this era of digital where everybody was talking about user personas. They're like these caricatures or archetypes of your audience, and they're fun, because you get to say, "Joe, the college student, who wants a career in public health," and you build a profile. You figure out that they're on LinkedIn and Twitter, and then you create content for those platforms. That's fine, but what you end up with there is the continued muddling and muddying of your audiences. When we talk about social now, what I urge people to do is think about one person in your audience, just one, a real person. Write for that person. You'll capture other people, too, but if you can't say about a social account, "Here's who this account is for," take a step back and worry about whether you even need that account.
Tom Cochran: Let me add one more thing about it. I also want to emphasize, let's not make it an age thing. It's definitely not an age thing. My dad, for example, is super old, but he's on Facebook and LinkedIn. Also a former foreign service officer for 33 years, by the way. He gets it. What I'm saying here is, it's a question of intellectual curiosity. It's a question of flexibility to understand, most importantly, acknowledge and recognize the things that you know are important but you yourself don't understand, and understand enough of it, and work with other people to make sure that whatever it is that strategy is gets taken care of by your team.
Ory Rinat: I would just really quickly add to that that you made a really good point. Social is just one channel. Email is an even more powerful channel. Search, in many places, is a more powerful channel. If you look at the channel side of it, go back to the platforms. Where are we keeping our content? Where are we publishing our content? Then go back to the content itself. Why are we creating a blog post? Why are we creating a video? Why are we creating a graphic? Who is it for, and what are we trying to say? So often, we start with the channel because it's where the easy number to brag about is. Then you get to go and say, "Oh my God, I reached 10 million people today." Okay, with what?
Shawn Powers: Great. I saw a few hands in the back. Maybe we can take, I think, Dan?
Daniel Munz: I think Matt raised the specter of us getting to a place where you have just bots arguing with each other with a diminishing role for authentic human preferences in shaping the market of ideas, which is interesting, because it sounds a lot to me like a pretty good description of high-frequency trading and a lot of the environments we see there. I'm wondering if there's anything we can learn in this space about how that problem and that set of issues is regulated or not regulated. I think this weekend, we learned a lot about French press laws, for example. In addition to thinking about how we're operators in this space, should we be thinking about pursuing regulatory or legal regimes that guard the contours of this whole problem?
Shawn Powers: That's an easy question. Who wants it?
Matt Chessen: That's actually a really interesting question. I have thought about this in the sense of the high-frequency trading, and one of the things that they've put into high frequency because of high-frequency trading is these circuit breakers. Because if you don't have the circuit breaker, basically the machines can start going in a direction and take the market completely in that direction without human control, and they can't stop it. I think that's something to think about in the context of this, but this is a big unknown. My paper basically talks about the scenario of [mad comm 01:18:02], where you have these machine-driven communication tools going wild, and there's no room for human speech in there.
There's a couple of interesting things there. We all program each other in a sense by communicating with each other and creating cultural norms and things like that. Very soon, we're going to get to a point where the machines are actually programming us, and creating cultural norms, and creating memes. Where is that going to go for human society? I think that's a big philosophical question.
I think there's another big question of, is it really going to get that bad? There are countermeasures you can take. There's artificial intelligence tools that can go and detect these machine-driven accounts. The platforms have responsibilities to actually take some of these accounts down and fight this information. There are some ideas out there about tools that could be built into browsers to warn people when they go to disinformation websites, or, "You are probably interacting with a bot right now." These tools are out there. They just haven't really been integrated into the platforms.
It's a really big unknown for me whether we're going to go to the worst-case scenario, or whether we're going to have these mitigation tools. I think it's going to turn out much more like the cyber security challenge, where you have this cycle of one-upsmanship and there's new tools that come out to manipulate people, and then new countermeasures against them. Where regulation fits in that, that's a really interesting question. As society, we've decided that false advertising is a social ill, and so we've created liability for that. That may be something where we want to create liability for disinformation. The real tricky part there is then, who determines what's truth and what's not? You can determine that in a court of law, but I don't know that that's necessarily something that we want to be getting into as a society.
Shawn Powers: I think Congressman Rogers may be able to speak to the question of regulations
Mike Rogers: Just on the regulatory side of this, if you look at the technology that's coming over the next 10 years, by 2025, there's going to be somewhere on the order of 700 billion new IoT devices connected to the Internet. They did a survey of companies putting these devices out, and everything from your refrigerator to your printer to your fill in the blank. I don't know about the rest of you, but I think my refrigerator works against me already, let alone having it on the Internet working against me.
If you think about what that means for security, and then you add AI. Currently AI represents about 65% of all the traffic flowing around the Internet. Your Coke example is the perfect example of that. It is business-driven AI data talking to computers talking to computers, farming your information and putting it into a system that gets business to business AI botnetted, if you would. When you look at the challenges of regulation of this, it's coming faster than your government will be able to handle it, candidly.
You think about even the social impact of AI that's coming. The big push for investment in research is driverless trucks and driverless cars. The single largest employment for middle-aged men in America is driving. Some notion that we are going to suck them up into the economy to do something different is a little bit foolish.
We haven't looked at the social impacts of what AI is going to do, and we certainly haven't looked at the security nature of it. When you look at these IoT devices, and I'll just tell you this one quick story, because I think this is relevant. The fact that the Dyn attack of last year, where they shut down Google, they shut down Facebook, what do you call it, Snapfish? Snapchat. When they were able to shut those down, it was one individual who went into the dark web, leased a botnet, they think he paid less than $1000. It was about 100,000 different IoT devices attacked the east coast node of Dyn.
Think of the traffic cop of your Internet service. That's what Dyn does. It was able to shut it down for about two hours. Think of all the economic revenue lost in those two hours for those major companies that involve e-commerce. This is a company who is used to these DDOS attacks. It just overwhelmed their system. They're used to doing this. They were completely overwhelmed. It freaked everybody out in the security space and cyber security because we were thinking, "Uh-oh, if one person can lease a botnet and do that," and the worst part of it is, what they got the east coast node back up and running, they went to the west coast node and shut it down. They were a little quicker to turn it around, but it tells you the gap between where we think are on understanding AI and how we defend ourselves, and where we really are.
I'll guarantee you, if you walk down to the Capitol, I'm a recovering member of Congress, to be clear. There is no way they're going to be able to keep up with what regulatory concerns are coming at us. This is going to have to be a private-led event that's going to help government get to where it needs to go. It's just too big, and it's too fast. It's amazing to me to watch. I'm on three separate small startup Internet security companies, and just the technology and trying to keep up with security protocols makes your head spin.
Again, when you look at this big notion, do you have a killswitch for AI? That's the new debate. Does the President of the United States eventually have a suitcase like they had the nuclear suitcase, with a killswitch for AI? When you introduce it to weapons systems, when you have two computers who are supposed to teach each other a different language decided it was too complicated, or too slow, too inefficient, wrote their own language, of which the designers of the program can't figure out what that language is? Tell me we don't have some challenges on the regulatory side of AI that we haven't even thought through that.
I just throw that out because I get a cut at the bar down the street if you want to go in. I get 10% of the take today, so go and have a drink and think about that. Go ahead.
Shawn Powers: What a scary note to end the panel on. I'd like to invite Ms. Mosbacher to come up and close out the session.
Georgette Mosbacher: First, I'd like to thank all of you for coming today, and particularly I'd like to thank my friend, Congressman Rogers, for that very sober note that we're going to end this session with. The Commission holds its next public meeting in September, and we'll present the findings from our 2017 comprehensive annual report of public diplomacy and international broadcasting. It's a project that catalogs the myriad of ways in which the State Department and Broadcasting Board of Governors inform and engage with our foreign audiences. You'll just have to stay tuned for the exact date and location of that.
Just on a quick personal note, I found it dazzling, this conversation today, breathtaking quite frankly. Yes, I'm afraid age does count. I'm just too old to figure out some of this stuff. It is dazzling. With respect to public diplomacy, I do think that we have all these tools, but it's like anything in life. You can have all the tools and the knowledge, but the question is, what do you do with it? I think it's incumbent on our Commission to define that for those of you who are really good with those tools, and what it is that we want to project, and how we can use this in the realm of public diplomacy, these tools and these platforms, that will affect what we feel that objective is.
Having said that, thank you so much for this enlightenment. I'm going to go back home now and turn on my computer, and I'm just going to stick with going and typing. I think I've covered everything that we needed to cover for the next meeting, so thank you again very much, all of you.