Hybrid War of Information
On October 15, 2017, the research team of Marinov and fellow coPIs filed a pre-proposal to the Volkswagen Foundation, under the call Challenges for Europe.
Team: CoPIs: Nikolay Marinov and Thomas Bräuninger, Professors, University of Mannheim (roles in research design, modelling); Anita Gohdes, Assistant Professor, Department of Political Science, University of Zurich (specialty in conflict studies, computational social science, see Gohdes (2018)); Dr Zoltan Fazekas, Postdoctoral Fellow - Department of Political Science, University of Oslo (specialty survey research and methodology, see Fazekas and Larsen (2016)); Dr. Dimitar Vatsov, Human and Social Studies Foundation, Sofia (heads research institution with extensive experience studying disinformation in Bulgaria, see Vatsov and et al (2017)).
Participants will jointly develop ideas, design tests, implement surveys, while supervising and directing post-doctoral researchers.
Misinformation by foreign actors to deceive public opinion and undercut political unity is hardly a new phenomenon. Going digital, however, has given it new momentum. The aim of the project is to answer a number of relevant questions to the current war of information in comparative perspective: (1) how do events, concerning issues of interest to foreign powers, impact the mix of information and disinformation reaching domestic publics in Europe? (2) do propaganda strategies change when there are random or unanticipated events? (3) How do approaching elections in European democracies change the mix of topics circulating online and offline? We will also explore what individual-level and country-level strategies could work to restore trust and debunk disinformation. We define a hybrid war of information as a deliberate attempt by a foreign government to change public opinion in a target state to advance its own foreign and domestic policy goals. We assume that the objectives of the sender regime are survival and expansion. However, as part of these immediate objectives, the non-democratic sender may pursue a broader policy agenda against its perceived opponents. We will utilize state of the art advances in political psychology, game theory and computational social science to develop and test a theory of information wars. The theoretical account we are developing as part of this project will give us testable propositions over the composition and type of information aimed at domestic publics in target states. We expect our predictions to be a function of the current status quo the policy sender cares about, the availability of high-quality signals to senders, domestic political divisions and the proximity to elections in the target. In our theoretical model, we will draw on advances in political psychology to model how individuals process information. The first step in this modeling process is to study what is intended when a specific type of message is generated. The public may be targeted directly, and the message may tip discussions on a policy of interest. Or maybe local publics must decide whether to replace or keep in office a politician. We seek to understand when a foreign power would focus on changing opinion on an issue, when it would target politicians, and when it would target the legitimacy of the democratic system as such. Our precise theoretical expectations will emerge as predictions from a strategic model of information dissemination. Unlike previous research, we will not take the informational environment as a given, something to be simply quantified. Rather our goal is to understand how the informational environment is changed, from the outside. We will model the decision by an interested actor to invest resources in sending a message to a local electorate, thereby changing the informational setting. Our empirical strategy includes surveys, and topic-modelling of online and offline media. In addition, we will follow social media reception of stories. We will select news stories in outlets such as Russia Today (Deutsch) to analyze the diffusion of individual news stories, as well as the users who post and amplify the stories (for example, through retweets on Twitter). We make use of recent techniques that were developed to identify `bots' (non-human social media accounts) to understand the dynamics of diffusion. Our chosen approach will give different metrics on the war of information, and its impact. The timing and granularity of social media data can help us isolate the transmission of propaganda, and, possibly, its effects. Our theory will open up new questions. We hope to shed light on the underlying motivation to foreign actors for attacking democratic processes abroad. We aim to understand who responds favorably to foreign propaganda narratives. We also want to know how the channel through which the message is sent (e.g. local or foreign media, online or offline) affects efficacy, and whether pointing out the source counteracts the effectiveness of a message. The result should be an advance in the study of political communication, persuasion, and theories of accountability and representation. Cooperation with other researchers, during and after the completion of this project, can focus on extending some of the insights learned to the relevant but distinct domain of social media networking and digital content sharing. Unchecked, disinformation risks damaging or diminishing the long-term vitality of the democratic project in Europe. The analysis we propose can help set the tone in the debate on the future of information, and of freedoms in Europe and beyond.