Securitization of Disinformation in NATO’s Lexicon: A Computational Text Analysis

Following the Russian meddling in the 2016 US elections, disinformation and fake news became popular terms to help generate domestic awareness against foreign information operations globally. Today, a large number of politicians, diplomats, and civil society leaders identify disinformation and fake news as primary problems in both domestic and foreign policy contexts. But how do security institutions define disinformation and fake news in foreign and security policies, and how do their securitization strategies change over years? Using computational methods, this article explores 238,452 tweets from official NATO and affiliated accounts, as well as more than 2,000 NATO texts, news statements, and publications since January 2014, presenting an unsupervised structural topic model (stm) analysis to investigate the main thematic and discursive contexts of these texts. The study finds that NATO’s threat discourse and securitization strategies are heavily influenced by the US’ political lexicon, and that the organization’s word choice changes based on their likelihood of mobilizing alliance resources and cohesion. In addition, the study suggests that the recent disinformation agenda is, in fact, a continuity of NATO’s long-standing Russia-focused securitization strategy and their attempt to mobilize the Baltic states and Poland in support of NATO’s mission.

___

  • Andrejevic, Mark. Infoglut: How Too Much Information Is Changing the Way We Think and Know. New York: Routledge, 2013.
  • Balzacq, Thierry, Sarah Léonard, and Jan Ruzicka. “‘Securitization’ Revisited: Theory and Cases.” International Relations 30, no. 4 (2016): 494–531.
  • Baum, Matthew A., and Philip B. K. Potter. “Media, Public Opinion, and Foreign Policy in the Age of Social Media.” The Journal of Politics 81, no. 2 (2019): 747–56.
  • Baumann, Mario. “‘Propaganda Fights’ and ‘Disinformation Campaigns’: The Discourse on Information Warfare in Russia-West Relations.” Contemporary Politics 26, no. 3 (2020): 288–307.
  • Bennett, W. Lance, and Steven Livingston. “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions.” European Journal of Communication 33, no. 2 (2018): 122–39.
  • Bouvier, Gwen, and David Machin. “Critical Discourse Analysis and the Challenges and Opportunities of Social Media.” Review of Communication 18, no. 3 (2018): 178–92.
  • Bradshaw, Samantha, and Philip N. Howard. “The Global Organization of Social Media Disinformation Campaigns.” Journal of International Affairs 71, no. 1.5 (2018): 23–32.
  • Buzan, Barry, and Ole Wæver. “Macrosecuritisation and Security Constellations: Reconsidering Scale in Securitisation Theory.” Review of International Studies 35, no. 2 (2009): 253–76.
  • Buzan, Barry, Ole Wver, and Jaap De Wilde. Security: A New Framework for Analysis. UK ed. edition. Boulder, CO: Lynne Rienner Publishers, 1998.
  • Cour, Christina la. “Theorising Digital Disinformation in International Relations.” International Politics 57, no. 4 (2020): 704–23.
  • Damashek, Marc. “Gauging Similarity with N-Grams: Language-Independent Categorization of Text.” Science 267, no. 5199 (1995): 843–48.
  • DiMaggio, Paul. “Adapting Computational Text Analysis to Social Science (and Vice Versa).” Big Data & Society 2, no. 2 (2015). doi: https://doi.org/10.1177/2053951715602908.
  • Duyn, Emily Van, and Jessica Collier. “Priming and Fake News: The Effects of Elite Discourse on Evaluations of News Media.” Mass Communication and Society 22, no. 1 (2019): 29–48.
  • Farkas, Johan, and Jannick Schou. “Fake News as a Floating Signifier: Hegemony, Antagonism and the Politics of Falsehood.” Javnost - The Public 25, no. 3 (2018): 298–314.
  • Freelon, Deen, and Chris Wells. “Disinformation as Political Communication.” Political Communication 37, no. 2 (2020): 145–56.
  • Galeotti, Mark. “The Mythical ‘Gerasimov Doctrine’ and the Language of Threat.” Critical Studies on Security 7, no. 2 (2019): 157–61.
  • Giles, Jim. “Computational Social Science: Making the Links.” Nature News 488, no. 7412 (2012): 448.
  • Grinberg, Nir, Kenneth Joseph, Lisa Friedland, Briony Swire-Thompson, and David Lazer. “Fake News on Twitter during the 2016 U.S. Presidential Election.” Science 363, no. 6425 (2019): 374–78.
  • Guess, Andrew M., and Benjamin A. Lyons. “Misinformation, Disinformation, and Online Propaganda.” In Social Media and Democracy: The State of the Field, Prospects for Reform, edited by Joshua A. Tucker and Nathaniel Persily, 10–33. SSRC Anxieties of Democracy. Cambridge: Cambridge University Press, 2020.
  • Hong, Liangjie, and Brian D. Davison. “Empirical Study of Topic Modeling in Twitter.” In Proceedings of the First Workshop on Social Media Analytics, 80–88. SOMA ’10. New York, NY, USA: Association for Computing Machinery, 2010.
  • Jack, Caroline. "Lexicon of lies: Terms for problematic information." Data & Society 3, no. 22 (2017): 1094–096.
  • Jr, Edson C. Tandoc, Zheng Wei Lim, and Richard Ling. “Defining ‘Fake News.’” Digital Journalism 6, no. 2 (2018): 137–53.
  • Khaldarova, Irina, and Mervi Pantti. “Fake News.” Journalism Practice 10, no. 7 (2016): 891–901.
  • Knudsen, Olav F. “Post-Copenhagen Security Studies: Desecuritizing Securitization.” Security Dialogue 32, no. 3 (2001): 355–68.
  • Krippendorff, Klaus. “Measuring the Reliability of Qualitative Text Analysis Data.” Quality and Quantity 38, no. 6 (2004): 787–800.
  • Kurowska, Xymena and Anatoly Reshetnikov. “Neutrollization: Industrialized Trolling as a Pro-Kremlin Strategy of Desecuritization.” Security Dialogue 49, no. 5 (2018): 345–63.
  • Lanoszka, Alexander. “Disinformation in International Politics.” European Journal of International Security 4, no. 2 (2019): 227–48.
  • Liang, Hai, and King-wa Fu. “Testing Propositions Derived from Twitter Studies: Generalization and Replication in Computational Social Science.” PLOS ONE 10, no. 8 (2015). doi: https://doi.org/10.1371/journal.pone.0134270.
  • Lipizzi, Carlo, Dante Gama Dessavre, Luca Iandoli, and Jose Emmanuel Ramirez Marquez. “Towards Computational Discourse Analysis: A Methodology for Mining Twitter Backchanneling Conversations.” Computers in Human Behavior 64 (2016): 782–92.
  • Lysenko, Volodymyr, and Catherine Brooks. “Russian Information Troops, Disinformation, and Democracy.” First Monday 23, no. 5 (2018). doi: https://doi.org/10.5210/fm.v22i5.8176.
  • Mälksoo, Maria. “Countering Hybrid Warfare as Ontological Security Management: The Emerging Practices of the EU and NATO.” European Security 27, no. 3 (2018): 374–92.
  • Marchi, Anna, and Charlotte Taylor, eds. Corpus Approaches to Discourse: A Critical Review. 1st edition. New York: Routledge, 2018.
  • Mas-Manchón, Lluís, Frederic Guerrero-Solé, Xavier Ramon, and Laura Grande. “Patriotic Journalism in Fake News Warfare: El País’ Coverage of the Catalan Process.” The Political Economy of Communication 8, no. 2 (2021). http://www.polecom.org/index.php/polecom/article/view/123.
  • Maweu, Jacinta Mwende. “‘Fake Elections’? Cyber Propaganda, Disinformation and the 2017 General Elections in Kenya.” African Journalism Studies 40, no. 4 (2019): 62–76.
  • Mejias, Ulises A, and Nikolai E Vokuev. “Disinformation and the Media: The Case of Russia and Ukraine.” Media, Culture & Society 39, no. 7 (2017): 1027–42.
  • Monsees, Linda. “‘A War against Truth’ - Understanding the Fake News Controversy.” Critical Studies on Security 8, no. 2 (May 3, 2020): 116–29.
  • Morgan, Susan. “Fake News, Disinformation, Manipulation and Online Tactics to Undermine Democracy.” Journal of Cyber Policy 3, no. 1 (2018): 39–43.
  • Neo, Rick. “When Would a State Crack Down on Fake News? Explaining Variation in the Governance of Fake News in Asia-Pacific.” Political Studies Review (2021). doi: https://doi.org/10.1177%2F14789299211013984.
  • Polletta, Francesca, and Jessica Callahan. “Deep Stories, Nostalgia Narratives, and Fake News: Storytelling in the Trump Era.” In Politics of Meaning/Meaning of Politics: Cultural Sociology of the 2016 U.S. Presidential Election, edited by Jason L. Mast and Jeffrey C. Alexander, 55–73. Cultural Sociology. Cham: Springer International Publishing, 2019.
  • Renz, Bettina. “Russian Military Capabilities after 20 Years of Reform.” Survival 56, no. 3 (2014): 61–84.
  • Roberts, M. E., Brandon M. Stewart, D. Tingley, and E. Airoldi. “The Structural Topic Model and Applied Social Science.” In ICONIP2013. Daegu, South Korea, 2013. https://scholar.princeton.edu/files/bstewart/files/stmnips2013.pdf.
  • Roberts, Margaret E., Brandon M. Stewart, Dustin Tingley, Christopher Lucas, Jetson Leder‐Luis, Shana Kushner Gadarian, Bethany Albertson, and David G. Rand. “Structural Topic Models for Open‐Ended Survey Responses.” American Journal of Political Science 58, no. 4 (2014): 1064–082.
  • Roberts, Margaret E., Brandon M. Stewart, and Dustin Tingley. “Stm: An R Package for Structural Topic Models.” Journal of Statistical Software 91, no. 1 (2019): 1–40.
  • Ross, Andrew S., and Damian J. Rivers. “Discursive Deflection: Accusation of ‘Fake News’ and the Spread of Mis- and Disinformation in the Tweets of President Trump.” Social Media + Society 4, no. 2 (2018). doi: https://doi.org/10.1177/2056305118776010.
  • Saurwein, Florian, and Charlotte Spencer-Smith. “Combating Disinformation on Social Media: Multilevel Governance and Distributed Accountability in Europe.” Digital Journalism 8, no. 6 (July 2, 2020): 820–41.
  • Sinovets, Polina, and Bettina Renz. “Russia’s 2014 Military Doctrine and beyond: Threat Perceptions, Capabilities and Ambitions.” NATO Research Papers. Rome: NATO Defense College, July 2015. https://www.ndc.nato.int/news/news.php?icode=830.
  • Smith, Christopher A. “Weaponized Iconoclasm in Internet Memes Featuring the Expression ‘Fake News.’” Discourse & Communication 13, no. 3 (2019): 303–19.
  • Stritzel, Holger, and Sean C Chang. “Securitization and Counter-Securitization in Afghanistan.” Security Dialogue 46, no. 6 (2015): 548–67.
  • Tan, Netina. “Electoral Management of Digital Campaigns and Disinformation in East and Southeast Asia.” Election Law Journal: Rules, Politics, and Policy 19, no. 2 (2020): 214–39.
  • Tong, Chau, Hyungjin Gill, Jianing Li, Sebastián Valenzuela, and Hernando Rojas. “‘Fake News Is Anything They Say!’ — Conceptualization and Weaponization of Fake News among the American Public.” Mass Communication and Society 23, no. 5 (2020): 755–78.
  • Tucker, Joshua, Andre Guess, Pablo Barberá, Cristian Vaccari, Alex, ra Siegel, Sergey Sanovich, Denis Stukal, Brendan Nyhan. “Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature.” Hewlett Foundation (blog), March 19, 2018. https://hewlett.org/library/social-media-political-polarization-political-disinformation-review-scientific-literature/.
  • Wang, Hongning, Duo Zhang, and ChengXiang Zhai. “Structural Topic Model for Latent Topical Structure Analysis.” Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, Oregon, June 19-24, 2011.
  • Wallach, Hanna M. “Topic Modeling: Beyond Bag-of-Words.” In Proceedings of the 23rd International Conference on Machine Learning, 977–84. ICML ’06. New York, NY, USA: Association for Computing Machinery, 2006. https://doi.org/10.1145/1143844.1143967.
  • Williams, Michael C. “Words, Images, Enemies: Securitization and International Politics.” International Studies Quarterly 47, no. 4 (2003): 511–31.
All Azimuth: A Journal of Foreign Policy and Peace-Cover
  • ISSN: 2146-7757
  • Yayın Aralığı: Yılda 2 Sayı
  • Başlangıç: 2012
  • Yayıncı: Dış Politika ve Barış Araştırmaları Merkezi, İhsan Doğramacı Barış Vakfı