(e.g. [email protected])

Remember me

Forgot Password?

Defense Visual Information Distribution Service Logo

  • SECRETARY OF DEFENSE LLOYD J. AUSTIN III
  • Combatant Commands
  • Holiday Greetings Map
  • Taking Care of Our People
  • Focus on the Indo-Pacific
  • Support for Ukraine
  • Value of Service
  • Face of Defense
  • Science and Technology
  • Publications
  • Storytellers
  • Tell Your Story
  • Media Awards
  • Hometown Heroes

Hometown News

  • Create Request
  • Media Press Kit

DVIDS Mobile Logo

  • DVIDS DIRECT

Media Requests

About dvids.

  • Privacy & Security
  • Copyright Information
  • Accessibility Information
  • Customer Service

NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

Photo By Petty Officer 1st Class Tom Tonthat | Computer Science doctoral student U.S. Navy Cmdr. Edgar Jatho, left, and his advisor... ... read more read more

Photo By Petty Officer 1st Class Tom Tonthat | Computer Science doctoral student U.S. Navy Cmdr. Edgar Jatho, left, and his advisor Assistant Professor Joshua A. Kroll are the winners of the U.S. Naval Institute 2021 Information Warfare Essay Contest with their award-winning piece, “Artificial Intelligence: Too Fragile to Fight?” (U.S. Navy photo by MC2 Tom Tonthat)   see less | View Image Page

MONTEREY, CA, UNITED STATES

Story by petty officer 2nd class tom tonthat  , naval postgraduate school.

information warfare essay

U.S. Navy Cmdr. Edgar Jatho, a doctoral student in the Naval Postgraduate School (NPS) Department of Computer Science, and his advisor Assistant Professor Joshua A. Kroll have been named the winners of the U.S. Naval Institute (USNI) 2021 Information Warfare Essay Contest for their piece, “Artificial Intelligence: Too Fragile to Fight?” Jatho and Kroll will be honored this week at an awards ceremony during WEST 2022, a large naval conference and exposition in San Diego, Feb. 16-18. In addition, their award-winning essay has been published in the February issue of USNI’s Proceedings. The essay cautions on the overreliance of AI and raises awareness of potential issues and exploits that can affect its effectiveness in the field. Jatho got the inspiration to write about the essay for the annual USNI contest while taking Kroll’s Trustworthy and Responsible AI course in 2020. “The course involved reading leading thinkers across disciplines about AI, automation and algorithms,” said Jatho. “It brought up the challenges and difficulties in implementing safe and ethical systems that leverage the technology … Now that we have this big impetus by the DOD to adopt artificial intelligence and machine learning into technology on all kinds of different levels and solutions, it’s easy for us to forget some hard-won lessons.” In addition to his NPS coursework, Jatho was inspired by one of the presenters in NPS’ longstanding Secretary of the Navy Guest Lecture (SGL) series. During his October 2021 lecture on future of warfare, retired U.S. Navy Adm. James Stavridis cautioned how the military’s overreliance on advanced technology can leave it vulnerable to massive disruption. “Dependence on a new technology like cyberspace, artificial intelligence or nanotechnology will enable you to move forward,” said Stavridis during the SGL. “But does it create an Achilles’ heel? Often it does.” “Based on all of the research that I’ve been reading of what’s possible, it really got me to think,” continued Jatho. “[AI] can be a very complex and difficult problem because you can find support that says it’s doing a great job. Then suddenly when it gets to the battlefield, you find that it’s extraordinarily brittle and easy to break.” Jatho credits the many resources at NPS supporting his education and research, and the opportunities to apply his studies, for his winning piece. Leading among them is Kroll, Jatho’s advisor and co-author of the essay. “I’m quite proud that we in NPS can be in this position of thought leadership for the Navy and we can do the work to really think strategically,” said Kroll. “That’s something that I don’t think comes as naturally from other institutions that aren’t as focused on defense-oriented problems.” As the winning essay, the USNI presents Jatho and Kroll’s cautions about AI as part of its mission to advance the professional, literary and scientific understanding of sea power and other issues critical to global security. It is critical for NPS faculty and students to contribute their work to these kinds of leading venues, Kroll says, that collect and distribute these ideas to the naval and defense community. “They’re a trusted voice in this DOD community,” he says, “and I hope we can have an impact with how people can make our naval capabilities support the execution of the mission in a more robust and trustworthy way.” Jatho continues the pursuit of a Ph.D. in Computer Science at NPS, and is slated to teach at the Naval Academy following graduation as part of the Permanent Military Professor program.

LEAVE A COMMENT

Public domain  .

This work, NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest , by PO1 Tom Tonthat , identified by DVIDS , must comply with the restrictions shown on https://www.dvidshub.net/about/copyright .

CONNECTED MEDIA

NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

MORE LIKE THIS

Controlled vocabulary keywords.

No keywords found.

  •   Register/Login to Download

DVIDS Control Center

Web Support

  • [email protected]
  • 1-888-743-4662
  • Links Disclaimer
  • No FEAR Act
  • Small Business Act
  • Open Government
  • Strategic Plan
  • Inspector General
  • Sexual Assault Prevention
  • DVI Records Schedule
  • DVI Executive Summary
  • Section 3103

Podcasts Logo

Naval Postgraduate School

Naval Postgraduate School

Where Science Meets the Art of Warfare

NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

MC2 Tom Tonthat   |  February 15, 2022

NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

Computer Science doctoral student U.S. Navy Cmdr. Edgar Jatho, left, and his advisor Assistant Professor Joshua A. Kroll are the winners of the U.S. Naval Institute 2021 Information Warfare Essay Contest with their award-winning piece, “Artificial Intelligence: Too Fragile to Fight?”

U.S. Navy Cmdr. Edgar Jatho, a doctoral student in the Naval Postgraduate School (NPS) Department of Computer Science, and his advisor Assistant Professor Joshua A. Kroll have been named the winners of the U.S. Naval Institute (USNI) 2021 Information Warfare Essay Contest for their piece, “ Artificial Intelligence: Too Fragile to Fight? ” 

Jatho and Kroll will be honored this week at an awards ceremony during WEST 2022, a large naval conference and exposition in San Diego, Feb. 16-18. In addition, their award-winning essay has been published in the February issue of USNI’s Proceedings .

The essay cautions on the overreliance of AI and raises awareness of potential issues and exploits that can affect its effectiveness in the field. Jatho got the inspiration to write about the essay for the annual USNI contest while taking Kroll’s Trustworthy and Responsible AI course in 2020.  

“The course involved reading leading thinkers across disciplines about AI, automation and algorithms,” said Jatho. “It brought up the challenges and difficulties in implementing safe and ethical systems that leverage the technology … Now that we have this big impetus by the DOD to adopt artificial intelligence and machine learning into technology on all kinds of different levels and solutions, it’s easy for us to forget some hard-won lessons.”

In addition to his NPS coursework, Jatho was inspired by one of the presenters in NPS’ longstanding Secretary of the Navy Guest Lecture (SGL) series. During his October 2021 lecture on future of warfare, retired U.S. Navy Adm. James Stavridis cautioned how the military’s overreliance on advanced technology can leave it vulnerable to massive disruption.

“Dependence on a new technology like cyberspace, artificial intelligence or nanotechnology will enable you to move forward,” said Stavridis during the SGL. “But does it create an Achilles’ heel? Often it does.”

“Based on all of the research that I’ve been reading of what’s possible, it really got me to think,” continued Jatho. “[AI] can be a very complex and difficult problem because you can find support that says it’s doing a great job. Then suddenly when it gets to the battlefield, you find that it’s extraordinarily brittle and easy to break.”

Jatho credits the many resources at NPS supporting his education and research, and the opportunities to apply his studies, for his winning piece. Leading among them is Kroll, Jatho’s advisor and co-author of the essay.

“I’m quite proud that we in NPS can be in this position of thought leadership for the Navy and we can do the work to really think strategically,” said Kroll. “That’s something that I don’t think comes as naturally from other institutions that aren’t as focused on defense-oriented problems.”

As the winning essay, the USNI presents Jatho and Kroll’s cautions about AI as part of its mission to advance the professional, literary and scientific understanding of sea power and other issues critical to global security. 

It is critical for NPS faculty and students to contribute their work to these kinds of leading venues, Kroll says, that collect and distribute these ideas to the naval and defense community.

“They’re a trusted voice in this DOD community,” he says, “and I hope we can have an impact with how people can make our naval capabilities support the execution of the mission in a more robust and trustworthy way.”

Jatho continues the pursuit of a Ph.D. in Computer Science at NPS, and is slated to teach at the Naval Academy following graduation as part of the Permanent Military Professor program.

MEDIA CONTACT  

Office of University Communications 1 University Circle Monterey, CA 93943 (831) 656-1068 https://nps.edu/office-of-university-communications [email protected]

  • Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies

Military History

  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Information Warfare

Introduction, general overviews.

  • World War II (1939–1945)
  • Cold War to Present (1946–2013)
  • Psychological Warfare
  • Cyber Warfare
  • Electronic Warfare

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Battle of Midway
  • Military Intelligence
  • World War I in Film

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Pre-Revolutionary Mexican Armed Forces
  • Private Military and Security Companies
  • Third Battle of Panipat
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Information Warfare by Robert R. Mackey LAST REVIEWED: 27 March 2014 LAST MODIFIED: 27 March 2014 DOI: 10.1093/obo/9780199791279-0024

Information warfare is a generally Western, late-20th century military term that encompasses a wide range of non-kinetic forms of human conflict. Before the emergence of modern communications technology in the early 20th century, information warfare included only fields such as misinformation, propaganda, and deception. The invention of radio added the new field of electronic warfare, as adversaries attempted to jam, fool, and monitor each other’s military efforts in this new domain. The invention of the microchip, which in turn led to the practical use of computers on the battlefield, ultimately joined the ranks as cyber warfare . Together, these seemingly disparate topics were joined together under information warfare.

Information warfare, while a relatively new doctrinal term in the military lexicon, is as old as warfare itself. The Trojan horse of Homer’s The Iliad is one the most well known examples of classical information warfare in literature, but military history is filled with non-fictional examples. According to Sun Tzu 1971 , the ancient Chinese military theorist and philosopher, Sun Tzu believed that “all warfare is deception,” in essence stating that warfare itself is based on the use or misuse of information, as well as military prowess. In the 20th and 21st centuries, the nature of information warfare further evolved, especially in the areas involved with mass communications, radio and electronic communications technology, and the application of marketing techniques to influence specific and general audiences. David and McKeldlin 2009 notes how the spread of global news and reporting has created an environment in which both battlefield commanders and political leaders are held accountable for actions that only a few generations before would have been ignored or suppressed. The melding of fields that had been distinctly separate in World War II, such as propaganda, deception, and electronic warfare (such as jamming enemy radio signals) has become more unified in both doctrine and practice, at least as seen in MacDonald 2007 . Luckily for scholars and students in the field, several fine works exist, such as Armistead 2007 and Paul 2008 , that greatly aid in the understanding of the nature of 21st century information warfare.

Armistead, Edwin Leigh. Information Warfare: Separating Hype from Reality . Washington, DC: Potomac, 2007.

Armistead’s collection of essays on the role of information operations and warfare as a distinct and critical part of national power is a solid basis for understanding the nature of information age conflict and conduct. Written primarily for practitioners, the work attempts to remove many of the myths that have evolved around information warfare.

David, G. J., and T. R. McKeldlin, eds. Ideas as Weapons: Influence and Perception in Modern Warfare . Washington, DC: Potomac, 2009.

An anthology of essays from practitioners of modern information warfare, Ideas as Weapons is an excellent introduction to the subject. From the grand strategic view, to tactical applications of information, the work encompasses a variety of topics. The work itself is a product of its era; most essays focus on US/Western operations in Iraq and Afghanistan in the mid- to late-2000s, with little focus outside of that experience.

Macdonald, Scot. Propaganda and Information Warfare in the Twenty-First Century . New York: Routledge, 2007.

Macdonald’s short book bridges the gap between Cold War-focused studies of the 1980s and the challenges of information post-2000. His focus is primarily on the modification and presentation of altered images in modern warfare, given the 24-hour news cycle, analyst information overload, and continued proliferation of the media culture in Western nations.

Paul, Christopher. Information Operations: Doctrine and Practice: A Reference Handbook . New York: Praeger, 2008.

The best single volume introduction to modern Western information operations (IO), Paul’s work examines all the elements (deception, psychological operations, cyber and electronic warfare) of modern IO. This work is often used by Western militaries as a basic textbook on the subject; clearly written and based on solid post-9/11 battlefield experience.

Sun Tzu. The Art of War . Translated by Samuel B. Griffith. New York: Oxford University Press, 1971.

Nearly all modern concepts of information warfare trace their roots back to The Art of War . Sun Tzu’s famous quote, “All warfare is deception,” has been repeated by war colleges across the globe as a basic truism of military doctrine. However, a modern reader should be aware that much of the work addresses the times in which it was written and has only limited application to an information-saturated modern world. Numerous editions, translations, and versions.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Military History »
  • Meet the Editorial Board »
  • 1916 Easter Rising, The
  • 1812, War of
  • Aerial Bombardment, Ethics of
  • Afghanistan, Wars in
  • Africa, Gunpowder and Colonial Campaigns in
  • African Wars of Independence
  • Air Transport
  • Allenby, Edmund
  • All-Volunteer Army, Post-Vietnam Through 2016
  • American Colonial Wars
  • American Indian Wars
  • American War of Independence
  • Animals and the Military
  • Antietam, Battle of
  • Arab-Israeli Wars, 1948-Present
  • Arctic Warfare
  • Armed Forces of the Ottoman Empire, 1683–1918
  • Armored War
  • Arms Control and Disarmament
  • Army, Roman
  • Artists and War Art
  • Assyrian Warfare
  • Attila and the Huns
  • Australia from the Colonial Era to the Present
  • Austrian Succession, War of the
  • Austro-Hungarian Armed Forces
  • Balkan Liberation, 1878–1913, Wars of
  • Battle of Plassey, 1757
  • Battle of Route Coloniale 4, 1950: France’s first devastat...
  • Battle of Salamis: 480 BC
  • Battle of Tours (732?)
  • Bonaparte, Napoleon
  • Brazilian Armed Forces
  • Britain and the Blitz
  • British Armed Forces, from the Glorious Revolution to Pres...
  • British Army in World War II
  • British Army of the Rhine, The
  • British-India Armies from 1740 to 1849
  • Canada from World War I to the Present
  • Canada in World War II
  • Canada through World War I
  • Cavalry since 1500
  • Charlemagne
  • China's Modern Wars, 1911-1979
  • Chinese Civil War, 1945-1949
  • Christianity and Warfare in the Medieval West
  • Churchill, John, 1st Duke of Marlborough
  • Churchill, Winston
  • Clausewitz, Carl von
  • Coalition and Alliance War
  • Cold War, 1945-1990
  • Commemoration
  • Communications, French Revolution to the Present
  • Conflict and Migration
  • Conquest of Mexico and Peru
  • Conscription
  • Cornwallis, Charles
  • Counterinsurgency in the Modern World
  • Crimean War, 1853–1856
  • Cromwell, Oliver
  • Crusades, The
  • Cuban Missile Crisis
  • Defense Industries
  • Dien Bien Phu, Battle of
  • Dominion Armies in World War II
  • Douhet, Giulio, airpower theorist
  • Eisenhower, Dwight
  • Ethnic Cleansing and Genocide
  • European Wars, Mid-Nineteenth-Century
  • Finland in World War II
  • France in World War I
  • Frederick the Great
  • French Armies, Early Modern
  • French Military, 1919-1940
  • French Revolutionary Wars, The
  • Gender Issues
  • German Air Forces
  • German Army, 1871–1945
  • German Sea Power, 1848-1918
  • German Unification, Wars of
  • Germany's Eastern Front in 1941
  • Grant, Ulysses S.
  • Greek and Roman Navies
  • Guerrilla Warfare, Pre-20th-Century
  • Haig, Douglas
  • Hippolyte, Comte de Guibert, Jacques Antoine
  • Hiroshima/Nagasaki
  • History of Intelligence in China
  • Hundred Days Campaign of 1918
  • Hundred Years War
  • Hungary, Warfare in Medieval and Early Modern
  • Imperial China, War in
  • India 'Mutiny' and 'Revolution,' 1857-1858
  • Indian Army in World War I
  • Indian Warfare, Ancient
  • India-Pakistan Wars
  • Indochina Wars, 1946-1975
  • Information Warfare
  • Intelligence, Military
  • International Efforts to Control War
  • Iraq Wars, 1980s-Present
  • Irish Civil War, 1922–1923
  • Irish Revolution, 1911-1923, The
  • Italian Armed Forces in the Modern Age
  • Italian Campaign, World War I
  • Japanese Army in the World War II Era, The Imperial
  • Japanese Navy
  • Jomini, Antoine-Henri
  • Justice, Military, the Anglo-American Tradition
  • Justice of War and Justice in War
  • Khan, Genghis
  • Kursk, Battle of
  • Learning and Adapting: The British Army from Somme to the ...
  • Lee, Robert E.
  • Lepizig, Battle of
  • Literature and Drama, War in
  • Loos, Battle of
  • Louis XIV, Wars of
  • Low-Intensity Operations
  • Manzikert, Battle of
  • Maratha Navy
  • Medicine, Military
  • Medieval French Warfare
  • Medieval Japan, 900-1600
  • Mercenaries
  • Meuse-Argonne Offensive
  • Mexico and the United States, 1836–1848, Wars of
  • Midway, Battle of
  • Military Officers, United States
  • Military Revolutions
  • Modern Piracy
  • Mongol Wars
  • Montgomery, Bernard Law
  • Music and War
  • Napoleonic Wars, The
  • Napoleonic Wars, War and Memory in the
  • Navy, British
  • Nelson, Horatio
  • New Zealand
  • Nimitz, Chester
  • Nuclear Culture
  • Nuclear Weapons
  • Occupations and Military Government
  • Operational Art
  • Ottoman Navy
  • Passchaendale, Battle of
  • Patton, George
  • Peacekeeping
  • Peninsular War
  • Polish Armed Forces, 1918-present
  • Political Purges in the 20th Century
  • Poltava, Battle of
  • Popular Culture and Modern War
  • Prehistoric Warfare
  • Prince Eugene of Savoy
  • Psychiatric Casualties
  • Race in the US Military
  • Religio-Military Orders
  • Revolt in the Spanish Netherlands: 1561–1609 (Dutch Revolt...
  • Roman Empire
  • Roman Republic
  • Roses, Wars of the
  • Russian and Soviet Armed Forces
  • Russian Campaign of 1812
  • Russian Civil War, 1918–1921
  • Russian Military History
  • Russian Military History, 1762-1825
  • Russo-Japanese War
  • Safavid Army
  • Sailing Warships
  • Science and Technology in War
  • Science Fiction, Military
  • Semi-Military and Paramilitary Organizations
  • Seven Years' War
  • Seven Years' War in North America, The
  • Sino-Japanese Wars, 1895-1945
  • South Africa's Apartheid Wars
  • South West Pacific, 1941–1945, Campaigns in
  • Southeast Asian Military History, Colonial
  • Southeast Asian Military History, Precolonial
  • Space and War
  • Spain since the Reconquista
  • Spanish Civil War
  • Special Operations Forces
  • Stalingrad, Battle of
  • Steppe Nomadic Warfare
  • Submarine Warfare
  • Swedish Armed Forces
  • Tet Offensive
  • The Allied Bombardment of Occupied Europe During World War...
  • The United States and the Middle East, 1945-2001
  • Thirty Years War, 1618–1648
  • Trench Warfare
  • Uganda–Tanzania War, 1978–1979
  • United States Marine Corps, The
  • Urban Warfare
  • US Air Force
  • US Air Power
  • Verdun, Battle of
  • Victorian Warfare, 1837–1902
  • Vietnam War
  • Vietnam War in Hollywood Feature Films
  • War at Sea in the Age of Napoleon
  • War, Chemical and Biological
  • War Correspondents
  • War, Culture of
  • War in Mughal India
  • War of the Spanish Succession, 1701–1714
  • Warfare, Precolonial, in Africa
  • Warships, Steam
  • Women in the Military
  • World War I Origins
  • World War I: The Eastern Front
  • World War I: The Western Front
  • World War II and the Far East
  • World War II in Film
  • World War II in the Mediterranean and Middle East
  • World War II, Indian Army in
  • World War II Origins
  • World War II, Russo-German War
  • Yugoslavian Civil War, 1991–1999
  • Zhukov, Georgii
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|195.158.225.244]
  • 195.158.225.244
  • Reimagining Lotería: Participatory Futures and Cultural Renewal in Mexico
  • Enhancing General Education with Strategic Foresight
  • Foresight’s role in Collective Impact Initiatives
  • Macrohistory, Peace and Futures | An Interview with Johan Galtung
  • After Corporate Capitalism: Economics for People and Planet
  • Call for Papers: WFSF XXV Conference Special Issue
  • Youth Foresight with Umar Sheraz
  • Stories from the Future— an Experiential Lens on the post- pandemic landscape

information warfare essay

Information Warfare as Future Weapon of Mass-disruption, Africa 2030s Scenarios

Journal of Futures Studies, September 2018, 23(1): 77–94

DOI:10.6531/JFS.201809_23(1).0006

A R T I C L E

Information Warfare as Future Weapon of Mass-disruption, Africa 2030s Scenarios 

Rianne van Vuuren, Institute for Futures Research South Africa

Information warfare is an emerging threat which is developing into a significant future global security challenge, especially as the relationship between information and power is strengthened. Futures studies generate foresight about the manifestation of information warfare in the 2030s as an upcoming national security threat in Africa. The four scenarios developed provide plausible futures which offer early warning insights on the manifestation of infor- mation warfare as a national security threat confronting Africa during the 2030s. Polarisation poses a significant future risk in terms of leveraging information warfare as a future weapon of mass-disruption.

Keywords: Information warfare, Cyber warfare, Netwar, Psychological operations, Futures studies, Environmental scanning, Foresight, Scenario study, 2030.

Introduction

In an age where information is increasingly positioned at the centre of society, a future in which information also becomes a security liability is a reality for Africa and the world at large. Information in all its manifestations – from data to wisdom – has been central to institutional power since the dawn of humankind. Technological efficacies brought about by the digital revolution and enhanced global networking activities have boosted the role of information in national power. Furthermore technological development catapulted information to the centre of most human endeavours, although high levels of inequality persist as manifested in the digital divide.

Information warfare is a subject intimately linked to the future because it is closely related to technology futures as well as the changing manifestation of warfare and conflict. Futures thinking is based on three interrelated inquiries into the future with the objective to create broad awareness about the future. The inquiries are measuring the future to obtain knowledge about the future; imagining the non-existing future; and purposefully designing or making the future. Measuring, imagining and making sustainable alternative futures should be the preferable outcome of holistic futures thinking and requires active interventions to realise (Spies, 2015). In this article, the focus is on the measuring and imagining dimensions, but it also provided insight regarding the design of countering information warfare futures in Africa by 20302. Thus, a futurist perspective does not entail prediction of the future. Instead, the futurist strives to provide insight and foresight with the aim to promote knowledgeability which could assist in creating a preferable future.

The following key questions are focused on:

What is information warfare as well as its link to power?

What are the main driving forces which will influence the shaping of the future of infor- mation warfare as a national security threat for Africa?

What are the plausible scenarios in which information warfare would manifest as a na- tional security threat for Africa?

What propositions can be identified applying to the plausible information warfare threats against African governments and societies in the 2030s?

Information and Power: the Growing Synergy and Implications for National Security

Before the scenarios, which the African population and governments could plausible be expected to be confronted by, could be presented, it is important to focus on what exactly constitute information warfare. Generally, the term information warfare is still associated with high- technology weapons and broadcast images of drones destroying military targets with apparently assured accuracy and computer hackers taking down a country’s power grid by gaining control of the power supplier’s mainframe computers. Unfortunately, this armchair view of the sometimes confusing capabilities made possible by high technology and information technology has created a simplistic and sanitised vision of information warfare in which, to paraphrase Toffler (1990), the mindless fist is replaced by the congealed mind.

The media’s initial focus on guided missiles and intelligent warfare systems, the tangible element of the so-called digital battlespace, masked the potentially deeper societal implications of virtual warfare strategies and global power projection (Cronin & Crawford, 1999, p.257). Increasingly cyber security and cyber warfare are becoming a major threat narrative with a wide range of applications in the fields of crime, business and politics (Carr, 2012, pp.1-5). Of specific interest for further study is the additional intangible role that information and communication play in terms of success in this new unfolding conflict environment. It is suspected that this aspect might become a major determinant of potential future political and economic supremacy.

Information is increasingly linked to power. How a government uses that power progressively controls how effectively a country may be influencing world politics and national security. In the past, the elements of power included mainly military, economic and diplomatic factors. However, in the 21st century, information is rapidly assuming a key position in foreign and security policy. It can potentially fulfil many roles, such as being a force multiplier, a tool for influencing decision-making and/or an instrument for manipulation. Information has evolved into a significant power projection instrument for the state. However, as much as it presents an offensive power projection capability to the state, it also poses a potential momentous threat to the state and society in general (Armistead, 2004, p.231).

The constantly changing national security environment is linked to the shifting basis of state power. As the foundation of global civilisation evolved from agriculture to industry and then to the information sector, the power structures within states also changed. At its core, the transformation to an information-based society represents a shift from manufacturing to knowledge, where the creation, application and dissemination of knowledge, rather than the production of manufactured goods or agricultural products, is becoming the central defining activity of modern society and governance (Mazarr, 1997, p.25). This shift has a direct impact on how national security is being viewed by some governments. At the same time, there has been an increase in the number of countries studying innovative ways to endeavour to gain an advantage by changing the way in which conflict is managed and power is projected. The information society brings new and revolutionary technologies and means, which demand change in the way state security is managed (Lin, 2000). This has a broad impact on modern society, changing risk and threat analysis in most human endeavours.

Regardless of shifting global power structures, a coherent national security strategy is an important instrument for any state. All states, even those with limited resources, have a broad range of tools at their disposal to advance their interests. These tools, whether diplomatic, economic, informational or military, provide the means by which they seek to achieve their security objectives. A national security strategy provides a rational framework for specifying interests in a comprehensive and methodological way (Africa Centre for Strategic Studies, 2005, p.1).

While many governments have developed strategic security frameworks as national security strategies, this remains largely limited to the developed world countries. Even in the case of countries addressing information-related threats the focus remains largely limited to cyber security. However, the threat from the information environment is broader than the cyber dimension. The pervasiveness of information in modern society makes it a key factor in the construction of a global information society (Chadwick, 2006, p.209). The African Union’s Agenda 2063 foresee a peaceful and secure Africa an integrated, prosperous and peaceful Africa, driven by its own citizens and representing a dynamic force in the international arena. (African Union Commission, 2015, pp.1-2). Already in the 1990s the potential threat posed by information warfare was identified as a significant national security threat in the future (Waltz, 1998, p.13). A significant question to be asked is whether the information dimensions, especially in terms of their comprehensive implication for national security, are adequately addressed by an upcoming continental power such as Africa.

Finding a practical definition for information warfare has been a challenge but a necessity for the development of information warfare futures in Africa by the 2030’s.

Defining Information Warfare

While a significant amount of work has been done on efforts to define information warfare and related concepts, this is taking place mainly in the developed world. In general, information warfare has not been regarded in a comprehensive manner as a significant part of the national security threat perception in the developing world, especially in Africa. Larger developing countries such as China (Cheng, 2017, p.1) and India (Sekhar, 2015) are exceptions.

In this article the term information warfare is used in its contemporary and futurist contexts, but it is also acknowledged that the phenomenon has strong historical links. Although information warfare is a recent concept that has only been used since the early 1990s, diverse and sometimes even contradictory definitions of information warfare have complicated study of this phenomenon. Existing definitions of information warfare have limitations. These are related to being too expansive or purely focusing on USA military-centric definitions, and lastly being limited or largely limited to attacks on the ICT infrastructure and capacities of countries and/or entities (Arquilla & Ronfeldt, 2001; Denning, 1999; Ventre, 2009 & 2011).

While some aspects related to information warfare are as old as humankind, many aspects as to how it is being applied in our contemporary information driven world are new (Jones, Kovacich

& Luzwick, 2002, p.5). In an effort to address the limitations of current definitions the following definition of information warfare is proposed: Information warfare is defined as actions focused on destabilising or manipulating the core information networks of a state or entity in society with the aim to influence the ability and will to project power as well as efforts to counter similar attacks by an opposing entity and/or state (van Vuuren, 2016, p.77).

The above definition encompasses three manifestations of information warfare, namely netwar, psychological operations and cyber warfare.

  • Netwar is described by Arquilla and Rondfeldt (1997) as referring “… to an emerging mode of conflict at societal levels, involving measures short of traditional war, in which the protagonists use network forms of organization and related doctrines, strategies, and technologies attuned to the Information Age. These protagonists are likely to consist of dispersed small groups who communicate, coordinate, and conduct their campaigns in an internetted manner, without a precise central command.”
  • Cyber warfare (cyberwar) describes a power related conflict that takes place in cyberspace3

(the virtual world and the internet) instead of in the physical world.

  • Psychological operations refer to an intangible sphere, in essence the power related conflict area is people’s minds, and criteria for winning or losing are also heavily culture-dependent (Eriksson, 1999, pp.57-64).

information warfare essay

The Fundamentals of Information Warfare

Taking cognisance of information warfare as an overriding term, while also noting the history of information in conflict as well as criticism on existing definitions, the following fundamentals endeavour to identify some common elements as well as significant characteristics of information warfare:

  • Information in especially a networking capacity is central to the information warfare  concept with attaining information superiority as a tactical and strategic aim
  • Information warfare refers to the cognitive and technological disruption linked to conflict and war but not to the kinetic aspects associated with war and terrorist activities.
  • Information warfare is linked to using information as instrument for manipulation, power projection, leveraging and creating an advantage.
  • The strengthening of network-orientated organisations and interactions, while consequently hierarchical orientated organisations and interactions are weakened because of the information revolution and expansion of global communications (Arquilla & Ronfeldt, 2001, p.1).
  • The global network ecology is transforming itself from a purely communications medium to a social environment of growing political and security significance (Vlahos, 1998, p.77).
  • The exponential growth of technology, globalisation and increasing significance of networking are enhancing the future significance of information warfare.
  • Information warfare is in essence a transdisciplinary concept covering a wide array of interests, including the political, governance, technological, psychological, social, media, economic and military fields.
  • Both offensive and defensive roles are envisaged for information warfare.Information warfare is not bound by geographic limitations.
  • The cost of conducting information warfare would in most cases be much lower compared to other forms of power projection.
  • It is widespread and available to any country, and, in most cases, to any individual or group that wants it (McLendon, 1994). Technological skills barriers exist in the case of cyber warfare.
  • The increasing dual-use nature of especially information technology results in many technologies having both military and civilian applications (Schneier, 2008).
  • Information warfare invokes asymmetric action. Asymmetry is about the qualitative difference in the means, values and style of opposing powers. Once a state or entity insists on superiority in power projection, its disadvantaged opponents resort to unconventional asymmetrical means to oppose it, avoiding its strengths and concentrating on its vulnerabilities (Bishara, 2001).

The Future of Information Warfare

Taking into account the current manifestation of information warfare related issues (from Ransomware like WannaCry affecting hundreds of thousands of users to indications of international digital interference in national elections), governments worldwide are actively investigating and building information warfare capabilities. Most developed states and some developing states have conducted information warfare related exercises and established national monitoring entities (Breene, 2016). At the same time non-governmental entities are also getting involved in information warfare related activities (Cronin & Crawford, 1999, p.259). Information warfare is a global phenomenon, which makes it difficult if not impossible to evaluate in a domestic context. This will become even more so considering the future of information warfare.

Despite these capacities being created by governments and interest groups, the dynamics of emerging emergencies and the unintended consequences of information warfare make the control of information-driven dynamics highly problematic. Controlling the potential consequences of information war in terms of its netwar and psychological dimensions will be difficult, if not impossible, in the future. The emergence of various other information warfare actors other than government controlled entities remains a high probability in the future.

Protest movements and/or governments are massing force in the infosphere as well as cyberspace for viral propaganda and debilitating IT infrastructure attacks. As a result, there is a renewed focus on mass cyber-mobilisation and concentration as part of information warfare. Networked actions, cyber-mobilisation and rhizome4 organisation are underpinning popular forms of conflict and not only lower impact cyber-related activities such as the disruption of websites. These activities require public participation for success and social media like viral organisations (Elkus, 2009).

In the future it can be expected that this form of mobilisation will even be more effective. The rise of artificial intelligence will probably change the nature of cyberwar in particular. Instead of relying on human hackers to carry out their attacks, antagonists will in the future continue to automate information warfare, relying on artificial intelligence systems to probe opposing defences, carry out attacks, and defend against opponents’ artificial intelligence. It is probable that this competition could eventually outstrip human control or even monitoring (Pazvakavambwa, 2018).

In the modern field of struggle between a sovereign country and non-state actors it also becomes necessary to refer to the information warfare that is taking place in the new and traditional media as well as other technological platforms, from the internet to virtual reality and computer games. Such groups, including terrorist organisations, continue to invest efforts in information warfare tools, which enable them to bridge the physical gap between these entities and their conventional law enforcement and security forces. Some of these entities’ irregular capacities will probably outstrip the competencies of states in this regard (Gilat, 2009).

These largely asymmetric warfare capabilities, which include information warfare, are even empowering the individual to conduct war. While the concept of asymmetric warfare dates back to ancient times, most modern conflicts have redefined the nature of such struggles. As the manifestation of information warfare indicates, warfare is being transformed from a closed, state- sponsored affair to one where the means and know-how to do battle are readily found on the internet and social networks. This open and global access to increasingly powerful technological tools is in effect allowing small groups to declare war on states. Insurgent groups can be expected to increasingly form loose and non-hierarchical networks to pursue a common vision. United by that vision, they exchange information and work collaboratively on tasks of mutual interest (Charette, 2007).

Outcome of an Environmental Scan

In measuring and imagining the future of information warfare for Africa an environmental scan was used to evaluate the current milieu within which information warfare manifests with the capacity to also provide some level of insight into the driving forces which will influence how it might manifest in the future. The environmental scan focused on events, developments and manifestations related to information warfare and national security within the Technological, War/ Conflict, Economical, Political and Social (TWEPS) macro-environmental hexagon, which is used instead of the frequently used STEEP (Social, Technological, Economic, Environmental and Political) sectors5 (Kurian & Molitor, 1996, p.814). This multi-disciplinary environmental scanning focused on literature with the aim to identify current manifestations and to recognise possible trends and driving forces flowing from this (van Vuuren, 2016, p.77). See Figure 2 (Adapted from Spies, 2005) for a graphical representation of the TWEPS macro-environmental hexagon.

information warfare essay

The outcome of the environmental scan focusses on three overriding trends present in all environments. Transformation, networking and the impact of technological innovation in all the environments investigated were highlighted as central to the manifestation of information warfare currently as well as in the future. These trends influence not only the entities involved in power relations in society, but also enhance the potential influence and power of small and marginalised entities in society. New forms of network-related actions such as the rhizome phenomenon (small, highly interconnected networks) using social network phenomena for cyber mobilisation have been identified as of particular use for information warfare in the future. In order for information warfare activities in line with the identified trends to accomplish any outcome it should be focused on specific targets in the information society.

Information warfare targets are multi-dimensional, focusing on tactical and strategic targets. The environmental scan highlighted the significance of growing interdependence and globalisation in economic prosperity, exposing the commercial networks and the global service sector as highly vulnerable to all constituent elements of information warfare, namely netwar, cyberwar and information operations. Information warfare is primary targeted at the power structures in any state. These structures are part of the complex and inter-related processes and services underlying the information structures in society. In this regard, see Figure 3 for an illustration of the pillars of an information society. The key vulnerable access points include factors underlining the networking, coordination, integration, compatibility and connectivity of the information and knowledge society.

information warfare essay

Based on an evaluation of the past manifestation of information warfare, it can be concluded that practically all recent conflict situations have had an information dimension (van Vuuren, 2016, pp.116-124). While information warfare enhances military power, especially in developing countries, it also creates new vulnerabilities. It can be assumed that this trend will continue and that nearly all future conflict situations will have an information warfare dimension.

Social media and internet/mobile device platforms for such media empower most net-enabled individuals with an interest in participating in practically any global issue (van Vuuren, 2016, p. 262). This has created platforms for involvement and participation in social and political issues on a level never seen before in the history of humankind. Increasingly, it does not matter what the majority’s view on issues are; it matters more what the majority of “empowered individuals” are doing.

Identification of the Scenario Drivers

The process to identify the scenario drivers is done by using qualitative text analysis on the outcome of the environmental scan deploying coding techniques, assisted by the use of software. Coding can be done using structured or unstructured data as source material. The best results are obtained by using qualitative text analysis of documentary source material (Kuckartz, 2015).

Kuckartz (2014, p.33) states that qualitative text analysis is a form of analysis in which an understanding and interpretation of the text play a far greater role than in classical content analysis, which is more limited to the so-called manifest content. In qualitative text analysis, codes are typically words or devices used to identify themes. As the research focuses on theme-related issues associated with the current and future manifestation of information warfare, thematic qualitative text analysis is focused on the environmental scan’s text in its entirety, identifying the common occurring codes.

After studying the text, the summative, essence-capturing codes are identified, which are applicable to all text in the environmental scan narrative. Three such overarching codes have been identified. The three coding concepts overlap to some extent but for each paragraph the essence- capturing categories and then subcategories are conceptualised. The source of these qualitative text analysis thematic coded categories, identified as Innovation, Networks and Transformation, is the   three cross-cutting trends identified from the environmental scan.

Firstly, the rapid spread of technology and innovation has a major impact on states, organisations and individuals while also contributing to significant inequality. Secondly, the world community has reached a new level of integration, accompanied by the rise of networks, especially social networks. Thirdly, societal change has been accelerated to new levels, resulting in transformation as a constant reality affecting nearly all social entities. The outcome of the environmental scan is coded using Coding Analysis Toolkit (CAT) software developed by the University of Pittsburgh’s Qualitative Data Analysis Program (QDAP) (University of Pittsburgh, 2015).

The value of the identified driving forces is increased when the driving forces are scrutinised, integrated, prioritised and validated by panels of global and local experts knowledgeable about information within the TWEPS environments. For this purpose two Delphi studies are conducted as the most appropriate method for arriving at an expert validation and consensus on the key driving forces impacting the manifestation of information warfare as a national security threat by the 2030s. Two separate Delphi studies were conducted. The one Delphi study used South African security experts while the second Delphi study uses both domestic and international experts (knowledgeable in one or more TWEPS environments). Input from a multi-disciplinary pool of experts enhanced the value of the analysis already done during the environmental scan and also provided some additional input and insight to the development of scenarios.

The two Delphi studies refined and validated the ten most significant drivers influencing the manifestation of information warfare as a national security threat in the 2030s:

  • The centre of power is shifting from the traditional developed countries to the developing countries.
  • Security in a networked environment will increase in complexity as its physical and non- physical elements become more tightly interwoven.
  • An increase in integration and polarisation will contribute to systemic stresses.
  • Information warfare will become a growing option for power projection.
  • Symbolic, information-related phenomena are increasingly impacting behaviour.
  • The speed of change is increasing exponentially with global communication becoming instantaneous.
  • Non-state actors are increasing their influence related to global security.
  • Global and intra-regional inequalities are stimulating conflict potential.
  • Information communication technology (ICT) is embedding itself as a crucial part of society.
  • Social media is a significant part of communication and this is expected to grow in the future.
  • The two most significant driving forces for scenario building are obtained by combining the prioritisation of the Delphi studies creating the two key drivers namely:
  • ICT is embedding itself as a crucial part of society.

Africa Information Warfare Scenarios 2030s

The horizontal axis of the scenario matrix represented the spectrum measuring the extent to which ICT is embedding itself as a crucial part of society. The upper side of this spectrum represented an environment in which ICT is highly embedded in society resulting in high connectivity. The lower side of the axis represented an environment in which ICT is poorly embedded in society resulting in lower connectivity. The vertical axis represented a spectrum measuring the extent to which integration and polarisation will contribute to societal systemic stresses. The right-hand side of this axis represented an environment in which society experiences high levels of integration. The left-hand side of this axis represented an environment in which society experiences high levels of polarisation. Thereby four information warfare scenarios for Africa in the 2030s are created (See Figure 4).

information warfare essay

Rationale for the scenario names

Metaphors from African traditional religion and myth are adopted for the identified scenarios with the aim to advance creative thinking about the content of the scenarios and setting it within an Africa milieu.

Shango is from Yoruba tradition in Nigeria. As an earth deity he was once a mortal man, the king of Oyo, who transformed himself into an immortal. According to tradition, during his life he breathed tongues of fire. He then ascended into the sky by climbing a golden chain and became the god of thunder and lightning. He is also god of justice, punishing thieves and liars (Jordan, 2004, p.282). Shango Rejuvenated illustrates a scenario in which a high level of ICT embeddedness and high levels of integration boost assimilation, cooperation and technological advancement but also provide the ideal staging area for the use of information warfare as power projection method.

Gaunab is malevolent god of darkness from the Khoi culture in Namibia. This deity is the chief adversary of the creator god Tsunigoab. He was engaged in a primordial struggle for supremacy during which Tsunigoab was wounded but eventually triumphed, consigning Gaunab to the so- called “dark heaven” (Jordan, 2004, pp.102-103). Gaunab Rising does not necessarily reflect the rise of authoritarian societies but rather highlights the possible consequences of international, regional and even national polarisation in an environment in which high levels of ICT embeddedness will enhance the capabilities of all the role-players involved in society. Inkanyamba is a Zulu storm god from South Africa. The deity is specifically responsible for tornados and perceived as a huge snake coiling down from heaven to earth (Jordan, 2004, p.139). Inkanyamba Reduced reflects a polarised environment in which the embeddedness of ICT remains at a low level. Although the technological element of information warfare such as cyberwar could be constrained, high levels of potential conflict are experienced.

Tsunigoab is the creator god in the Khoi tradition in Namibia. Tsunigoab walks with a limp, because of an injury sustained in a primordial battle with his arch rival Gaunab, the god of darkness, who was eventually driven away to live in the dark heaven. Tsunigoab is invoked at dawn each day (Jordan, 2004, p.323). Tsunigoab Revived reflects a scenario in which a high level of social, political and economic integration is achieved but this is not supported by a high level of ICT embeddedness in society overall, resulting in general stability but difficulty in maximising technological opportunities in society.

Scenario 1: “Shango Rejuvenated”

The first scenario is one in which technological progress takes place in an increasingly cooperative and networked global society. This scenario is formed in the quadrant where both the two main driving forces – ICT embeddedness and societal integration – are high. This provides for a highly technological driven African society in which the consumer and business demand for technological solutions are elevated. Information warfare is seen as a common but also practical and useful instrument for the projection of power by many entities within Africa. In Africa the governments and other prominent local and international role-players are forcing commercial values on the continent, creating a largely homogeneous landscape, with main stream diversity decreasing and some sub-cultures forced to the margins. Information warfare is regarded by many domestic but also foreign role-players as a legitimate method for dissent and conflict. The governments but even larger corporate entities invest significant resources into both defensive and offensive capabilities.

Shango Rejuvenated: Information Warfare and National Security Manifestation

The high levels of social cohesion play a role in limiting both international as well as domestic conflict and war. However, competition, especially competition leading to conflict, remains common. Because of the high level of ICT embeddedness in African society, information warfare remains a useful as well as viable instrument of influence and power projection. Many African states continue to survive challenges against their legitimacy. However, powerful non-state entities such as corporations and ideological or religious groups continue to challenge the sovereignty of many African states.

As interconnected high technology and especially the digital economy are central to stability and wealth creation in Africa, societies are highly vulnerable to information warfare. It also occurs on all levels and in all of its manifestations as cyberwar, netwar and psychological operations. Information warfare is regarded as a major national security issue in most African states. It is also being developed by some governments and other entities as offensive power projection tools. At the same time counter measures against threats are highly specialised and evolving fast. Networked security is seen in Africa as of major significance and substantial resources are invested in this.

The relevance of information warfare is strengthened by the systemic nature of conflict because of the high levels of ICT embeddedness. Conflict is highly complex with growing interaction between technological change, system development and operational innovation. Within Africa the deployment of autonomous weapons and security systems are common. Asymmetric action forms part of conflict in Africa but is somewhat restrained because of high levels of global integration and the focus on multi-level defensive capabilities.

Africa’s multi-lateral security cooperation is strengthened and assists in the overall maintenance of order in the international system. Africa is exposed to less inter-state conflict while the occurrence of intra-state conflict within the countries is increasing in which information warfare will increasingly become the method of choice for dissent. The added value of high level anonymity provided by technology, especially with cyberwar, is also exploited by non-government institutions and the government alike. Globally the borders between military and civilian conflict weakens partly because of the proliferation of power projection opportunities brought about by information warfare options. This leads to the expansion of a technological arms race between governments and non-government groups. Despite the information warfare related challenges in general, a balance of power within Africa as well as between African countries and foreign competitors in terms of this phenomenon exists as counter measures are largely in place or are developed quickly.

Terrorism presents a threat in Africa but increases also in the non-physical (information warfare) part thereof. Crime and terrorism are increasingly being combated by way of algorithms and big data. In Africa crime and terrorism groups use advanced technology that is basically matched by the advanced technology of the government entities opposing them. The high levels of integration in Africa will also result in the expansion of the reach and innovativeness of disruptive and terrorist means.

Scenario 2: “Gaunab Rising”

The second scenario is one in which technological progress takes place in an increasingly polarised society. This scenario is formed in the quadrant where the two main driving forces

– ICT embedding and polarisation – are high. While technological progress and technological interconnectedness are high, the potential thereof cannot be realised because of divisions in society. This scenario represents a divisive continent with increasing competition between societies mobilised on the grounds of status, nationality, religion and class, providing opportunities for authoritarian elites to expand control. Information warfare remains rife and its use is expanding on all societal levels. Inequality of capacities could lead to rapidly changing power configurations.

Gaunab Rising: Information Warfare and National Security Manifestation

The nation state is paramount, but non-state entities organised on religious, ideological and national grounds do present a growing threat to national security. Full-blown information conflicts and wars are common with a mix of kinetic and non-kinetic elements. In this regard virtual cyber- based groups such as Anonymous (or its future successors) pose a major national security threat. Non-state actors regard information warfare as a primary tool to promote and advance their political agendas.

In Africa information warfare thus poses a significant national security risk and growing ICT embeddedness ensures that states are highly vulnerable, especially to cyberwar. The level of potential hostility in Africa ensures that all methods for promoting interests in a fairly hostile environment are used. Both terrorism and cybercrime are serious local and global threats. Information warfare is regarded as a legitimate instrument of power projection by both the elite and the dissatisfied in Africa. The ruling elites in African countries manipulate and force the rest of the population into submission with whatever means at their disposal. However, the marginalised are also turning to information warfare as a practical tool to pursue their interests. Conflict becomes highly complex and common with growing interaction between technological change, system development and operational innovation. Asymmetric war and conflict options are an ingrained part of conflict and war globally.

The broader continental and global environment experience an increase in inter-state and intra-state conflict. The threat of nuclear war will be significant as the non-proliferation system is increasingly eroded. The start of a Second Cold War is a potential risk, especially if authoritarian states expand their assertive stands and military capabilities. This leads to an escalation of an arms race with high-technology weapons and even weapons of mass destruction (WMD). Private armies (mercenaries) are significant role-players in these conflicts. Drone warfare is rife and used by many governments and even non-government role-players.

Scenario 3: “Inkanyamba Reduced”

The third scenario is one in which polarisation is high while technology participation is low. This scenario is formed in the quadrant where the ICT embedding is low but polarisation is high; magnifying dissent, resulting in high levels of conflict and competition for resources. The technological part of information warfare in the form of cyberwar is limited, but the cognitive aspects in the form of netwar and psychological operations remain high. Elites control resources and inequality is widespread.

Inkanyamba Reduced: Information Warfare and National Security Manifestation

The potential for general anarchy is a significant national security risk resulting in security being a major but also expensive reality in African countries as well as in many other countries. In most cases, only the wealthy in Africa can afford security. The use of information warfare (especially netwar and psychological operations) is widespread. In general, the mix of kinetic and non-kinetic elements in war and conflict is common. Asymmetric war and conflict options form an ingrained part of conflict and war globally, although it is not as technology driven as it could be. Inequality leads to conflict and unrest in many African countries. Conflict about resources is common and hybrid warfare is widespread worldwide. The global non-proliferation system collapses and increases the possible use of WMD substantially. An arms race between states and even in some cases non-state entities are a reality.

The nation state including African countries are under pressure as non-state entities organised on religious and ideological foundations assert alternative and hybrid configurations. Ultra-regulated digital and physical fortresses are maintained in Africa, while outside of these a general “survival of the fittest” mind-set reigns. Private armies (mercenaries) are significant role-players in security as well as in the conduct of conflict.

Scenario 4: “Tsunigoab Revived”

High levels of social integration with low levels of technological participation ensure relative stability but also limit the potential value that technology could add to society. This scenario is formed in the quadrant where societal integration is high and ICT embeddedness is low. Levels of information warfare are lower but information warfare continues to be an instrument for power enhancement in society. Inequality continues to be a challenge.

Tsunigoab Revived: Information Warfare and National Security Manifestation

African governments remain paramount while non-state entities organised on religious, ideological and ethnic grounds do present some level of threat to national security. Information war is part of the power projection instruments available to state and non-state role-players. Cyber warfare’s significance, however, is diminished by the lack of ICT embeddedness in society. Asymmetric threats and operations form part of conflict but are restrained because of high levels of global integration. These threats and operations are disruptive when they occur. The threats associated with global conflict types, such as nuclear war, are less significant as the multi-lateral system is more robust as a result of the high levels of international cooperation that are maintained. Global and national conflict continues but the risk of serious escalation is lower. The start of a new global Cold War is unlikely. Terrorism remains a threat but counter measures are more coordinated in this more cooperative global environment.

Propositions

The formulation of the propositions is done from the viewpoint of endeavouring to find commonalities between the different information warfare scenarios. Although the propositions focus on African situations similar to the scenarios, the perspective is applicable globally. Information warfare as national security threat is so ingrained in global society that it is difficult to only restrict these propositions to Africa.

  • Information warfare will be a national security threat of note by the 2030s.
  • Multi-lateral measures would be ineffective to control information warfare.
  • Polarisation poses a significant risk for the boosting of information warfare as a national security threat.
  • Innovative forms of network-related actions will transform information warfare into ev- er-changing manifestations in the national security threat environment.
  • While information warfare is a national security threat, it also potentially implies an infor- mation warfare threat from the state posed to the freedom of the population .
  • Information warfare will become ingrained in society as the virtual and real worlds increas- ingly merge.
  • The identified four information warfare scenarios for the 2030s as well as the information warfare future model can serve as frameworks or mental models for wider application in the TWEPS environments and further research.

Globalisation and a high level of interconnectedness are changing the world, creating new national security challenges, processes and actors. Despite optimism that multi-lateral efforts would solve global security problems, it is clear that significant work still needs to be done in this regard. In terms of containing information warfare, it is even seriously questioned if any multi- lateral agreement to contain this phenomenon would even be possible, as verification would be practically impossible. It can be expected that national security will remain a national government responsibility, albeit a much more complex phenomenon in which individuals, non-state actors and alliances of individuals and other entities will be highly relevant actors. It can be expected that with technological development will come many innovations and improvements in the quality of life. At the same time, the negative side of these technologies will also be present and will mutate to hamper the development of solutions.

The two key driving forces on the continuum presented by the level of integration versus polarisation and the level of ICT embedding in society will be crucial in the manifestation of information warfare by the 2030s. On a strategic level, the management of these two driving forces and the countering of polarisation will be crucial in negating the threats posed by information warfare. However, irrespective of which scenario manifests, information warfare will become a national security threat of note by the 2030s. As economic, political and social life becomes more and more intertwined in everyday life, so does the vulnerability of humankind.

Furthermore the plausible scenarios also highlights how countries are able to manage discontent through a collaborative ethic of common purpose namely the ability of countries to build and sustain partnerships to combat discontent will increasingly depend on bolstering a country’s credibility with the broader global population and forging an ethic of common purpose. It can be expected that political credibility and international esteem will probably grow in political significance in the future. The Western model of political development and values was dominant up to the 20th century but is increasingly being challenged by the rise of Asia and Africa. The political democratic models as conceived and developed by the West will not necessary represent the models for the political environment of the future.

The threat of some form of global anarchy is an underlying theme regarding the nexus of the identified main driving forces namely polarisation and ICT embeddedness in the future. Therefore the importance of strengthening national and social will to enhance collaboration and shared common purpose might be that which will differentiate places of human progress from places of increasing inequality and increased chaos in the future.

The viability of futures research is related to the quality of the research methods used. The viability of futures research is also associated with the diversity of research methods used, specifically in the developing world. As long as futures research is seen as a sole domain of the developed North, it will struggle to maintain its global position as an instrument of change and sustainable growth.

Correspondence

Rianne van Vuuren Research Associate

Institute for Futures Research South Africa

E-mail: [email protected]

  • This article is based on research conducted for a University of Stellenbosch (US) Futures Study PhD thesis: “Information warfare as future South African national security threat” finalized in 2016.
  • In this article, the 2030s are set as the timeframe for the development of scenarios because of various reasons. As this study was conducted during the period from 2008 to 2015, the 2030s provides a time horizon of minimum 15 years, which is short enough to fit comfortably in the lifespan of individuals involved and interested in the question of how information warfare might influence society, but long enough to feel confident that significant changes in this regard could occur over this time period. At the same time, South Africa’s National Development Plan (NDP) 2030, drafted in August 2012 by the National Planning Commission (2011), contains a series of proposals to eliminate poverty and reduce inequality by 2030.
  • Arquilla and Ronfeldt (1997, p.41) described cyberspace as follows: “… is a bioelectronic environment that is literally universal, it exist everywhere where there are telephone wires, coaxial cables, fibre-optic lines or electro-magnetic waves. This environment is inhabited by knowledge, existing in electronic form.” Cyberspace consists of two measurable elements: connectivity and content. Connectivity encompasses the physical hardware, software and connecting electromagnetic or cable media that permit the generation, transfer, storage and sharing of data. The second element of cyberspace is content which influences behaviour and decision-making (Campen, 2008).
  • Deleuze and Guattari (1980, p.29) used rhizome as a metaphor to refer to a non-hierarchal form of organisation. Vail (2007) has extended this metaphor, referring to rhizome as an alternative mode of human organisation consisting of a network of minimally self-sufficient nodes that leverage non-hierarchal coordination of economic activity. The two key concepts of rhizome are self-sufficiency, which eliminates the dependencies that characterise hierarchy, and loose but dynamic networking that uses the “small worlds” theory of network information processing to allow rhizome networks to overcome information processing burdens typical of hierarchies. Rhizome therefore refers to an organisational pattern characterised by interconnected but independent networks of entities.
  • As the focus of research is on information warfare, the applicability of the STEEP sectors would need to be critically evaluated. Haberman (2013) identifies the Environmental (Ecology) Scanning sector as encompassing the natural world around us and understanding how the nature affects humanity and how humanity affects nature. Issues of concern are inter alia global warming, clean water, air quality, agriculture and increasing severity of storms. While these phenomena do have a significant influence on the world currently and in the future, these phenomena do not significantly manifest in the information warfare domain. Additionally the absence of the phenomena such as war and conflict, which go to the core role of information warfare in society as an additional fully-fledged sector, can be regarded as a significant gap. Therefore, the environmental sector is replaced with a War/Conflict sector creating the Technological, War/Conflict, Economic, Political and Social (TWEPS) macro-environmental hexagon.

Africa Centre for Strategic Studies. (2005). Background paper on the senior leader seminar . Gabo- rone, Botswana, 19 June to 1 July.

African Union Commission. (2015, April). Agenda 2063: The Africa we want . Final Edition. Armistead, Edwin. L. (2004). Information operations: Warfare and the hard reality of soft power.

Washington DC: Brassey’s.

Arquilla, John., & Ronfeldt, David. (1997). Information, power and grand strategy. In Arquilla, John. & Ronfeldt, David. (Eds.), Athena’s camp: Preparing for conflict in the information age. Santa Monica: RAND Corporation, MR-880. Retrieved April 15, 2005, from http:// www.rand.org/publications/MR/MR880/indext.html

Arquilla, John., & Ronfeldt, David. (2001). The advent of netwar (Revisited). In Arquilla, John. & Ronfeldt, David. (Eds.), Networks and netwars: the future of terror, crime, and militancy. Santa Monica: National Defence Research Institute, RAND.

Bishara, Marwan. (2001, October 3). An enemy with no forwarding address. Le Monde Diploma- tique , Retrieved January 12, 2004, from ttp://mondediplo.com/2001/10/03asymmetry

Breene, Keith. (2016, May 4). Who are the cyberwar superpowers? World Economic Forum. Re- trieved April 8, 2018, from https:/ /www.weforum.org/agenda/2016/05/who-are-the-cyber- war-superpowers/

Campen, Alan. D. (2008, January). Cyberwar, anyone? SIGNAL Magazine . Retrieved January 8, 2008, from http://www.afcea.org/signal/articles/templates/Signal_Article_Template.asp?arti- cleid=1452&zoneid=223

Carr, Jeffrey. (2012). Inside cyber warfare. Sebastropol: O’Reilly Media.

Chadwick, Andrew. (2006). Internet politics: States, citizens, and new communication technologies . Oxford: Oxford University Press.

Charette, Robert. N. (2007, November). Open-source warfare. IEEE Spectrum . Retrieved December 15, 2007, from http://blogs.spectrum.ieee.org/riskfactor

Cheng, Dean. (2017). Cyber dragon: Inside China’s information warfare and cyber operations . Santa Barbara: Praeger.

Cronin, Blaise., & Crawford, Holly. (1999). Information warfare: Its application in military and ci- vilian contexts. The Information Society , 15(4), 257-263.

Deleuze, Gilles., & Guattari, Félix. (1980). A thousand plateaus: Capitalism and schizophrenia . London and New York: Continuum.

Denning, Dorothy. E. (1999). Information warfare and security . Reading MA: Addison-Wesley. Elkus, Adam. (2009). The rise of cyber-mobilization. Groupintel.com. Retrieved February 16, 2009

92 from http://www.groupintel.com/2009/02/13/the-rise-of-cyber-mobilization

Eriksson, E. Anders. (1999). Viewpoint: Information Warfare: Hype or Reality? The Nonprolifera- tion Review , Spring-Summer, 57-64.

Gilat, Amir. (2009). Information warfare in the 21st century: Ideas are sometimes stronger than bombs. Eurekalert.org . Retrieved March 23, 2009 from http://www.eurekalert.org/pub_re- leases/2009-03/uoh-iwi031809.php.

Haberman, Michael. (2013, April). Four ways to do environmental scanning. Omega Solutions Blog , Retrieved July 25, 2016 from http://omegahrsolutions.com/2013/04/four-ways-to-do-envi- ronmental-scanning.html

Jones, Andrew., Kovacich, Gerald. L., & Luzwick, Perry. G. (2002). Global information warfare: How businesses, governments, and others achieve and attain competitive advantages . Boca Raton: Auerback Publications.

Jordan, Michael. (2004). Dictionary of Gods and Goddesses (2nd edition). New York: Facts On File, Inc.

Kuckartz, Udo. (2014). Qualitative text analysis: A guide to methods, practice and using software . London: Sage.

Kuckartz, Udo. (2015, April 3). Personal e-mail communication responding to question on the use of documentary material in qualitative text analysis.

Kurian, George. T. & Molitor, Graham. T. T. (1996). Encyclopaedia of the future , Volume 2. New York: Simon and Schuster Macmillan.

Lin, Abe. C. (2000). Comparison of the Information Warfare Capabilities of the ROC and PRC.

Cryptome . Retrieved October 27, 2005 from http://cryptome.org/cn2-infowar.htm

Mazarr, J. Michael. (1997). Global trends 2005: The challenge of the new millennium . Cambridge, MA: Center for International Studies.

McLendon, James. W. (1994, April). Information Warfare: Impacts and Concerns. Air War College Maxwell Air Force Base. . Retrieved March 2, 2008, from http://warandgame.wordpress. com/2008/02/24/information-warfare-impacts-and-concerns/

National Planning Commission. (2011). National development plan: Vision for 2030 . Pretoria: South African Government.

Pazvakavambwa, Regina. (2018, March 1). AI now on cyber criminals’ agenda. ITWeb . Retrieved on April, 8, 2018 from https://www.itweb.co.za/content/G98YdMLxaapMX2PD

Schneier, Bruce. (2008, May 1). America’s Dilemma: Close Security Holes, or Exploit Them Our- selves. Wired News. Retrieved May 4, 2008. from http://www.wired.com/politics/security/ commentary/securitymatters/2008/05/blog_securitymatters_0501

Sekhar, Raja. (2015). Digital India in the age of information warfare. GreatGameIndia Magazine , Oct-Dec 2015 issue. Retrieved March, 6, 2018 from http://greatgameindia.com/digital-in- dia-in-the-age-of-information-warfare/

Spies, Phillip. (2005). Measuring and making the future. Volume 4: The views of futurists. In Slaughter, Richard. A. (Ed.), Knowledge base of future studies. Foresight International, CD- ROM.

Spies, Phillip. (2015). Futures Studies’ ‘Holy Trinity’ within the context of a trained futures mind. Stellenbosch: Institute for Futures Research, Stellenbosch University. Learning Hub Lecture Notes, Principles of Futures Studies .

Toffler, Alvin. (1990). Powershift, knowledge, wealth and violence at the edge of the 21st century.  London: Bantam.

University of Pittsburgh. 2015. Coding Analysis Toolkit (CAT). Qualitative Data Analysis Program (QDAP) . Retrieved and used for coding May, 18 to 20, 2015 from http://cat.ucsur.pitt.edu/ app/main.aspx

Vail, Jeff. (2007). What is rhizome? Retrieved December, 5, 2008 from http://www.jeffvail. net/2007/01/what-is-rhizome.html

Van Vuuren, Rianne. (2016). Information warfare as future South African national security threat .

PhD thesis. Stellenbosch: Stellenbosch University.

Ventre. Daniel. (Ed). (2011). Cyberwar and information warfare . London: ISTE Ltd & John Wiley and Son Inc.

Ventre, Daniel. 2009. Information warfare. London: ISTE Ltd & John Wiley and Son Inc.

Vlahos, Michael. (1998). The emergence of the infosphere and its impact on military operations. In Campen, Alan. D. & Dearth. Douglas. H. (Eds.), Cyberwar 2.0: Myths, Mysteries and Reali- ty. Fairfax: AFCEA International Press.

Waltz, Edward. (1998), Information warfare: Principles and operations . Boston: Artech House.

Type above and press Enter to search. Press Esc to cancel.

Forms and Examples of Information Warfare Report (Assessment)

Information warfare is the manipulation of information to fulfill a military or political agenda. Gathering information and using it to weaken the opponent is the main reason behind the prevailing use of different types of information warfare tactics.

Disinformation and propaganda are the two main forms of information warfare. Most of these forms gained considerable credit during the world war age, and they have remained fundamental to date in military, government, and business management domains (Fogleman, 1995).

A notable example of propaganda information warfare is evident after a close review of the manner in which politicians managed the world war one. Propaganda helped citizens to maintain high levels of confidence in the political leadership.

According to Trueman (2000), Britain newspaper agencies printed headlines with messages of propaganda. He mentions some of them to have read “Belgium child’s hands cut off by Germany.”

Additionally, they printed headlines in other newspapers reporting that German prisoners had taken out the eyes of the British citizens. In an equal measure, Trueman’s report indicates that the German government also used propaganda tactics to earn the support of their citizens.

He notes that the German Government printed headlines such as “French doctors infect German wells with Plague germs.” Moreover, at another moment, they published papers with information that the enemy had blinded German prisoners (Trueman, 2000).

These propaganda messages blackened the enemy’s name and lead to rising of emotions in favor of the government. This form of warfare is, therefore, indispensible in increasing civic support to withstand a government’s interest.

Propaganda, which is a fundamental form of information warfare, helped to sustain the world war on for a long period that it could have been possible if the players did not use the technique.

During the Second World War, disinformation information warfare helped most organised armies to overcome their enemies. According to experts, information warfare helped the US, for instance, to defeat their enemy during the world war two.

According to Fogleman (1995), who was part of the army that participated in the war, there is a time during their mission when the Germany 7th army attempted to drive them from the tenuous beachhead.

The attempts failed because the US army had employed the use of Ultra’s to monitor the operations of the allied army The devices helped the army to read the German’s mail and messages without their knowledge.

When the German army began to gather forces to attach the US camp, they did not signify that they were privy to the plans of their opponents. They did not reveal that they had information on the strengths and weaknesses of the opponent (Fogleman, 1995).

The unique use of technology during war helped the soldiers. The tactic that they used is disinformation. The Ultras helped the army to gather relevant information that aided them to keep away from targeted grounds and gave them opportunities to launch attacks strategically.

By gathering the sensitive information from the enemy and being careful to operate almost normally, they successfully sent the right signal to the opponent and that confused the enemy.

Because of effective use of information warfare, the enemy failed to capture the Fogleman’s team until the end of the war.

Information warfare is still relevant in the current world. There are illustrations of the use of information warfare in fighting current wars.

Communications that have characterized interactions between military personnel in Somalia, Africa, and Al Shabaab militia portrays that the technique still enjoys acceptance across the globe.

Major E. Chirchir usually engages the al Shabaab in war of word on the internet. The communication falls into at least one of the categories of information warfare.

Recently, the officer posted on tweeter that Al Shabaab is working in exchange of money. He said that the militia are selfish and has no agenda for the public. Chirchir also informed the group that they would earn cash at Harbole and Jana Abd (Chirchir, 2012).

In saying this, Chirchir portrays the group as selfish. In addition, he sends fear in the enemy’s camp by saying his team will hit their strategic hid outs. For that matter, they should prepare to incur fatal loses. As the officer says, the militia will make more cash in Jana Abd.

This suggests that many death cases will arise at the locations and lead to immense compensations for lives lost.

This is propaganda information warfare meant to keep the allied forces in wait of attach at any moment, without getting factual time plans.

Alternatively, the idea can be disinformation, particularly if the army chooses to keep the opponent held up guarding their territory just to attack a different location without expectation.

At the same time, spokesperson Chirchir develops and maintains a positive public perception of his military group and the entire mission by suggesting in the statements that he is a defender of public interest.

They also attempt to put to an end public support that the group enjoys by branding them as selfish for money. The tool is helpful in maintaining public support for the peacekeeping group.

In Al Shabaab’s response to Chirchir, they claim that the allied army’s actions will lead to difficult consequences in future (Chirchir, 2012). The idea is to use propaganda to cause fear and possibly weaken the warriors’ ability to fight due to fear of unknown consequences.

Finally, the recent Libyan US Embassy attack provides useful material on information warfare. The US Government has intelligently reacted to the attack saying that all is well.

President Barack Obama diplomatically reported that Libya and US would join hands to arrest the suspect (US ambassador killed in consulate attack in Libya – Yahoo! News, 2012). Nevertheless, there are reports that the Government has interest in finding out the true causes of this attach.

This report also indicates that investigation experts are already at the location to find out the cause and motive of the killings (US ambassador killed in consulate attack in Libya – Yahoo! News, 2012).

Though it is difficult to access all government secrets, it is easy to think that the presidential statement aims at easing investigations. The US Government is therefore employing disinformation warfare schemes to gather credible evidence.

As investigation into the matter is in process, it is not justifiable to eliminate any party from the list of suspects until an investigation proves otherwise.

However, the government’s diplomatic reaction is called for, as it creates room for peaceful co-existence among different governments and people of varied origins.

Above all, the government uses disinformation warfare to accomplish the country’s political agenda of maintaining and sustaining security of the citizens.

Chirchir, M. E. (2012, September 12). Major E. Chirchir . Kenya Military Spokesman -The Official Account . Web.

Fogleman, G. R. (1995, May 16). Information Warfare and Deterrence. This site has been updated . Web.

Trueman, C. (n.d.). Propaganda and World War One. History Learning Site . Web.

US ambassador killed in consulate attack in Libya – Yahoo! News. (2012, September 12). Yahoo! News – Latest News & Headlines . Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2022, December 4). Forms and Examples of Information Warfare. https://ivypanda.com/essays/forms-and-examples-of-information-warfare/

"Forms and Examples of Information Warfare." IvyPanda , 4 Dec. 2022, ivypanda.com/essays/forms-and-examples-of-information-warfare/.

IvyPanda . (2022) 'Forms and Examples of Information Warfare'. 4 December.

IvyPanda . 2022. "Forms and Examples of Information Warfare." December 4, 2022. https://ivypanda.com/essays/forms-and-examples-of-information-warfare/.

1. IvyPanda . "Forms and Examples of Information Warfare." December 4, 2022. https://ivypanda.com/essays/forms-and-examples-of-information-warfare/.

Bibliography

IvyPanda . "Forms and Examples of Information Warfare." December 4, 2022. https://ivypanda.com/essays/forms-and-examples-of-information-warfare/.

  • Boko Haram Terrorist Organization: History and Facts
  • Global Counter-Terrorism Strategy
  • The United Nations Security Council's Challenges
  • Was the American Use of the Atomic Bomb Against Japan in 1945 the Final Act of WW2 or the Signal That the Cold War Was About to Begin
  • The Japanese Occupation of Southeast Asia 1941-1945
  • "Ways of Forgetting, Ways of Remembering" Book
  • Ways of forgetting, Ways of Remembering: Japan in the Modern World by John Dower
  • Australia’s Involvement in Pacific War

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.

institution icon

  • Journal of Advanced Military Studies

Marine Corps University Press

  • Volume 12, Number 1, 2021
  • Marine Corps University Press

View HTML

Additional Information

  • Political Warfare and Propaganda An Introduction
  • James J. F. Forest , PhD (bio)

The digital age has greatly expanded the terrain and opportunities for a range of foreign influence efforts. A growing number of countries have invested significantly in their capabilities to disseminate online propaganda and disinformation worldwide, while simultaneously establishing information dominance at home. This introductory essay provides a brief examination of terms, concepts, and examples of these efforts and concludes by reviewing how the articles of this issue of the Journal of Advanced Military Studies contribute to our understanding of political warfare and propaganda.

information operations, digital influence, political warfare, psychological warfare

In 1970, Canadian media theorist Marshall McLuhan predicted that World War III would involve "a guerrilla information war with no division between military and civilian participation." 1 More than 30 years later, in their 2001 groundbreaking book Networks and Netwars: The Future of Terror, Crime, and Militancy , John Arquilla and David Ronfeld described how

the conduct and outcome of conflicts increasingly depend on information and communications. More than ever before, conflicts revolve around "knowledge" and the use of "soft power." Adversaries are learning to emphasize "information operations" and "perception management"—that is, media-oriented [End Page 13] measures that aim to attract or disorient rather than coerce, and that affect how secure a society, a military, or other actor feels about its knowledge of itself and of its adversaries. Psychological disruption may become as important a goal as physical destruction. 2

How prescient these observations seem today, particularly given how malicious actors—both foreign and domestic—are now weaponizing information for the purpose of influencing political, economic, social, and other kinds of behavior.

This issue of the Journal of Advanced Military Studies addresses the intersection of political warfare and the digital ecosystem. To frame the contributions that follow, this introduction to the issue reviews the broad landscape of terms and concepts that refer to the weaponization of information, and then provides a small handful of historical and modern examples that reflect the goals and objectives pursued through influence efforts. The discussion then turns to describe how the articles in this issue contribute to our understanding of political warfare and propaganda in the digital age, before concluding with some thoughts about the need for research-based strategies and policies that can improve our ability to defend against foreign influence efforts and mitigate their consequences.

A Diverse Landscape of Terms and Concepts

The past several centuries have largely been defined by physical security threats, requiring a nation's military to physically respond with whatever means they have available. But as explained by Isaiah Wilson III—president of Joint Special Operations University—today we face "compound security threats," which include physical security threats as well as "communication and information operations that scale with the speed of a social media post that goes viral, as well as cyber warfare, hacking and theft by our adversaries, both state and non-state actors." 3 These compound security threats can exploit cybersecurity vulnerabilities as well as psychological and emotional vulnerabilities of targets, using modern internet platforms to reach targets worldwide.

Terms like information operations or information warfare have been frequently used in military doctrine to describe computer network attacks (often by highly trained military units) like hacking into databases to observe or steal information, disrupting and degrading a target's technological capabilities, weakening military readiness, extorting financial ransoms, and much more. These terms have also referred to operations intended to protect our own data from these attacks by adversaries. Computer network attacks like these can also be used to send a message (e.g., about a target's vulnerabilities and the attacker's capabilities), and in that way could be a means of influencing others. Cyberattacks are seen as compound security threats because they can have implications [End Page 14] for multiple dimensions of a nation's well-being, including politics, economics, technology, information security, relations with other countries, and much more.

Today's digital influence attacks also have implications for these same multiple dimensions and are likewise seen as compound security threats. The goals of digital influence attacks can include disrupting and degrading a target's societal cohesion, undermining confidence in political systems and institutions (i.e., democratic elections), fracturing international alliances, and much more. Tactics used in such attacks include various forms of deception and provocation, from deepfake videos and fake social media accounts to gaslighting, doxing, trolling, and many others. Through social media and other internet technologies, attackers can incentivize and manipulate interactions directly with citizens of a foreign population, bypassing government efforts to insulate their citizens from an onslaught of disinformation. 4 These types of attacks exploit human vulnerabilities more than technological attacks and capitalize on psychological and emotional dimensions like fear, uncertainty, cognitive biases, and others.

A variety of terms are used to describe these attacks, sometimes leading to confusion rather than clarity. The term political warfare was used by the legendary diplomat George Kennan in 1948 to describe "the employment of all the means at a nation's command, short of war, to achieve its national objectives. Such operations are both overt and covert and can include various kinds of propaganda as well as covert operations that provide clandestine support to underground resistance in hostile states." 5 Paul A. Smith describes political warfare as "the use of political means to compel an opponent to do one's will" and "its chief aspect is the use of words, images, and ideas, commonly known, according to context, as propaganda and psychological warfare." 6 Carnes Lord notes a "tendency to use the terms psychological warfare and political warfare interchangeably" along with "a variety of similar terms—ideological warfare, the war of ideas, political communication and more." 7 And the U.S. Department of Defense has used the term military information support operations to describe efforts to "convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign governments, organizations, groups, and individuals in a manner favorable to the originator's objectives." 8

In a 2019 research report published by Princeton University, Diego A. Martin and Jacob N. Shapiro illustrate how "foreign actors have used social media to influence politics in a range of countries by promoting propaganda, advocating controversial viewpoints, and spreading disinformation." 9 The researchers define foreign-influence efforts as: 1) coordinated campaigns by one state to impact one or more specific aspects of politics in another state, 2) [End Page 15] through media channels, including social media, and by 3) producing content designed to appear indigenous to the target state. 10 The objective of such campaigns can be quite broad and to date have included influencing political decisions by shaping election outcomes at various levels, shifting the political agenda on topics ranging from health to security, and encouraging political polarization. 11 Similarly, research by Philip N. Howard describes "countries with dedicated teams meddling in the affairs of their neighbors through social media misinformation." 12 And social media platforms—most notably Facebook—are now using the term information operations when referring to deliberate and systematic attempts to steer public opinion using inauthentic accounts and inaccurate information. 13

A recent book by Carl Miller describes how "digital warfare has broken out between states struggling for control over what people see and believe." 14 Other terms used in the literature include "new generation warfare," "ambiguous warfare," "full-spectrum warfare," and "non-linear war." 15 Scholars have also described these security challenges as forms of hybrid warfare, encompassing a combination of political warfare, psychological operations, and information operations (including propaganda). Similar terms in this broad landscape include public diplomacy and strategic communications . Further, some states are portrayed as pursuing "information dominance" over the populations of other states through a combination of computer network operations, deception, public affairs, public diplomacy, perception management, psychological operations, electronic countermeasures, jamming, and defense suppression. 16

Whatever we want to call it, there are clear examples of aggression, attackers, targets, defenders, tactics, strategies, goals, winners, losers, and innocent victims. And this is not something that only states do to other states: nonstate actors are increasingly engaged in these kinds of activities as well. 17 The author's own work has used the term influence warfare to describe the kinds of activities in which the focus is not the information but on the purposes of that information. 18 This conceptual approach views the implicit goal of spreading propaganda, misinformation, disinformation, and so forth as shaping perceptions and influencing behavior of a specific target (or set of targets). Further, influence warfare strategies and tactics—particularly as we have seen online—also involve more than just manipulation of information; they can include behavior signaling (e.g., swarming or bandwagoning), trolling, gaslighting, and other means by which the target is provoked into having an emotional response that typically overpowers any rational thought or behavior. 19 Clickbait, memes, and ragebait (for example) are not really seen as forms of information operations as traditionally conceived, but they are certainly ways of influencing others via the internet. This leads us to the term digital influence warfare , which will be used variably throughout this introduction [End Page 16] as a catchall phrase representing the broadly diverse terrain of political and psychological warfare in the digital age. 20

Strategic Goals and Tactics of Influence Warfare

The "weaponization of information" in order to obtain power and influence is of course not new. The principles of influence warfare are based on an ancient and much-repeated maxim, attributed to the Chinese general and military theorist Sun Tzu, paraphrased as "to win one hundred victories in one hundred battles is not the highest skill. To subdue the enemy without fighting is the highest skill." 21 When the thirteenth-century Mongols were rolling across Eurasia, they deliberately spread news of the atrocities they perpetrated on cities that did not surrender, the obvious goal being what Sun Tzu argued was the ultimate victory: to defeat the enemy before a single shot has been fired. As Marc Galeotti explains, fear is a powerful emotion, and in this instance it was used to coerce the behavior of cities the Mongols had in their sights, preferring that they surrender instead of having to spend valuable resources conquering them through force. 22 Mongol hordes would also drag branches behind their horses to raise dust clouds suggesting their armies were far larger than reality—an early and effective form of deception and disinformation.

The previous century saw a wide variety of efforts involving the weaponization of information for strategic purposes. During the Chinese Civil War (1945–49), both the Communist and Nationalist (Kuomintang, or KMT) armies spread false information to sow discord in enemy-controlled areas, spreading rumors about defections, falsifying enemy attack plans, and stirring up unrest in an effort to misdirect enemy planning. After the Nationalist government relocated to Taiwan in 1949, the influence efforts continued as the two sides flooded propaganda and disinformation into enemy-controlled territories to affect public opinion and troop morale. 23 Various forms of influence warfare also played a major role in both World Wars. For example, the Committee on Public Information was created during World War I by U.S. president Woodrow Wilson to facilitate communications and serve as a worldwide propaganda organization on behalf of the United States. 24

Influence warfare was increasingly prominent throughout World War II, especially the massive amounts of propaganda disseminated by Joseph Goebbels and the Nazi regime. In response, U.S. president Franklin D. Roosevelt established the Office of War Information in 1942, responsible for (among other things) undermining the enemy's morale—often through various psychological and information operations—as well as for providing moral support and strengthening the resolve of resistance movements in enemy territories. The Voice of America (VOA) was also established in 1942 as the foreign radio and television broadcasting service of the U.S. government, broadcasting in English, [End Page 17] French, and Italian. Years later, the United States Information Agency (USIA) was created in 1953 as a primary conduit for enhancing our nation's strategic influence during the Cold War. 25 The director of USIA reported to the president through the National Security Council and coordinated closely with the secretary of state on foreign policy matters.

Meanwhile, when Radio Moscow began broadcasting in 1922, it was initially available only in Moscow and its surrounding areas, but by 1929, the Soviets were able to broadcast into Europe, North and South America, Japan, and the Middle East using a variety of languages. 26 By 1941, the Union of Soviet Socialist Republics (USSR) was able to broadcast in 21 languages and, 10 years later, had a program schedule of 2,094 hours. 27 But radio and television broadcasting were just the visible tip of the iceberg for what became a multidimensional influence effort during the Cold War involving an array of covert influence tactics, particularly through the spread of disinformation. As Thomas Rid notes, "Entire bureaucracies were created in the Eastern bloc during the 1960s for the purpose of bending the facts." 28 The Soviets used disinformation "to exacerbate tensions and contradictions within the adversary's body politic, by leveraging facts, fakes, and ideally a disorienting mix of both." 29

In the first academic study of the Soviet-era active measures program, Richard H. Shultz and Roy Godson explain how the Soviets cultivated several different types of so-called "agents of influence … including the unwitting but manipulated individual, the 'trusted contact,' and the controlled covert agent." 30 As they explain,

The agent of influence may be a journalist, a government official, a labor leader, an academic, an opinion leader, an artist, or involved in a number of other professions. The main objective of an influence operation is the use of the agent's position—be it in government, politics, labor, journalism or some other field—to support and promote political conditions desired by the sponsoring foreign power. 31

Forged documents—including faked photographs—have also been a part of influence warfare for more than a century. For example, during the 1920s the Soviet Cheka (secret police) used elaborate forgeries to lure anti-Bolsheviks out of hiding, and many were captured and killed as a result. 32 During the Cold War, as Shultz and Godson note, many "authentic-looking but false U.S. government documents and communiqués" could be categorized mainly as either "altered or distorted versions of actual US documents that the Soviets obtained (usually through espionage)" or "documents that [were] entirely fabricated." 33 Examples include falsified U.S. State Department documents ordering diplomatic [End Page 18] missions to sabotage peace negotiations or other endeavors, fake documents outlining U.S. plans to manipulate the leaders of Third World countries, or even forged cables from an American embassy outlining a proposed plan to overthrow a country's leader. 34

In one case, an authentic, unclassified U.S. government map was misrepresented as showing nuclear missiles targeting Austrian cities. A fabricated letter ostensibly written by the U.S. defense attaché in Rome contained language denying "rumors suggesting the death of children in Naples could be due to chemical or biological substances stored at American bases near Naples," while no such substances were stored at those bases. 35 Even a fake U.S. Army Field Manual was distributed, purportedly encouraging Army intelligence personnel to interfere in the affairs of host countries and subvert foreign government officials and military officers. 36 Through these and other types of information operations, the Soviets tried to influence a range of audiences, and the lessons to be learned from this history—both successes and failures—can inform the influence warfare efforts of many countries today.

Influence Opportunities in the Digital Age

While the primary strategies and goals of influence warfare have remained fairly constant, the operational environment in which these efforts take place has changed significantly during the past two decades. The rise of the internet and social media companies, whose profit model is based on an attention economy, has been a game changer. Within the attention economy, the most valued content is that which is most likely to attract attention and provoke engagement, with no regard to whether it is beneficial or harmful, true or untrue. New tools have emerged for creating and spreading information (and disinformation) on a global scale. Connectivity in the digital realm is now much easier, and yet the emergence of hyperpartisan echo chambers has sequestered many online users into separate communities who reject the credibility and merits of each other's ideas, beliefs, and narratives.

Unlike conventional cyberattacks, the goal of a digital influence warfare campaign is not about degrading the functional integrity of a computer system. Rather, it is to use those computer systems against the target in whatever ways might benefit that attacker's objectives. Often, those objectives include a basic divide and conquer strategy—a society that is disunited will fight among themselves over lots of things, instead of coming together in the face of a threat that only some of them believe is there. Many influence activities are meant to shape the perceptions, choices, and behaviors of a society—and in some cases, the goal may in fact be making the target dysfunctional as a society. This is not simply propaganda, fake news, or perception manipulation. It is a battle over [End Page 19] what people believe is reality and the decisions that each individual makes based on those beliefs. The victors in this battle are the attackers who have convinced scores of victims to make decisions that directly benefit the attackers.

Digital influence warfare involves the use of persuasion tactics, information and disinformation, provocation, identity deception, computer network hacking, altered videos and images, cyberbullying, and many other types of activity explored in this issue of the Journal of Advanced Military Studies . The attacker (or "influencer") seeks to weaponize information against a target in order to gain the power needed to achieve the goals articulated in their strategic influence plan. Some goals may involve changing the target's beliefs and behaviors, prompting the targets to question their beliefs in the hopes that once those beliefs have been undermined, the targets may change their minds. Other goals may include manufacturing uncertainty to convince the target that nothing may be true and anything may be possible. 37 In other instances, the goals of an influence strategy could include strengthening the target's certainty, even their commitment to believing in things that are actually untrue.

The central goal of influence attacks is—according to a recent report by Rand—"to cause the target to behave in a manner favorable to the influencer." 38 The influencer may seek to disrupt the target's information environment—for example, interrupting the flow of information between sources and intended recipients of an organization, or on a broader level, between the target's government and its citizens. Similarly, the influencer may also seek to degrade the quality, efficiency, and effectiveness of the target's communication capabilities, which may involve flooding channels of communication with misinformation and disinformation. The overall goal here involves undermining the perceived credibility and reliability of information shared among the adversary's organizational members (government or corporate) or between the target's government and its citizens. 39 Attackers in the digital influence domain can organize swarms of automated social media accounts ("bots") alongside real accounts, coordinated to amplify a particular narrative or attack a specific target. Government (or corporate) leaders can hire technically skilled mercenaries and contractors (from large so-called social media influence corporations to lone hackers) to do the dirty work for them. 40

Based on whatever goals the attacker wants to achieve, they will need to identify the targets they want to influence. When conducting research on their targets, the attackers will seek to answer specific questions like: What do they already believe about their world and/or their place within it? What do they think they know, and what are they uncertain about? What assumptions, suspicions, prejudices, and biases might they have? What challenges and grievances (economic, sociopolitical, security, identity, etc.) seem to provoke the most emotional reactions among them? Throughout the history of influence warfare, [End Page 20] this information has been relatively easy to identify in open liberal democracies of the West. In more closed or oppressed societies, an additional step may be needed to determine how the target audience's perceptions compare to the discourse in the public domain—for example, what the news media (often owned and controlled by the government) identify as important topics and acceptable views within that society may not fully reflect the reality.

Influence efforts should always be guided by data on potential targets. An attacker should never waste their resources on target audiences that are already well-armed to repeal the influence efforts; better instead to identify vulnerable targets to exploit. For example, if the goal is to sow division and increase political polarization within a society, the United States offers a prime target for achieving that goal. Research by the Oxford Internet Institute in 2019 has found that people in the United States share more junk news (i.e., completely fabricated information disguised to look like authentic news) than people in other advanced democracies such as France, Germany, and the United Kingdom. 41 A study by the Pew Research Center in 2017 found that 67 percent of U.S. adults received news through social media sites like Twitter and Facebook. 42 Further, analysis of Russian influence efforts by the Atlantic Council's Digital Forensic Research Lab in 2018 found that Americans were vulnerable to a distinct type of troll accounts that used "carefully crafted personalities" to infiltrate activist communities and post hyperpartisan messages in order to "make their audiences ever more radical." 43

These research studies reflect another important dimension of influence efforts: after gathering enough quality information about the target, the attacker will then seek to establish a foothold in the information environment preferred by that target. They must establish a credible presence among an audience of like-minded social media users before attempting to influence or polarize that audience. A common approach involves initially posting some messages that the target audience is likely to agree with. The convention of "like" or "share" facilitated by social media platforms can draw the target toward recognition of an acceptable persona (the "like-minded, fellow traveler"). 44 Once established within the target's digital ecosystem, the persona can then begin to shape perceptions and behavior in ways that will benefit their influence strategy.

Perhaps the most well-known example of this in the public arena today is called disinformation or fake news. Essentially, these are forms of information deception, and there are several variations to consider. According to researcher Claire Wardle, some of the most "problematic content within our information ecosystem" includes:

• False connection: when headlines, visuals, or captions do not support the substance or content of the story itself; [End Page 21]

• Misleading content: misleading use of information to frame an issue or individual;

• False context: when genuine content is shared with false contextual information;

• Imposter content: when genuine sources are impersonated;

• Manipulated content: when genuine information or imagery is manipulated to deceive (altered videos and images, including deepfakes, are the most prevalent examples of this); and

• Fabricated content: new content is 100 percent false and designed to deceive and do harm. 45

Each of these forms of "problematic content" has a role to play in achieving an influence warfare strategy. Further, in many cases the most effective means of using these types of information (or disinformation) involves a careful integration between fake details and accurate details that the target already accepts as true. In the field of education, teachers often refer to the concept of scaffolding as a strategy to foster learning by introducing material that builds on what the student already understands or believes. For the purposes of an influence strategy, as Thomas Rid explains, for disinformation to be successful it must "at least partially respond to reality, or at least accepted views." 46

Additional examples of deceptive digital influence tactics include identity deception (e.g., using fake or hijacked social media accounts) and information source deception (e.g., rerouting internet traffic to different sources of information that seem legitimate but relays false information to the viewers). As with the other forms of deception, a primary intent of these tactics is for the influencer to make the target believe what is not true. Similarly, the influencer may also spread disinformation through the target's trusted communication channels to degrade the integrity of their decision making and even their perception of reality.

Of course, deception is only one of several digital influence strategies. Another, which we have seen in use frequently in recent years, is to encourage engagement—especially by provoking emotional responses—using information that may in fact be all or partially accurate. Unlike disinformation and deception, the primary focus here is less on the message than on provoking people to propagate the message. Effective targets for this approach are those who have higher uncertainty about what is true or not but are willing to share and retransmit information without knowing whether it is untrue (and often because they want it to be true). And it is widely understood that fear is an exceptionally powerful emotion that can lead people to make a wide variety of (often unwise) decisions.

There are many kinds of influence goals that can be achieved by intentionally [End Page 22] provoking emotional responses, usually in reference to something that the target already favors or opposes. The tactic of provoking outrage can be particularly effective here against a target audience—as Sun Tzu wrote, "Use anger to throw them into disarray." 47 With the right sort of targeting, message format, and content, the influencer can use provocation tactics to produce whatever kinds of behavior they want by the target (e.g., angrily lashing out at members of an opposing political party or questioning the scientific evidence behind an inconvenient truth). And an additional type of influence warfare involves attacking the target directly—threatening or bullying them, calling them derogatory names, spreading embarrassing photos and videos of them, and so forth.

One of the most well-known earlier forms of digital influence warfare was North Korea's attack against Sony. In the summer of 2014, Sony Pictures had planned to release a comedy, The Interview , featuring a plot in which two bumbling, incompetent journalists score an interview with Kim Jong-un, but before they leave they are recruited by the Central Intelligence Agency (CIA) to blow him up. 48 An angered North Korea responded by hacking into Sony's computer networks, destroying some key systems and stealing tons of confidential emails that they later released publicly in small, increasingly embarrassing quantities. Details about contracts with Hollywood stars, medical records, salaries, and Social Security numbers were also released. But unlike other well-reported cyberattacks of that era, this was—in the words of David E. Sanger—"intended as a weapon of political coercion." 49 As with many other examples of this hack and release tactic, the strategic goals are fairly straightforward: for example, to weaken an adversary by undermining its perceived credibility. This same script was followed by Russia during the 2016 U.S. presidential election, when they hacked into John Podesta's email account and released (via WikiLeaks) a stream of embarrassing messages (as detailed in the investigation report by former Federal Bureau of Investigation [FBI] director Robert S. Mueller III). 50

Today, states are engaged in these kinds of digital influence activities with increasing regularity and sophistication. As a July 2020 report by the Stanford Internet Observatory explains:

Well-resourced countries have demonstrated sophisticated abilities to carry out influence operations in both traditional and social media ecosystems simultaneously. Russia, China, Iran, and a variety of other nation-states control media properties with significant audiences, often with reach far beyond their borders. They have also been implicated in social media company takedowns of accounts and pages that are manipulative either by virtue of the fake accounts and suspicious domains involved, or by way of coordinated distribution tactics [End Page 23] to drive attention to certain content or to create the perception that a particular narrative is extremely popular. 51

China in particular has significantly ramped up its digital foreigninfluence efforts, to include disrupting Twitter conversations about the conflict in Tibet and meddling in Taiwanese politics. 52 In fact, public opinion warfare and psychological warfare are closely intertwined in Chinese military doctrine. According to a recent Pentagon report, China's approach to psychological warfare "seeks to influence and/or disrupt an opponent's decision-making capability, to create doubts, foment anti-leadership sentiments, to deceive opponents and to attempt to diminish the will to fight among opponents." 53 A primary objective, as Laura Jackson explains, is "to demoralize both military personnel and civilian populations, and thus, over time, to diminish their will to act … to undermine international institutions, change borders, and subvert global media, all without firing a shot." 54

China's "Three Warfares" doctrine is focused on: (1) public opinion (media) warfare ( yulun zhan ); (2) psychological warfare ( xinli zhan ); and (3) legal warfare ( falu zhan ). 55 In their conception of public opinion warfare, the goal is to influence both domestic and international public opinion in ways that build support for China's own military operations, while undermining any justification for an adversary who is taking actions counter to China's interests. 56 But this effort goes well beyond what Steven Collins refers to in a 2003 NATO Review article as "perception management," in which a nation or organization provides (or withholds) certain kinds of information to influence foreign public opinion, leaders, intelligence agencies, and the policies and behaviors that result from their interpretation of this information. 57 According to the Pentagon report, China "leverages all instruments that inform and influence public opinion … and is directed against domestic populations in target countries." 58 As Laura Jackson explains, "China's extensive global media network, most notably the Xinhua News Agency and China Central Television (CCTV), also plays a key role, broadcasting in foreign languages and providing programming to stations throughout Africa, Central Asia, Europe, and Latin America." 59 In turn, Western media outlets then repeat and amplify the spread of messages to a broader international audience, lending a perception of legitimacy to what is in fact Chinese state-directed propaganda. 60

Similarly, Russia has also engaged in a broad, multifaceted influence warfare campaign involving all of the former tools and tactics of its active measures program along with a flurry of new technological approaches. Media outlets like Sputnik and RT (formerly Russia Today) view themselves—according to Margarita Simonyan, chief editor of RT—as equal in importance to the Defense Ministry, using "information as a weapon." 61 And like many other authoritarian [End Page 24] regimes, Russia has invested heavily in online troll farms, armies of automated bot accounts, cyber hacking units, and other means by which they can pursue their foreign influence goals using the most modern tools available to them. 62 While the "agent of influence" of the Cold War may have been a journalist, a government official, a labor leader, or an academic (among many other examples), today the agent is more likely to be a social media user with enough followers to be considered a potential "influencer." 63

According to a report by the Stanford Internet Observatory, both China and Russia have "full-spectrum propaganda capabilities," including prominent Facebook pages and YouTube channels targeting regionalized audiences. 64 Both have military units dedicated to influencing foreign targets and also encourage and incentivize citizen involvement in those efforts. 65 They gather extensive information about their targets and manage an array of fake Facebook pages and Twitter personas that are used for eroding the international perception and domestic social cohesion of its rivals. 66 And as detailed in many reports by congressional committees, think tanks, and academics, Russia has been particularly aggressive during this past decade in its online efforts to influence democratic elections in the United States, Europe, Africa, and elsewhere, as well as to sow confusion and encourage widespread societal polarization and animosity. 67

Meanwhile, other countries are also increasingly engaging in their own forms of digital influence warfare. In October 2019, Facebook announced the deletion of 93 Facebook accounts, 17 Facebook pages, and 4 Instagram accounts "for violating our policy against coordinated inauthentic behavior. This activity originated in Iran and focused primarily on the US, and some on French-speaking audiences in North Africa." 68 According to the announcement, "the individuals behind this activity used compromised and fake accounts—some of which had already been disabled by our automated systems—to masquerade as locals, manage their Pages, join Groups and drive people to off-platform domains connected to our previous investigation into the Iran-linked 'Liberty Front Press' and its removal in August 2018." 69 Facebook also removed 38 Facebook accounts, 6 pages, 4 groups, and 10 Instagram accounts that originated in Iran and focused on countries in Latin America, including Venezuela, Brazil, Argentina, Bolivia, Peru, Ecuador, and Mexico. The page administrators and account owners typically represented themselves as locals, used fake accounts to post in groups and manage pages posing as news organizations, as well as directed traffic to other websites. 70 And that same month, Microsoft announced that hackers linked to the Iranian government targeted an undisclosed U.S. presidential campaign, as well as government officials, media outlets, and prominent expatriate Iranians. 71

In short, older strategies, tactics, and tools of influence warfare have evolved to encompass a new and very powerful digital dimension. By using massive [End Page 25] amounts of internet user data, including profiles and patterns of online behavior, microtargeting strategies have become a very effective means of influencing people from many backgrounds. The strategies, tactics, and tools of digital influence warfare will increasingly be used by foreign and domestic actors to manipulate our perceptions in ways that will negatively affect us. According to a 2018 United Nations Educational, Scientific and Cultural Organization (UNESCO) report, the danger we face in the future is "the development of an 'arms race' of national and international disinformation spread through partisan 'news' organizations and social media channels, polluting the information environment for all sides." 72

Tomorrow's disinformation and perceptions manipulation will be much worse than what we are dealing with now, in part because the tactics and tools are becoming more innovative and sophisticated. As a 2019 report by Rand notes, "Increasingly, hostile social manipulation will be able to target the information foundations of digitized societies: the databases, algorithms, networked devices, and artificial intelligence programs that will dominate the day-to-day operation of the society." 73 The future evolution of digital influence tools—including augmented reality, virtual reality, and artificial intelligence (AI)—promise to bring further confusion and challenges to an already chaotic situation, offering a new frontier for disinformation and perceptions manipulation. 74 For example, in the not-too-distant future we will see a flood of fake audio, images, messages, and video created through AI that will appear so real it will be increasingly difficult to convince people they are fakes. 75 Technology already exists that can be used to manipulate an audio recording to delete words from a speech and then stitch the rest together seamlessly, or add new words using software that replicates the voice of the speaker with uncanny accuracy. 76 Imagine the harm that can be done when in the future, digital influencers have the ability to clone any voice, use it to say anything the influencer wants, and then use that audio recording to persuade others. 77

Creating deepfake images and video is also becoming easier, with increasingly realistic results becoming more convincing. One particularly sophisticated AI-related approach involves a tool known as generative adversarial networks (GANs). These involve integrating a competitive function into software, with one network seeking to generate an item, such as an image or video, while the other network judges the item to determine whether it looks real. As the first network continues to adapt to fool the adversarial network, the software learns how to better create more realistic images or videos. 78 Over time, according to Michael Mazzar and his colleagues at Rand, "As technology improves the quality of this production, it will likely become more difficult to discern real events from doctored or artificial ones, particularly if combined with the advancements in audio software." 79 If the target of such deepfake disinformation holds [End Page 26] true to the old adage of "hearing and seeing is believing," the long-term harmful effects of this technology are quite obvious. Technological advances will make it increasingly difficult to distinguish real people from computer-generated ones, and even more difficult to convince people that they are being deceived by someone they believe is real.

And, of course, we can fully expect that digital influence warfare attacks against democratic elections will continue and will likely involve new and innovative tactics. For example, there are concerns that in the future malicious hackers could use ransomware to snatch and hold hostage databases of local voter registrations or cause power disruptions at polling centers on election day. Further, as one expert noted, "with Americans so mistrustful of one another, and of the political process, the fear of hacking could be as dangerous as an actual cyberattack—especially if the election is close." 80 As Laura Rosenberger observes, "You don't actually have to breach an election system in order to create the public impression that you have." 81 The future will likely bring darker influence silos that no light of truth can penetrate, resulting in heightened uncertainty and distrust, deeper animosity, more extremism and violence, and widespread belief in things that simply are not true. This is the future that the enemies of America's peace and prosperity want to engineer. The United States must find ways to prevent them from succeeding. The research and analysis provided in this issue contributes to that important goal.

The Issue of JAMS on Political Warfare and Propaganda

Each of the contributions to this issue addresses the central theme of influencing perceptions and behavior. First, Daniel de Wit draws lessons from a historical analysis of the Office of Strategic Services (OSS), America's intelligence and special operations organization in World War II. In addition to its efforts to collect intelligence on the Axis powers and to arm and train resistance groups behind enemy lines, the OSS also served as America's primary psychological warfare agency, using a variety of "black propaganda" methods to sow dissension and confusion in enemy ranks. 82 As noted earlier, psychological warfare plays a significant role in the conduct of today's military operations, so de Wit's research offers important historical lessons for contemporary campaign planners.

Next, Kyleanne Hunter and Emma Jouenne examine the uniquely troubling effects of spreading misogynistic views online. Their analysis of three diverse case studies—the U.S. military, the incel movement, and ISIS—reveals how unchecked online misogyny can result in physical behavior that can threaten human and national security. Glen Segell then explores how perceptions about cybersecurity operations can have positive or negative impacts on civil-military relations, drawing on a case study of the Israeli experience. Lev Topor and Alexander Tabachnik follow with a study of how Russia uses the [End Page 27] strategies and tactics of digital influence warfare against other countries, while continually seeking to strengthen its information dominance over Russian citizens. And Donald M. Bishop reveals how other countries do this as well, including China, North Korea, Iran, Cuba, and Venezuela. Each is engaged in these same kinds of efforts to control the information that circulates within their respective societies, while using various forms of propaganda against other countries to strengthen their influence and national power.

Phil Zeman's contribution to this issue looks at how China and Russia are trying to fracture American and Western societies through information, disinformation, economic coercion, and the creation of economic dependencies—in many cases capitalizing on specific attributes and vulnerabilities of a target nation to achieve their strategic objectives. Through these efforts, he concludes, China and Russia hope to prevent the will or ability of American or Western states to respond to an aggressive act. Next, Michael Cserkits explains how a society's perceptions about armed forces can be influenced by cinematic productions and anime, drawing on a case study comparison of Japan and the United States. And finally, Anthony Patrick examines how social media penetration and internet connectivity could impact the likelihood that parties within a conventional intrastate conflict will enter negotiations.

As a collection, these articles make a significant contribution to the scholarly research literature on political warfare and propaganda. The authors shed light on the need for research-based strategies and policies that can improve our ability to identify, defend against, and mitigate the consequences of influence efforts. However, when reflecting on the compound security threats described at the beginning of this introduction—involving both cyberattacks and influence attacks—a startling contrast is revealed: we have committed serious resources toward cybersecurity but not toward addressing the influence issues examined in this issue. We routinely install firewalls and other security measures around our computer network systems, track potential intrusion attempts, test and report network vulnerabilities, hold training seminars for new employees, and take many other measures to try and mitigate cybersecurity threats. In contrast, there are no firewalls or intrusion detection efforts defending us against digital influence attacks of either foreign or domestic origin. Government sanctions and social media deplatforming efforts respond to influence attackers once they have been identified as such, but these efforts take place after attacks have already occurred, sometimes over the course of several years.

The articles of this issue reflect an array of efforts to influence the perceptions, emotions, and behavior of human beings at both individual and societal levels. In the absence of comprehensive strategies to more effectively defend against these efforts, the United States risks losing much more than military advantage; we are placing at risk the perceived legitimacy of our systems [End Page 28] and institutions of governance, as well as our economic security, our ability to resolve social disagreements peacefully, and much more. 83 Further, many other nations are also facing the challenges of defending against foreign influence efforts. As such, the transnational nature of influence opportunities and capabilities in the digital age may require a multinational, coordinated response. In the years ahead, further research will be needed to uncover strategies for responding to the threat of digital influence warfare with greater sophistication and success.

James J. F. Forest is a professor at the School of Criminology & Justice Studies, University of Massachusetts Lowell and a visiting professor at the Fletcher School, Tufts University. He has published more than 20 books in the field of international security studies, most recently Digital Influence Warfare in the Age of Social Media (2021) and Digital Influence Mercenaries (2021).

1. Marshall McLuhan, Culture Is Our Business (Eugene, OR: Wipf and Stock Publishers, 1970), 66.

2. John Arquilla and David Ronfeldt, "The Advent of Netwar (Revisited)," in Networks and Netwars: The Future of Terror, Crime, and Militancy , ed. John Arquilla and David Ronfeldt (Santa Monica, CA: Rand, 2001), 1, https://doi.org/10.7249/MR1382 .

3. Isaiah Wilson III, "What Is Compound Security?: With Dr. Isaiah 'Ike' Wilson III (Part 2 of 4)," YouTube, 26 February 2021, 16:48; and Isaiah Wilson III and Scott A. Smitson, "The Compound Security Dilemma: Threats at the Nexus of War and Peace," Parameters 50, no. 2 (Summer 2020): 1–17.

4. Wilson, "What Is Compound Security?"; and Wilson and Smitson, "The Compound Security Dilemma."

5. Max Boot and Michael Doran, "Political Warfare," Council on Foreign Relations, 28 June 2013.

6. Paul A. Smith, On Political War (Washington, DC: National Defense University Press, 1989), 3.

7. Carnes Lord, "The Psychological Dimension in National Strategy," in Political Warfare and Psychological Operations: Rethinking the US Approach , ed. Carnes Lord and Frank R. Barnett (Washington, DC: National Defense University Press, 1989), 16.

8. Military Information Support Operations , Joint Publication 3-13.2 (Washington, DC: Joint Chiefs of Staff, 2014).

9. Diego A. Martin and Jacob N. Shapiro, Trends in Online Foreign Influence Efforts (Princeton, NJ: Woodrow Wilson School of Public and International Affairs, Princeton University, 2019), 3.

10. Martin and Shapiro, Trends in Online Foreign Influence Efforts .

11. Martin and Shapiro, Trends in Online Foreign Influence Efforts .

12. Philip N. Howard, Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations and Political Operatives (New Haven, CT: Yale University Press, 2020), 75.

13. Caroline Jack, Lexicon of Lies: Terms for Problematic Information (New York: Data & Society Research Institute, 2017), 6.

14. Carl Miller, The Death of the Gods: The New Global Power Grab (London: Windmill Books, 2018), xvi.

15. Mark Galeotti, Russian Political War: Moving Beyond the Hybrid (Abingdon, UK: Routledge, 2019), 11.

16. Michael V. Hayden, The Assault on Intelligence: American National Security in an Age of Lies (New York: Penguin Press, 2018), 191.

17. In addition to terrorists and insurgents using these tools of digital influence for political purposes, we also see various kinds of individuals and marketing firms engaged in profit-seeking activities as described in James J. F. Forest, Digital Influence Mercenaries: Profit and Power Through Information Warfare (Annapolis, MD: Naval Institute Press, 2021).

18. James. J. F. Forest, ed., Influence Warfare: How Terrorists and Governments Fight to Shape Perceptions in a War of Ideas (Westport, CT: Praeger Security International, 2009).

19. While Arquilla and Ronfeldt initially defined swarming as a "deliberately structured, coordinated, strategic way to strike from all directions," in this context the term is used to describe a collection of social media accounts that converges on a single target like a swarm of bees. See John Arquilla and David Ronfeldt, Swarming and the Future of Conflict (Santa Monica, CA: Rand, 2000); Ali Fisher, "Swarmcast: How Jihadist Networks Maintain a Persistent Online Presence," Perspectives on Terrorism 9, no. 3 (June 2015): 3–20; and bandwagoning is a term from social psychology used to describe a type of cognitive bias and collective identity signaling that leads people to adopt the behaviors or attitudes of others. This can be observed in political campaigns, support for a winning sports team, fashion trends, adoption of new consumer electronics, and many other arenas of daily life.

20. James J. F. Forest, Digital Influence Warfare in the Age of Social Media (Santa Barbara, CA: ABC-CLIO/Praeger Security International, 2021).

21. Specifically, chapter 3, "Attack by Strategem" reads: "Supreme excellence consists in breaking the enemy's resistance without fighting." Sun Tzu, The Art of War (New York: Fall River Press, 2015), 54.

22. Galeotti, Russian Political War , 10.

23. Russell Hsiao, "CCP Propaganda against Taiwan Enters the Social Age," China Brief 18, no. 7 (April 2018).

24. W. Phillips Davison, "Some Trends in International Propaganda," Annals of the American Academy of Political Science and Social Science 398, no. 1 (November 1971): 1–13, https://doi.org/10.1177/000271627139800102 .

25. Daniel Baracskay, "U.S. Strategic Communication Efforts during the Cold War," in Influence Warfare , 253–74.

26. James Woods, History of International Broadcasting , vol. 2 (London: IET, 1992), 110.

27. Woods, History of International Broadcasting , 110–11.

28. Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare (New York: Farrar, Strauss and Giroux, 2020), 4.

29. Rid, Active Measures , 7.

30. Richard H. Shultz and Roy Godson, Dezinformatsia: Active Measures in Soviet Strategy (New York: Pergamon Brassey's, 1984), 133.

31. Shultz and Godson, Dezinformatsia , 133.

32. Shultz and Godson, Dezinformatsia , 149.

33. Shultz and Godson, Dezinformatsia , 150–51.

34. Shultz and Godson, Dezinformatsia , 152–53.

35. Shultz and Godson, Dezinformatsia , 155.

36. Shultz and Godson, Dezinformatsia , 157.

37. This is a cornerstone of Russia's digital influence warfare program and the title of an important book. See Peter Pomerantsev, Nothing Is True and Everything Is Possible: The Surreal Heart of the New Russia (New York: Public Affairs, 2014).

38. This section of the discussion significantly amplifies and paraphrases a report by Eric V. Larson et al., Understanding Commanders' Information Needs for Influence Operations (Santa Monica, CA: Rand, 2009), Appendix B: Task List Analysis, 71–73, which cites several Department of the Army documents and 1st Information Operations Command (Land), Field Support Division, "Terminology for IO Effects," in Tactics, Techniques and Procedures for Operational and Tactical Information Operations Planning (Washington, DC: Department of the Army, 2004), 23.

39. Larson et al., Understanding Commanders' Information Needs for Influence Operations , 71–73.

40. For details, see Forest, Digital Influence Mercenaries .

41. Howard, Lie Machines , 99–100. Junk news was defined by the Oxford Internet Institute as being articles from outlets that publish "deliberately misleading, deceptive or incorrect information." See Ryan Browne, "'Junk News' Gets Massive Engagement on Facebook Ahead of EU Elections, Study Finds," CNBC, 21 May 2019.

42. Elisa Shearer and Jeffrey Gottfried, "News Use Across Social Media Platforms 2017," Pew Research Center, 7 September 2017.

43. Ben Nimmo, Graham Brookie, and Kanishk Karanm, "#TrollTracker: Twitter Troll Farm Archives, Part One—Seven Key Take Aways from a Comprehensive Archive of Known Russian and Iranian Troll Operations," Atlantic Council's Digital Forensic Research Lab, 17 October 2018.

44. For the purpose of this discussion, a "like-minded fellow traveler" is described as someone who sees the world in much the same way you do and is moving intellectually and emotionally in a direction that you approve of.

45. Claire Wardle, "Fake News. It's Complicated," First Draft, 16 February 2017.

46. Rid, Active Measures , 5, with a direct quote from famous Soviet defector Ladislav Bittman, author of the 1972 book The Deception Game (Syracuse, NY: Syracuse University Research Corp, 1972).

47. Various interpretations of this classic work use different phrasing. For example, "If your opponent is of choleric temper, seek to irritate him." Sun Tzu, The Art of War , 49 (passage 1.22); and "When their military leadership is obstreperous, you should irritate them to make them angry—then they will become impetuous and ignore their original strategy." Sun Tzu, The Art of War , trans. by Thomas Cleary (Boston, MA: Shambhala Pocket Classics, 1991), 15 (passage 1.12).

48. For a detailed examination of this event, see David E. Sanger, The Perfect Weapon: Sabotage and Fear in the Cyber Age (New York: Crown Publishing, 2018), 124–43.

49. Sanger, The Perfect Weapon , 143.

50. Robert S. Mueller III, Report on the Investigation into Russian Interference in the 2016 Presidential Election , vol. 1 (Washington, DC: Department of Justice, 2019).

51. Renee DiResta et al., Telling China's Story: The Chinese Communist Party's Campaign to Shape Global Narratives (Stanford, CA: Stanford Internet Observatory and Hoover Institution, Stanford University, 2020), 3.

52. Howard, Lie Machines , 77; Jonathan Kaiman, "Free Tibet Exposes Fake Twitter Accounts by China Propagandists," Guardian , 22 July 2014; and Nicholas J. Monaco, "Taiwan: Digital Democracy Meets Automated Autocracy," in Computational Propaganda: Political Parties, Politicians and Political Manipulation on Social Media , ed. Samuel C. Woolley and Philip N. Howard (New York: Oxford University Press, 2018), 104–27, https://doi.org/10.1093/oso/9780190931407.003.0006 .

53. Stefan Halper, China: The Three Warfares (Washington, DC: Office of the Secretary of Defense, 2013), 12.

54. Halper, China .

55. Larry M. Wortzel, The Chinese People's Liberation Army and Information Warfare (Carlisle Barracks, PA: United States Army War College Press, 2014), 29–30. Note: according to Wortzel, a direct translation of yulun is "public opinion"; thus, in many English translations, the term "public opinion warfare" is used. In some People's Liberation Army translations of book titles and articles, however, it is called "media warfare."

56. Wortzel, The Chinese People's Liberation Army and Information Warfare .

57. Steven Collins, "Mind Games," NATO Review (Summer 2003).

58. Halper, China , 12–13.

59. Laura Jackson, "Revisions of Reality: The Three Warfares—China's New Way of War," in Information at War: From China's Three Warfares to NATO's Narratives (London: Legatum Institute, 2015), 5–6.

60. Jackson, "Revisions of Reality."

61. Ben Nimmo, "Question That: RT's Military Mission," Atlantic Council's Digital Forensic Research Lab, 8 January 2018.

62. Statement Prepared for the U.S. Senate Select Committee on Intelligence Hearing, 115th Cong. (30 March 2017) (statement of Clint Watts on "Disinformation: A Primer in Russian Active Measures and Influence Campaigns"), hereafter Watts statement.

63. Watts statement.

64. Watts statement.

65. For details on the efforts of both China and Russia, see Ross Babbage, Winning With out Fighting: Chinese and Russian Political Warfare Campaigns and How the West Can Prevail , vol. 1 (Washington, DC: Center for Strategic and Budgetary Assessments, 2019); Esther Chan and Rachel Blundy, "'Bulletproof' China-backed Site Attacks HK Democracy Activists," Yahoo News, 1 November 2019; John Costello and Joe McReynolds, China's Strategic Support Force: A Force for a New Era , China Strategic Perspectives 13 (Washington, DC: National Defense University Press, 2018); Joanne Patti Munisteri, "Controlling Cognitive Domains," Small Wars Journal , 24 August 2019; Austin Doehler, "How China Challenges the EU in the Western Balkans," Diplomat , 25 September 2019; Keoni Everington, "China's 'Troll Factory' Targeting Taiwan with Disinformation Prior to Election," Taiwan News , 5 November 2018; "Hong Kong Protests: YouTube Shuts Accounts over Disinformation," BBC News, 22 August 2019; Paul Mozur and Alexandra Stevenson, "Chinese Cyberattack Hits Telegram, App Used by Hong Kong Protesters," New York Times , 13 June 2019; and Tom Uren, Elise Thomas, and Jacob Wallis, Tweeting through the Great Firewall: Preliminary Analysis of PRC-linked Information Operations on the Hong Kong Protests (Canberra: Australian Strategic Policy Institute, 2019).

66. DiResta et al., Telling China's Story .

67. Background to "Assessing Russian Activities and Intentions in Recent U.S. Elections": The Analytic Process and Cyber Incident Attribution (Washington, DC: Office of the Director of National Intelligence, 2017); Ellen Nakashima, "Senate Committee Unanimously Endorses Spy Agencies' Finding that Russia Interfered in 2016 Presidential Race in Bid to Help Trump," Washington Post , 21 April 2020; Jane Mayer, "How Russia Helped Swing the Election for Trump," New Yorker , 24 September 2018; Philip N. Howard et al., The IRA, Social Media and Political Polarization in the United States, 2012–2018 (Oxford, UK: Programme on Democracy & Technology, 2018); and Nike Aleksejeva et al., Operation Secondary Infektion: A Suspected Russian Intelligence Operation Targeting Europe and the United States (Washington, DC: Atlantic Council Digital Forensic Research Lab, 2019).

68. Nathaniel Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia," Facebook Newsroom, 21 October 2019.

69. Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia."

70. Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia."

71. "Hacking Group Linked to Iran Targeted a U.S. Presidential Campaign, Microsoft Says," Los Angeles (CA) Times , 4 October 2019.

72. Cherilyn Ireton and Julie Posetti, Journalism, "Fake News" and Disinformation (Paris: UNESCO, 2018), 18.

73. Michael J. Mazarr et al., The Emerging Risk of Virtual Societal Warfare: Social Manipulation in a Changing Information Environment (Santa Monica, CA: Rand, 2019), 65–66, https://doi.org/10.7249/RR2714 .

74. For instance, see Rob Price, "AI and CGI Will Transform Information Warfare, Boost Hoaxes, and Escalate Revenge Porn," Business Insider, 12 August 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 87.

75. Will Knight, "Fake America Great Again: Inside the Race to Catch the Worryingly Real Fakes that Can Be Made Using Artificial Intelligence," MIT Technology Review 17 August 2018; for some examples of realistic Instagram memes created by powerful computer graphics equipment combined with AI, see "the_fakening," Instagram, accessed 6 April 2021.

76. Avi Selk, "This Audio Clip of a Robot as Trump May Prelude a Future of Fake Human Voices," Washington Post , 3 May 2017; Bahar Gholipour, "New AI Tech Can Mimic Any Voice," Scientific American , 2 May 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 85–86.

77. "Imitating People's Speech Patterns Precisely Could Bring Trouble," Economist , 20 April 2017; and Mazarr et al, The Emerging Risk of Virtual Societal Warfare , 86.

78. "Fake News: You Ain't Seen Nothing Yet," Economist , 1 July 2017; Faizan Shaikh, "Introductory Guide to Generative Adversarial Networks (GANs) and Their Promise!," Analytics Vidhya, 15 June 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 88.

79. Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 91.

80. Matthew Rosenberg, Nicole Perlroth, and David E. Sanger, "'Chaos Is the Point': Russian Hackers and Trolls Grow Stealthier in 2020," New York Times , 10 January 2020.

81. Rosenberg, Perlroth, and Sanger, "'Chaos Is the Point'."

82. Howard Becker, "The Nature and Consequences of Black Propaganda," American Sociological Review 14, no. 2 (April 1949): 221, https://doi.org/10.2307/2086855 . "'Black' propaganda is that variety which is presented by the propagandizer as coming from a source inside the propagandized."

83. For a discussion of strategies to counter foreign influence threats from Chinese and Russian malign influence efforts, see Thomas G. Mahnken, Ross Babbage, and Toshi Yoshihara, Countering Comprehensive Coercion: Competitive Strategies Against Authoritarian Political Warfare (Washington, DC: Center for Strategic and Budgetary Assessments, 2018).

inline graphic

This work is licensed under a Creative Commons Attribution 4.0 International License.

Previous Article

From the Editors

Next Article

Fake News for the Resistance: The OSS and the Nexus of Psychological Warfare and Resistance Operations in World War II

Creative Commons License

Project MUSE Mission

Project MUSE promotes the creation and dissemination of essential humanities and social science resources through collaboration with libraries, publishers, and scholars worldwide. Forged from a partnership between a university press and a library, Project MUSE is a trusted part of the academic and scholarly community it serves.

MUSE logo

2715 North Charles Street Baltimore, Maryland, USA 21218

+1 (410) 516-6989 [email protected]

©2024 Project MUSE. Produced by Johns Hopkins University Press in collaboration with The Sheridan Libraries.

Now and Always, The Trusted Content Your Research Requires

Project MUSE logo

Built on the Johns Hopkins University Campus

  • Misinformation & Disinformation

Misinformation Is Warfare

Protests Erupt Across Mideast as Israel Prepares Gaza Assault

R ather than flip on the TV when major news-worthy events happen, like Hamas’ attack on Israel on Oct. 7 and the subsequent retaliation by Israeli forces in Gaza, we open up social media to get up-to-the-minute information. However, while television is still bound to regulations that require a modicum of truthful content, social media is a battleground of facts, lies, and deception, where governments, journalists, law enforcement, and activists are on an uneven playing field.

It is a massive understatement to use the term “fog of war” to describe what is happening in discussions of Hamas and Israel on social media. It’s a torrent of true horror , violent pronunciations , sadness , and disinformation . Some have capitalized on this moment to inflame Russia or gain clout by posting video game clips or older images of war recontextualized. Many governments, including the U.S., were shocked that Israeli Intelligence failed to see the land, sea, and air attack. Israel is known for its controversial cyber defense and spyware used to tap into journalists’ and adversaries’ networks. How could this have happened?

It may come as a surprise to some that we are involved in an information war playing out across all social media platforms every day. But it’s one thing to see disinformation, and it’s another to be an active (or unwitting) participant in battle.

Read More: How Israel-Hamas War Misinformation Is Spreading Online

Different from individuals, states conduct warfare operations using the DIME model—"diplomacy, information, military, and economics.” Most states do everything they can to inflict pain and confusion on their enemies before deploying the military. In fact, attacks on vectors of information is a well-worn tactic of war and usually are the first target when the charge begins. It’s common for telecom data and communications networks to be routinely monitored by governments, which is why the open data policies of the web are so concerning to many advocates of privacy and human rights.

With the worldwide adoption of social media, more governments are getting involved in low-grade information warfare through the use of cyber troops. According to a study by the Oxford Internet Institute in 2020, cyber troops are “government or political party actors tasked with manipulating public opinion online.” The Oxford research group was able to identify 81 countries with active cyber troop operations utilizing many different strategies to spread false information, including spending millions on online advertising. Importantly, this situation is vastly different from utilizing hacking or other forms of cyber warfare to directly attack opponents or infrastructure. Cyber troops typically utilize social media and the internet as it is designed, while employing social engineering techniques like impersonation, bots, and growth hacking.

Data on cyber troops is still limited because researchers rely heavily on takedown reports by social media companies. But the Oxford researchers were able to identify that, in 2020, Palestine was a target of information operations from Iran on Facebook and Israel was a target of Iran on Twitter, which indicates that disinformation campaigns know no borders. Researchers also noted that Israel developed high-capacity cyber troop operations internally, using tactics like botnets and human accounts to spread pro-government, anti-opposition, and suppress anti-Israel narratives. The content Israel cyber troops produced or engaged with included disinformation campaigns, trolling, amplification of favored narratives, and data-driven strategies to manipulate public opinion on social media. 

Of course, there is no match for the cyber troops deployed by the U.S. government and ancillary corporations hired to smear political opponents, foreign governments, and anyone that gets in the way. Even companies like Facebook have employed PR firms to use social media to trash the reputation of competing companies . It’s open warfare—and you’ve likely participated. 

As for who runs influence operations online, researchers found evidence of a blurry boundary between government operatives and private firms contracted to conduct media manipulation campaigns online. This situation suggests that contemporary cyber operations are best characterized as fourth generation warfare, which blurs the lines between civilians and combatants. 

It also has called into question the validity of the checks that platforms have built to separate fact from fiction. For instance, a graphic video of the war was posted by Donald Trump Jr.—images which Trump Jr. claimed came from a “source within Israel,”—was flagged as fake through X’s Community Notes fact-checking feature. The problem, though, was that the video was real . This would not be the first time we have seen fact-checkers spread disinformation , as pro-Russian accounts did something similar in 2022.

Time and time again, we have seen social media used to shape public opinion, defame opponents, and leak government documents using tactics that involve deception by creating fake engagement, using search engine optimization, cloaked and imposter accounts, as well as cultural interventions through meme wars. Now more than ever we need politicians to verify what they are saying and arm themselves with facts. Even President Biden was fact-checked on his claim to have seen images of beheaded babies, when he had only read news reports.

Today, as we witness more and more attacks across Israel and Palestine, influential people—politicians, business people, athletes, celebrities, journalists, and folks just like me and you—are embattled in fourth generation warfare using networks of information as a weapon. The networks are key factors here as engagement is what distributes some bytes of information—like viral videos, hashtags, or memes—across vast distances.

If we have all been drafted into this war, here are some things that information scientist and professor Amelia Acker and I developed to gauge if an online post might be disinformation. Ask yourself: Is it a promoted post or ad? This is a shortcut to massive audiences and can be very cheap to go viral. Is there authentic engagement on the post or do all of the replies seem strange or unrelated? If you suspect the account is an imposter, conduct a reverse image search of profile pics and account banners, and look to see if the way-back machine has screenshots of the account from prior months or years. Lastly, to spot spam, view attached media (pictures, videos, links) and look for duplicates and see if this account engages in spam posting, for example, replying to lots of posts with innocuous comments.

While my hope is for peace, we all must bear witness to these atrocities. In times of war, truth needs an advocate.

More Must-Reads From TIME

  • What Student Photojournalists Saw at the Campus Protests
  • How Far Trump Would Go
  • Why Maternity Care Is Underpaid
  • Saving Seconds Is Better Than Hours
  • Welcome to the Golden Age of Ryan Gosling
  • Scientists Are Finding Out Just How Toxic Your Stuff Is
  • The 100 Most Influential People of 2024
  • Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time

Contact us at [email protected]

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Propaganda and Information Warfare in Contemporary World: Definition Problems, Instruments and Historical Context

Profile image of Elena Kotelenets

2019, Proceedings of the International Conference on Man-Power-Law-Governance: Interdisciplinary Approaches (MPLG-IA 2019)

Related Papers

Global Regional Review

Beenish Zaheen

Propaganda had always remained a very important tool to influence others. Many researchers had worked and synthesized the concept of propaganda. But there was still a lot to be done as the contemporary propaganda industries had given this phenomenon a very subtle structure. This particular research study aimed to analyze the propaganda definitions set forth by various propaganda researchers to develop a comprehensive definition of the concept. For this purpose definitions provided by thirty different propaganda scholars had been analyzed to identify various elements of propaganda. Fourteen most frequently occurring elements were identified. On the basis of these elements, a comprehensive definition of propaganda was constructed to enhance the propaganda literature of Pakistan, which was the target of international propaganda from last couple of decades. This research paper would open new horizons for Pakistani researchers in the field of propaganda.

information warfare essay

Abdullah Alnajjar

We can surly say that millions of messages in the one minute over the world sent by media in its different outlets and received by the audience in their different orientations, believes, awareness and economics level. This messages delivered to people with various forms and contains, this can be easily recognized, but it is not

SOJ Psychology

Dr. Jay Seitz

ABSTRACT: From a psychological perspective, I theorize that propaganda in wartime works insidiously by tapping into people’s prejudices and stereotypes and galvanizes belief in an immense conspirational network in which the “other” is given an ominous character. Individuals see the psychological characteristics of the other (“enemy”) as personal, pervasive, and permanent. That is, the other side (“enemy”) is collectively demonized by way of stereotypes (i.e., generalizations about categories of people and their beliefs) and simplifications (i.e., reducing events and their causes to one or two variables) while one’s own side is seen as wholly good. I describe how the mass media heightens the impact of propaganda by fostering a strong feeling of community and using cults of experts to structure bias, among other things. PARTIAL SUMMARY: Propaganda in wartime is ubiquitous in all societies but most likely originated in the writings of military tacticians in China in the fifth century B.C.E. or earlier. Power, as Napoleon once opined, is a based on opinion, and governments in wartime “manufacture consent” among the various publics by justifying and then carrying out their own intended actions. Propaganda has been most effective in democratic regimes in which tolerance of, and respect for, different opinions and ways of life is considered the foundation of democratic behavior. Propaganda eviscerates democracy because its, often unstated, goal is to provoke active or passive participation without democratic deliberation, the political core of free societies. Propaganda accomplishes its goals through several means. It may create a political climate favorable for attitude change (pre-persuasion or pre-propaganda), modify opinions directly through the use of the print and non-print media (political propaganda) or it may accomplish its aims furtively by disguising its true goals or in such a way that the democratic public is aware that it is being propagandized (overt or white propaganda) only to further a more covert intention (covert or black propaganda). Historically, propaganda in wartime has typically descended from the upper echelons of government (vertical propaganda) but it may just as equally arise from below in the internal dynamics of an organized group or institution (political education; horizontal propaganda).

Grigoriy Nikitin

The article analyzes the phenomenon of "information war" in the modern world. The information war are the indirect mechanism of manipulation of public consciousness. Modern information war is related to the psychological pressure on society. Information war in the modern world characterize the qualitative change of the traditional cultural and spiritual life, the discontinuity of national ideals and values, and the dismantling of historical memory.

N. 7, year IV, May 2015 - Pesquisas Doutorais

Marco Marsili

The article is a little research, carried out in the seminars of International Relations of the Ph.D. program in History, Studies of Security and Defense. The aim of the work is to highlight the role of propaganda during the conflicts in the contemporary age, particularly during the two World Wars. The propaganda developed during two major conflicts of the ‘900 as a true ‘weapon’ and instrument of government policy in international relations, has perfected the techniques of ‘news management’, and today is a real and relished ‘art’ applied to guide public opinion in favor of government decisions.

Jonathan Auerbach

In an effort to reorient the field of propaganda studies, this essay offers thirteen interrelated propositions about propaganda. The concept is defined as a mode of mass persuasion with a distinct historical genesis predating its modern use strictly as a term of disrepute. These propositions address the moral and affective dimensions of propaganda, as well as the relation between propaganda and other kinds of public information and institutions such as advertising, teaching, and religion. Offering a functionalist and contextual approach to studying propaganda, the propositions shift attention from content analysis to emphasize how information flows through various media networks. The essay regards the targets of mass persuasion not as passive dupes, as customarily assumed, but rather active consumers who play a part in shaping the meanings and effects of propaganda—past, present, and future—both in totalitarian societies and more democratic ones.

In Y. R. Kamalipour (Ed.) Transnational Media and Global Communication, 2nd Edition.

Richard Vincent

Miroslav Mitrovic

We bear witness to the impact that information has on decision-making process, political practice, international relations, as well as the widest public opinion. Manipulation of information and systematic intrusion of social engineering drivers indicate the wide large-scale repercussions in different parts of world, changing the face and dynamics of international relations, governmental structure, political and ethnic contexts , economic and ownership relations. Regarding the analyses of previous academic and scholar literature, those effects could be marked as consequences of implementation of the Hybrid Warfare concept implemented through psychological operations and communication strategies. Content analyses and induction-deduction work intend to provide a base-line for frame analyses of mass communication usage for achieving the Hybrid Warfare concept aims.

Revista San Gregorio

Olena Zinenko

The research focuses on the concepts of “infodemic”, “information war” and their modern transformations. In order to determine the theoretical basis of the work, inductive and deductive methods of analysis have been used; the method of interdisciplinary research has been involved for identifying the genesis, principles of structuring and specific characteristics of infodemic as a form of informational influence of a manipulative nature. The author’s BNA-analysis made it possible to determine the nature of information influences and the effectiveness of socially valuable performance as a measure to counteract information influences and a training method. It has been found that the interaction of different media and social communications: PR, advertising and journalism, makes it possible to increase the impact of information waves on society; infodemic is an example of such an intense impact. The problem of determining the algorithm, nature and means of combating information intrusio...

Joan Pedro-Carañana

2011b This two-part article explores Herman and Chomsky’s propaganda model from diverse angles, with the aim of deepening its current dynamism and validity for explaining mass media production and content in advanced capitalist democracies. Part I of the contribution studies the contemporary relevance of the five components or “filters” that comprise the model, relates them to ongoing sociohistorical developments, and focuses on the different interactions affecting the media in the context of power relations. It then analyzes the situations in which the spectrum of media opinion is more open. Part II focuses on the validity of the model for explaining news content both in countries other than the United States and on the Internet, as well as for explaining media products other than news. This is followed by an examination of the possibility of expanding and modifying the model by incorporating other factors, which may be considered secondary filters.

RELATED PAPERS

Michael Hantke

Canadian Studies in Population

Amir Houshang Mehryar

Apuntes Universitarios

Rosa Llontop

Bulletin of Kemerovo State University

Roman Rybakov

Behavior Research Methods, Instruments, & Computers

Zoologica Scripta

Namrata Jariwala

Philosophical Transactions of the Royal Society B: Biological Sciences

Katie Slocombe

Proceedings of the National Academy of Sciences

Internet Journal of Medical Update - EJOURNAL

Via Atlântica

Mário César Lugarinho

Optimization for Computer Vision

Marco Treiber

Psycho-Oncology

levent turhan

Working paper (Federal Reserve Bank of Cleveland)

C N V Krishnan

Journal of managed care & specialty pharmacy

Cochrane Database of Systematic Reviews

Milan Mathew

University of Exeter

Deirdre Conlon

Verlag Julius Klinkhardt eBooks

Svantje Schumann

Annals of Nuclear Energy

Eben Mulder

Central European Journal of Public Health

Jean-Pierre Poussou

Haemophilia

Derek Stephens

Behavioural Brain Research

David Feifel

Irene Bellier

Gastroenterology

Stephen Wiener

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Share full article

Advertisement

Supported by

VISUAL INVESTIGATIONS

Satellite Images Reveal Where Russian Nukes Could Be Stored in Belarus

A New York Times analysis shows security upgrades unique to Russian nuclear storage facilities at a Cold War-era munitions depot.

Christoph Koettl

By Christoph Koettl

photo

The New York Times, Source: Maxar Technologies

A newly added air defense system.

A distinctive security checkpoint.

And a triple fence around a bunker.

These new security features and other upgrades at a munitions depot in central Belarus reveal that Russia is building facilities there that could house nuclear warheads. If Russia does move weapons to this location, it would mark the first time it has stored them outside the country since the fall of the Soviet Union in 1991.

Russia already has nuclear warheads on its own soil that are close to Ukraine and NATO countries, but by basing some in Belarus, the Kremlin appears to be trying to accentuate its nuclear threat and bolster its nuclear deterrent.

Russia’s president, Vladimir V. Putin, made reference to such a site early last year, saying Russia would soon be completing the construction of “special storage for tactical nuclear weapons” in Belarus.

The New York Times analyzed satellite imagery and photos, and spoke with nuclear weapons and arms control experts, to track the new construction, which started in March 2023.

The site is 120 miles north of the Ukrainian border at a military depot next to the town of Asipovichy. Some of the recently built structures there have features that are unique to nuclear storage facilities at bases inside Russia. For example, a new, highly secure area is surrounded by three layers of fencing, in addition to the existing security perimeter of the entire base. Another telltale sign is a covered loading area connected to what appears to be a concealed Soviet-era underground bunker.

information warfare essay

Hans Kristensen of the Federation of American Scientists, who has analyzed the site , said that the nuclear developments in Belarus “appear designed to unnerve NATO’s easternmost member states, but will not give Russia a significant new military advantage in the region.”

There is no consensus definition of a tactical nuclear weapon, as opposed to longer-range strategic arms. But Russia defines tactical arms as those with a range of up to 300 kilometers, about 186 miles. Because nuclear programs are so secretive, it’s possible there are other locations in Belarus where Russia is storing warheads — and the Kremlin may have even moved some to the Asipovichy location, though all indications suggest otherwise. Both the Russian and Belarusian ministries of defense did not respond to requests for comment.

Nuclear warheads are typically stored close to military bases with the capability to deliver the weapons. The suspected nuclear storage site is in the same town as Belarus’s Iskander missiles, which can be used to launch nuclear or conventional warheads. Russia delivered the Iskanders to Belarus in 2022.

information warfare essay

Over the past week, both Russia and Belarus have made statements about nuclear weapons drills. On Monday, the Kremlin said it would hold military exercises with troops based near Ukraine to train for the possible use of tactical nuclear weapons. On Tuesday, the Belarusian defense minister told state media that an inspection had begun of the Iskander forces and other nuclear weapons delivery systems.

Russia’s comments immediately provoked condemnation by the U.S. and NATO for “irresponsible rhetoric.”

“We are reviving Cold War practices, hence we are reviving Cold War risks,” said Jeffrey Lewis , an arms control expert at the Middlebury Institute of International Studies at Monterey in California.

In 2023, as new fences went up to create a higher-security area at the Asipovichy base, a covered area was revamped, including a truck loading dock that now has a new roof, shielding any activities from surveillance above. These renovations are consistent with structures seen at other former Soviet nuclear storage sites. Seen below, a matching dock in Hungary contains an internal entrance to an old, tree-covered underground bunker.

William Moon , an independent consultant and former official with the Pentagon’s Defense Threat Reduction Agency , told The Times that the design of the Asipovichy upgrades, with triple fencing, one main entry and an emergency exit, resembles the Russian nuclear warhead storage sites he has seen in person. Mr. Moon, who worked on nuclear warhead security with Russia, said, “When we were working with their standards, they would require that third layer fencing.”

He said that in addition to added security, he would also expect separate housing for the Russian military unit that remains in control of the nuclear warheads. Three new buildings, which appear to be either for administrative use or barracks, have been set up in the depot entrance area, and an additional area is currently being bulldozed.

At the entrance to the triple-fenced zone, a security checkpoint — a covered inspection area next to a guardhouse — was added in 2023. These types of structures have become fixtures over the last two decades at nuclear sites inside Russia, according to Michael Duitsman , a colleague of Mr. Lewis’s at the Middlebury Institute. They are a “ unique feature not seen at other Russian bases ,” he said.

Security checkpoints fit a distinct pattern

information warfare essay

Asipovichy, Belarus

information warfare essay

Novgorod Oblast, Russia

information warfare essay

Khabarovsk Krai, Russia

information warfare essay

Belgorod Oblast, Russia

Sources: Analysis by Michael Duitsman, Middlebury Institute of International Studies at Monterey; Maxar Technologies; Google Earth

By The New York Times

In recent weeks, construction began on what may be new buildings. “The details are still uncertain, but construction has clearly entered a new phase,” said Mr. Kristensen.

An aerial view shows two large concrete pads, possibly for new buildings, and several existing structures on reddish dirt.

An air defense system has also been brought in to protect the site. It was initially spotted camouflaged in mid-2023, including through radar satellite imagery provided by the space company Umbra . Since September, one of the air defense vehicles has been deployed in a field about a mile from the bunker. Asipovichy is part of nuclear history. The same site that Russia is building out today was likely used to store nuclear weapons during the Cold War. The Soviet Union began basing nuclear missile brigades in and around the town in the 1960s, according to William Alberque, who has been a director at the think tank International Institute for Strategic Studies (IISS) and a Pentagon and NATO official. It also stationed a military unit that managed nuclear weapons at an artillery munitions storage site, he said. After the Soviet Union disintegrated in 1991, all nuclear weapons were removed from Belarus.

Declassified U.S. intelligence satellite photos of the Asipovichy site taken during the Cold War appear to show these two functions. The southern section was thought to be for conventional weapons, with clearings and many storage buildings. In a separate, tree-covered northern section, four bunkers are visible, with a walled compound farther north — the exact spot where the current construction can be seen.

information warfare essay

Asipovichy Military Depot | 1984

information warfare essay

Source: United States Geological Survey

While the 1970 Nuclear Non-Proliferation Treaty prohibits the transfer of nuclear weapons to non-nuclear states, it doesn’t ban housing nuclear weapons abroad if control is maintained by the country that owns them. Under NATO’s nuclear sharing arrangement , the U.S. currently has nuclear weapons in some member countries.

A U.S. State Department spokesperson would not say if the United States was monitoring any particular site in Belarus, but said the department is keeping a close eye on the situation in order “to ensure Russia maintains control of its weapons in the event of any deployment to Belarus and upholds its obligations under the Nuclear Non-Proliferation Treaty.” An April 2024 State Department report said that the U.S. would not change its nuclear posture in response to the developments in Belarus.

Julian E. Barnes and Dmitriy Khavin contributed reporting. Phil Robibero , Blacki Migliozzi , David Botti and Alexander Cardia contributed visual production.

Christoph Koettl is a Times reporter on the Visual Investigations team . More about Christoph Koettl

IMAGES

  1. Information Warfare Essay Contest

    information warfare essay

  2. Electronic Warfare Essay Example

    information warfare essay

  3. Information Warfare Offensive

    information warfare essay

  4. What Is Information Warfare?

    information warfare essay

  5. 2022 Information Warfare Essay Contest

    information warfare essay

  6. Essay On Information Warfare

    information warfare essay

VIDEO

  1. 1.2 information warfare,cyber crime information terrorism

  2. Information Warfare Lecture

  3. How Call Of Duty LOST Its Identity…

COMMENTS

  1. Information Warfare Essay Contest

    2022 Information Warfare Essay Contest. Sponsored by Booz Allen Hamilton. Deadline. 30 November 2022. The Nation's adversaries and competitors are proving to be formidable in the digital battlespace. Essay Contest.

  2. Beyond Bullets and Bombs: The Rising Tide of Information War in ...

    The Information Space and Cognitive Warfare. Cognitive warfare is a new term applied to an old concept. In writings from ancient strategists like Kautilya and Sun Tzu to modern practitioners like George Kennan, victory is achieved by successfully shaping how a population perceives a set of events through a mix of overt and covert messaging.

  3. NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

    NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest 15 February 2022. From Petty Officer 2nd Class Huy Tonthat, Naval Postgraduate School . MONTEREY, Calif. - U.S. Navy Cmdr ...

  4. PDF INFORMATION WARFARE ESSAY CONTEST

    INFORMATION WARFARE ESSAY CONTEST SPONSORED BY THE REWARD First Prize: $5,000 Second Prize: $2,500 Third Prize: $1,500 All winners will receive a 1-year membership in the Naval Institute. SUBMISSION GUIDELINES Open to all contributors— active-duty military, reservists, veterans, and civilians. • Word count: 2,500 words max (excludes ...

  5. NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

    This work, NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest, by PO1 Tom Tonthat, identified by DVIDS, must comply with the restrictions shown on https://www.dvidshub.net ...

  6. NPS Student, Professor Win 2021 USNI Information Warfare Essay Contest

    U.S. Navy Cmdr. Edgar Jatho, a doctoral student in the Naval Postgraduate School (NPS) Department of Computer Science, and his advisor Assistant Professor Joshua A. Kroll have been named the winners of the U.S. Naval Institute (USNI) 2021 Information Warfare Essay Contest for their piece, "Artificial Intelligence: Too Fragile to Fight? Jatho and Kroll will be honored this week at an awards ...

  7. Information Warfare

    An anthology of essays from practitioners of modern information warfare, Ideas as Weapons is an excellent introduction to the subject. From the grand strategic view, to tactical applications of information, the work encompasses a variety of topics. The work itself is a product of its era; most essays focus on US/Western operations in Iraq and ...

  8. Information warfare

    Information warfare (IW) is the battlespace use and management of information and communication technology (ICT) in pursuit of a competitive advantage over an opponent. ... An essay on Information Operations by Zachary P. Hubbard [permanent dead link] News articles

  9. Information warfare: methods to counter disinformation

    Leading Issues in Information Warfare & Security Research 1/1 (2011), 80. 103 Robert Murphy and Richard Cathers. 'Transparency in Force Modernisation Decisions'. Information and Security 23/2 (2009), 215. 104 Led primarily by the Lockheed Martin study of intrusion kill chains, see Hutchins, Intelligence-driven.

  10. Information Warfare as Future Weapon of Mass-disruption, Africa 2030s

    Information warfare will become ingrained in society as the virtual and real worlds increas- ingly merge. The identified four information warfare scenarios for the 2030s as well as the information warfare future model can serve as frameworks or mental models for wider application in the TWEPS environments and further research. Conclusion

  11. Forms and Examples of Information Warfare

    Disinformation and propaganda are the two main forms of information warfare. Most of these forms gained considerable credit during the world war age, and they have remained fundamental to date in military, government, and business management domains (Fogleman, 1995). A notable example of propaganda information warfare is evident after a close ...

  12. [PDF] What Is Information Warfare

    This essay examines that line of thinking and indicates several fundamental flaws while arguing the following points: Information warfare, as a separate technique of waging war, does not exist, there are several distinct forms of information warfare, each laying claim to the larger concept. Abstract : This essay examines that line of thinking and indicates several fundamental flaws while ...

  13. Information Warfare

    INFORMATION WARFARE Prof George J. Stein, AWC We need to state up front that much of what is discussed in this essay on information warfare is unofficial speculation. There is no official, open-source US government definition of information warfare. The Department of Defense calls its current thinking and approach to information warfare

  14. U.S. Naval Institute Information Warfare Essay Contest

    The U.S. Naval Institute Information Warfare Essay Contest contest is open to all contributors -- active-duty military, reservists, veterans, and civilians. Essays must be no more than 2,500 words, excluding end notes and sources. Include word count on title page of the essay. Essays are judged in the blind.

  15. Project MUSE

    5 Paul A. Smith describes political warfare as "the use of political means to compel an opponent to do one's will" and "its chief aspect is the use of words, images, and ideas, commonly known, according to context, as propaganda and psychological warfare." 6 Carnes Lord notes a "tendency to use the terms psychological warfare and political ...

  16. Misinformation Is Warfare

    Different from individuals, states conduct warfare operations using the DIME model—"diplomacy, information, military, and economics." Most states do everything they can to inflict pain and ...

  17. (PDF) Propaganda and Information Warfare in Contemporary World

    The essay regards the targets of mass persuasion not as passive dupes, as customarily assumed, but rather active consumers who play a part in shaping the meanings and effects of propaganda—past, present, and future—both in totalitarian societies and more democratic ones. ... Information warfare represents the most concentrated propaganda ...

  18. Artificial Intelligence: Too Fragile to Fight?

    Information Warfare Essay Contest—First Prize. Sponsored by Booz Allen Hamilton. Automation—including AI—has persistent, critical vulnerabilities that must be thoroughly understood and adequately addressed if defense applications are to remain resilient and effective.

  19. China's Chilling Cognitive Warfare Plans

    China is focusing on developing generative AI and BMI technologies, and will continue to work toward its ultimate goal of "winning without fighting" by improving its ability to control human ...

  20. Manila seizes the moral high ground in information wars with China

    SEOUL, South Korea — Manila is turning Beijing 's gray-zone tactics on their head as it shows off its own information warfare muscles in the intensifying territorial battles in the South China ...

  21. Satellite Images Reveal Where Russian Nuclear Weapons Could Be Stored

    A New York Times analysis shows security upgrades unique to Russian nuclear storage facilities at a Cold War-era munitions depot.