[Editor’s Note: As with everyone else on the planet, the U.S. Army Mad Scientist Initiative’s 2020 “battle rhythm” was disrupted by the COVID-19 Global Pandemic. Consequently, instead of hosting live Mad Scientist conferences this year, we facilitated a series of Weaponized Information Virtual Events (including three podcasts, six webinars, a wargame, and a capstone conference) with our co-sponsors, Georgetown University’s Center for Security Studies and the Center for Advanced Red Teaming, University at Albany, SUNY. Concurrently, we also crowdsourced a complementary Information Warfare Vignettes Writing Contest to envision how weaponized information could be operationalized. Today’s blog post captures the key insights that we’ve gleaned from this past Spring and Summer’s events, organized into a dozen themes. Following each, we’ve provided links to assist you in drilling down further to explore and learn more about each of these insights.
Please accept our hearty “Thanks!” for participating online during these events — your insightful chat room commentaries and penetrating questions for our presenters and panelists enhanced our collective understanding of this critical topic. The Army’s ability to achieve decision dominance in multi-domain operations by gaining and maintaining information advantage has been directly enhanced through your participation — Read on!]
1. Disinformation will be a central component of competition and conflict at the strategic, operational, and tactical levels and can be a decisive factor in an adversary’s ability to meet their political and military aims. Adversaries will track the movement of U.S. and allied troops, all the way from their home station installations (i.e., the Strategic Support Area in the U.S.) to the Close Area fight. This will afford our adversaries the opportunity to target misinformation at Soldiers and their families. The resultant panic from home may cause confusion within units, disrupt decision-making, and adversely affect unit cohesion and readiness. The use of social networks with in-app geolocation software also presents a vulnerability, as these signals can be hacked and used to monitor and track troop movements. Military families present a vulnerability for exploitation – financially and psychologically at home, and kinetically against our Soldiers in Theater. Whole-of-force resiliency efforts are necessary to “harden” and protect our Soldiers and their families from adversaries’ disinformation campaigns.
Explore this topic further in these related posts: What We are Learning about the Operational Environment; Blurring Lines Between Competition and Conflict; Nowhere to Hide: Information Exploitation and Sanitization; Weaponized Information: One Possible Vignette; and Located, Isolated, and Distracted – An Infantry Platoon Leader’s Experience by COL Scott Shaw. Also check out the video presentation AI and Manufacturing Reality [access using a non-DoD network] with Dr. Marek Posard and Dr. Christopher Paul, RAND Corporation.
2. The core purpose of information warfare is to corrupt trust, erode national legitimacy, and sow doubt about what is true. Command and control are built on trust. People tend to believe individuals who belong to organizations they find credible. Each realm of social media in the information environment (written, graphic, video, etc.) are abstraction layers of society and reality. Transparency is essential for developing trust in the competition phase — the military, and society as a whole, need to proactively defend against weaponized information, especially with respect to building trust and credibility. The ultimate goal of malicious actors is to undermine Americans’ and partner nations’ faith in the integrity of their democratic processes and sow pervasive mistrust in their systems. Perception warfare doesn’t just mean weaponized information; it means the weaponized perception of information. Our adversaries will use perception hacking to exploit cognitive shortcuts and biases, tap into collective fears, and foster distrust of our experts and institutions, all with the goal of dividing and conquering.
Explore this topic further in these related posts: Shaping Perceptions with Information Operations: Lessons for the Future by Taylor Galanides; In the Cognitive War – The Weapon is You! by Dr. Zac Rogers; Damnatio Memoriae through AI by Marie Murphy; and The Death of Authenticity: New Era Information Warfare. Also check out the following video presentations [access using a non-DoD network]: A Superiority-Engine for the Information Environment with Lewis Shepherd, Senior Director, Technology Strategy, VMware; The Storm After the Flood Virtual Wargame Panel led by Dr. Gary Ackerman, Director, and Doug Clifford, Program Manager, The Center for Advanced Red Teaming, University at Albany, SUNY; Assault on Authenticity with Kara Frederick, Fellow, Technology & National Security, CNAS; and Perception Warfare and Whole of Society Solutions with Olga Belogolova, Adjunct Assistant Professor, Georgetown University Center for Security Studies.
3. In order to fight and win in the future information environment, accurate and appropriate data will be critical to decision-making. Bad training data results in bad outcomes. Data interoperability is critical and AI can help parse and produce more “wholesome” results that consider a large amount of data from a variety of sources. There are millions of media types and the number of tools used to manipulate media far exceed the capabilities of detecting it. The growth of visual media content on the internet has been exponential – there are billions of points of content uploaded online daily.
Explore this topic further in these related posts: AI Enhancing EI in War by MAJ Vincent Dueñas; The Guy Behind the Guy: AI as the Indispensable Marshal by Mr. Brady Moore and Mr. Chris Sauceda; and Integrating Artificial Intelligence into Military Operations by Dr. James Mancillas. Also check out the following video presentation [access using a non-DoD network]: AI and Manufacturing Reality with Dr. Marek Posard and Dr. Christopher Paul, RAND Corporation.
4. The low barrier to entry, low cost, and difficulty in attributing sources of disinformation means that non-state actors and super-empowered individuals will also be able to carry out sophisticated operations to manipulate the information environment, in addition to state adversaries. The internet reduces the transaction costs of exploiting others. It allows for large-scale scams, targeting at a very low cost, and gaslighting at scale. There is an extremely low barrier to entry. While many methods and software are low-skill and low-cost, the more advanced the fake needs to be, the greater skill and resources are required to make it.
Explore this topic further in these related posts: Influence at Machine Speed: The Coming of AI-Powered Propaganda by MAJ Chris Telley; LikeWar – The Weaponization of Social Media; and Extremism on the Horizon: The Challenges of VEO Innovation by Colonel Montgomery Erfourth and Dr. Aaron Bazin. Also check out the following video presentations [access using a non-DoD network]: The Content Blitz: Quantifying the Islamic State’s Impact on the Saudi Twittersphere, with Henry Mihm, Inès Oulamine, and Fiona Singer, Georgetown University; AI Speeding up Disinformation with LTG John D. Bansemer (USAF-Ret.), Dr. Margarita Konaev, Katerina Sedova, and Tim Hwang, Georgetown University’s Center for Security and Emerging Technologies; and Using AI to Detect Visual Media Manipulation at Scale with Dr. Neil Johnson, Technical SETA, Information Innovation Office, DARPA.
5. Information warfare works effectively through cultural understanding and analysis with subsequent adapted targeting that fits within convincing societal narratives. Targeted ads/information, micro-targeting, and micro-influencing specific audiences leads to distancing and distrust. The most persuasive source for communication is “someone like you.” Trends in political tribalization exacerbate this issue, which is exploited by our adversaries. These adversaries are focused on exacerbating cultural fractures and seams within Western societies.
While we have the obligation to stand up and refute untruths, doing so rarely makes a difference. A better strategy is to shift our focus to anticipating and preparing for future attacks, rather than trying to counter previous ones. Refuting disinformation doesn’t work because it isn’t as nicely packaged in the narrative. Due to anchoring bias and cognitive dissonance, trying to retract disinformation only entrenches it further in our minds.
Explore this topic further in these related posts: The Convergence: Bias, Behavior, and Baseball with Keith Law and the associated podcast; and The Convergence: Political Tribalism and Cultural Disinformation with Samantha North and the associated podcast. Also check out the following video presentations [access using a non-DoD network]: Weaponization of Social Media and Fictional Intelligence (FICINT) with Peter W. Singer and August Cole; Disinformation and Narrative Warfare with Dr. Ajit Maan and Paul Cobaugh, Georgetown University’s Center for Strategic Studies; and Decision Making with Keith Law, Senior Baseball Writer, The Athletic.
6. Disinformation tends to follow home-grown and pre-existing social divisions. It does not create the problems, it works to effectively exacerbate them. Threat actors have a good understanding of culture and are good at identifying social fissures in the target population. People from different cultural narratives will process information differently and the narratives that we grow up with reflect how we view ourselves and see our place in the world. The reason why weaponized narratives are so powerful is that what is being weaponized against us is something that we’re not conscious of on a daily basis. Why can’t we counter disinformation with the truth? Because the disinformation was told better and it resonates more meaningfully with the target audience than the facts.
Explore this topic further in these related posts: Lincoln Zaleski‘s paper on Disinformation Ink Spots: A Framework to Combat Authoritarian Disinformation in GEN Z and the OE: 2020 Final Findings and the associated podcast (Part 2); Virtual War – A Revolution in Human Affairs (Part I and II) by COL(R) Stefan J. Banach; and The Convergence: The Next Iteration of Warfare with Lisa Kaplan and the associated podcast. Also check out the following video presentation [access using a non-DoD network]: Disinformation and Narrative Warfare with Dr. Ajit Maan and Paul Cobaugh, Georgetown University’s Center for Strategic Studies
7. Counter-narratives will not work unless they’re a part of a larger narrative strategy. They should not refer to the adversarial or false narrative because they further entrench people’s thinking in the false narrative. Cultural comprehension is key to identifying what resonates most with the target audience and helps us to understand how best to communicate with them.
Explore this topic further in these related posts: The Convergence: The Psychology of Terrorism and Disinformation with Dr. Aleks Nesic and listen to the associated podcast; and Extremism on the Horizon: The Challenges of VEO Innovation by Colonel Montgomery Erfourth and Dr. Aaron Bazin. Also check out the following video presentation [access using a non-DoD network]: Disinformation and Narrative Warfare with Dr. Ajit Maan and Paul Cobaugh, Georgetown University’s Center for Strategic Studies
8. Adversaries seek to overwhelm target populations through a firehose of disinformation operations, using false (partially or wholly) and mis-contextualized information, effectively burying the truth in a sea of falsehoods. Virality trumps veracity: if something trends, it makes an impact, affecting viewers’ opinions — regardless of whether or not it is true. ISIS can produce an outsized effect on Twitter because the Twitter algorithm relies on volume, meaning that trending hashtags last longer. Repetitive and high-volumes of half-true narratives are pushed by human operators and trolls/bots onto both sides of a conflict to exploit natural fissures in society. Conspiracy theorists, particularly on social media, tend to repeat the same narrative, taking advantage of the illusory truth effect. The mere effect of hearing/ reading something repeatedly makes a person more likely to believe it, regardless of its veracity. The firehose of falsehoods and misinformation from adversaries is used to soften targets and dull awareness, followed by targeted deep fakes and similar AI-enabled disinformation tools that target individuals at a more granular level. This provides our adversaries with a first mover advantage, granting them operational and tactical advantages. The U.S. Army tends to regard information operations the way it does indirect fires — a precision weapon to be wielded for specific, targeted effects. Unfortunately, information operation “counter-fires” focused on refuting deceits and seeking to re-establish truth will be overwhelmed and lost in the sheer mass of our adversaries’ disinformation campaigns.
Explore this topic further in The Convergence: True Lies – The Fight Against Disinformation with Cindy Otis and listen to the associated podcast. Also check out the following video presentations [access using a non-DoD network]: Weaponization of Social Media and Fictional Intelligence (FICINT) with Peter W. Singer and August Cole; The Content Blitz: Quantifying the Islamic State’s Impact on the Saudi Twittersphere, with Henry Mihm, Inès Oulamine, and Fiona Singer, Georgetown University; Decision Making with Keith Law, Senior Baseball Writer, The Athletic; and AI Speeding up Disinformation with LTG John D. Bansemer (USAF-Ret.), Dr. Margarita Konaev, Katerina Sedova, and Tim Hwang, Georgetown University’s Center for Security and Emerging Technologies;
9. Artificial intelligence allows for the swift creation and dissemination of precision disinformation at scale. Bad actors are using data and AI to identify how to best change the reality / perceptions of a potential target. AI can help tailor messages to adversaries, local populations, and allies, making it harder to differentiate between authentic and inauthentic information. In some cases, machines can measure quantifiable variables far better than humans can. But there are some intangible, qualitative variables and patterns that humans can spot that machines cannot. Micro-targeting through AI algorithms is used to track populations. Pattern detection and data monitoring are other functions of AI. This data can be used by authoritarian governments to track movement or identify opposition leaders and actors. The Army will be challenged to mitigate this vast quantity of data at scale, and AI can help detectors triage data in order to get answers faster.
Explore this topic further in these related posts: Influence at Machine Speed: The Coming of AI-Powered Propaganda by MAJ Chris Telley; LikeWar – The Weaponization of Social Media; Extremism on the Horizon: The Challenges of VEO Innovation by Colonel Montgomery Erfourth and Dr. Aaron Bazin; and Clara Waterman‘s paper In Defense of Data: How the DoD Can Strengthen AI Development in GEN Z and the OE: 2020 Final Findings and the associated podcast (Part 1). Also check out the following video presentations [access using a non-DoD network]: The Information Disruption Industry and the Operational Environment of the Future with Vincent H. O’Neil; AI and Manufacturing Reality with Dr. Marek Posard and Dr. Christopher Paul, RAND Corporation; AI Speeding up Disinformation with LTG John D. Bansemer (USAF-Ret.), Dr. Margarita Konaev, Katerina Sedova, and Tim Hwang, Georgetown University’s Center for Security and Emerging Technologies; and Assault on Authenticity with Kara Frederick, Fellow, Technology & National Security, CNAS.
10. As the information environment rapidly evolves, it will become necessary to introduce technologies or methods that have not been fully legislated in order to keep pace with near-peer adversaries. Trends in the information environment are outpacing laws, regulations, and policies, and there is a tension between truth and attribution. Currently, some Machine Learning (ML)-based decisions are not transparent to humans and lack an explainable logic. Decisions that require multiple levels of bureaucracy can waste valuable time. The initiative to handle information operations should be pushed through at as low a level as possible.
Explore this topic further in these related posts: Bias and Machine Learning; Megan Hogan‘s paper on Replicating Reality: Advantages and Limitations of Weaponized Deepfake Technology in GEN Z and the OE: 2020 Final Findings and the associated podcast (Part 2); and Davis Ellison‘s winning vignette “Catch a Fish” in Three Best Information Warfare Vignettes and video presentation [access using a non-DoD network].
11. Information warfare will directly affect the average American citizen, requiring a whole-of-society approach to combat it. The United States needs to structure its responses / capabilities in a way that lives up to its democratic values. The US should look at building a robust toolkit and create templates to deal with disinformation and propaganda synchronously.
-
-
- Strategic foresight is a way of thinking to develop an effective strategy that is appropriate for the moment.
-
-
-
- In terms of a national response, public/private collaboration is essential. Big tech companies often lack “geopolitical cognition,” as they are focused on the distribution of their products, not on their social or political implications. The only way the U.S. Government will be able to perform optimally is to leverage the talent and regional expertise that private sector companies recruit and employ.
-
-
-
- When considering how people are influenced: personal, social, societal, media, authority figures, economic, structural, environmental, and psychological factors affect the information that we consume and what we think.
-
-
-
- Technology solutions to consider: gamified education, ad tech, dark web monitoring tools, blockchain-based content validation, social listening, and crowd-sourced content assessments and web annotation.
-
-
-
- Verification training: provide literacy basics, focus on open-source research, and explore and leverage the abundance of information already out there.
-
-
-
- In order to combat disinformation, collaboration across the board is essential: industry and platforms, government agencies, and other sectors.
-
Explore this topic further in these related posts: Katherine Armstrong‘s paper on Transnational Repression: The Long Arm of Authoritarianism in GEN Z and the OE: 2020 Final Findings and the associated podcast (Part 1); The Convergence: The Next Iteration of Warfare with Lisa Kaplan and the associated podcast; and The Convergence: The Psychology of Terrorism and Disinformation with Dr. Aleks Nesic and the associated podcast. Also check out the following video presentations [access using a non-DoD network]: Using AI to Detect Visual Media Manipulation at Scale with Dr. Neil Johnson, Technical SETA, Information Innovation Office, DARPA; and Technology Engagement Team & Disinfo Cloud with Alexis Frisbie and Christina Nemr, Global Engagement Center, U.S. Department of State.
12. Technology can alter the balance between free and open societies and repressive and closed regimes. Disinformation can be deployed at scale with low cost and limited human capital. This is very useful for authoritarian regimes that rely on information to control their populations. What used to take armies of secret police and a vast human surveillance network can be accomplished more effectively with intrusive electronic collection, big data, and ML.
Explore this topic further in these related posts: Russia: Our Current Pacing Threat; China: Our Emergent Pacing Threat; The Hermit Kingdom in the Digital Era: Implications of the North Korean Problem for the SOF Community and The Iranian Pursuit of Military Advantage: A Forecast for the Next Seven Years by Colonel Montgomery Erfourth and Dr. Aaron Bazin; Katherine Armstrong‘s paper on Transnational Repression: The Long Arm of Authoritarianism and Michaela Flemming‘s paper on The Tech Trojan Horse: China’s Strategic Export of the Surveillance State, in GEN Z and the OE: 2020 Final Findings and the associated podcasts (Part 1 and 2).
Conclusion: Weaponized information’s low barrier of entry and cost, in conjunction with today’s hyper-connective world, has democratized its accessibility, extended its global reach, and amplified its effects. State and non-state actors and super-empowered individuals alike can now execute sophisticated information operations — manipulating and exploiting our cognitive shortcuts and biases, tapping into collective fears, exacerbating political tribalization and cultural seams, and fostering distrust of our leaders, experts, and institutions — with the goal of dividing and conquering us. This firehose of falsehoods and misinformation will soften and dull our awareness overtime, making us susceptible to subsequent targeted deep fakes and similar AI-enabled disinformation tools that target individuals at a more granular level.
This onslaught of weaponized information will directly affect American citizens and Soldiers, requiring a whole-of-society approach to combat. Public and private partnerships will enable our Federal and State governments to harness private sector expertise in building effective toolkits and templates to counter our adversaries. Harnessing the processing power of AI will be critical in detecting and countering this deluge of disinformation. Indeed, it may become necessary to introduce technologies or methods that have not been fully legislated or understood in order to keep pace with our near-peer adversaries. Whole-of-force resiliency efforts are necessary to “harden” and protect our Soldiers and their families from our adversaries’ disinformation campaigns. Above all, the US must remain proactive. Rather than countering our adversaries’ false narratives and thereby lending them traction and credence, we should promote our own narrative, espousing what we as a Nation stand for to bind our people together and expose our adversaries’ deceits.
If you enjoyed this post:
Check out The Information Environment: Competition and Conflict – A Mad Scientist Laboratory Anthology — the primer for our Weaponized Information Virtual Events…
… and watch all of the presentations from our series of Weaponized Information Virtual Events here [via a non-DoD network] and explore all of the associated content (presenter biographies, slide decks, scenarios, and notes from each of the presentations) here.