[Editor’s Note: The Information Environment is a unique space that demands our understanding, as the Internet of Things (IoT) and hyper-connectivity have democratized accessibility, extended global reach, and amplified the effects of its weaponization by our adversaries. We launched the first of our Mad Scientist Weaponized Information Virtual Events on 20 May 2020 with Georgetown University’s Center for Security Studies to better understand the ramifications of this rapidly evolving environment on our Soldiers and the Operational Environment. Today’s post captures the preliminary insights we’ve gleaned about competition and conflict in the Information Environment from the first four webinars (and a podcast) in this continuing series. Read on to learn our interim findings!]
U.S. Soldiers will face inherent disadvantages when competing in, exploiting, and winning in the Information Environment. The exponential growth of interconnectivity via networked sensors and smart devices will create increasingly complex and advanced battlespaces. In order to deny this “cyber high ground” to our adversaries, we must become adept at competing in and exploiting this Internet of Things (IoT) battlespace. Yet the very infrastructure and components comprising this battlespace will, in many cases, have already been penetrated, compromised, or optimized for our adversary’s use via their national military-civil fusion and collaboration with their telecommunications conglomerates. The U.S. Army should prepare to not only operate in, dominate, and win in an Information Environment optimized for our adversary’s exploitation, but also be able to discern when adversaries are manipulating it in order to deceive us.
Across our initial four Weaponized Information webinars, co-hosted with Georgetown University’s Center for Security Studies, Mad Scientist identified nine preliminary insights about the present and future of the Information Environment. Presenters whose work contributed to these findings include Mr. Vincent O’Neil, risk expert with Fidelity; Peter Singer and August Cole, authors and prominent Think Tank members; Dr. Marek Posard and Dr. Christopher Paul from The Rand Corporation; Cindy Otis, author and disinformation / cybersecurity expert; and Henry Mihm, Ines Oulamine, Fiona Singer, Maddox Angerhofer, Kristina Cary, Bilva Chandra, and Ido Wulkan from Georgetown University.
1. The Information Environment is rapidly changing. The evolution of Information Operations will cause disinformation to have a wider impact on a greater scale. AI will enable the creation of intelligent bots that are more difficult to detect and possess more intelligent targeting or micro-targeting strategies that increase the effectiveness of (dis)information operations, while the low barriers to entry for conducting information operations allow more actors to utilize these tactics. As the information space becomes more democratized, the homeland will become more targetable. Adversaries can use non-lethal, but very disruptive, methods to attack non-combatants, including Service members’ families. This capability is available to all 2+3 adversaries and beyond, including super-empowered individuals.
2. Bad actors can take advantage of new industry tactics that disrupt information in two major ways. When infiltrating an information system, operators can either employ “skill and daring” or “brute force and ignorance.” Skill and daring tactics involve precise, undetectable disruption to protect, alter, or remove data or to prevent its collection or distribution. Brute force and ignorance tactics are less targeted – acting to crash an entire system – to potentially cover bad actors’ tracks. The Information Disruption Industry (IDI) is coalescing as an emergent market with clients willing to pay for the removal or destruction of their data by whatever means necessary. As the IDI becomes more legitimate and commercialized as a privacy maintenance tool, it may become more difficult for the Army to exploit and protect the Information Environment as other sophisticated entities will seek to access systems and alter data for paying customers. Adapting to disruption, rather than trying to abolish it, will better ensure future preparedness for inevitable cyber-incursions.
3. Trust in information itself is being corrupted. Disinformation or weaponized information is an extremely cheap and effective tool to reach a broad audience simultaneously. There is an increasingly diverse variety of methods that bad actors can use to manipulate the information their targets receive. If an individual or subgroup is targeted, bad actors in their social media or online spaces can “gaslight” them by altering the information they are exposed to, causing them to subconsciously change their beliefs or ideas about a topic. Deepfakes are also increasing in quality and rapidly proliferating, creating challenges in verifying information that is exposed to a group or population. A lack of trust in information can disrupt command and control functions, as decision-makers need accurate and unbiased information to include in their decision analysis. This is particularly critical when it comes to data inputs for AI-assisted decision making. If the input data is somehow inaccurate or corrupted, the resulting output will be misleading, creating a distrust of AI among leaders and decision-makers.
4. Trust in our information platforms is being corrupted. In addition to the information on online platforms being subject to manipulation, bad actors are taking advantage of the platforms themselves to weaponize information. ISIS uses innocent, pre-existing hashtags to push their content onto unwitting and unsuspecting users’ Twitter feeds. Nefarious groups or actors with large social media followings take advantage of the platform’s tools and policies to publish disinformation that is widely accessible. As platforms lose legitimacy as carriers of reliable information, people may begin searching elsewhere for information, naturally seeking out information that confirms their own biases. This will create an increasingly polarized informational world where people are operating on different sets of facts and assumptions on the same topic, creating confusion and havoc about what is “the truth” as people become increasingly entrenched in their own beliefs.
5. Military technology could be vulnerable to the effects of information disruption and corruption. Some commercial developers of military technology sell a modified version of their products on the civilian market. These military systems could become attack vectors for bad actors or the IDI to exploit a system’s vulnerabilities from available commercial models. The cycle of purchasing commercial technology, including the time and expense to harden it for military use, may become increasingly unsustainable. As the technological sophistication of bad actors increases, attack vectors in such equipment will be harder to trace and patch. The result of infiltrated or corrupted military systems or data may include the denial and spoofing of GPS capability, inaccurate data supporting decision-making and mission command, erroneous ordnance targeting and guidance, and the disruption of supply and personnel movements. Suggestions for countering weaponized information range from creating an advisory authority to verify if a system has been infiltrated, to building in recovery systems and redundancies in data-based systems and operator training for manual options in the event of a corrupted program or the crash of the entire system.
6. Civilians and military families could be at increased risk for weaponized information attacks. The IDI and bad actors can use their disruptive tactics against any system that stores data, including civilian organizations. Interrupting services like banking would cause disruption and mayhem on a wide-spread scale. Such an attack could be micro-targeted against the families of Service members. Military families are already vulnerable targets for coercion and manipulation by false or misleading information, due to the young average age of military spouses and families, heightened emotional stress, desperation for information, and the potential absence of a strong support system that some families experience. Such disruptions could cause increased stress and distractions to soldiers in theater who are worried about their finances or their families at home, causing a decline in the readiness and operational focus of our troops.
7. There are a variety of tools and strategies that individuals, both military and civilian, can apply to mitigate the effects of weaponized information. Critical thinking and information analysis, online videos and training modules, and understanding the various models and methods that bad actors can use to influence people online are all ways that people can become less vulnerable to the effects of disinformation. If the families of Service members are given resources to detect instances of weaponized information, that group may become less vulnerable to coercion or suggestion. The Army could develop training modules to better prepare family members and make them aware that they could be identified and targeted. In addition to warning Army civilians and families of the risks of disinformation or data breaches, a defined set of instructions detailing the steps a family member should take if they feel that their online accounts or other personal data was attacked may also reduce the impact of weaponized information.
8. There need to be multiple ways to communicate the impacts of weaponized information. FICINT, or Fiction Intelligence, is an effective tool for communicating the potential future impacts of weaponized information by presenting them in a plausible story with memorable and relatable characters. This holistic method helps the visualization process when considering the future Information Environment. Even Congress has adopted this approach, with our presenters Peter Singer and August Cole opening the March 2020 Cyberspace Solarium Report with a compelling vignette. The impacts of weaponized information also need to be incorporated into Army exercises, wargames, and training events. In order to truly train the way we fight, weaponized information environment conditions at every level (tactical, operational, and strategic) should be emulated.
9. To combat weaponized information, military and corporate cooperation will be essential. The Army needs to work with industry professionals to combat bad actors in the IDI in order to protect user’s data or target systems. Cooperation with social media platforms to “deplatform” accounts that distribute disinformation, are linked to terrorist activity, or belong to persons or groups of interest to the Army is essential. This fight is not one that the military can win on its own. Combating weaponized information requires a Joint, interagency, and commercial response. The Army could develop a relationship where commercial companies are willing and able to immediately respond to evolving threats.
This post’s insights are by no means comprehensive — indeed, Mad Scientist’s series of Weaponized Information Virtual Events will continue beyond our day long virtual conference on 21 July 2020 (register via the link, below!), We can conclude, however, that the revolution in connected devices and virtual power projection will continue to increase the potential for our adversaries to target us across the Information Environment. Propaganda and misinformation campaigns are not novel, in fact, they are age-old military competition tactics. However, the speed of delivery and technological sophistication of fake information makes weaponized information in cyberspace available to any audience at any time. Hyper-connectivity increases the attack surface for cyber-attacks and access to publicly available information on our Soldiers and their families, making personalized warfare and the use of psychological attacks and deepfakes likely. A force deploying to a combat zone will be subjected to these attacks from the Strategic Support Area – including individual Soldiers’ personal residences, home station installations, and ports of embarkation – all the way forward to the Close Area fight during its entire deployment. These adversarial conditions should be presented in all training and simulations. If the Information Environment is not properly represented, a critical gap in Leader development could develop that would eventually have to be redressed, costing critical time and energy. Our resilience to these attacks and our ability to shape and dominate the narrative in the Information Environment are essential if we are to successfully compete with and (when necessary) engage and defeat our adversaries in armed conflict.
If you enjoyed this post, check out:
The Information Environment: Competition and Conflict – A Mad Scientist Laboratory Anthology — the primer for our Weaponized Information Virtual Events.
The Information Disruption Industry and the Operational Environment of the Future, by proclaimed Mad Scientist Vincent H. O’Neil
The Convergence: True Lies – The Fight Against Disinformation with Cindy Otis
>>> REMINDER 1: We will facilitate our capstone Weaponized Information Virtual Conference, co-sponsored by Georgetown University and the Center for Advanced Red Teaming, University at Albany, SUNY, next Tuesday, 21 July 2020. The final draft agenda for this event can be viewed here. In order to participate in this virtual conference, you must first register here [via a non-DoD network].
>>> REMINDER 2: If you missed participating in any of the previous webinars in our Mad Scientist Weaponized Information Virtual Events series — no worries! You can watch them again here [via a non-DoD network] and explore all of the associated content (presenter biographies, slide decks, scenarios, and notes from each of the presentations) here.
Disclaimer: The views expressed in this blog post do not necessarily reflect those of the Department of Defense, Department of the Army, Army Futures Command, or the Training and Doctrine Command.