[Editor’s Note: In the movie World War Z (I know… the book was way better!), an Israeli security operative describes how Israel prepared for the coming zombie plague. Their strategy was if nine men agreed on an analysis or a course of action,the tenth man had to take an alternative view.
This Devil’s Advocate or contrarian approach serves as a form ofalternative analysis and is a check against group think and mirror imaging. The Mad Scientist Laboratory will begin a series of posts entitled “The Tenth Man” to offer a platform for the contrarians in our network (I know you’re out there!) to share their alternative perspectives and analyses regarding the Future Operational Environment.]
Our foundational assumption about the Future Operational Environment is that the Character of Warfare ischanging due to an exponential convergence of emerging technologies. Artificial Intelligence, Robotics, Autonomy, Quantum Sciences, Nano Materials, and Neuro advances will mean more lethal warfare at machine speed, integrated seamlessly across all five domains – air, land, sea, cyber, and space.
We have consistently seen four main themes used to counter this idea of a changing character of war, driven by technology:
1. Cost of Robotic Warfare: All armies must plan for the need to reconstitute forces. This is particularly ingrained in the U.S. Army’s culture where we have often lost the first battles in any given conflict (e.g., Kasserine Pass in World War II and Task Force Smith in Korea). We cannot afford to have a “one loss” Army where our national wealth and industrial base can not support the reconstitution of a significant part of our Army. A high-cost, roboticized Army might also limit our political leaders’ options for the use of military force due to the risk of loss and associated cost.
2. Technology Hype: Technologists are well aware of the idea of a hype cycle when forecasting emerging technologies. Machine learning was all the rage in the 1970s, but the technology needed to drive these tools did not exist. Improved computing has finally helped us realize this vision, forty years later. The U.S. Army’s experience with the Future Combat System hits a nerve when assumptions of the future require the integration of emerging technologies.
3. Robotic Warfare: A roboticized Army is over-optimized to fight against a peer competitor, which is the least likely mission the Army will face. We build an Army and develop Leaders first and foremost to protect our Nation’s sovereignty. This means having an Army capable of deterring, and failing that, defeating peer competitors. At the same time, this Army must be versatile enough to execute a myriad of additional missions across the full spectrum of conflict. A hyper-connected Army enabled by robots with fewer Soldiers will be challenged in executing missions requiring significant human interactions such as humanitarian relief, building partner capacity, and counter-insurgency operations.
4. Coalition Warfare: A technology-enabled force will exasperate interoperability challenges with both our traditional and new allies. Our Army will not fight unilaterally on future battlefields. We have had difficulties with the interoperability of communications and have had gaps between capabilities that increased mission risks. These risks were offset by the skills our allies brought to the battlefield. We cannot build an Army that does not account for a coalition battlefield and our alliesmay not be able to afford the tech-enabled force envisioned in the Future Operational Environment.
All four of these assumptions are valid and should be further studied as we build the Army of 2028 and the Army of 2050. There are many other contrarian views about the Future Operational Environment, and so we are calling upon our network to put on their red hats and be our “Tenth Man.”
[Editor’s Note: Now that another month has flown by, Mad Scientist Laboratory is pleased to present our June edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]
I know — I cheated and gave you two articles to read. These “dueling” articles demonstrate the early state of our understanding of the role of humans in decision-making. The Harvard Business Review article describes findings where human – Artificial Intelligence (AI) partnerships take advantage of the leadership, teamwork, creativity, and social skills of humans with the speed, scalability, and quantitative capabilities of AI. This is basically the idea of “centaur” chess which has been prevalent in discussions of human and AI collaboration. Conversely, the MIT Technology Review article describes the ongoing work to build AI algorithms that are incentivized to collaborate with other AI teammates. Could it be that collaboration is not a uniquely human attribute? The ongoing work on integration of AI into the workforce and in support of CEO decision-making could inform the Army’s investment strategy for AI. Julianne Gallina, one of our proclaimed Mad Scientists, described a future where everyone would have an entourage and Commanders would have access to a “Patton in the Pocket.” How the human operates on or in the loop and how Commanders make decisions at machine speed will be informed by this research. In August, the Mad Scientist team will conduct a conference focused on Learning in 2050 to further explore the ideas of human and AI teaming with intelligent tutors and mentors.
2. Origin: A Novel, by Dan Brown, Doubleday, October 3, 2017, reviewed by Ms. Marie Murphy.
Dan Brown’s famous symbologist Robert Langdon returns to avenge the murder of his friend, tech developer and futurist Edmund Kirsch. Killed in the middle of presenting what he advertised as a life-changing discovery, Langdon teams up with Kirsch’s most faithful companion, his AI assistant Winston, in order to release Edmund’s presentation to the public. Winston is able to access Kirsch’s entire network, give real-time directions, and make decisions based on ambiguous commands — all via Kirsch’s smartphone. However, this AI system doesn’t appear to know Kirsch’s personal password, and can only enable Langdon in his mission to find it. An omnipresent and portable assistant like Winston could greatly aid future warfighters and commanders. Having this scope of knowledge on command is beneficial, but future AI will be able to not only regurgitate data, but present the Soldier with courses of action analyses and decision options based on the data. Winston was also able to mimic emotion via machine learning, which can reduce Soldier stress levels and present information in a humanistic manner. Once an AI has been attached to a Soldier for a period of time, it can learn the particular preferences and habits of that Soldier, and make basic or routine decisions and assumptions for that individual, anticipating their needs, as Winston does for Kirsch and Langdon.
Mad Scientist Laboratory readers are already familiar with the expression, “warfare at machine speed.” As our adversaries close the technology gap and potentially overtake us in select areas, there is clearly a “need for speed.”
“… speed matters — in two distinct dimensions. First, autonomy can increase decision speed, enabling the U.S. to act inside an adversary’s operations cycle. Secondly, ongoing rapid transition of autonomy into warfighting capabilities is vital if the U.S. is to sustain military advantage.” — Defense Science Board (DSB) Report on Autonomy, June 2016 (p. 3).
In his monograph, however, author and former Clinton Administration Secretary of the Navy Richard Danzig contends that “superiority is not synonymous with security;” citing the technological proliferation that almost inevitably follows technological innovations and the associated risks of unintended consequences resulting from the loss of control of military technologies. Contending that speed is a form of technological roulette, former Secretary Danzig proposes a control methodology of five initiatives to help mitigate the associated risks posed by disruptive technologies, and calls for increased multilateral planning with both our allies and opponents. Unfortunately, as with the doomsday scenario played out in Nevil Shute’s novel On the Beach, it is “… the little ones, the Irresponsibles…” that have propagated much of the world’s misery in the decades following the end of the Cold War. It is the specter of these Irresponsible nations, along with non-state actors andSuper-Empowered Individuals, experimenting with and potentially unleashing disruptive technologies, who will not be contained by any non-proliferation protocols or controls. Indeed, neither will our near-peer adversaries, if these technologies promise to offer a revolutionary, albeit fleeting, Offset capability.
This article illustrates how the Pentagon’s faith in its own technology drove the Department of Defense to trust it would maintain dominance over the electromagnetic spectrum for years to come. That decision left the United States vulnerable to new leaps in technology made by our near-peers. GEN Paul Selva, Vice Chairman of the Joint Chiefs of Staff, has concluded that the Pentagon must now keep up with near-peer nations and reestablish our dominance of electronic warfare and networking (spoiler alert – we are not!). This is an example of apink flamingo (a known, known), as we know our near-peers have surpassed us in technological dominance in some cases. In looking at technological forecasts for the next decade, we must ensure that the U.S. is making the right investments in Science and Technology to keep up with our near-peers. This article demonstrates that timely and decisive policy-making will be paramount in keeping up with our adversaries in the fast changing and agile Operational Environment.
Researchers at MIT have discovered a way to “see” people through walls by tracking WiFi signals that bounce off of their bodies. Previously, the technology limited fidelity to “blobs” behind a wall, essentially telling you that someone was present but no indication of behavior. The breakthrough is using a trained neural network to identify the bouncing signals and compare those with the shape of the human skeleton. This is significant because it could give an added degree of specificity to first responders or fire teams clearing rooms. The ability to determine if an individual on the other side of the wall is potentially hostile and holding a weapon or a non-combatant holding a cellphone could be the difference between life and death. This also brings up questions about countermeasures. WiFi signals are seemingly everywhere and, with this technology, could prove to be a large signature emitter. Will future forces need to incorporate uniforms or materials that absorb these waves or scatter them in a way that distorts them?
A study performed by the University of Maryland determined that people will recall information better when seeing it first in a 3D virtual environment, as opposed to a 2D desktop or mobile screen. TheVirtual Reality (VR) system takes advantage of what’s called “spatial mnemonic encoding” which allows the brain to not only remember something visually, but assign it a place in three-dimensional space which helps with retention and recall. This technique could accelerate learning and enhance retention when we train our Soldiers and Leaders. As the VR hardware becomes smaller, lighter, and more affordable, custom mission sets, or the skills necessary to accomplish them, could be learned on-the-fly, in theater in a compressed timeline. This also allows for education to be distributed and networked globally without the need for a traditional classroom.
This book is fascinating for two reasons: 1) It utilizes one of the greatest science fiction series (almost a genre unto itself) in order to brilliantly illustrate some military strategy concepts and 2) It is chock full of Mad Scientists as contributors. One of the editors, John Amble, is a permanent Mad Scientist team member, while another, Max Brooks, author of World War Z, and contributor, August Cole, are officially proclaimed Mad Scientists.
The book takes a number of scenes and key battles in Star Wars and uses historical analogies to help present complex issues like civil-military command structure, counterinsurgency pitfalls, force structuring, and battlefield movement and maneuver.
One of the more interesting portions of the book is the concept of ‘droid armies vs. clone soldiers and the juxtaposition of that with the future testing of manned-unmanned teaming (MUM-T) concepts. There are parallels in how we think about what machines can and can’t do and how they think and learn.
If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: email@example.com — we may select it for inclusion in our next edition of “The Queue”!