65. “The Queue”

[Editor’s Note:  Now that another month has flown by, Mad Scientist Laboratory is pleased to present our June edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]


1. Collaborative Intelligence: Humans and AI are Joining Forces, by H. James Wilson and Paul R. Daugherty, Harvard Business Review, July – August 2018.


Source: OpenAI

A Team of AI Algorithms just crushed Expert Humans in a Complex Computer Game, by Will Knight, MIT Technology Review, June 25, 2018.

I know — I cheated and gave you two articles to read. These “dueling” articles demonstrate the early state of our understanding of the role of humans in decision-making. The Harvard Business Review article describes findings where human – Artificial Intelligence (AI) partnerships take advantage of the leadership, teamwork, creativity, and social skills of humans with the speed, scalability, and quantitative capabilities of AI. This is basically the idea of “centaur” chess which has been prevalent in discussions of human and AI collaboration. Conversely, the MIT Technology Review article describes the ongoing work to build AI algorithms that are incentivized to collaborate with other AI teammates. Could it be that collaboration is not a uniquely human attribute? The ongoing work on integration of AI into the workforce and in support of CEO decision-making could inform the Army’s investment strategy for AI. Julianne Gallina, one of our proclaimed Mad Scientists, described a future where everyone would have an entourage and Commanders would have access to a “Patton in the Pocket.” How the human operates on or in the loop and how Commanders make decisions at machine speed will be informed by this research. In August, the Mad Scientist team will conduct a conference focused on Learning in 2050 to further explore the ideas of human and AI teaming with intelligent tutors and mentors.

Source: Doubleday

2. Origin: A Novel, by Dan Brown, Doubleday, October 3, 2017, reviewed by Ms. Marie Murphy.

Dan Brown’s famous symbologist Robert Langdon returns to avenge the murder of his friend, tech developer and futurist Edmund Kirsch. Killed in the middle of presenting what he advertised as a life-changing discovery, Langdon teams up with Kirsch’s most faithful companion, his AI assistant Winston, in order to release Edmund’s presentation to the public. Winston is able to access Kirsch’s entire network, give real-time directions, and make decisions based on ambiguous commands — all via Kirsch’s smartphone. However, this AI system doesn’t appear to know Kirsch’s personal password, and can only enable Langdon in his mission to find it. An omnipresent and portable assistant like Winston could greatly aid future warfighters and commanders. Having this scope of knowledge on command is beneficial, but future AI will be able to not only regurgitate data, but present the Soldier with courses of action analyses and decision options based on the data. Winston was also able to mimic emotion via machine learning, which can reduce Soldier stress levels and present information in a humanistic manner. Once an AI has been attached to a Soldier for a period of time, it can learn the particular preferences and habits of that Soldier, and make basic or routine decisions and assumptions for that individual, anticipating their needs, as Winston does for Kirsch and Langdon.

Source: Getty Images adapted by CNAS

3. Technology Roulette: Managing Loss of Control as Many Militaries Pursue Technological Superiority, by Richard Danzig, Center for a New American Security, 30 May 2018.

Mad Scientist Laboratory readers are already familiar with the expression, “warfare at machine speed.” As our adversaries close the technology gap and potentially overtake us in select areas, there is clearly a “need for speed.”

“… speed matters — in two distinct dimensions. First, autonomy can increase decision speed, enabling the U.S. to act inside an adversary’s operations cycle. Secondly, ongoing rapid transition of autonomy into warfighting capabilities is vital if the U.S. is to sustain military advantage.” — Defense Science Board (DSB) Report on Autonomy, June 2016 (p. 3).

In his monograph, however, author and former Clinton Administration Secretary of the Navy Richard Danzig contends that “superiority is not synonymous with security;” citing the technological proliferation that almost inevitably follows technological innovations and the associated risks of unintended consequences resulting from the loss of control of military technologies. Contending that speed is a form of technological roulette, former Secretary Danzig proposes a control methodology of five initiatives to help mitigate the associated risks posed by disruptive technologies, and calls for increased multilateral planning with both our allies and opponents. Unfortunately, as with the doomsday scenario played out in Nevil Shute’s novel On the Beach, it is “… the little ones, the Irresponsibles…” that have propagated much of the world’s misery in the decades following the end of the Cold War. It is the specter of these Irresponsible nations, along with non-state actors and Super-Empowered Individuals, experimenting with and potentially unleashing disruptive technologies, who will not be contained by any non-proliferation protocols or controls. Indeed, neither will our near-peer adversaries, if these technologies promise to offer a revolutionary, albeit fleeting, Offset capability.

U.S. Vice Chairman of the Joint Chiefs of Staff Air Force Gen. Paul Selva, Source: Alex Wong/Getty Images

4. The US made the wrong bet on radiofrequency, and now it could pay the price, by Aaron Metha, C4ISRNET, 21 Jun 2018.

This article illustrates how the Pentagon’s faith in its own technology drove the Department of Defense to trust it would maintain dominance over the electromagnetic spectrum for years to come.  That decision left the United States vulnerable to new leaps in technology made by our near-peers. GEN Paul Selva, Vice Chairman of the Joint Chiefs of Staff, has concluded that the Pentagon must now keep up with near-peer nations and reestablish our dominance of electronic warfare and networking (spoiler alert – we are not!).  This is an example of a pink flamingo (a known, known), as we know our near-peers have surpassed us in technological dominance in some cases.  In looking at technological forecasts for the next decade, we must ensure that the U.S. is making the right investments in Science and Technology to keep up with our near-peers. This article demonstrates that timely and decisive policy-making will be paramount in keeping up with our adversaries in the fast changing and agile Operational Environment.


5. MIT Device Uses WiFi to ‘See’ Through Walls and Track Your Movements, by Kaleigh Rogers, MOTHERBOARD, 13 June 2018.

Researchers at MIT have discovered a way to “see” people through walls by tracking WiFi signals that bounce off of their bodies. Previously, the technology limited fidelity to “blobs” behind a wall, essentially telling you that someone was present but no indication of behavior. The breakthrough is using a trained neural network to identify the bouncing signals and compare those with the shape of the human skeleton. This is significant because it could give an added degree of specificity to first responders or fire teams clearing rooms. The ability to determine if an individual on the other side of the wall is potentially hostile and holding a weapon or a non-combatant holding a cellphone could be the difference between life and death. This also brings up questions about countermeasures. WiFi signals are seemingly everywhere and, with this technology, could prove to be a large signature emitter. Will future forces need to incorporate uniforms or materials that absorb these waves or scatter them in a way that distorts them?

Source: John T. Consoli / University of Maryland

6. People recall information better through virtual reality, says new UMD study, University of Maryland, EurekaAlert, 13 June 2018.

A study performed by the University of Maryland determined that people will recall information better when seeing it first in a 3D virtual environment, as opposed to a 2D desktop or mobile screen. The Virtual Reality (VR) system takes advantage of what’s called “spatial mnemonic encoding” which allows the brain to not only remember something visually, but assign it a place in three-dimensional space which helps with retention and recall. This technique could accelerate learning and enhance retention when we train our Soldiers and Leaders. As the VR hardware becomes smaller, lighter, and more affordable, custom mission sets, or the skills necessary to accomplish them, could be learned on-the-fly, in theater in a compressed timeline. This also allows for education to be distributed and networked globally without the need for a traditional classroom.

Source: Potomac Books

7. Strategy Strikes Back: How Star Wars Explains Modern Military Conflict, edited by Max Brooks, John Amble, ML Cavanaugh, and Jaym Gates; Foreword by GEN Stanley McChrystal, Potomac Books, May 1, 2018.

This book is fascinating for two reasons:  1) It utilizes one of the greatest science fiction series (almost a genre unto itself) in order to brilliantly illustrate some military strategy concepts and 2) It is chock full of Mad Scientists as contributors. One of the editors, John Amble, is a permanent Mad Scientist team member, while another, Max Brooks, author of World War Z, and contributor, August Cole, are officially proclaimed Mad Scientists.

The book takes a number of scenes and key battles in Star Wars and uses historical analogies to help present complex issues like civil-military command structure, counterinsurgency pitfalls, force structuring, and battlefield movement and maneuver.

One of the more interesting portions of the book is the concept of ‘droid armies vs. clone soldiers and the juxtaposition of that with the future testing of manned-unmanned teaming (MUM-T) concepts. There are parallels in how we think about what machines can and can’t do and how they think and learn.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

49. “The Queue”

(Editor’s Note: Beginning today, the Mad Science Laboratory will publish a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the previous month. In this anthology, we will address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!)

1. Army of None: Autonomous Weapons and the Future of War, by Paul Scharre, Senior Fellow and Director of the Technology and National Security Program, Center for a New American Security.

One of our favorite Mad Scientists, Paul Scharre, has authored a must read for all military Leaders. This book will help Leaders understand the definitions of robotic and autonomous weapons, how they are proliferating across states, non-states, and super-empowered individuals (his chapter on Garage Bots makes it clear this is not state proliferation analogous), and lastly the ethical considerations that come up at every Mad Scientist Conference. During these Conferences, we have discussed the idea of algorithm vs algorithm warfare and what role human judgement plays in this version of future combat. Paul’s chapters on flash war really challenge our ideas of how a human operates in the loop and his analogies using the financial markets are helpful for developing the questions needed to explore future possibilities and develop policies for dealing with warfare at machine speed.

Source: Rosoboronexport via YouTube
2. “Convergence on retaining human control of weapons systems,” in Campaign to Stop Killer Robots, 13 April 2018.

April 2018 marked the fifth anniversary of the Campaign to Stop Killer Robots. Earlier this month, 82 countries and numerous NGOs also convened at the Convention on Certain Conventional Weapons (CCW) in Geneva, Switzerland, where many stressed the need to retain human control over weapons systems and the use of force. While the majority in attendance proposed moving forward this November to start negotiations towards a legally binding protocol addressing fully autonomous weapons, five key states rejected moving forward in negotiating new international law – France, Israel, Russia, the United Kingdom, and the United States. Mad Scientist notes that the convergence of a number of emerging technologies (synthetic prototyping, additive manufacturing, advanced modeling and simulations, software-defined everything, advanced materials) are advancing both the feasibility and democratization of prototype warfare, enabling and improving the engineering of autonomous weapons by non-state actors and super-empowered individuals alike. The genie is out of the bottle – with the advent of the Hyperactive Battlefield, advanced engagements will collapse the decision-action cycle to mere milliseconds, granting a decisive edge to the side with more autonomous decision-action.

Source: The Stack
3. “China’s Strategic Ambiguity and Shifting Approach to Lethal Autonomous Weapons Systems,” by Elsa Kania, Adjunct Fellow with the Technology and National Security Program, Center for a New American Security, in Lawfare, 17 Apr 18.

Mad Scientist Elsa Kania addresses the People’s Republic of China’s apparent juxtaposition between their diplomatic commitment to limit the use of fully autonomous lethal weapons systems and the PLA’s active pursuit of AI dominance on the battlefield. The PRC’s decision on lethal autonomy and how it defines the role of human judgement in lethal operations will have tactical, operational, and strategic implications. In TRADOC’s Changing Character of Warfare assessment, we addressed the idea of an asymmetry in ethics where the differing ethical choices non-state and state adversaries make on the integration of emerging technologies could have real battlefield overmatch implications. This is a clear pink flamingo where we know the risks but struggle with addressing the threat. It is also an area where technological surprise is likely, as systems could have the ability to move from human in the loop mode to fully autonomous with a flip of a switch.

Source: HBO.com
4. “Maeve’s Dilemma in Westworld: What Does It Mean to be Free?,” by Marco Antonio Azevedo and Ana Azevedo, in Institute of Art and Ideas, 12 Apr 18. [Note: Best viewed on your personal device as access to this site may be limited by Government networks]

While this article focuses primarily on a higher-level philosophical interpretation of human vs. machine (or artificial intelligence, being, etc.), the core arguments and discussion remain relevant to an Army that is looking to increase its reliance on artificial intelligence and robotics. Technological advancements in these areas continue to trend toward modeling humans (both in form and the brain). However, the closer we get to making this a reality, the closer we get to confronting questions about consciousness and artificial humanity. Are we prepared to face these questions earnestly? Do we want an artificial entity that is, essentially, human? What do we do when that breakthrough occurs? Does biological vs. synthetic matter if the being “achieves” personhood? For additional insights on this topic, watch Linda MacDonald Glenn‘s Ethics and Law around the Co-Evolution of Humans and AI presentation from the Mad Scientist Visualizing Multi Domain Battle in 2030-2050 Conference at Georgetown University, 25-26 Jul 17.

5. Do You Trust This Computer?, directed by Chris Paine, Papercut Films, 2018.

The Army, and society as a whole, is continuing to offload certain tasks and receive pieces of information from artificial intelligence sources. Future Army Leaders will be heavily influenced by AI processing and distributing information used for decision making. But how much trust should we put in the information we get? Is it safe to be so reliant? What should the correct ratio be of human/machine contribution to decision-making? Army Leaders need to be prepared to make AI one tool of many, understand its value, and know how to interpret its information, when to question its output, and apply appropriate context. Elon Musk has shown his support for this documentary and tweeted about its importance.

6. Ready Player One, directed by Steven Spielberg, Amblin Entertainment, 2018.

Adapted from the novel of the same name, this film visualizes a future world where most of society is consumed by a massive online virtual reality “game” known as the OASIS. As society transitions from the physical to the virtual (texting, email, skype, MMORPG, Amazon, etc.), large groups of people will become less reliant on the physical world’s governmental and economic systems that have been established for centuries. As virtual money begins to have real value, physical money will begin to lose value. If people can get many of their goods and services through a virtual world, they will become less reliant on the physical world. Correspondingly, physical world social constructs will have less control of the people who still inhabit it, but spend increasing amounts of time interacting in the virtual world. This has huge implications for the future geo-political landscape as many varied and geographically diverse groups of people will begin congregating and forming virtual allegiances across all of the pre-established, but increasingly irrelevant physical world geographic borders. This will dilute the effectiveness, necessity, and control of the nation-state and transfer that power to the company(ies) facilitating the virtual environment.

Source: XO, “SoftEcologies,” suckerPUNCH
7. “US Army could enlist robots inspired by invertebrates,” by Bonnie Burton, in c/net, 22 Apr 18.

As if Boston Dynamic’s SpotMini isn’t creepy enough, the U.S. Army Research Laboratory (ARL) and the University of Minnesota are developing a flexible, soft robot inspired by squid and other invertebrates that Soldiers can create on-demand using 3-D printers on the battlefield. Too often, media visualizations have conditioned us to think of robots in anthropomorphic terms (with corresponding limitations). This and other breakthroughs in “soft,” polymorphic, printable robotics may grant Soldiers in the Future Operational Environment with hitherto unimagined on-demand, tailorable autonomous systems that will assist operations in the tight confines of complex, congested, and non-permissive environments (e.g., dense urban and subterranean). Soft robotics may also prove to be more resilient in arduous conditions. This development changes the paradigm for how robotics are imagined in both design and application.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

For additional insights into the Mad Scientist Initiative and how we continually explore the future through collaborative partnerships and continuous dialogue with academia, industry, and government, check out this Spy Museum’s SPYCAST podcast.