179. A New Age of Terror: New Mass Casualty Terrorism Threats

[Editor’s Note:  Mad Scientist Laboratory is pleased to publish today’s post by returning guest blogger Zachary Kallenborn, continuing his New Age of Terror series.  The democratization of unmanned air, ground, sea, and subsea systems and the proliferation of cyber-physical systems (e.g., automated plants) provide lesser states, non-state actors, and super-empowered individuals with new capabilities to conduct long-range precision fires and generate global non-kinetic effects resulting in mass casualty events. The potential weaponization of these otherwise benign capabilities pose new vulnerabilities to those who fail to remain vigilant and imagine the unthinkable — beware!]

A loud buzz pierced the quiet night air. A group of drones descended on a chemical plant near New York City. The drones disperse throughout the installation in search of storage tanks. A few minutes later, the buzz of the drone propellers was drowned out by loud explosions. A surge of fire leapt to the sky. A plume of gas followed, floating towards the nearby city. The gas killed thousands and thousands more were hospitalized with severe injuries.

The rapid proliferation of unmanned systems and cyber-physical systems offer terrorists new, easier means of carrying out mass casualty attacks. Drones allow terrorists to reduce their operational risk and acquire relatively low cost platforms. Cyber attacks require few resources and could cause significant harm, though a lack of expertise limits terrorist ability to inflict harm. Terrorists may prefer these methods to difficult-to-acquire and risky chemical, biological, radiological, and nuclear (CBRN) weapons.

Drones

Drones offer terrorists low cost methods of delivering harm with lower risk to attacker lives. Drone attacks can be launched from afar, in a hidden position, close to an escape route. Simple unmanned systems can be acquired easily: Amazon.com offers seemingly hundreds of drones for as low as $25. Of course, low cost drones also mean lower payloads that limit the harm caused, often significantly. Improvements to drone autonomy will allow terrorists to deploy more drones at once, including in true drone swarms.1 Terrorists can mount drone attacks across air, land, and sea.

Aerial drones allow attackers to evade ground-based defenses and could be highly effective in striking airports, chemical facilities, and other critical infrastructure. Houthi rebels in Yemen have repeatedly launched drone strikes on Saudi oil pipelines and refineries.2  Recent drone attacks eliminated half of Saudi oil production capacity.3  Attacks on chemical facilities are likely to be particularly effective. A chemical release would not require large amounts of explosives and could cause massive harm, as in the Bhopal gas accident that killed thousands. Current Department of Homeland Security Chemical Facility Anti-Terrorism Standards do not require any meaningful defenses against aerial attack.4  Alternatively, even small drones can cause major damage to airplane wings or engines, potentially risking bringing a plane down.5  In December 2018, that risk alone was enough to ground hundreds of flights at Gatwick airport south of London when drones were spotted close to the runway.

Self-driving cars also provide a means of mass casualty attack. Waymo, Uber, and several other companies seek to launch a self-driving taxi service, open to the public. Terrorists could request multiple taxis, load them with explosives or remotely operated weapons, and send them out to multiple targets. Alternatively, terrorists could launch multi-stage attacks on the same target: a first strike causes first responders to mass and subsequent attacks hit the responders. In fact, ISIS has reportedly considered this option.6

For a few hundred dollars, anyone can rent a semi-autonomous surface vessel that can carry up to 35lbs.7  No license or registration is necessary.8  Although a surface attack limits terrorists to maritime targets, potential still exists for significant harm. Terrorists can strike popular tourist sites like the Statue of Liberty or San Francisco’s Fisherman’s Wharf. U.S. military vessels are ideal targets too, such as the USS Cole bombing in October 2000.9  But drones are not the only new method of attack.

Cyber-physical systems

Like drones, cyber attacks are low cost and reduce operational risks. Cyber attacks can be launched from secure locations, even on the other side of the world. Terrorists also gain high levels of autonomy that will inhibit law enforcement responses.10  Although cyberterrorism requires significant technical know-how, terrorists require few resources other than a computer to carry out an attack.

Cyber attacks could target chemical facilities, airplanes, and other critical infrastructure targets. In 2000, Vitek Boden infiltrated computers controlling the sewage system of Maroochy Shire, Australia, and released hundreds of thousands of gallons of raw sewage into the surrounding area.11  Boden could have caused even more harm if he wished.12  Although Boden’s attack primarily harmed the environment, other attacks could threaten human life. Cyber attacks could disable safety systems at chemical facilities, risking an accidental toxic gas release or explosions. A cyber assault on a Saudi petrochemical facility in August 2017 reportedly had that exact goal.13

However, cyber expertise and specific target knowledge is likely to be a significant inhibitor. Although attacks on critical infrastructure may require specialist knowledge of the control system and administrative operations, protective measures are not always implemented, leaving targets vulnerable.14  Boden was successful in large part because he worked closely with the sewage system’s control systems. Although terrorists have defaced websites and conducted denial of service attacks, known terrorist organizations do not currently possess the capabilities to mount a major destructive cyber attack.15  The availability of the necessary human capital is a strong factor in whether terrorists pursue cyber attacks.16  Nonetheless, the risk is likely to grow as terrorists develop greater cyber capabilities, increased connectivity creates new opportunities for attack, and the black market for cybercrime tools grows.17

The Future Operational Environment

Hot-zone team members from Hawaii’s Chemical, Biological, Radiological, Nuclear, and High-Yield Explosive, Enhanced-Response-Force-Package Team (CERFP) process simulated casualties through a decontamination zone during an exercise this spring. /  Source: U.S. Air National Guard photo by Senior Airman John Linzmeier

If terrorists have new avenues of mass casualty attack, U.S. forces must devote more resources to force protection and emergency response. U.S. forces may be called upon to aid local, state, and federal emergency responders in the event of a mass casualty attack. Likewise, U.S. troops may face risks themselves: cyber and drone attacks could certainly target U.S. military installations. Even attacks that do not kill can cause significant harm: disrupting airport operations as in the 2018 Gatwick drone incident may delay troop resupply, troop deployment, or close air support to Soldiers in the field. The U.S. military and the broader national security community must rethink its approach to mass casualty terrorism to respond to these threats. Terrorist groups have typically required CBRN weapons to cause mass harm. But if you can kill thousands in a drone attack, why bother with risky, difficult-to-acquire CBRN weapons?

For more information on this threat trend, see Non-State Actors and Their Uses of Emerging Technology, presented by Dr. Gary Ackerman, National Consortium for the Study of Terrorism and Responses to Terrorism, University of Maryland, at the Mad Scientist Robotics, Artificial Intelligence & Autonomy Conference at the Georgia Tech Research Institute, Atlanta, Georgia, 7-8 March 2017…

… as well as the following related Mad Scientist Laboratory posts:

– Zachary Kallenborn‘s previous post, A New Age of Terror: The Future of CBRN Terrorism.

– Marie Murphy‘s post, Trouble in Paradise: The Technological Upheaval of Modern Political and Economic Systems

The Democratization of Dual Use Technology

Autonomy Threat Trends

The Future of the Cyber Domain

Emergent Threat Posed by Super-Empowered Individuals

… and crank up Love and Terror by The Cinematics!

Zachary Kallenborn is a freelance researcher and analyst, specializing in Chemical, Biological, Radiological, and Nuclear (CBRN) weapons, CBRN terrorism, drone swarms, and emerging technologies writ large. His research has appeared in the Nonproliferation Review, Studies in Conflict and Terrorism, Defense One, the Modern War Institute at West Point, and other outlets. His most recent study, Swarming Destruction: Drone Swarms and CBRN Weapons, examines the threats and opportunities of drone swarms for the full scope of CBRN weapons.

Disclaimer: The views expressed in this blog post do not necessarily reflect those of the Department of Defense, Department of the Army, Army Futures Command (AFC), or Training and Doctrine Command (TRADOC).


1 Amy Hocraffer and Chang S. Nam, “A Meta-analysis of Human–System Interfaces in Unmanned Aerial Vehicle (UAV) Swarm Management,” Applied Ergonomics, Vol. 58 (2017), pp. 66–80, http://www.researchgate.net/profile/Chang_Nam5/publication/303782432_A_meta-analysis_of_human-system_interfaces_in_unmanned_aerial_vehicle_UAV_swarm_management/links/5767f71f08ae1658e2f8b435.pdf

2 Natasha Turak, “Oil Prices Jump as Saudi Energy Minister Reports Drone ‘Terrorism’ Against Pipeline Infrastructure,” CNBC, May 14, 2019, https://www.cnbc.com/2019/05/14/oil-jumps-as-saudi-energy-minister-reports-drone-terrorism-against-pipeline.html

3 John Defterios and Victoria Cavaliere, “Coordinated Strikes Knock Out Half of Saudi Oil Capacity, More Than 5 Million Barrels a Day,” CNN, September 15, 2019, https://www.cnn.com/2019/09/14/business/saudi-oil-output-impacted-drone-attack/index.html

4 Department of Homeland Security, “Risk-Based Performance Standards Guidance: Chemical Facility Anti-Terrorism Standards,” May 2009, 15, 85.

5 Peter Dockrill, “Here’s What it Looks Like When a Drone Smashes into a Plane Wing at 238 MPH,” ScienceAlert, October 22, 2018, https://www.sciencealert.com/this-is-what-it-looks-like-drone-smashes-into-plane-s-wing-238-mph-mid-air-collision-aircraft-impact

6 Lia Eustachewich, “Terrorist Wannabes Plotted Self-Driving Car Bomb Attack: Authorities,” New York Post, September 4, 2018, https://nypost.com/2018/09/04/terrorist-wannabes-plotted-self-driving-car-bomb-attack-authorities/

7 AllTerra, “AllTerra Rental Rates,” May 3, 2019, https://allterracentral.com/pub/media/wysiwyg/AllTerra_Rental_Rates-5.3.19.pdf

8 Phone conversation with USV retailer.

9 CNN Library, “USS Cole Bombing Fast Facts,” CNN, March 27, 2019, https://www.cnn.com/2013/09/18/world/meast/uss-cole-bombing-fast-facts/index.html

10 Steve S. Sin, Laura A. Blackerby, Elvis Asiamah, and Rhyner Washburn, “Determining Extremist Organisations’ Likelihood of Conducting Cyber Attacks,” 2016 8th International Conference on Cyber Conflict, May 31 to June 3, 2016, http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumber=7529428&tag=1

11 Marshall Abrams and Joe Weiss, “Malicious Control System Cyber Security Attack Case Study – Maroochy Water Services, Australia,” MITRE, July 23, 2008, https://www.mitre.org/sites/default/files/pdf/08_1145.pdf

12 Nabil Sayfayn and Stuart Madnick, “Cybersafety Analysis of the Maroochy Shire Sewage Spill (Preliminary Draft),” Cybersecurity Interdisciplinary Systems Laboratory, May 2017, http://web.mit.edu/smadnick/www/wp/2017-09.pdf

13 Nicole Perlroth and Clifford Krauss, “A Cyberattack in Saudi Arabia had a Deadly Goal. Experts Fear Another Try,” New York Times, March 15, 2018, https://www.nytimes.com/2018/03/15/technology/saudi-arabia-hacks-cyberattacks.html

14 Noguchi Mutsuo and Ueda Hirofumi, “An Analysis of the Actual Status of Recent Cyberattacks on Critical Infrastructure,” NEC Technical Journal, Vol. 12, No. 2, January 2018, https://www.nec.com/en/global/techrep/journal/g17/n02/pdf/170204.pdf

15 Tamara Evan, Eireann Leverett, Simon Ruffle, Andrew Coburn, James Bourdeau, Rohan Gunaratna, and Daniel Ralph, “Cyber Terrorism: Assessment of the Threat to Insurance,” Cambridge Centre for Risk Studies – Cyber Terrorism Insurance Futures 2017, November 2017, https://www.jbs.cam.ac.uk/fileadmin/user_upload/research/centres/risk/downloads/pool-re-cyber-terrorism.pdf

16 Steve S. Sin, et al, “Determining Extremist Organisations’ Likelihood of Conducting Cyber Attacks.”

17 Lillian Ablon, Martin C. Libicki, and Andrea A. Golay, “Markets for Cybercrime Tools and Stolen Data: Hacker’s Bazaar,” RAND, 2014, https://www.rand.org/content/dam/rand/pubs/research_reports/RR600/RR610/RAND_RR610.pdf

138. “The Monolith”

The Monolith set from the dawn of man sequence, 2001: A Space Odyssey, Metro-Goldwyn-Mayer (1968) / Source: Wikimedia Commons

[Editor’s Note: Mad Scientist Laboratory is pleased to introduce a new, quarterly feature, entitled “The Monolith.” Arthur C. Clarke and Stanley Kubrick fans alike will recognize and appreciate our allusion to the alien artifact responsible for “uplifting” mankind from primitive, defenseless hominids into tool using killers — destined for the stars — from their respective short story, “The Sentinel,” and movie, “2001: A Space Odyssey.” We hope that you will similarly benefit from this post (although perhaps in not quite so evolutionary a manner!), reflecting the Mad Scientist Teams’ collective book and movie recommendations — Enjoy!]

Originally published by PublicAffairs on 5 October 2017

The Future of War by Sir Lawrence Freedman. The evolution of warfare has taken some turns that were quite unexpected and were heavily influenced by disruptive technologies of the day. Sir Lawrence examines the changing character of warfare over the last several centuries, how it has been influenced by society and technology, the ways in which science fiction got it wrong and right, and how it might take shape in the future. This overarching look at warfare causes one to pause and consider whether we may be asking the right questions about future warfare.

 

Royal Scots Guardsmen engaging the enemy with a Lewis Machine Gun / Source:  Flickr

They Shall Not Grow Old directed by Sir Peter Jackson. This lauded 2018 documentary utilizes original film footage from World War I (much of it unseen for the past century) that has been digitized, colorized, upscaled, and overlaid with audio recordings from British servicemen who fought in the war. The divide between civilians untouched by the war and service members, the destructive impact of new disruptive technologies, and the change they wrought on the character of war resonate to this day and provide an excellent historical analogy from which to explore future warfare.

Gene Simmons plays a nefarious super empowered individual in Runaway

Runaway directed by Michael Crichton. This film, released in 1984, is set in the near future, where a police officer (Tom Selleck) and his partner (Cynthia Rhodes) specialize in neutralizing malfunctioning robots. A rogue killer robot – programmed to kill by the bad guy (Gene Simmons) – goes on homicidal rampage. Alas, the savvy officers begin to uncover a wider, nefarious plan to proliferate killer robots. This offbeat Sci-Fi thriller illustrates how dual-use technologies in the hands of super-empowered individuals could be employed innovatively in the Future Operational Environment. Personalized warfare is also featured, as a software developer’s family is targeted by the ‘bad guy,’ using a corrupted version of the very software he helped create. This movie illustrates the potential for everyday commercial products to be adapted maliciously by adversaries, who, unconstrained ethically, can out-innovate us with convergent, game changing technologies (robotics, CRISPR, etc.).

Originally published by Macmillan on 1 May 2018

The Military Science of Star Wars by George Beahm. Storytelling is a powerful tool used to visualize the future, and Science Fiction often offers the best trove of ideas. The Military Science of Star Wars by George Beahm dissects and analyzes the entirety of the Star Wars Universe to mine for information that reflects the real world and the future of armed conflict. Beahm tackles the personnel, weapons, technology, tactics, strategy, resources, and lessons learned from key battles and authoritatively links them to past, current, and future Army challenges. Beahm proves that storytelling, and even fantasy (Star Wars is more a fantasy story than a Science Fiction story), can teach us about the real world and help evolve our thinking to confront problems in new and novel ways. He connects the story to the past, present, and future Army and asks important questions, like “What makes Han Solo a great military Leader?”, “How can a military use robots (Droids) effectively?”, and most importantly, “What, in the universe, qualified Jar Jar Binks to be promoted to Bombad General?”.

Ex Machina, Universal Pictures (2014) / Source: Vimeo

Ex Machina directed by Alex Garland. This film, released in 2014, moves beyond the traditional questions surrounding the feasibility of Artificial Intelligence (AI) and the Turing test to explore the darker side of synthetic beings, knowing that it is achievable and that the test can be passed. The film is a cautionary tale of what might be possible at the extreme edge of AI computing and innovation where control may be fleeting or even an illusion. The Army may never face the same consequences that the characters in the film face, but it can learn from their lessons. AI is a hotly debated topic with some saying it will bring about the end of days, and others saying generalized AI will never exist. With a future this muddy, one must be cautious of exploring new and undefined technology spaces that carry so much risk. As more robotic entities are operationalized, and AI further permeates the battlefield, future Soldiers and Leaders would do well to stay abreast of the potential for volatility in an already chaotic environment. If Military AI progresses substantially, what will happen when we try to turn it off?

Astronaut and Lunar Module pilot Buzz Aldrin is pictured during the Apollo 11 extravehicular activity on the moon / Source: NASA

Apollo 11 directed by Todd Douglas Miller. As the United States prepares to celebrate the fiftieth anniversary of the first manned mission to the lunar surface later this summer, this inspiring documentary reminds audiences of just how audacious an achievement this was. Using restored archival audio recordings and video footage (complemented by simple line animations illustrating each of the spacecrafts’ maneuver sequences), Todd Miller skillfully re-captures the momentousness of this historic event, successfully weaving together a comprehensive point-of-view of the mission. Watching NASA and its legion of aerospace contractors realize the dream envisioned by President Kennedy eight years before serves to remind contemporary America that we once dared and dreamed big, and that we can do so again, harnessing the energy of insightful and focused leadership with the innovation of private enterprise. This uniquely American attribute may well tip the balance in our favor, given current competition and potential future conflicts with our near-peer adversaries in the Future Operational Environment.

Originally published by Penguin Random House on 3 July 2018

Artemis by Andy Weir. In his latest novel, following on the heels of his wildly successful The Martian, Andy Weir envisions an established lunar city in 2080 through the eyes of Jasmine “Jazz” Bashara, one of its citizen-hustlers, who becomes enmeshed in a conspiracy to control the tremendous wealth generated from the space and lunar mineral resources refined in the Moon’s low-G environment. His suspenseful plot, replete with descriptions of the science and technologies necessary to survive (and thrive!) in the hostile lunar environment, posits a late 21st century rush to exploit space commodities. The resultant economic boom has empowered non-state actors as new competitors on the global — er, extraterrestrial stage — from the Kenya Space Corporation (blessed by its equatorial location and reduced earth to orbit launch costs) to the Sanchez Aluminum mining and refining conglomerate, controlled by a Brazilian crime syndicate scheming to take control of the lunar city. Readers are reminded that the economic hegemony currently enjoyed by the U.S., China, and the E.U. may well be eclipsed by visionary non-state actors who dare and dream big enough to exploit the wealth that lies beyond the Earth’s gravity well.

94. The Wide Range of Competition

[Editor’s Note: Mad Scientist tracks convergence trends that are changing the character of future warfare. The democratization of technologies and the global proliferation of information is one of these trends that has expanded the arena of high-end threat capabilities beyond nation-states to now include non-state actors and super-empowered individuals. Today’s post illustrates how the democratization of one such capability,  biotechnology, affects the Future Operational Environment.]

As discussed during the Mad Scientist Bio Convergence and Soldier 2050 Conference, co-hosted with SRI International at Menlo Park, California last Spring, the broad advancement of biotechnologies will provide wide access to dangerous and powerful bioweapons and human enhancement. The low cost and low expertise entry point into gene editing, human performance enhancement, and bioweapon production has spurred a string of new explorations into this arena by countries with large defense budgets (e.g., China), non-state criminal and terrorist organizations (e.g., ISIS), and even super-empowered individuals willing to subject their bodies to experimental and risky treatments or augmentations.

China has invested billions of dollars into biotechnology – including in several U.S. biotechnology firms – and plans on focusing on their own bio revolution. Gene editing is one of the areas where China has sought to leapfrog the United States through ambitious Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) projects, editing the genes of 86 individuals, while the United States is just now approaching human trials. Additionally, Elsa Kania, an expert on Chinese emerging technology from the Center for the New American Security (CNAS), noted that China is now seeking to build its own innovation base rather than focusing on intellectual property theft and technology transfers.

Listen to Ms. Kania’s discussion addressing technological priorities and how they overlay on the Chinese government’s strategic objectives in the  China’s Quest for Enhanced Military Technology podcast, hosted by our colleagues at Modern War Institute.

Non-state actors – mainly terrorist organizations – have focused more on weaponizing biotechnology. A personal laptop belonging to ISIS that was captured in Syria, was found to contain lessons on making bubonic plague bombs and the employment of various weapons of mass destruction (WMDs). The possession of this dangerous information by the most notorious terrorist organization across the globe is a testament to the worldwide proliferation of information. This challenge of weaponized biotechnology is exacerbated by the relative ease of obtaining material to carry out such attacks.

Watch Dr. Gary Ackerman‘s presentation on Non-State Actors and their Uses of Technology from the Mad Scientist Artificial Intelligence, Robotics, and Autonomy: Visioning Mult-Domain Battle in 2030-2050 Conference at Georgetown University, 7-8 March 2017.

There is a growing community of individual biohackers and “do it yourselfers” (DIYers), super-empowered individuals pushing the boundaries of DNA editing, implants, embedded technologies (embeds), and unapproved chemical and biological injections. One of the most prominent biohackers, Josiah Zayner, a former NASA employee with a biophysics PhD, who livestreamed his self-injection of CRISPR and has even started a company selling DIY CRISPR kits ranging from several hundred to over 1000 dollars, effectively enabling biohackers to cheaply change their physiology, alter their appearance, and go beyond human biological norms. None of these treatments and augmentations are approved by regulatory agencies and DIYers run the serious risk of harming themselves or unleashing destructive and disruptive biological agents upon an unwitting population.

Read our Mad Scientist Laboratory blog post on the Emergent Threat Posed by Super-Empowered Individuals .

Biotechnology is just one example of how potentially game changing capabilities that were once only within the purview of our strategic competitors will be democratized via the global proliferation of information.  In the Future Operational Environment, we can also expect to see artificial intelligence, multi-domain swarming, and space capabilities in the hands of non-state and super-empowered individuals.

85. Benefits, Vulnerabilities, and the Ethics of Soldier Enhancement

[Editor’s Note: The United States Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International at their Menlo Park, CA, campus on 8-9 March 2018, where participants discussed the advent of new biotechnologies and the associated benefits, vulnerabilities, and ethics associated with Soldier enhancement for the Army of the Future.  The following post is an excerpt from this conference’s final report.]

Source:  Max Pixel

Advances in synthetic biology likely will enhance future Soldier performance – speed, strength, endurance, and resilience – but will bring with it vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing the enhancement.

 

Emerging synthetic biology tools – e.g., CRISPR, Talon, and ZFN – present an opportunity to engineer Soldiers’ DNA and enhance their abilities. Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing. [1] Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Cognitive enhancement could make Soldiers more lethal, more decisive, and perhaps more resilient. Using neurofeedback, a process that allows a user to see their brain activity in real-time, one can identify ideal brain states, and use them to enhance an individual’s mental performance. Through the mapping and presentation of identified expert brains, novices can rapidly improve their acuity after just a few training sessions. [2] Further, there are studies being conducted that explore the possibility of directly emulating those expert brain states with non-invasive EEG caps that could improve performance almost immediately. [3]  Dr. Amy Kruse, the Chief Scientific Officer at the Platypus Institute, referred to this phenomenon as “sitting on a gold mine of brains.”

There is also the potential to change and improve Soldier’s physical attributes. Scientists can develop drugs, specific dietary plans, and potentially use genetic editing to improve speed, strength, agility, and endurance.

Source: Andrew Herr, CEO Helicase

In order to fully leverage the capability of human performance enhancement, Andrew Herr, CEO of Helicase and an Adjunct Fellow at CNAS, suggested that human performance R&D be moved out of the medical field and become its own research area due to its differing objectives and the convergence between varying technologies.

Soldiers, Airmen, Marines, and Sailors are already trying to enhance themselves with commercial products – often containing unknown or unsafe ingredients – so it is incumbent on the U.S. military to, at the very least, help those who want to improve.

However, a host of new vulnerabilities, at the genetic level, accompany this revolutionary leap in human evolution. If one can map the human genome and more thoroughly scan and understand the brain, they can target genomes and brains in the same ways. Soldiers could become incredibly vulnerable at the genomic level, forcing the Army to not only protect Soldiers using body armor and armored vehicles, but also protect their identities, genomes, and physiologies.

Adversaries will exploit all biological enhancements to gain competitive advantage over U.S. forces. Targeted genome editing technology such as CRISPR will enable adversarial threats to employ super-empowered Soldiers on the battlefield and target specific populations with bioweapons. U.S. adversaries may use technologies recklessly to achieve short term gains with no consideration of long range effects. [4] [5]

There are numerous ethical questions that come with the enhancement of Soldiers such as the moral acceptability of the Army making permanent enhancements to Soldiers, the responsibility for returning transitioning Soldiers to a “baseline human,” and the general definition of what a “baseline human” is legally defined as.

Transhumanism H+ symbol by Antonu / Source:  https://commons.wikimedia.org/wiki/File:Transhumanism_h%2B.svg

By altering, enhancing, and augmenting the biology of the human Soldier, the United States Army will potentially enter into uncharted ethical territory. Instead of issuing items to Soldiers to complement their physical and cognitive assets, by 2050, the U.S. Army may have the will and the means to issue them increased biological abilities in those areas. The future implications and the limits or thresholds for enhancement have not yet been considered. The military is already willing to correct the vision of certain members – laser eye surgery, for example – a practice that could be accurately referred to as human enhancement, so discretely defining where the threshold lies will be important. It is already known that other countries, and possible adversaries, are willing to cross the line where we are not. Russia, most recently, was banned from competition in the 2018 Winter Olympics for widespread performance-enhancing drug violations that were believed to be supported by the Russian Government. [6] Those drugs violate the spirit of competition in the Olympics, but no such spirit exists in warfare.

Another consideration is whether or not the Soldier enhancements are permanent. By enhancing Soldiers’ faculties, the Army is, in fact, enhancing their lethality or their ability to defeat the enemy. What happens with these enhancements—whether the Army can or should remove them— when a Soldier leaves the Army is an open question. As stated previously, the Army is willing and able to improve eyesight, but does not revert that eyesight back to its original state after the individual has separated. Some possible moral questions surrounding Soldier enhancement include:

• If the Army were to increase a Soldier’s stamina, visual acuity, resistance to disease, and pain tolerance, making them a more lethal warfighter, is it incumbent upon the Army to remove those enhancements?

• If the Soldier later used those enhancements in civilian life for nefarious purposes, would the Army be responsible?

Answers to these legal questions are beyond the scope of this paper, but can be considered now before the advent of these new technologies becomes widespread.

Image by Leonardo da Vinci / Source: Flickr

If the Army decides to reverse certain Soldier enhancements, it likely will need to determine the definition of a “baseline human.” This would establish norms for features, traits, and abilities that can be permanently enhanced and which must be removed before leaving service. This would undoubtedly involve both legal and moral challenges.

 

The complete Mad Scientist Bio Convergence and Soldier 2050 Final Report can be read here.

To learn more about the ramifications of Soldier enhancement, please go to:

– Dr. Amy Kruse’s Human 2.0 podcast, hosted by our colleagues at Modern War Institute.

– The Ethics and the Future of War panel discussion, facilitated by LTG Jim Dubik (USA-Ret.) from Day 2 (26 July 2017) of the Mad Scientist Visualizing Multi Domain Battle in 2030-2050 Conference at Georgetown University.


[1] Ahmad, Zarah and Stephanie Larson, “The DNA Utility in Military Environments,” slide 5, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018.
[2] Kruse, Amy, “Human 2.0 Upgrading Human Performance,” Slide 12, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018
[3]https://www.frontiersin.org/articles/10.3389/fnhum.2016.00034/full
[4] https://www.technologyreview.com/the-download/610034/china-is-already-gene-editing-a-lot-of-humans/
[5] https://www.c4isrnet.com/unmanned/2018/05/07/russia-confirms-its-armed-robot-tank-was-in-syria/
[6] https://www.washingtonpost.com/sports/russia-banned-from-2018-olympics-following-doping-allegations/2017/12/05/9ab49790-d9d4-11e7-b859-fb0995360725_story.html?noredirect=on&utm_term=.d12db68f42d1

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

78. The Classified Mind – The Cyber Pearl Harbor of 2034

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following post by guest blogger Dr. Jan Kallberg, faculty member, United States Military Academy at West Point, and Research Scientist with the Army Cyber Institute at West Point. His post serves as a cautionary tale regarding our finite intellectual resources and the associated existential threat in failing to protect them!]

Preface: Based on my experience in cybersecurity, migrating to a broader cyber field, there have always been those exceptional individuals that have an unreplicable ability to see the challenge early on, create a technical solution, and know how to play it in the right order for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of Artificial Intelligence increases our reliance on these highly capable individuals – because someone must set the rules, the boundaries, and point out the trajectory for Artificial Intelligence at initiation.

Source: https://thebulletin.org/2017/10/neuroscience-and-the-new-weapons-of-the-mind/

As an industrialist society, we tend to see technology and the information that feeds it as the weapons – and ignore the few humans that have a large-scale direct impact. Even if identified as a weapon, how do you make a human mind classified? Can we protect these high-ability individuals that in the digital world are weapons, not as tools but compilers of capability, or are we still focused on the tools? Why do we see only weapons that are steel and electronics and not the weaponized mind as a weapon?  I believe firmly that we underestimate the importance of Applicable Intelligence – the ability to play the cyber engagement in the optimal order.  Adversaries are often good observers because they are scouting for our weak spots. I set the stage for the following post in 2034, close enough to be realistic and far enough for things to happen when our adversaries are betting that we rely more on a few minds than we are willing to accept.

Post:  In a not too distant future, 20th of August 2034, a peer adversary’s first strategic moves are the targeted killings of less than twenty individuals as they go about their daily lives:  watching a 3-D printer making a protein sandwich at a breakfast restaurant; stepping out from the downtown Chicago monorail; or taking a taste of a poison-filled retro Jolt Cola. In the gray zone, when the geopolitical temperature increases, but we are still not at war yet, our adversary acts quickly and expedites a limited number of targeted killings within the United States of persons whom are unknown to mass media, the general public, and have only one thing in common – Applicable Intelligence (AI).

The ability to apply is a far greater asset than the technology itself. Cyber and card games have one thing in common, the order you play your cards matters. In cyber, the tools are publicly available, anyone can download them from the Internet and use them, but the weaponization of the tools occurs when used by someone who understands how to play the tools in an optimal order. These minds are different because they see an opportunity to exploit in a digital fog of war where others don’t or can’t see it. They address problems unburdened by traditional thinking, in new innovative ways, maximizing the dual-purpose of digital tools, and can create tangible cyber effects.

It is the Applicable Intelligence (AI) that creates the procedures, the application of tools, and turns simple digital software in sets or combinations as a convergence to digitally lethal weapons. This AI is the intelligence to mix, match, tweak, and arrange dual purpose software. In 2034, it is as if you had the supernatural ability to create a thermonuclear bomb from what you can find at Kroger or Albertson.

Sadly we missed it; we didn’t see it. We never left the 20th century. Our adversary saw it clearly and at the dawn of conflict killed off the weaponized minds, without discretion, and with no concern for international law or morality.

These intellects are weapons of growing strategic magnitude. In 2034, the United States missed the importance of these few intellects. This error left them unprotected.

All of our efforts were instead focusing on what they delivered, the application and the technology, which was hidden in secret vaults and only discussed in sensitive compartmented information facilities. Therefore, we classify to the highest level to ensure the confidentiality and integrity of our cyber capabilities. Meanwhile, the most critical component, the militarized intellect, we put no value to because it is a human. In a society marinated in an engineering mindset, humans are like desk space, electricity, and broadband; it is a commodity that is input in the production of the technical machinery. The marveled technical machinery is the only thing we care about today, 2018, and as it turned out in 2034 as well.

We are stuck in how we think, and we are unable to see it coming, but our adversaries see it. At a systematic level, we are unable to see humans as the weapon itself, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed. As the armory of the war of 1812, as the stockpile of 1943, and as the launch pad of 2034. Arms are made of steel, or fancier metals, with electronics – we failed in 2034 to see weapons made of corn, steak, and an added combative intellect.

General Nakasone stated in 2017, “Our best ones [coders] are 50 or 100 times better than their peers,” and continued “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.” In reality, the success of cyber and cyber operations is highly dependent not on the tools or toolsets but instead upon the super-empowered individual that General Nakasone calls “the 50-x coder.”

Manhattan Project K-25 Gaseous Diffusion Process Building, Oak Ridge, TN / Source: atomicarchive.com

There were clear signals that we could have noticed before General Nakasone pointed it out clearly in 2017. The United States’ Manhattan Project during World War II had at its peak 125,000 workers on the payroll, but the intellects that drove the project to success and completion were few. The difference with the Manhattan Project and the future of cyber is that we were unable to see the human as a weapon, being locked in by our path dependency as an engineering society where we hail the technology and forget the importance of the humans behind it.

J. Robert Oppenheimer – the militarized intellect behind the  Manhattan Project / Source: Life Magazine

America’s endless love of technical innovations and advanced machinery reflects in a nation that has celebrated mechanical wonders and engineered solutions since its creation. For America, technical wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the intercontinental railroad, the Panama Canal, the manufacturing era, the moon landing, and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps that can solve a problem or act.

The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced. In 2034, the era of digital conflicts and the war between algorithms with engagements occurring at machine speed with no time for leadership or human interaction, it is the intellects that design and understand how to play it. We didn’t see it.

In 2034, with fewer than twenty bodies piled up after targeted killings, resides the Cyber Pearl Harbor. It was not imploding critical infrastructure, a tsunami of cyber attacks, nor hackers flooding our financial systems, but instead traditional lead and gunpowder. The super-empowered individuals are gone, and we are stuck in a digital war at speeds we don’t understand, unable to play it in the right order, and with limited intellectual torque to see through the fog of war provided by an exploding kaleidoscope of nodes and digital engagements.

Source: Shutterstock

If you enjoyed this post, read our Personalized Warfare post.

Dr. Jan Kallberg is currently an Assistant Professor of Political Science with the Department of Social Sciences, United States Military Academy at West Point, and a Research Scientist with the Army Cyber Institute at West Point. He was earlier a researcher with the Cyber Security Research and Education Institute, The University of Texas at Dallas, and is a part-time faculty member at George Washington University. Dr. Kallberg earned his Ph.D. and MA from the University of Texas at Dallas and earned a JD/LL.M. from Juridicum Law School, Stockholm University. Dr. Kallberg is a certified CISSP, ISACA CISM, and serves as the Managing Editor for the Cyber Defense Review. He has authored papers in the Strategic Studies Quarterly, Joint Forces Quarterly, IEEE IT Professional, IEEE Access, IEEE Security and Privacy, and IEEE Technology and Society.

68. Bio Convergence and Soldier 2050 Conference Final Report

[Editor’s Note: The U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International on 8–9 March 2018 at their Menlo Park campus in California. This conference explored bio convergence, what the Army’s Soldier of 2050 will look like, and how they will interact and integrate with their equipment. The following post is an excerpt from this conference’s final report.]

Source: U.S. Army photo by SPC Joshua P. Morris

While the technology and concepts defining warfare have continuously and rapidly transformed, the primary actor in warfare – the human – has remained largely unchanged. Soldiers today may be physically larger, more thoroughly trained, and better equipped than their historical counterparts, but their capability and performance abilities remain very similar.

These limitations in human performance, however, may change over the next 30 years, as advances in biotechnology and human performance likely will expand the boundaries of what is possible for humans to achieve. We may see Soldiers – not just their equipment – with superior vision, enhanced cognitive abilities, disease/virus resistance, and increased strength, speed, agility, and endurance. As a result, these advances could provide the Soldier with an edge to survive and thrive on the hyperactive, constantly changing, and increasingly lethal Multi-Domain Battlespace.

Source: The Guardian and Lynsey Irvine/Getty

In addition to potentially changing the individual physiology and abilities of the future Soldier, there are many technological innovations on the horizon that will impact human performance. The convergence of these technologies – artificial intelligence (AI), robotics, augmented reality, brain-machine interface, nanotechnologies, and biological and medical improvements to the human – is referred to as bio convergence. Soldiers of the future will have enhanced capabilities due to technologies that will be installed, instilled, and augmented. This convergence will also make the Army come to terms on what kinds of bio-converged technologies will be accepted in new recruits.

The conference generated the following key findings:

Source: RodMartin.org

• The broad advancement of biotechnologies will provide wide access to dangerous and powerful bioweapons and human enhancements. The low cost and low expertise entry point into gene editing, human performance enhancement, and bioweapon production has spurred a string of new explorations into this arena by countries with large defense budgets (e.g.,  China), non-state criminal and terrorist organizations (e.g., ISIS), and even super-empowered individuals willing to subject their bodies to experimental and risky treatments.

Source: Shutterstock

• Emerging synthetic biology tools (e.g., CRISPR, Talon, and ZFN) present an opportunity to engineer Soldiers’ DNA and enhance their performance, providing  greater  speed, strength, endurance, and resilience.  These tools, however, will also create new vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing enhancement.  Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing.  Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Source: Getty Images

• Ensuring that our land forces are ready to meet future challenges requires optimizing biotechnology and neuroscience advancements.  Designer viruses and diseases will be highly volatile, mutative, and extremely personalized, potentially challenging an already stressed Army medical response system and its countermeasures.  Synthetic biology provides numerous applications that will bridge capability gaps and enable future forces to fight effectively. Future synthetic biology defense applications are numerous and range from sensing capabilities to rapidly developed vaccines and therapeutics.

Source: Rockwell Collins / Aviation Week

• Private industry and academia have become the driving force behind innovation. While there are some benefits to this – such as shorter development times – there are also risks. For example, investments in industry are mainly driven by market demand which can lead to a lack of investment in areas that are vital to National Defense but have low to no consumer demand. In academia, a majority of graduate students in STEM fields are foreign nationals, comprising over 80% of electrical and petroleum engineering programs. The U.S. will need to find a way to maintain its technological superiority even when most of the expertise eventually leaves the country.

Source: World Health Organization

• The advent of new biotechnologies will give rise to moral, regulatory, and legal challenges for the Army of the Future, its business practices, recruiting requirements, Soldier standards, and structure. The rate of technology development in the synthetic biology field is increasing rapidly. Private individuals or small start-ups with minimal capital can create a new organism for which there is no current countermeasure and the development of one will likely take years. This potentiality leads to the dilemma of swiftly creating effective policy and regulation that addresses these concerns, while not stifling creativity and productivity in the field for those conducting legitimate research. Current regulation may not be sufficient, and bureaucratic inflexibility prevents quick reactive and proactive change. Our adversaries may not move as readily to adopt harsher regulations in the bio-technology arena. Rather than focusing on short-term solutions, it may be beneficial to take a holistic approach centered in a world where bio-technology is interacting with everyday life. The U.S. may have to work from a relative “disadvantage,” using safe and legal methods of enhancement, while our adversaries may choose to operate below our defined legal threshold.

Bio Convergence is incredibly important to the Army of the Future because the future Soldier is the Bio. The Warrior of tomorrow’s Army will be given more responsibility, will be asked to do more, will be required to be more capable, and will face more challenges and complexities than ever before. These Soldiers must be able to quickly adapt, change, connect to and disconnect from a multitude of networks – digital and otherwise – all while carrying out multiple mission-sets in an increasingly disrupted, degraded, and arduous environment marred with distorted reality, information warfare, and attacks of a personalized nature.

For additional information regarding this conference:

• Review the Lessons Learned from the Bio Convergence and Soldier 2050 Conference preliminary assessment.

• Read the entire Mad Scientist Bio Convergence and Soldier 2050 Conference Final Report.

• Watch the conference’s video presentations.

• See the associated presentations’ briefing slides.

• Check out the associated “Call for Ideas” writing contest finalist submissions, hosted by our colleagues at Small Wars Journal.

 

65. “The Queue”

[Editor’s Note:  Now that another month has flown by, Mad Scientist Laboratory is pleased to present our June edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Source: KUO CHENG LIAO

1. Collaborative Intelligence: Humans and AI are Joining Forces, by H. James Wilson and Paul R. Daugherty, Harvard Business Review, July – August 2018.

 

Source: OpenAI

A Team of AI Algorithms just crushed Expert Humans in a Complex Computer Game, by Will Knight, MIT Technology Review, June 25, 2018.

I know — I cheated and gave you two articles to read. These “dueling” articles demonstrate the early state of our understanding of the role of humans in decision-making. The Harvard Business Review article describes findings where human – Artificial Intelligence (AI) partnerships take advantage of the leadership, teamwork, creativity, and social skills of humans with the speed, scalability, and quantitative capabilities of AI. This is basically the idea of “centaur” chess which has been prevalent in discussions of human and AI collaboration. Conversely, the MIT Technology Review article describes the ongoing work to build AI algorithms that are incentivized to collaborate with other AI teammates. Could it be that collaboration is not a uniquely human attribute? The ongoing work on integration of AI into the workforce and in support of CEO decision-making could inform the Army’s investment strategy for AI. Julianne Gallina, one of our proclaimed Mad Scientists, described a future where everyone would have an entourage and Commanders would have access to a “Patton in the Pocket.” How the human operates on or in the loop and how Commanders make decisions at machine speed will be informed by this research. In August, the Mad Scientist team will conduct a conference focused on Learning in 2050 to further explore the ideas of human and AI teaming with intelligent tutors and mentors.

Source: Doubleday

2. Origin: A Novel, by Dan Brown, Doubleday, October 3, 2017, reviewed by Ms. Marie Murphy.

Dan Brown’s famous symbologist Robert Langdon returns to avenge the murder of his friend, tech developer and futurist Edmund Kirsch. Killed in the middle of presenting what he advertised as a life-changing discovery, Langdon teams up with Kirsch’s most faithful companion, his AI assistant Winston, in order to release Edmund’s presentation to the public. Winston is able to access Kirsch’s entire network, give real-time directions, and make decisions based on ambiguous commands — all via Kirsch’s smartphone. However, this AI system doesn’t appear to know Kirsch’s personal password, and can only enable Langdon in his mission to find it. An omnipresent and portable assistant like Winston could greatly aid future warfighters and commanders. Having this scope of knowledge on command is beneficial, but future AI will be able to not only regurgitate data, but present the Soldier with courses of action analyses and decision options based on the data. Winston was also able to mimic emotion via machine learning, which can reduce Soldier stress levels and present information in a humanistic manner. Once an AI has been attached to a Soldier for a period of time, it can learn the particular preferences and habits of that Soldier, and make basic or routine decisions and assumptions for that individual, anticipating their needs, as Winston does for Kirsch and Langdon.

Source: Getty Images adapted by CNAS

3. Technology Roulette: Managing Loss of Control as Many Militaries Pursue Technological Superiority, by Richard Danzig, Center for a New American Security, 30 May 2018.

Mad Scientist Laboratory readers are already familiar with the expression, “warfare at machine speed.” As our adversaries close the technology gap and potentially overtake us in select areas, there is clearly a “need for speed.”

“… speed matters — in two distinct dimensions. First, autonomy can increase decision speed, enabling the U.S. to act inside an adversary’s operations cycle. Secondly, ongoing rapid transition of autonomy into warfighting capabilities is vital if the U.S. is to sustain military advantage.” — Defense Science Board (DSB) Report on Autonomy, June 2016 (p. 3).

In his monograph, however, author and former Clinton Administration Secretary of the Navy Richard Danzig contends that “superiority is not synonymous with security;” citing the technological proliferation that almost inevitably follows technological innovations and the associated risks of unintended consequences resulting from the loss of control of military technologies. Contending that speed is a form of technological roulette, former Secretary Danzig proposes a control methodology of five initiatives to help mitigate the associated risks posed by disruptive technologies, and calls for increased multilateral planning with both our allies and opponents. Unfortunately, as with the doomsday scenario played out in Nevil Shute’s novel On the Beach, it is “… the little ones, the Irresponsibles…” that have propagated much of the world’s misery in the decades following the end of the Cold War. It is the specter of these Irresponsible nations, along with non-state actors and Super-Empowered Individuals, experimenting with and potentially unleashing disruptive technologies, who will not be contained by any non-proliferation protocols or controls. Indeed, neither will our near-peer adversaries, if these technologies promise to offer a revolutionary, albeit fleeting, Offset capability.

U.S. Vice Chairman of the Joint Chiefs of Staff Air Force Gen. Paul Selva, Source: Alex Wong/Getty Images

4. The US made the wrong bet on radiofrequency, and now it could pay the price, by Aaron Metha, C4ISRNET, 21 Jun 2018.

This article illustrates how the Pentagon’s faith in its own technology drove the Department of Defense to trust it would maintain dominance over the electromagnetic spectrum for years to come.  That decision left the United States vulnerable to new leaps in technology made by our near-peers. GEN Paul Selva, Vice Chairman of the Joint Chiefs of Staff, has concluded that the Pentagon must now keep up with near-peer nations and reestablish our dominance of electronic warfare and networking (spoiler alert – we are not!).  This is an example of a pink flamingo (a known, known), as we know our near-peers have surpassed us in technological dominance in some cases.  In looking at technological forecasts for the next decade, we must ensure that the U.S. is making the right investments in Science and Technology to keep up with our near-peers. This article demonstrates that timely and decisive policy-making will be paramount in keeping up with our adversaries in the fast changing and agile Operational Environment.

Source: MIT CSAIL

5. MIT Device Uses WiFi to ‘See’ Through Walls and Track Your Movements, by Kaleigh Rogers, MOTHERBOARD, 13 June 2018.

Researchers at MIT have discovered a way to “see” people through walls by tracking WiFi signals that bounce off of their bodies. Previously, the technology limited fidelity to “blobs” behind a wall, essentially telling you that someone was present but no indication of behavior. The breakthrough is using a trained neural network to identify the bouncing signals and compare those with the shape of the human skeleton. This is significant because it could give an added degree of specificity to first responders or fire teams clearing rooms. The ability to determine if an individual on the other side of the wall is potentially hostile and holding a weapon or a non-combatant holding a cellphone could be the difference between life and death. This also brings up questions about countermeasures. WiFi signals are seemingly everywhere and, with this technology, could prove to be a large signature emitter. Will future forces need to incorporate uniforms or materials that absorb these waves or scatter them in a way that distorts them?

Source: John T. Consoli / University of Maryland

6. People recall information better through virtual reality, says new UMD study, University of Maryland, EurekaAlert, 13 June 2018.

A study performed by the University of Maryland determined that people will recall information better when seeing it first in a 3D virtual environment, as opposed to a 2D desktop or mobile screen. The Virtual Reality (VR) system takes advantage of what’s called “spatial mnemonic encoding” which allows the brain to not only remember something visually, but assign it a place in three-dimensional space which helps with retention and recall. This technique could accelerate learning and enhance retention when we train our Soldiers and Leaders. As the VR hardware becomes smaller, lighter, and more affordable, custom mission sets, or the skills necessary to accomplish them, could be learned on-the-fly, in theater in a compressed timeline. This also allows for education to be distributed and networked globally without the need for a traditional classroom.

Source: Potomac Books

7. Strategy Strikes Back: How Star Wars Explains Modern Military Conflict, edited by Max Brooks, John Amble, ML Cavanaugh, and Jaym Gates; Foreword by GEN Stanley McChrystal, Potomac Books, May 1, 2018.

This book is fascinating for two reasons:  1) It utilizes one of the greatest science fiction series (almost a genre unto itself) in order to brilliantly illustrate some military strategy concepts and 2) It is chock full of Mad Scientists as contributors. One of the editors, John Amble, is a permanent Mad Scientist team member, while another, Max Brooks, author of World War Z, and contributor, August Cole, are officially proclaimed Mad Scientists.

The book takes a number of scenes and key battles in Star Wars and uses historical analogies to help present complex issues like civil-military command structure, counterinsurgency pitfalls, force structuring, and battlefield movement and maneuver.

One of the more interesting portions of the book is the concept of ‘droid armies vs. clone soldiers and the juxtaposition of that with the future testing of manned-unmanned teaming (MUM-T) concepts. There are parallels in how we think about what machines can and can’t do and how they think and learn.

 
If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

59. Fundamental Questions Affecting Army Modernization

[Editor’s Note:  The Operational Environment (OE) is the start point for Army Readiness – now and in the Future. The OE answers the question, “What is the Army ready for?”  Without the OE in training and Leader development, Soldiers and Leaders are “practicing” in a benign condition, without the requisite rigor to forge those things essential for winning in a complex, multi-domain battlefield.  Building the Army’s future capabilities, a critical component of future readiness, requires this same start point.  The assumptions the Army makes about the Future OE are the sine qua non start point for developing battlefield systems — these assumptions must be at the forefront of decision-making for all future investments.]

There are no facts about the future. Leaders interested in building future ready organizations must develop assumptions about possible futures and these assumptions require constant scrutiny. Leaders must also make decisions based on these assumptions to posture organizations to take advantage of opportunities and to mitigate risks. Making these decisions is fundamental to building future readiness.

Source: Evan Jensen, ARL

The TRADOC G-2 has made the following foundational assumptions about the future that can serve as launch points for important questions about capability requirements and capabilities under development. These assumptions are further described in An Advanced Engagement Battlespace: Tactical, Operational and Strategic Implications for the Future Operational Environment, published by our colleagues at Small Wars Journal.

1. Contested in all domains (air, land, sea, space, and cyber). Increased lethality, by virtue of ubiquitous sensors, proliferated precision, high kinetic energy weapons and advanced area munitions, further enabled by autonomy, robotics, and Artificial Intelligence (AI) with an increasing potential for overmatch. Adversaries will restrict us to temporary windows of advantage with periods of physical and electronic isolation.

Source: Army Technology

2. Concealment is difficult on the future battlefield. Hiding from advanced sensors — where practicable — will require dramatic reduction of heat, electromagnetic, and optical signatures. Traditional hider techniques such as camouflage, deception, and concealment will have to extend to “cross-domain obscuration” in the cyber domain and the electromagnetic spectrum. Canny competitors will monitor their own emissions in real-time to understand and mitigate their vulnerabilities in the “battle of signatures.” Alternately, “hiding in the open” within complex terrain clutter and near-constant relocation might be feasible, provided such relocation could outpace future recon / strike targeting cycles.   Adversaries will operate among populations in complex terrain, including dense urban areas.

3. Trans-regional, gray zone, and hybrid strategies with both regular and irregular forces, criminal elements, and terrorists attacking our weaknesses and mitigating our advantages. The ensuing spectrum of competition will range from peaceful, legal activities through violent, mass upheavals and civil wars to traditional state-on-state, unlimited warfare.

Source: Science Photo Library / Van Parys Media

4. Adversaries include states, non-state actors, and super-empowered individuals, with non-state actors and super empowered individuals now having access to Weapons of Mass Effect (WME), cyber, space, and Nuclear/Biological/ Chemical (NBC) capabilities. Their operational reach will range from tactical to global, and the application of their impact from one domain into another will be routine. These advanced engagements will also be interactive across the multiple dimensions of conflict, not only across every domain in the physical dimension, but also the cognitive dimension of information operations, and even the moral dimension of belief and values.

Source: Northrop Grumman

5. Increased speed of human interaction, events and action with democratized and rapidly proliferating capabilities means constant co-evolution between competitors. Recon / Strike effectiveness is a function of its sensors, shooters, their connections, and the targeting process driving decisions. Therefore, in a contest between peer competitors with comparable capabilities, advantage will fall to the one that is better integrated and makes better and faster decisions.

These assumptions become useful when they translate to potential decision criteria for Leaders to rely on when evaluating systems being developed for the future battlefield. Each of the following questions are fundamental to ensuring the Army is prepared to operate in the future.

Source: Lockheed Martin

1. How will this system operate when disconnected from a network? Units will be disconnected from their networks on future battlefields. Capabilities that require constant timing and precision geo-locational data will be prioritized for disruption by adversaries with capable EW systems.

2. What signature does this system present to an adversary? It is difficult to hide on the future battlefield and temporary windows of advantage will require formations to reduce their battlefield signatures. Capabilities that require constant multi-directional broadcast and units with large mission command centers will quickly be targeted and neutralized.

Image credit: Alexander Kott

3. How does this system operate in dense urban areas? The physical terrain in dense urban areas and megacities creates concrete canyons isolating units electronically and physically. Automated capabilities operating in dense population areas might also increase the rate of false signatures, confusing, rather than improving, Commander decision-making. New capabilities must be able to operate disconnected in this terrain. Weapons systems must be able to slew and elevate rapidly to engage vertical targets. Automated systems and sensors will require significant training sets to reduce the rate of false signatures.

Source: Military Embedded Systems

4. How does this system take advantage of open and modular architectures? The rapid rate of technological innovations will offer great opportunities to militaries capable of rapidly integrating prototypes into formations.  Capabilities developed with open and modular architectures can be upgraded with autonomous and AI enablers as they mature. Early investment in closed-system capabilities will freeze Armies in a period of rapid co-evolution and lead to overmatch.

5. How does this capability help win in competition short of conflict with a near peer competitor? Near peer competitors will seek to achieve limited objectives short of direct conflict with the U.S. Army. Capabilities will need to be effective at operating in the gray zone as well as serving as deterrence. They will need to be capable of strategic employment from CONUS-based installations.

If you enjoyed this post, check out the following items of interest:

    • Join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live 11 June 2018 — click here to learn more!