94. The Wide Range of Competition

[Editor’s Note: Mad Scientist tracks convergence trends that are changing the character of future warfare. The democratization of technologies and the global proliferation of information is one of these trends that has expanded the arena of high-end threat capabilities beyond nation-states to now include non-state actors and super-empowered individuals. Today’s post illustrates how the democratization of one such capability,  biotechnology, affects the Future Operational Environment.]

As discussed during the Mad Scientist Bio Convergence and Soldier 2050 Conference, co-hosted with SRI International at Menlo Park, California last Spring, the broad advancement of biotechnologies will provide wide access to dangerous and powerful bioweapons and human enhancement. The low cost and low expertise entry point into gene editing, human performance enhancement, and bioweapon production has spurred a string of new explorations into this arena by countries with large defense budgets (e.g., China), non-state criminal and terrorist organizations (e.g., ISIS), and even super-empowered individuals willing to subject their bodies to experimental and risky treatments or augmentations.

China has invested billions of dollars into biotechnology – including in several U.S. biotechnology firms – and plans on focusing on their own bio revolution. Gene editing is one of the areas where China has sought to leapfrog the United States through ambitious Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) projects, editing the genes of 86 individuals, while the United States is just now approaching human trials. Additionally, Elsa Kania, an expert on Chinese emerging technology from the Center for the New American Security (CNAS), noted that China is now seeking to build its own innovation base rather than focusing on intellectual property theft and technology transfers.

Listen to Ms. Kania’s discussion addressing technological priorities and how they overlay on the Chinese government’s strategic objectives in the  China’s Quest for Enhanced Military Technology podcast, hosted by our colleagues at Modern War Institute.

Non-state actors – mainly terrorist organizations – have focused more on weaponizing biotechnology. A personal laptop belonging to ISIS that was captured in Syria, was found to contain lessons on making bubonic plague bombs and the employment of various weapons of mass destruction (WMDs). The possession of this dangerous information by the most notorious terrorist organization across the globe is a testament to the worldwide proliferation of information. This challenge of weaponized biotechnology is exacerbated by the relative ease of obtaining material to carry out such attacks.

Watch Dr. Gary Ackerman‘s presentation on Non-State Actors and their Uses of Technology from the Mad Scientist Artificial Intelligence, Robotics, and Autonomy: Visioning Mult-Domain Battle in 2030-2050 Conference at Georgetown University, 7-8 March 2017.

There is a growing community of individual biohackers and “do it yourselfers” (DIYers), super-empowered individuals pushing the boundaries of DNA editing, implants, embedded technologies (embeds), and unapproved chemical and biological injections. One of the most prominent biohackers, Josiah Zayner, a former NASA employee with a biophysics PhD, who livestreamed his self-injection of CRISPR and has even started a company selling DIY CRISPR kits ranging from several hundred to over 1000 dollars, effectively enabling biohackers to cheaply change their physiology, alter their appearance, and go beyond human biological norms. None of these treatments and augmentations are approved by regulatory agencies and DIYers run the serious risk of harming themselves or unleashing destructive and disruptive biological agents upon an unwitting population.

Read our Mad Scientist Laboratory blog post on the Emergent Threat Posed by Super-Empowered Individuals .

Biotechnology is just one example of how potentially game changing capabilities that were once only within the purview of our strategic competitors will be democratized via the global proliferation of information.  In the Future Operational Environment, we can also expect to see artificial intelligence, multi-domain swarming, and space capabilities in the hands of non-state and super-empowered individuals.

85. Benefits, Vulnerabilities, and the Ethics of Soldier Enhancement

[Editor’s Note: The United States Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International at their Menlo Park, CA, campus on 8-9 March 2018, where participants discussed the advent of new biotechnologies and the associated benefits, vulnerabilities, and ethics associated with Soldier enhancement for the Army of the Future.  The following post is an excerpt from this conference’s final report.]

Source:  Max Pixel

Advances in synthetic biology likely will enhance future Soldier performance – speed, strength, endurance, and resilience – but will bring with it vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing the enhancement.

 

Emerging synthetic biology tools – e.g., CRISPR, Talon, and ZFN – present an opportunity to engineer Soldiers’ DNA and enhance their abilities. Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing. [1] Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Cognitive enhancement could make Soldiers more lethal, more decisive, and perhaps more resilient. Using neurofeedback, a process that allows a user to see their brain activity in real-time, one can identify ideal brain states, and use them to enhance an individual’s mental performance. Through the mapping and presentation of identified expert brains, novices can rapidly improve their acuity after just a few training sessions. [2] Further, there are studies being conducted that explore the possibility of directly emulating those expert brain states with non-invasive EEG caps that could improve performance almost immediately. [3]  Dr. Amy Kruse, the Chief Scientific Officer at the Platypus Institute, referred to this phenomenon as “sitting on a gold mine of brains.”

There is also the potential to change and improve Soldier’s physical attributes. Scientists can develop drugs, specific dietary plans, and potentially use genetic editing to improve speed, strength, agility, and endurance.

Source: Andrew Herr, CEO Helicase

In order to fully leverage the capability of human performance enhancement, Andrew Herr, CEO of Helicase and an Adjunct Fellow at CNAS, suggested that human performance R&D be moved out of the medical field and become its own research area due to its differing objectives and the convergence between varying technologies.

Soldiers, Airmen, Marines, and Sailors are already trying to enhance themselves with commercial products – often containing unknown or unsafe ingredients – so it is incumbent on the U.S. military to, at the very least, help those who want to improve.

However, a host of new vulnerabilities, at the genetic level, accompany this revolutionary leap in human evolution. If one can map the human genome and more thoroughly scan and understand the brain, they can target genomes and brains in the same ways. Soldiers could become incredibly vulnerable at the genomic level, forcing the Army to not only protect Soldiers using body armor and armored vehicles, but also protect their identities, genomes, and physiologies.

Adversaries will exploit all biological enhancements to gain competitive advantage over U.S. forces. Targeted genome editing technology such as CRISPR will enable adversarial threats to employ super-empowered Soldiers on the battlefield and target specific populations with bioweapons. U.S. adversaries may use technologies recklessly to achieve short term gains with no consideration of long range effects. [4] [5]

There are numerous ethical questions that come with the enhancement of Soldiers such as the moral acceptability of the Army making permanent enhancements to Soldiers, the responsibility for returning transitioning Soldiers to a “baseline human,” and the general definition of what a “baseline human” is legally defined as.

Transhumanism H+ symbol by Antonu / Source:  https://commons.wikimedia.org/wiki/File:Transhumanism_h%2B.svg

By altering, enhancing, and augmenting the biology of the human Soldier, the United States Army will potentially enter into uncharted ethical territory. Instead of issuing items to Soldiers to complement their physical and cognitive assets, by 2050, the U.S. Army may have the will and the means to issue them increased biological abilities in those areas. The future implications and the limits or thresholds for enhancement have not yet been considered. The military is already willing to correct the vision of certain members – laser eye surgery, for example – a practice that could be accurately referred to as human enhancement, so discretely defining where the threshold lies will be important. It is already known that other countries, and possible adversaries, are willing to cross the line where we are not. Russia, most recently, was banned from competition in the 2018 Winter Olympics for widespread performance-enhancing drug violations that were believed to be supported by the Russian Government. [6] Those drugs violate the spirit of competition in the Olympics, but no such spirit exists in warfare.

Another consideration is whether or not the Soldier enhancements are permanent. By enhancing Soldiers’ faculties, the Army is, in fact, enhancing their lethality or their ability to defeat the enemy. What happens with these enhancements—whether the Army can or should remove them— when a Soldier leaves the Army is an open question. As stated previously, the Army is willing and able to improve eyesight, but does not revert that eyesight back to its original state after the individual has separated. Some possible moral questions surrounding Soldier enhancement include:

• If the Army were to increase a Soldier’s stamina, visual acuity, resistance to disease, and pain tolerance, making them a more lethal warfighter, is it incumbent upon the Army to remove those enhancements?

• If the Soldier later used those enhancements in civilian life for nefarious purposes, would the Army be responsible?

Answers to these legal questions are beyond the scope of this paper, but can be considered now before the advent of these new technologies becomes widespread.

Image by Leonardo da Vinci / Source: Flickr

If the Army decides to reverse certain Soldier enhancements, it likely will need to determine the definition of a “baseline human.” This would establish norms for features, traits, and abilities that can be permanently enhanced and which must be removed before leaving service. This would undoubtedly involve both legal and moral challenges.

 

The complete Mad Scientist Bio Convergence and Soldier 2050 Final Report can be read here.

To learn more about the ramifications of Soldier enhancement, please go to:

– Dr. Amy Kruse’s Human 2.0 podcast, hosted by our colleagues at Modern War Institute.

– The Ethics and the Future of War panel discussion, facilitated by LTG Jim Dubik (USA-Ret.) from Day 2 (26 July 2017) of the Mad Scientist Visualizing Multi Domain Battle in 2030-2050 Conference at Georgetown University.


[1] Ahmad, Zarah and Stephanie Larson, “The DNA Utility in Military Environments,” slide 5, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018.
[2] Kruse, Amy, “Human 2.0 Upgrading Human Performance,” Slide 12, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018
[3]https://www.frontiersin.org/articles/10.3389/fnhum.2016.00034/full
[4] https://www.technologyreview.com/the-download/610034/china-is-already-gene-editing-a-lot-of-humans/
[5] https://www.c4isrnet.com/unmanned/2018/05/07/russia-confirms-its-armed-robot-tank-was-in-syria/
[6] https://www.washingtonpost.com/sports/russia-banned-from-2018-olympics-following-doping-allegations/2017/12/05/9ab49790-d9d4-11e7-b859-fb0995360725_story.html?noredirect=on&utm_term=.d12db68f42d1

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

78. The Classified Mind – The Cyber Pearl Harbor of 2034

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following post by guest blogger Dr. Jan Kallberg, faculty member, United States Military Academy at West Point, and Research Scientist with the Army Cyber Institute at West Point. His post serves as a cautionary tale regarding our finite intellectual resources and the associated existential threat in failing to protect them!]

Preface: Based on my experience in cybersecurity, migrating to a broader cyber field, there have always been those exceptional individuals that have an unreplicable ability to see the challenge early on, create a technical solution, and know how to play it in the right order for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of Artificial Intelligence increases our reliance on these highly capable individuals – because someone must set the rules, the boundaries, and point out the trajectory for Artificial Intelligence at initiation.

Source: https://thebulletin.org/2017/10/neuroscience-and-the-new-weapons-of-the-mind/

As an industrialist society, we tend to see technology and the information that feeds it as the weapons – and ignore the few humans that have a large-scale direct impact. Even if identified as a weapon, how do you make a human mind classified? Can we protect these high-ability individuals that in the digital world are weapons, not as tools but compilers of capability, or are we still focused on the tools? Why do we see only weapons that are steel and electronics and not the weaponized mind as a weapon?  I believe firmly that we underestimate the importance of Applicable Intelligence – the ability to play the cyber engagement in the optimal order.  Adversaries are often good observers because they are scouting for our weak spots. I set the stage for the following post in 2034, close enough to be realistic and far enough for things to happen when our adversaries are betting that we rely more on a few minds than we are willing to accept.

Post:  In a not too distant future, 20th of August 2034, a peer adversary’s first strategic moves are the targeted killings of less than twenty individuals as they go about their daily lives:  watching a 3-D printer making a protein sandwich at a breakfast restaurant; stepping out from the downtown Chicago monorail; or taking a taste of a poison-filled retro Jolt Cola. In the gray zone, when the geopolitical temperature increases, but we are still not at war yet, our adversary acts quickly and expedites a limited number of targeted killings within the United States of persons whom are unknown to mass media, the general public, and have only one thing in common – Applicable Intelligence (AI).

The ability to apply is a far greater asset than the technology itself. Cyber and card games have one thing in common, the order you play your cards matters. In cyber, the tools are publicly available, anyone can download them from the Internet and use them, but the weaponization of the tools occurs when used by someone who understands how to play the tools in an optimal order. These minds are different because they see an opportunity to exploit in a digital fog of war where others don’t or can’t see it. They address problems unburdened by traditional thinking, in new innovative ways, maximizing the dual-purpose of digital tools, and can create tangible cyber effects.

It is the Applicable Intelligence (AI) that creates the procedures, the application of tools, and turns simple digital software in sets or combinations as a convergence to digitally lethal weapons. This AI is the intelligence to mix, match, tweak, and arrange dual purpose software. In 2034, it is as if you had the supernatural ability to create a thermonuclear bomb from what you can find at Kroger or Albertson.

Sadly we missed it; we didn’t see it. We never left the 20th century. Our adversary saw it clearly and at the dawn of conflict killed off the weaponized minds, without discretion, and with no concern for international law or morality.

These intellects are weapons of growing strategic magnitude. In 2034, the United States missed the importance of these few intellects. This error left them unprotected.

All of our efforts were instead focusing on what they delivered, the application and the technology, which was hidden in secret vaults and only discussed in sensitive compartmented information facilities. Therefore, we classify to the highest level to ensure the confidentiality and integrity of our cyber capabilities. Meanwhile, the most critical component, the militarized intellect, we put no value to because it is a human. In a society marinated in an engineering mindset, humans are like desk space, electricity, and broadband; it is a commodity that is input in the production of the technical machinery. The marveled technical machinery is the only thing we care about today, 2018, and as it turned out in 2034 as well.

We are stuck in how we think, and we are unable to see it coming, but our adversaries see it. At a systematic level, we are unable to see humans as the weapon itself, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed. As the armory of the war of 1812, as the stockpile of 1943, and as the launch pad of 2034. Arms are made of steel, or fancier metals, with electronics – we failed in 2034 to see weapons made of corn, steak, and an added combative intellect.

General Nakasone stated in 2017, “Our best ones [coders] are 50 or 100 times better than their peers,” and continued “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.” In reality, the success of cyber and cyber operations is highly dependent not on the tools or toolsets but instead upon the super-empowered individual that General Nakasone calls “the 50-x coder.”

Manhattan Project K-25 Gaseous Diffusion Process Building, Oak Ridge, TN / Source: atomicarchive.com

There were clear signals that we could have noticed before General Nakasone pointed it out clearly in 2017. The United States’ Manhattan Project during World War II had at its peak 125,000 workers on the payroll, but the intellects that drove the project to success and completion were few. The difference with the Manhattan Project and the future of cyber is that we were unable to see the human as a weapon, being locked in by our path dependency as an engineering society where we hail the technology and forget the importance of the humans behind it.

J. Robert Oppenheimer – the militarized intellect behind the  Manhattan Project / Source: Life Magazine

America’s endless love of technical innovations and advanced machinery reflects in a nation that has celebrated mechanical wonders and engineered solutions since its creation. For America, technical wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the intercontinental railroad, the Panama Canal, the manufacturing era, the moon landing, and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps that can solve a problem or act.

The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced. In 2034, the era of digital conflicts and the war between algorithms with engagements occurring at machine speed with no time for leadership or human interaction, it is the intellects that design and understand how to play it. We didn’t see it.

In 2034, with fewer than twenty bodies piled up after targeted killings, resides the Cyber Pearl Harbor. It was not imploding critical infrastructure, a tsunami of cyber attacks, nor hackers flooding our financial systems, but instead traditional lead and gunpowder. The super-empowered individuals are gone, and we are stuck in a digital war at speeds we don’t understand, unable to play it in the right order, and with limited intellectual torque to see through the fog of war provided by an exploding kaleidoscope of nodes and digital engagements.

Source: Shutterstock

If you enjoyed this post, read our Personalized Warfare post.

Dr. Jan Kallberg is currently an Assistant Professor of Political Science with the Department of Social Sciences, United States Military Academy at West Point, and a Research Scientist with the Army Cyber Institute at West Point. He was earlier a researcher with the Cyber Security Research and Education Institute, The University of Texas at Dallas, and is a part-time faculty member at George Washington University. Dr. Kallberg earned his Ph.D. and MA from the University of Texas at Dallas and earned a JD/LL.M. from Juridicum Law School, Stockholm University. Dr. Kallberg is a certified CISSP, ISACA CISM, and serves as the Managing Editor for the Cyber Defense Review. He has authored papers in the Strategic Studies Quarterly, Joint Forces Quarterly, IEEE IT Professional, IEEE Access, IEEE Security and Privacy, and IEEE Technology and Society.

68. Bio Convergence and Soldier 2050 Conference Final Report

[Editor’s Note: The U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International on 8–9 March 2018 at their Menlo Park campus in California. This conference explored bio convergence, what the Army’s Soldier of 2050 will look like, and how they will interact and integrate with their equipment. The following post is an excerpt from this conference’s final report.]

Source: U.S. Army photo by SPC Joshua P. Morris

While the technology and concepts defining warfare have continuously and rapidly transformed, the primary actor in warfare – the human – has remained largely unchanged. Soldiers today may be physically larger, more thoroughly trained, and better equipped than their historical counterparts, but their capability and performance abilities remain very similar.

These limitations in human performance, however, may change over the next 30 years, as advances in biotechnology and human performance likely will expand the boundaries of what is possible for humans to achieve. We may see Soldiers – not just their equipment – with superior vision, enhanced cognitive abilities, disease/virus resistance, and increased strength, speed, agility, and endurance. As a result, these advances could provide the Soldier with an edge to survive and thrive on the hyperactive, constantly changing, and increasingly lethal Multi-Domain Battlespace.

Source: The Guardian and Lynsey Irvine/Getty

In addition to potentially changing the individual physiology and abilities of the future Soldier, there are many technological innovations on the horizon that will impact human performance. The convergence of these technologies – artificial intelligence (AI), robotics, augmented reality, brain-machine interface, nanotechnologies, and biological and medical improvements to the human – is referred to as bio convergence. Soldiers of the future will have enhanced capabilities due to technologies that will be installed, instilled, and augmented. This convergence will also make the Army come to terms on what kinds of bio-converged technologies will be accepted in new recruits.

The conference generated the following key findings:

Source: RodMartin.org

• The broad advancement of biotechnologies will provide wide access to dangerous and powerful bioweapons and human enhancements. The low cost and low expertise entry point into gene editing, human performance enhancement, and bioweapon production has spurred a string of new explorations into this arena by countries with large defense budgets (e.g.,  China), non-state criminal and terrorist organizations (e.g., ISIS), and even super-empowered individuals willing to subject their bodies to experimental and risky treatments.

Source: Shutterstock

• Emerging synthetic biology tools (e.g., CRISPR, Talon, and ZFN) present an opportunity to engineer Soldiers’ DNA and enhance their performance, providing  greater  speed, strength, endurance, and resilience.  These tools, however, will also create new vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing enhancement.  Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing.  Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Source: Getty Images

• Ensuring that our land forces are ready to meet future challenges requires optimizing biotechnology and neuroscience advancements.  Designer viruses and diseases will be highly volatile, mutative, and extremely personalized, potentially challenging an already stressed Army medical response system and its countermeasures.  Synthetic biology provides numerous applications that will bridge capability gaps and enable future forces to fight effectively. Future synthetic biology defense applications are numerous and range from sensing capabilities to rapidly developed vaccines and therapeutics.

Source: Rockwell Collins / Aviation Week

• Private industry and academia have become the driving force behind innovation. While there are some benefits to this – such as shorter development times – there are also risks. For example, investments in industry are mainly driven by market demand which can lead to a lack of investment in areas that are vital to National Defense but have low to no consumer demand. In academia, a majority of graduate students in STEM fields are foreign nationals, comprising over 80% of electrical and petroleum engineering programs. The U.S. will need to find a way to maintain its technological superiority even when most of the expertise eventually leaves the country.

Source: World Health Organization

• The advent of new biotechnologies will give rise to moral, regulatory, and legal challenges for the Army of the Future, its business practices, recruiting requirements, Soldier standards, and structure. The rate of technology development in the synthetic biology field is increasing rapidly. Private individuals or small start-ups with minimal capital can create a new organism for which there is no current countermeasure and the development of one will likely take years. This potentiality leads to the dilemma of swiftly creating effective policy and regulation that addresses these concerns, while not stifling creativity and productivity in the field for those conducting legitimate research. Current regulation may not be sufficient, and bureaucratic inflexibility prevents quick reactive and proactive change. Our adversaries may not move as readily to adopt harsher regulations in the bio-technology arena. Rather than focusing on short-term solutions, it may be beneficial to take a holistic approach centered in a world where bio-technology is interacting with everyday life. The U.S. may have to work from a relative “disadvantage,” using safe and legal methods of enhancement, while our adversaries may choose to operate below our defined legal threshold.

Bio Convergence is incredibly important to the Army of the Future because the future Soldier is the Bio. The Warrior of tomorrow’s Army will be given more responsibility, will be asked to do more, will be required to be more capable, and will face more challenges and complexities than ever before. These Soldiers must be able to quickly adapt, change, connect to and disconnect from a multitude of networks – digital and otherwise – all while carrying out multiple mission-sets in an increasingly disrupted, degraded, and arduous environment marred with distorted reality, information warfare, and attacks of a personalized nature.

For additional information regarding this conference:

• Review the Lessons Learned from the Bio Convergence and Soldier 2050 Conference preliminary assessment.

• Read the entire Mad Scientist Bio Convergence and Soldier 2050 Conference Final Report.

• Watch the conference’s video presentations.

• See the associated presentations’ briefing slides.

• Check out the associated “Call for Ideas” writing contest finalist submissions, hosted by our colleagues at Small Wars Journal.

 

65. “The Queue”

[Editor’s Note:  Now that another month has flown by, Mad Scientist Laboratory is pleased to present our June edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Source: KUO CHENG LIAO

1. Collaborative Intelligence: Humans and AI are Joining Forces, by H. James Wilson and Paul R. Daugherty, Harvard Business Review, July – August 2018.

 

Source: OpenAI

A Team of AI Algorithms just crushed Expert Humans in a Complex Computer Game, by Will Knight, MIT Technology Review, June 25, 2018.

I know — I cheated and gave you two articles to read. These “dueling” articles demonstrate the early state of our understanding of the role of humans in decision-making. The Harvard Business Review article describes findings where human – Artificial Intelligence (AI) partnerships take advantage of the leadership, teamwork, creativity, and social skills of humans with the speed, scalability, and quantitative capabilities of AI. This is basically the idea of “centaur” chess which has been prevalent in discussions of human and AI collaboration. Conversely, the MIT Technology Review article describes the ongoing work to build AI algorithms that are incentivized to collaborate with other AI teammates. Could it be that collaboration is not a uniquely human attribute? The ongoing work on integration of AI into the workforce and in support of CEO decision-making could inform the Army’s investment strategy for AI. Julianne Gallina, one of our proclaimed Mad Scientists, described a future where everyone would have an entourage and Commanders would have access to a “Patton in the Pocket.” How the human operates on or in the loop and how Commanders make decisions at machine speed will be informed by this research. In August, the Mad Scientist team will conduct a conference focused on Learning in 2050 to further explore the ideas of human and AI teaming with intelligent tutors and mentors.

Source: Doubleday

2. Origin: A Novel, by Dan Brown, Doubleday, October 3, 2017, reviewed by Ms. Marie Murphy.

Dan Brown’s famous symbologist Robert Langdon returns to avenge the murder of his friend, tech developer and futurist Edmund Kirsch. Killed in the middle of presenting what he advertised as a life-changing discovery, Langdon teams up with Kirsch’s most faithful companion, his AI assistant Winston, in order to release Edmund’s presentation to the public. Winston is able to access Kirsch’s entire network, give real-time directions, and make decisions based on ambiguous commands — all via Kirsch’s smartphone. However, this AI system doesn’t appear to know Kirsch’s personal password, and can only enable Langdon in his mission to find it. An omnipresent and portable assistant like Winston could greatly aid future warfighters and commanders. Having this scope of knowledge on command is beneficial, but future AI will be able to not only regurgitate data, but present the Soldier with courses of action analyses and decision options based on the data. Winston was also able to mimic emotion via machine learning, which can reduce Soldier stress levels and present information in a humanistic manner. Once an AI has been attached to a Soldier for a period of time, it can learn the particular preferences and habits of that Soldier, and make basic or routine decisions and assumptions for that individual, anticipating their needs, as Winston does for Kirsch and Langdon.

Source: Getty Images adapted by CNAS

3. Technology Roulette: Managing Loss of Control as Many Militaries Pursue Technological Superiority, by Richard Danzig, Center for a New American Security, 30 May 2018.

Mad Scientist Laboratory readers are already familiar with the expression, “warfare at machine speed.” As our adversaries close the technology gap and potentially overtake us in select areas, there is clearly a “need for speed.”

“… speed matters — in two distinct dimensions. First, autonomy can increase decision speed, enabling the U.S. to act inside an adversary’s operations cycle. Secondly, ongoing rapid transition of autonomy into warfighting capabilities is vital if the U.S. is to sustain military advantage.” — Defense Science Board (DSB) Report on Autonomy, June 2016 (p. 3).

In his monograph, however, author and former Clinton Administration Secretary of the Navy Richard Danzig contends that “superiority is not synonymous with security;” citing the technological proliferation that almost inevitably follows technological innovations and the associated risks of unintended consequences resulting from the loss of control of military technologies. Contending that speed is a form of technological roulette, former Secretary Danzig proposes a control methodology of five initiatives to help mitigate the associated risks posed by disruptive technologies, and calls for increased multilateral planning with both our allies and opponents. Unfortunately, as with the doomsday scenario played out in Nevil Shute’s novel On the Beach, it is “… the little ones, the Irresponsibles…” that have propagated much of the world’s misery in the decades following the end of the Cold War. It is the specter of these Irresponsible nations, along with non-state actors and Super-Empowered Individuals, experimenting with and potentially unleashing disruptive technologies, who will not be contained by any non-proliferation protocols or controls. Indeed, neither will our near-peer adversaries, if these technologies promise to offer a revolutionary, albeit fleeting, Offset capability.

U.S. Vice Chairman of the Joint Chiefs of Staff Air Force Gen. Paul Selva, Source: Alex Wong/Getty Images

4. The US made the wrong bet on radiofrequency, and now it could pay the price, by Aaron Metha, C4ISRNET, 21 Jun 2018.

This article illustrates how the Pentagon’s faith in its own technology drove the Department of Defense to trust it would maintain dominance over the electromagnetic spectrum for years to come.  That decision left the United States vulnerable to new leaps in technology made by our near-peers. GEN Paul Selva, Vice Chairman of the Joint Chiefs of Staff, has concluded that the Pentagon must now keep up with near-peer nations and reestablish our dominance of electronic warfare and networking (spoiler alert – we are not!).  This is an example of a pink flamingo (a known, known), as we know our near-peers have surpassed us in technological dominance in some cases.  In looking at technological forecasts for the next decade, we must ensure that the U.S. is making the right investments in Science and Technology to keep up with our near-peers. This article demonstrates that timely and decisive policy-making will be paramount in keeping up with our adversaries in the fast changing and agile Operational Environment.

Source: MIT CSAIL

5. MIT Device Uses WiFi to ‘See’ Through Walls and Track Your Movements, by Kaleigh Rogers, MOTHERBOARD, 13 June 2018.

Researchers at MIT have discovered a way to “see” people through walls by tracking WiFi signals that bounce off of their bodies. Previously, the technology limited fidelity to “blobs” behind a wall, essentially telling you that someone was present but no indication of behavior. The breakthrough is using a trained neural network to identify the bouncing signals and compare those with the shape of the human skeleton. This is significant because it could give an added degree of specificity to first responders or fire teams clearing rooms. The ability to determine if an individual on the other side of the wall is potentially hostile and holding a weapon or a non-combatant holding a cellphone could be the difference between life and death. This also brings up questions about countermeasures. WiFi signals are seemingly everywhere and, with this technology, could prove to be a large signature emitter. Will future forces need to incorporate uniforms or materials that absorb these waves or scatter them in a way that distorts them?

Source: John T. Consoli / University of Maryland

6. People recall information better through virtual reality, says new UMD study, University of Maryland, EurekaAlert, 13 June 2018.

A study performed by the University of Maryland determined that people will recall information better when seeing it first in a 3D virtual environment, as opposed to a 2D desktop or mobile screen. The Virtual Reality (VR) system takes advantage of what’s called “spatial mnemonic encoding” which allows the brain to not only remember something visually, but assign it a place in three-dimensional space which helps with retention and recall. This technique could accelerate learning and enhance retention when we train our Soldiers and Leaders. As the VR hardware becomes smaller, lighter, and more affordable, custom mission sets, or the skills necessary to accomplish them, could be learned on-the-fly, in theater in a compressed timeline. This also allows for education to be distributed and networked globally without the need for a traditional classroom.

Source: Potomac Books

7. Strategy Strikes Back: How Star Wars Explains Modern Military Conflict, edited by Max Brooks, John Amble, ML Cavanaugh, and Jaym Gates; Foreword by GEN Stanley McChrystal, Potomac Books, May 1, 2018.

This book is fascinating for two reasons:  1) It utilizes one of the greatest science fiction series (almost a genre unto itself) in order to brilliantly illustrate some military strategy concepts and 2) It is chock full of Mad Scientists as contributors. One of the editors, John Amble, is a permanent Mad Scientist team member, while another, Max Brooks, author of World War Z, and contributor, August Cole, are officially proclaimed Mad Scientists.

The book takes a number of scenes and key battles in Star Wars and uses historical analogies to help present complex issues like civil-military command structure, counterinsurgency pitfalls, force structuring, and battlefield movement and maneuver.

One of the more interesting portions of the book is the concept of ‘droid armies vs. clone soldiers and the juxtaposition of that with the future testing of manned-unmanned teaming (MUM-T) concepts. There are parallels in how we think about what machines can and can’t do and how they think and learn.

 
If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

59. Fundamental Questions Affecting Army Modernization

[Editor’s Note:  The Operational Environment (OE) is the start point for Army Readiness – now and in the Future. The OE answers the question, “What is the Army ready for?”  Without the OE in training and Leader development, Soldiers and Leaders are “practicing” in a benign condition, without the requisite rigor to forge those things essential for winning in a complex, multi-domain battlefield.  Building the Army’s future capabilities, a critical component of future readiness, requires this same start point.  The assumptions the Army makes about the Future OE are the sine qua non start point for developing battlefield systems — these assumptions must be at the forefront of decision-making for all future investments.]

There are no facts about the future. Leaders interested in building future ready organizations must develop assumptions about possible futures and these assumptions require constant scrutiny. Leaders must also make decisions based on these assumptions to posture organizations to take advantage of opportunities and to mitigate risks. Making these decisions is fundamental to building future readiness.

Source: Evan Jensen, ARL

The TRADOC G-2 has made the following foundational assumptions about the future that can serve as launch points for important questions about capability requirements and capabilities under development. These assumptions are further described in An Advanced Engagement Battlespace: Tactical, Operational and Strategic Implications for the Future Operational Environment, published by our colleagues at Small Wars Journal.

1. Contested in all domains (air, land, sea, space, and cyber). Increased lethality, by virtue of ubiquitous sensors, proliferated precision, high kinetic energy weapons and advanced area munitions, further enabled by autonomy, robotics, and Artificial Intelligence (AI) with an increasing potential for overmatch. Adversaries will restrict us to temporary windows of advantage with periods of physical and electronic isolation.

Source: Army Technology

2. Concealment is difficult on the future battlefield. Hiding from advanced sensors — where practicable — will require dramatic reduction of heat, electromagnetic, and optical signatures. Traditional hider techniques such as camouflage, deception, and concealment will have to extend to “cross-domain obscuration” in the cyber domain and the electromagnetic spectrum. Canny competitors will monitor their own emissions in real-time to understand and mitigate their vulnerabilities in the “battle of signatures.” Alternately, “hiding in the open” within complex terrain clutter and near-constant relocation might be feasible, provided such relocation could outpace future recon / strike targeting cycles.   Adversaries will operate among populations in complex terrain, including dense urban areas.

3. Trans-regional, gray zone, and hybrid strategies with both regular and irregular forces, criminal elements, and terrorists attacking our weaknesses and mitigating our advantages. The ensuing spectrum of competition will range from peaceful, legal activities through violent, mass upheavals and civil wars to traditional state-on-state, unlimited warfare.

Source: Science Photo Library / Van Parys Media

4. Adversaries include states, non-state actors, and super-empowered individuals, with non-state actors and super empowered individuals now having access to Weapons of Mass Effect (WME), cyber, space, and Nuclear/Biological/ Chemical (NBC) capabilities. Their operational reach will range from tactical to global, and the application of their impact from one domain into another will be routine. These advanced engagements will also be interactive across the multiple dimensions of conflict, not only across every domain in the physical dimension, but also the cognitive dimension of information operations, and even the moral dimension of belief and values.

Source: Northrop Grumman

5. Increased speed of human interaction, events and action with democratized and rapidly proliferating capabilities means constant co-evolution between competitors. Recon / Strike effectiveness is a function of its sensors, shooters, their connections, and the targeting process driving decisions. Therefore, in a contest between peer competitors with comparable capabilities, advantage will fall to the one that is better integrated and makes better and faster decisions.

These assumptions become useful when they translate to potential decision criteria for Leaders to rely on when evaluating systems being developed for the future battlefield. Each of the following questions are fundamental to ensuring the Army is prepared to operate in the future.

Source: Lockheed Martin

1. How will this system operate when disconnected from a network? Units will be disconnected from their networks on future battlefields. Capabilities that require constant timing and precision geo-locational data will be prioritized for disruption by adversaries with capable EW systems.

2. What signature does this system present to an adversary? It is difficult to hide on the future battlefield and temporary windows of advantage will require formations to reduce their battlefield signatures. Capabilities that require constant multi-directional broadcast and units with large mission command centers will quickly be targeted and neutralized.

Image credit: Alexander Kott

3. How does this system operate in dense urban areas? The physical terrain in dense urban areas and megacities creates concrete canyons isolating units electronically and physically. Automated capabilities operating in dense population areas might also increase the rate of false signatures, confusing, rather than improving, Commander decision-making. New capabilities must be able to operate disconnected in this terrain. Weapons systems must be able to slew and elevate rapidly to engage vertical targets. Automated systems and sensors will require significant training sets to reduce the rate of false signatures.

Source: Military Embedded Systems

4. How does this system take advantage of open and modular architectures? The rapid rate of technological innovations will offer great opportunities to militaries capable of rapidly integrating prototypes into formations.  Capabilities developed with open and modular architectures can be upgraded with autonomous and AI enablers as they mature. Early investment in closed-system capabilities will freeze Armies in a period of rapid co-evolution and lead to overmatch.

5. How does this capability help win in competition short of conflict with a near peer competitor? Near peer competitors will seek to achieve limited objectives short of direct conflict with the U.S. Army. Capabilities will need to be effective at operating in the gray zone as well as serving as deterrence. They will need to be capable of strategic employment from CONUS-based installations.

If you enjoyed this post, check out the following items of interest:

    • Join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live 11 June 2018 — click here to learn more!

52. Potential Game Changers

The Mad Scientist Initiative brings together cutting-edge leaders and thinkers from the technology industry, research laboratories, academia, and across the military and Government to explore the impact of potentially disruptive technologies. Much like Johannes Gutenberg’s moveable type (illustrated above), these transformational game changers have the potential to impact how we live, create, think, and prosper. Understanding their individual and convergent impacts is essential to continued battlefield dominance in the Future Operational Environment. In accordance with The Operational Environment and the Changing Character of Future Warfare, we have divided this continuum into two distinct timeframes:

The Era of Accelerated Human Progress (Now through 2035):
The period where our adversaries can take advantage of new technologies, new doctrine, and revised strategic concepts to effectively challenge U.S. military forces across multiple domains. Game changers during this era include:

• Robotics: Forty plus countries develop military robots with some level of autonomy. Impact on society, employment.
Vulnerable: To Cyber/Electromagnetic (EM) disruption, battery life, ethics without man in the loop.
Formats: Unmanned/Autonomous; ground/air vehicles/subsurface/sea systems. Nano-weapons.
Examples: (Air) Hunter/killer Unmanned Aerial Vehicle (UAV) swarms; (Ground) Russian Uran: Recon, ATGMs, SAMs.

• Artificial Intelligence: Human-Agent Teaming, where humans and intelligent systems work together to achieve either a physical or mental task. The human and the intelligent system will trade-off cognitive and physical loads in a collaborative fashion.

• Swarms/Semi Autonomous: Massed, coordinated, fast, collaborative, small, stand-off. Overwhelm target systems. Mass or disaggregate.



• Internet of Things (IoT): Trillions of internet linked items create opportunities and vulnerabilities. Explosive growth in low Size Weight and Power (SWaP) connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness). Greater than 100 devices per human. Significant end device processing (sensor analytics, sensor to shooter, supply chain management).
Vulnerable: To Cyber/EM/Power disruption. Privacy concerns regarding location and tracking.
Sensor to shooter: Accelerate kill chain, data processing, and decision-making.

• Space: Over 50 nations operate in space, increasingly congested and difficult to monitor, endanger Positioning, Navigation, and Timing (PNT)

GPS Jamming/Spoofing: Increasingly sophisticated, used successfully in Ukraine.
Anti Satellite: China has tested two direct ascent anti-satellite missiles.

The Era of Contested Equality (2035 through 2050):
The period marked by significant breakthroughs in technology and convergences in terms of capabilities, which lead to significant changes in the character of warfare. During this period, traditional aspects of warfare undergo dramatic, almost revolutionary changes which at the end of this timeframe may even challenge the very nature of warfare itself. Game changers during this era include:

• Hyper Velocity Weapons:
Rail Guns (Electrodynamic Kinetic Energy Weapons): Electromagnetic projectile launchers. High velocity/energy and space (Mach 5 or higher). Not powered by explosive.
No Propellant: Easier to store and handle.
Lower Cost Projectiles: Potentially. Extreme G-force requires sturdy payloads.
Limiting factors: Power. Significant IR signature. Materials science.
Hyper Glide Vehicles: Less susceptible to anti-ballistic missile countermeasures.

• Directed Energy Weapons: Signature not visible without technology, must dwell on target. Power requirements currently problematic.
Potential: Tunable, lethal, and non-lethal.
Laser: Directed energy damages intended target. Targets: Counter Aircraft, UAS, Missiles, Projectiles, Sensors, Swarms.
Radio Frequency (RF): Attack targets across the frequency spectrum. Targets: Not just RF; Microwave weapons “cook targets,” people, electronics.

• Synthetic Biology: Engineering / modification of biological entities
Increased Crop Yield: Potential to reduce food scarcity.
Weaponization: Potential for micro-targeting, Seek & destroy microbes that can target DNA. Potentially accessible to super-empowered individuals.
Medical Advances: Enhance soldier survivability.
Genetic Modification: Disease resistant, potentially designer babies and super athletes/soldiers. Synthetic DNA stores digital data. Data can be used for micro-targeting.
CRISPR: Genome editing.

• Information Environment: Use IoT and sensors to harness the flow of information for situational understanding and decision-making advantage.




In envisioning Future Operational Environment possibilities, the Mad Scientist Initiative employs a number of techniques. We have found Crowdsourcing (i.e., the gathering of ideas, thoughts, and concepts from a wide variety of interested individuals assists us in diversifying thoughts and challenging conventional assumptions) to be a particularly effective technique. To that end, we have published our latest, 2-page compendium of Potential Game Changers here — we would like to hear your feedback regarding them. Please let us know your thoughts / observations by posting them in this blog post’s Comment box (found below, in the Leave a Reply section). Alternatively, you can also submit them to us via email at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil. Thank you in advance for your contributions!

51. Black Swans and Pink Flamingos

The Mad Scientist Initiative recently facilitated a workshop with thought leaders from across the Department of Defense, the Intelligence Community, other Government agencies, industry, and academia to address the unknown, unknowns (i.e., Black Swans) and the known, knowns (i.e., Pink Flamingos) to synthesize cross-agency thinking about possible disruptions to the Future Operational Environment.

Black Swans: In Nassim Nicholas Taleb’s original context, a black swan (unknown, unknowns) is an event or situation which is unpredictable, but has a major effect. For this conference, we used a looser definition, identifying possibilities that are not likely, but might have significant impacts on how we think about warfighting and security.

Pink Flamingos: Defined by Frank Hoffman, Pink Flamingos are the known, knowns that are often discussed, but ignored by Leaders trapped by organizational cultures and rigid bureaucratic decision-making structures. Peter Schwartz further describes Pink Flamingos as the “inevitable surprise.” Digital photography was a pink flamingo to Kodak.

At the workshop, attendees identified the following Black Swans:

Naturally Occurring Disaster: These events (i.e., Carrington Event — solar flare frying solid state electronics, super volcano eruptions, earthquake swarms, etc.) would have an enormous impact on the Army and its ability to continue to operate and defend the nation and support national recovery operations. While warning times have increased for many of these events, there are limited measures that can be implemented to mitigate the devastating effects of these events.


Virtual Nations: While the primacy of Westphalian borders has been challenged and the power of traditional nation-states has been waning over the last decade, some political scientists have assumed that supranational organizations and non-state actors would take their place. One potential black swan is the emergence of virtual nations due to the convergence of blockchain technologies, crypto-currency, and the ability to project power and legitimacy through the virtual world. Virtual nations could be organized based on ideologies, business models, or single interests. Virtual nations could supersede, supplement, or compete with traditional, physical nations. The Army of the future may not be prepared to interact and compete with virtual nations.


Competition in Venues Other than Warfare (Economic, Technological, Demographic, etc.) Achieving Primacy: In the near future, war in the traditional sense may be less prevalent, while competitions in other areas may be the driving forces behind national oppositions. How does the Army need to prepare for an eventuality where armed conflict is not as important as it once was?


Alternate Internet — “Alternet”: A distinct entity, separate from the general commercial internet, only accessible with specific corresponding hardware. This technology would allow for unregulated and unmonitored communication and commerce, potentially granting safe haven to criminal and terrorist activities.

At the workshop, attendees identified the following Pink Flamingos:

Safe at Home: Army installations are no longer the sanctuaries they once were, as adversaries will be able to attack Soldiers and families through social media and other cyberspace means. Additionally, installations no longer merely house, train, and deploy Soldiers — unmanned combat systems are controlled from home installations -— a trend in virtual power that will increase in the future. The Army needs a plan to harden our installations and train Soldiers and families to be resilient for this eventuality.


Hypersonics: High speed (Mach 5 or higher) and highly maneuverable missiles or glide vehicles that can defeat our air defense systems. The speed of these weapons is unmatched and their maneuverability allows them to keep their targets unknown until only seconds before impact, negating current countermeasures.


Generalized, Operationalized Artificial Intelligence (AI): Artificial intelligence is one of the most prominent pink flamingos throughout global media and governments. Narrow artificial intelligence is being addressed as rapidly as possible through ventures such as Project MAVEN. However, generalized and operationalized artificial intelligence – that can think, contextualize, and operate like a human – has the potential to disrupt not only operations, but also the military at its very core and foundation.


Space/Counterspace: Space is becoming increasingly congested, commercialized, and democratized. Disruption, degradation, and denial in space threatens to cripple multi-domain warfare operations. States and non-state actors alike are exploring options to counter one another, compete, and potentially even fight in space.


Quantum Sciences: Quantum science – communication, computing, and sensing – has the potential to solve some intractable but very specific problem sets. Quantum technology remains in its infancy. However, as the growth of qubits in quantum computing continues to expand, so does the potentiality of traditional encryption being utterly broken. Quantum sensing can allow for much more precise atomic clocks surpassing the precision timing of GPS, as well as quantum imaging that provides better results than classical imaging in a variety of wavelengths.


Bioweapons/Biohacking: The democratization of bio technology will mean that super-empowered individuals as well as nation states will have the ability to engineer weapons and hacks that can augment friendly human forces or target and degrade enemy human forces (e.g., targeted disease or genetic modifications).


Personalized Warfare: Warfare is now waged on a personal level, where adversaries can attack the bank accounts of Soldiers’ families, infiltrate their social media, or even target them specifically by their genetics. The Army needs to understand that the individual Soldier can be exploited in many different ways, often through information publicly provided or stolen.

Source: ommbeu / Fotolia
Deep Fakes/Information Warfare: Information warfare and “fake news” have played a prominent role in global politics over the last several years and could dominate the relationship between societies, governments, politicians, and militaries in the future operational environment. Information operations, thanks to big data and humanity’s ever-growing digital presence, are targeted at an extremely personal and specific level. One of the more concerning aspects of this is an artificial intelligence-based human image/voice synthesis technique known as deep fakes. Deep fakes can essentially put words in the mouths of prominent or trusted politicians and celebrities.


Multi-Domain Swarming: Swarming is often thought about in terms of unmanned aerial systems (UAS), but one significant pink flamingo is swarming taking place across multiple domains with self-organizing, autonomous aerial, ground, maritime (sub and surface), and even subterranean unmanned systems. U.S. defense systems on a linear modernization and development model will not be capable of dealing with the saturation and complexity issues arising from these multi-domain swarms.


Lethal Autonomy: An autonomous system with the ability to track, target, and fire without the supervision or authority of a human in/on the loop. The U.S. Army will have to examine its own policy regarding these issues as well as our adversaries, who may be less deterred by ethical/policy issues.


Tactical Nuclear Exchange: While strategic nuclear war and mutually assured destruction have been discussed and addressed ad nauseam, not enough attention has been given to the potential of a tactical nuclear exchange between state actors. One tactical nuclear attack, while not guaranteeing a nuclear holocaust, would bring about a myriad of problems for U.S. forces worldwide (e.g., the potential for escalation, fallout, contamination of water and air, and disaster response). Additionally, a high altitude nuclear burst’s electromagnetic pulse has the potential to fry solid state electronics across a wide-area, with devastating results to the affected nation’s electrical grid, essential government services, and food distribution networks.

Leaders must anticipate these future possibilities in determining the character of future conflicts and in force design and equipping decisions. Using a mental model of black swans and pink flamingos provides a helpful framework for assessing the risks associated with these decisions.

For additional information on projected black swans for the next 20+ years, see the RAND Corporation’s Discontinuities and Distractions — Rethinking Security for the Year 2040.