191. Competition in 2035: Anticipating Chinese Exploitation of Operational Environments

[Editor’s Note:  In today’s post, Mad Scientist Laboratory explores China’s whole-of-nation approach to exploiting operational environments, synchronizing government, military, and industry activities to change geostrategic power paradigms via competition in 2035. Excerpted from products previously developed and published by the TRADOC G-2’s Operational Environment and Threat Analysis Directorate (see links below), this post describes China’s approach to exploitation and identifies the implications for the U.S. Army — Enjoy!]

The Operational Environment is envisioned as a continuum, divided into two eras: the Era of Accelerated Human Progress (now through 2035) and the Era of Contested Equality (2035 through 2050). This latter era is marked by significant breakthroughs in technology and convergences in terms of capabilities, which lead to significant changes in the character of warfare. During this period, traditional aspects of warfare undergo dramatic, almost revolutionary changes which at the end of this timeframe may even challenge the very nature of warfare itself. In this era, no one actor is likely to have any long-term strategic or technological advantage, with aggregate power between the U.S. and its strategic competitors being equivalent, but not necessarily symmetric. Prevailing in this period will depend on an ability to synchronize multi-domain capabilities against an artificial intelligence-enhanced adversary with an overarching capability to visualize and understand the battlespace at even greater ranges and velocities. Equally important will be controlling information and the narrative surrounding the conflict. Adversaries will adopt sophisticated information operations and narrative strategies to change the context of the conflict and thus defeat U.S. political will.

The future strategic environment will be characterized by a persistent state of competition where global competitors seek to exploit the conditions of operational environments to gain advantage. Adversaries understand that the application of any or all elements of national power in competition just below the threshold of armed conflict is an effective strategy against the U.S.

Chinese DF-17 carrying the DF-ZF Hypersonic Glide Vehicle / Source: Bill Bostock, Business Insider Australia, via Wikimedia Commons

China is rapidly modernizing its armed forces and developing new approaches to warfare. Beijing has invested significant resources into research and development of a wide array of advanced technologies. Coupled with its time-honored practice of reverse engineering technologies or systems it purchases or acquires through espionage, this effort likely will allow China to surpass Russia as our most capable threat sometime around 2030.

China’s Approach to Exploitation

China’s whole-of-nation approach, which involves synchronization of actions across government, military, and industry, will facilitate exploitation of operational environments and enable it to gain global influence through economic exploitation.

China will leverage the international system to advance its own interests while attempting to constrain others, including the U.S.

Preferred Conditions and Methods

The following conditions and methods are conducive to exploitation by China, enabling them to shape the strategic environment in 2035:

    • Infrastructure Capacity Challenges:  China targets undeveloped and fragile environments where their capital investments, technology, and human capital can produce financial gains and generate political influence.
    • Interconnected Economies:  China looks for partners and opportunities to become a significant stakeholder in a wide variety of economies in order to capitalize on its investments as well as generate political influence.
    • Specialized Economies:  China looks for opportunities to partner with specialized markets and leverage their vulnerabilities for gain.
    • Technology Access Gaps:  China targets areas where their capital investments in technology provide partners with key resources and competitive advantages by filling technology gaps.

Implications for the U.S. Army:

    • The Chinese People’s Liberation Army (PLA) deployed armored medical vehicles and personnel to Germany for the Combined Aid 2019 Joint Exercise with the Bundeswehr this past summer.

      Traditional Army threat paradigms may not be sufficient for competition.

    • The Army could be drawn into unanticipated escalation as a result of China’s activities during the competition phase.
    • Army military partnerships will likely be undermined by China in 2035.
    • Army operations and engagements will be increasingly impacted by the pervasiveness of Chinese goods, technology, infrastructure, and systems.

If you enjoyed this post, please see the original paper and associated infographic of the same title, both by the TRADOC G-2’s Operational Environment and Threat Analysis Directorate and hosted on their All Partners Access Network (APAN) site

… and read the following MadSci Laboratory blog posts:

A view of the Future: 2035-2050

China’s Drive for Innovation Dominance and Quantum Surprise on the Battlefield?, by Elsa Kania

A Closer Look at China’s Strategies for Innovation: Questioning True Intent, by Cindy Hurst

Critical Projection: Insights from China’s Science Fiction, by Lt Col Dave Calder

190. Weaponized Information: One Possible Vignette

[Editor’s Note:  The Information Environment (IE) is the point of departure for all events across the Multi-Domain Operations (MDO) spectrum. It’s a unique space that demands our understanding, as the Internet of Things (IoT) and hyper-connectivity have democratized accessibility, extended global reach, and amplified the effects of weaponized information. Our strategic competitors and adversaries have been quick to grasp and employ it to challenge our traditional advantages and exploit our weaknesses.

    • Our near-peers confront us globally, converging IE capabilities with hybrid strategies to expand the battlefield across all domains and create hemispheric threats challenging us from home station installations (i.e., the Strategic Support Area) to the Close Area fight.
    • Democratization of weaponized information empowers regional hegemons and non-state actors, enabling them to target the U.S. and our allies and achieve effects at a fraction of the cost of conventional weapons, without risking armed conflict.
    • The IE enables our adversaries to frame the conditions of future competition and/or escalation to armed conflict on their own terms.

Today’s post imagines one such vignette, with Russia exploiting the IE to successfully out-compete us and accomplish their political objectives, without expending a single bullet!]

Ethnic Russian minorities’ agitation against their respective governments in Estonia, Lithuania, and Latvia spike. Simultaneously, the Russian Government ratchets up tensions, with inflammatory statements of support for these ethnic Russian minorities in the Baltic States; coordinated movements and exercises by Russian ground, naval, and air forces adjacent to the region; and clandestine support to ethnic Russians in these States. The Russian Government started a covert campaign to shape people’s views about the threats against the Russian diaspora. More than 200,000 twitter accounts send 3.6 million tweets trending #protectRussianseverywhere. This sprawling Russian disinformation campaign is focused on building internal support for the Russian President and a possible military action. The U.S. and NATO respond…

The 2nd Cav Regt is placed on alert; as it prepares to roll out of garrison for Poland, several videos surface across social media, purportedly showing the sexual assault of several underage German nationals by U.S. personnel. These disturbingly graphic deepfakes appear to implicate key Leaders within the Regiment. German political and legal authorities call for an investigation and host nation protests erupt outside the gates of Rose Barracks, Vilseck, disrupting the unit’s deployment.

Simultaneously, in units comprising the initial Force Package earmarked to deploy to Europe, key personnel (and their dependents) are targeted, distracting troops from their deployment preparations and disrupting unit cohesion:

    • Social media accounts are hacked/hijacked, with false threats by dependents to execute mass/school shootings, accusations of sexual abuse, hate speech posts by Leaders about their minority troops, and revelations of adulterous affairs between unit spouses.
    • Bank accounts are hacked: some are credited with excessive amounts of cash followed by faux “See Something, Say Something” hotline accusations being made about criminal and espionage activities; while others are zeroed out, disrupting families’ abilities to pay bills.

Russia’s GRU (Military Intelligence) employs AI Generative Adversarial Networks (GANs) to create fake persona injects that mimic select U.S. Active Army, ARNG, and USAR commanders making disparaging statements about their confidence in our allies’ forces, the legitimacy of the mission, and their faith in our political leadership. Sowing these injects across unit social media accounts, Russian Information Warfare specialists seed doubt and erode trust in the chain of command amongst a percentage of susceptible Soldiers, creating further friction in deployment preparations.

As these units load at railheads or begin their road march towards their respective ports of embarkation, Supervisory Control and Data Acquisition (SCADA) attacks are launched on critical rail, road, port, and airfield infrastructures, snarling rail lines, switching yards, and crossings; creating bottlenecks at key traffic intersections; and spoofing navigation systems to cause sealift asset collisions and groundings at key maritime chokepoints. The fly-by-wire avionics are hacked on a departing C-17, causing a crash with the loss of all 134 Soldiers onboard. All C-17s are grounded, pending an investigation.

Salvos of personalized, “direct inject” psychological warfare attacks are launched against Soldiers via immersive media (Augmented, Virtual, and Mixed Reality; 360o Video/Gaming), targeting them while they await deployment and are in-transit to Theater. Similarly, attacks are vectored at spouses, parents, and dependents, with horrifying imagery of their loved ones’ torn and maimed bodies on Artificial Intelligence-generated battlefields (based on scraped facial imagery from social media accounts).

Multi-Domain Operations has improved Jointness, but exacerbated problems with “the communications requirements that constitute the nation’s warfighting Achilles heel.” As units arrive in Theater, seams within and between these U.S. and NATO Intelligence, Surveillance, and Reconnaissance; Fires; Sustainment; and Command and Control inter-connected and federated tactical networks that facilitate partner-to-partner data exchanges are exploited with specifically targeted false injects, sowing doubt and distrust across the alliance for the Multi-Domain Common Operating Picture. Spoofing of these systems leads to accidental air defense engagements, resulting in Blue-on-Blue fratricide or the downing of a commercial airliner, with additional civilian deaths on the ground from spent ordnance, providing more opportunities for Russian Information Operations to spread acrimony within the alliance and create dissent in public opinion back home.

With the flow of U.S. forces into the Baltic Nations, real instances of ethnic Russians’ livelihoods being disrupted (e.g., accidental destruction of livestock and crops, the choking off of main routes to market, and damage to essential services [water, electricity, sewerage]) by maneuver units on exercise are captured on video and enhanced digitally to exacerbate their cumulative effects. Proliferated across the net via bots, these instances further stoke anti-Baltic / anti-U.S. opinion amongst Russian-sympathetic and non-aligned populations alike.

Following years of scraping global social media accounts and building profiles across the full political spectrum, artificial influencers are unleashed on-line that effectively target each of these profiles within the U.S. and allied civilian populations. Ostensibly engaging populations via key “knee-jerk” on-line affinities (e.g., pro-gun, pro-choice, etc.), these artificial influencers, ever so subtly, begin to shift public opinion to embrace a sympathetic position on the rights of the Russian diaspora to greater autonomy in the Baltic States.

The release of deepfake videos showing Baltic security forces massacring ethnic Russians creates further division and causes some NATO partners to hesitate, question, and withhold their support, as required under Article 5. The alliance is rent asunder — Checkmate!

Many of the aforementioned capabilities described in this vignette are available now. Threats in the IE space will only increase in verisimilitude with augmented reality and multisensory content interaction. Envisioning what this Bot 2.0 Competition will look like is essential in building whole-of-government countermeasures and instilling resiliency in our population and military formations.

The Mad Scientist Initiative will continue to explore the significance of the IE to Competition and Conflict and information weaponization throughout our FY20 events — stay tuned to the MadSci Laboratory for more information. In anticipation of this, we have published The Information Environment:  Competition and Conflict anthology, a collection of previously published blog posts that serves as a primer on this topic and examines the convergence of technologies that facilitates information weaponization — Enjoy!

183. Ethics, Morals, and Legal Implications

[Editor’s Note: The U.S. Army Futures Command (AFC) and Training and Doctrine Command (TRADOC) co-sponsored the Mad Scientist Disruption and the Operational Environment Conference with the Cockrell School of Engineering at The University of Texas at Austin on 24-25 April 2019 in Austin, Texas. Today’s post is excerpted from this conference’s Final Report and addresses how the speed of technological innovation and convergence continues to outpace human governance. The U.S. Army must not only consider how best to employ these advances in modernizing the force, but also the concomitant ethical, moral, and legal implications their use may present in the Operational Environment (see links to the newly published TRADOC Pamphlet 525-92, The Operational Environment and the Changing Character of Warfare, and the complete Mad Scientist Disruption and the Operational Environment Conference Final Report at the bottom of this post).]

Technological advancement and subsequent employment often outpaces moral, ethical, and legal standards. Governmental and regulatory bodies are then caught between technological progress and the evolution of social thinking. The Disruption and the Operational Environment Conference uncovered and explored several tension points that the Army may be challenged by in the future.

Space

Cubesats in LEO / Source: NASA

Space is one of the least explored domains in which the Army will operate; as such, we may encounter a host of associated ethical and legal dilemmas. In the course of warfare, if the Army or an adversary intentionally or inadvertently destroys commercial communications infrastructure – GPS satellites – the ramifications to the economy, transportation, and emergency services would be dire and deadly. The Army will be challenged to consider how and where National Defense measures in space affect non-combatants and American civilians on the ground.

Per proclaimed Mad Scientists Dr. Moriba Jah and Dr. Diane Howard, there are ~500,000 objects orbiting the Earth posing potential hazards to our space-based services. We are currently able to only track less than one percent of them — those that are the size of a smart phone / softball or larger. / Source: NASA Orbital Debris Office

International governing bodies may have to consider what responsibility space-faring entities – countries, universities, private companies – will have for mitigating orbital congestion caused by excessive launching and the aggressive exploitation of space. If the Army is judicious with its own footprint in space, it could reduce the risk of accidental collisions and unnecessary clutter and congestion. It is extremely expensive to clean up space debris and deconflicting active operations is essential. With each entity acting in their own self-interest, with limited binding law or governance and no enforcement, overuse of space could lead to a “tragedy of the commons” effect.1  The Army has the opportunity to more closely align itself with international partners to develop guidelines and protocols for space operations to avoid potential conflicts and to influence and shape future policy. Without this early intervention, the Army may face ethical and moral challenges in the future regarding its addition of orbital objects to an already dangerously cluttered Low Earth Orbit. What will the Army be responsible for in democratized space? Will there be a moral or ethical limit on space launches?

Autonomy in Robotics

AFC’s Future Force Modernization Enterprise of Cross-Functional Teams, Acquisition Programs of Record, and Research and Development centers executed a radio rodeo with Industry throughout June 2019 to inform the Army of the network requirements needed to enable autonomous vehicle support in contested, multi-domain environments. / Source: Army.mil

Robotics have been pervasive and normalized in military operations in the post-9/11 Operational Environment. However, the burgeoning field of autonomy in robotics with the potential to supplant humans in time-critical decision-making will bring about significant ethical, moral, and legal challenges that the Army, and larger DoD are currently facing. This issue will be exacerbated in the Operational Environment by an increased utilization and reliance on autonomy.

The increasing prevalence of autonomy will raise a number of important questions. At what point is it more ethical to allow a machine to make a decision that may save lives of either combatants or civilians? Where does fault, responsibility, or attribution lie when an autonomous system takes lives? Will defensive autonomous operations – air defense systems, active protection systems – be more ethically acceptable than offensive – airstrikes, fire missions – autonomy? Can Artificial Intelligence/Machine Learning (AI/ML) make decisions in line with Army core values?

Deepfakes and AI-Generated Identities, Personas, and Content

Source: U.S. Air Force

A new era of Information Operations (IO) is emerging due to disruptive technologies such as deepfakes – videos that are constructed to make a person appear to say or do something that they never said or did – and AI Generative Adversarial Networks (GANs) that produce fully original faces, bodies, personas, and robust identities.2  Deepfakes and GANs are alarming to national security experts as they could trigger accidental escalation, undermine trust in authorities, and cause unforeseen havoc. This is amplified by content such as news, sports, and creative writing similarly being generated by AI/ML applications.

This new era of IO has many ethical and moral implications for the Army. In the past, the Army has utilized industrial and early information age IO tools such as leaflets, open-air messaging, and cyber influence mechanisms to shape perceptions around the world. Today and moving forward in the Operational Environment, advances in technology create ethical questions such as: is it ethical or legal to use cyber or digital manipulations against populations of both U.S. allies and strategic competitors? Under what title or authority does the use of deepfakes and AI-generated images fall? How will the Army need to supplement existing policy to include technologies that didn’t exist when it was written?

AI in Formations

With the introduction of decision-making AI, the Army will be faced with questions about trust, man-machine relationships, and transparency. Does AI in cyber require the same moral benchmark as lethal decision-making? Does transparency equal ethical AI? What allowance for error in AI is acceptable compared to humans? Where does the Army allow AI to make decisions – only in non-combat or non-lethal situations?

Commanders, stakeholders, and decision-makers will need to gain a level of comfort and trust with AI entities exemplifying a true man-machine relationship. The full integration of AI into training and combat exercises provides an opportunity to build trust early in the process before decision-making becomes critical and life-threatening. AI often includes unintentional or implicit bias in its programming. Is bias-free AI possible? How can bias be checked within the programming? How can bias be managed once it is discovered and how much will be allowed? Finally, does the bias-checking software contain bias? Bias can also be used in a positive way. Through ML – using data from previous exercises, missions, doctrine, and the law of war – the Army could inculcate core values, ethos, and historically successful decision-making into AI.

If existential threats to the United States increase, so does pressure to use artificial and autonomous systems to gain or maintain overmatch and domain superiority. As the Army explores shifting additional authority to AI and autonomous systems, how will it address the second and third order ethical and legal ramifications? How does the Army rectify its traditional values and ethical norms with disruptive technology that rapidly evolves?

If you enjoyed this post, please see:

    • “Second/Third Order, and Evil Effects” – The Dark Side of Technology (Parts I & II) by Dr. Nick Marsella.
    • Ethics and the Future of War panel, facilitated by LTG Dubik (USA-Ret.) at the Mad Scientist Visualizing Multi Domain Battle 2030-2050 Conference, facilitated at Georgetown University, on 25-26 July 2017.

Just Published! TRADOC Pamphlet 525-92, The Operational Environment and the Changing Character of Warfare, 7 October 2019, describes the conditions Army forces will face and establishes two distinct timeframes characterizing near-term advantages adversaries may have, as well as breakthroughs in technology and convergences in capabilities in the far term that will change the character of warfare. This pamphlet describes both timeframes in detail, accounting for all aspects across the Diplomatic, Information, Military, and Economic (DIME) spheres to allow Army forces to train to an accurate and realistic Operational Environment.


1 Munoz-Patchen, Chelsea, “Regulating the Space Commons: Treating Space Debris as Abandoned Property in Violation of the Outer Space Treaty,” Chicago Journal of International Law, Vol. 19, No. 1, Art. 7, 1 Aug. 2018. https://chicagounbound.uchicago.edu/cgi/viewcontent.cgi?article=1741&context=cjil

2 Robitzski, Dan, “Amazing AI Generates Entire Bodies of People Who Don’t Exist,” Futurism.com, 30 Apr. 2019. https://futurism.com/ai-generates-entire-bodies-people-dont-exist

123. Decision in the 21st Century

[Editor’s Note: Mad Scientist Laboratory welcomes returning guest blogger Matthew Ader, whose submission builds upon his previous post regarding the demise of strategic and operational deception and surprise.  Given the ascendancy of finders, Mr. Ader argues for the use of profoundly decisive impacts, achieved through information operations and minimal kinetic force, to “generate maximum hysteria” and bend the will of our adversaries’ populations in order to achieve our objectives.]

The future battlespace will be dominated by the finders, not the hiders. Finder capabilities are effective and are only growing more so, leveraging cross-domain surveillance through cheap satellites, unmanned systems, and open source intelligence. This is augmented by the ongoing proliferation of precision long-range fires. In this environment, large unit manoeuvres to achieve decision favoured by the Joint Force will not be possible. Instead, kinetic action should be used to catalyse fear and dissatisfaction among the enemy civilian population, leading to pressure for a negotiated end to conflict.

Why is decisive kinetic manoeuvre no longer possible?

Operation Desert Storm required enormous logistics support / Source: Wikimedia Commons

Logistics. Specifically, the practicalities of supplying a force in a finder dominated environment. During Operation Desert Storm, the fuel consumption rate per day for the U.S. VII and XVIII ABN Corps was about 4.5 million gallons. Ammunition requirements were about 14,000 tons a day.1 Logistics support at this scale can neither be foraged nor arranged ad-hoc. Modern warfare depends on a robust supply network to deliver the requisite food, fuel, ammunition, and spare parts, when and where they are needed, to sustain the fight. In the First Gulf War, that was achieved by a handful of well provisioned logistics bases close to the line of advance. In the Second Gulf War, logistics ran on a just in time model, with supply dependent on “frequent, reliable distribution rather than on large forward stockpiles.”

Depot explosion at a military base in Kalynivka, west of Kiev, Ukraine / Source: Gleb Garanich, Reuters)

Both of these models are no longer viable in the future operating environment. Large logistics bases will be highly vulnerable to cruise, ballistic, and conventional artillery fire. Drone attacks will also pose a significant challenge, aptly demonstrated in Kalynivka, Ukraine in 2017, where a single Russian quadcopter ignited a Ukrainian depot, destroying over 83,000 tons of ammunition.  Challenges to air supremacy complicate the just in time delivery model. In a situation where units have only a few days of organic fuel and ammunition, a handful of missed convoys due to enemy air interdiction would prove disastrous. The unmanned threat is also present here. Autonomous ‘mobile mines’ could be deployed by air or artillery (à la Family of Scatterable Mines or FASCAM) onto lines of communication to complicate supply efforts.

This is not to say that logistics will be impossible. Promising innovations, particularly in using autonomous vehicles, could help with sustainment operations. Nevertheless, from a volume standpoint, the division-sized forces envisioned to achieve decision in a contested environment may not be viable.

What do we do instead?

War is about compelling our opponent to fulfil our will. Up to this point, the most efficient way to do this in a conventional war has been, bluntly, to kill people and blow things up until the enemy government surrenders. Due to the limitations on logistics imposed by the finder’s world, this is no longer possible. We need to find a new way to compel our opponent to fulfil our will.

On June 9, 2014, 150 ISIS militants routed the 75,000 Iraqi Army forces in Mosul / Source: Andolu photo

Luckily, modern information technology provides the Army with a new way.  51% of people with social media access (about 2.5 billion) use it as a source for news. Both of these numbers are likely to grow as connectivity increases in the developing world. However, news on social media is often accompanied and preceded by a bow wave of hysteria, rumours, and conspiracy. This can have direct real-world impact – #AllEyesOnISIS caused much of the Iraqi force defending Mosul to flee before they saw the enemy. That was a profoundly decisive impact, achieved through minimal kinetic force.

The U.S. Army currently considers information operations to be an important adjunct to kinetic action. However, in a finder dominated environment, this should be flipped on its head. Small kinetic offensives (the smaller, the easier for likely highly degraded logistics networks to support) designed to generate maximum hysteria among the enemy population should be the watchword. The result will be viral fear and significant internal pressure to accede to U.S. demands.

In the digital, connected age, all the world is a stage. The Army must learn to weaponize theatrics.

If you enjoyed this post, please also read the following:

– Mr. Ader‘s previous post War Laid Bare.

– Our review of Mad Scientist P.W. Singer and co-author Emerson T. Brooking’s book LikeWar — The Weaponization of Social Media.

– COL Stefan J. Banach‘s complementary posts on Virtual War – A Revolution in Human Affairs (Parts I and II).

Mr. Matthew Ader is a first-year undergraduate taking War Studies at King’s College London.


1Pagonis, LTG William G., with Cruikshank, Jeffrey L., Moving Mountains: Lessons in Leadership and Logistics from the Gulf War, Harvard Business Review Press,  1 August 1992.

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

71. Shaping Perceptions with Information Operations: Lessons for the Future

[Editor’s Note: Mad Scientist Laboratory is pleased to present today’s guest post by Ms. Taylor Galanides, TRADOC G-2 Summer Intern, exploring how the increasing momentum of human interaction, events, and actions, driven by the convergence of innovative technologies, is enabling adversaries to exploit susceptibilities and vulnerabilities to manipulate populations and undermine national interests.  Ms. Galanides examines contemporary Information Operations as a harbinger of virtual warfare in the future Operational Environment.]

More information is available than ever before. Recent and extensive developments in technology, media, communication, and culture – such as the advent of social media, 24-hour news coverage, and smart devices – allow people to closely monitor domestic and foreign affairs. In the coming decades, the increased speed of engagements, as well as the precise and pervasive targeting of both civilian and military populations, means that these populations and their respective nations will be even more vulnerable to influence and manipulation attempts, misinformation, and cyber-attacks from foreign adversaries.

The value of influencing and shaping the perceptions of foreign and domestic populations in order to pursue national and military interests has long been recognized. This can be achieved through the employment of information operations, which seek to affect the decision-making process of adversaries. The U.S. Army views information operations as an instrumental part of the broader effort to maintain an operational advantage over adversaries. Information operations is specifically defined by the U.S. Army as “The integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries while protecting our own.”

The U.S. Army Training and Doctrine Command (TRADOC) G-2’s The Operational Environment and the Changing Character of Future Warfare further emphasizes this increased attention to the information and cognitive domains in the future – in the Era of Contested Equality (2035 through 2050). As a result, it has been predicted that no single nation will hold hegemony over its adversaries, and major powers and non-state actors alike “… will engage in a fight for information on a global scale.” Winning preemptively in the competitive dimension before escalation into armed conflict through the use of information and psychological warfare will become key.

Source: Becoming Human – Artificial Intelligence Magazine

Part of the driving force that is changing the character of warfare includes the rise of innovative technologies such as computer bots, artificial intelligence, and smart devices. Such emerging and advancing technologies have facilitated the convergence of new susceptibilities to individual and international security; as such, it will become increasingly more important to employ defensive and counter information operations to avoid forming misperceptions or being deceived.

Harbinger of the Future:  Information Operations in Crimea

Russia’s invasion of eastern Ukraine and subsequent annexation of Crimea in 2014 effectively serve as cautionary examples of Russia’s evolving information operations and their perception-shaping capabilities. In Crimea, Russia sought to create a “hallucinating fog of war” in an attempt to alter the analytical judgments and perceptions of its adversaries. With the additional help of computer hackers, bots, trolls, and television broadcasts, the Russian government was able to create a manipulated version of reality that claimed Russian intervention in Crimea was not only necessary, but humanitarian, in order to protect Russian speakers. Additionally, Russian cyberespionage efforts included the jamming or shutting down of telecommunication infrastructures, important Ukrainian websites, and cell phones of key officials prior to the invasion. Through the use of large demonstrations called “snap exercises,” the Russians were able to mask military buildups along the border, as well as its political and military intentions. Russia further disguised their intentions and objectives by claiming adherence to international law, while also claiming victimization from the West’s attempts to destabilize, subvert, and undermine their nation.

By denying any involvement in Crimea until after the annexation was complete, distorting the facts surrounding the situation, and refraining from any declaration of war, Russia effectively infiltrated the international information domain and shaped the decision-making process of NATO countries to keep them out of the conflict.  NATO nations ultimately chose minimal intervention despite specific evidence of Russia’s deliberate intervention in order to keep the conflict de-escalated. Despite the West’s refusal to acknowledge the annexation of Crimea, it could be argued that Russia achieved their objective of expanding its sphere of influence.

Vulnerabilities and Considerations

Russia is the U.S.’ current pacing threat, and China is projected to overtake Russia as the Nation’s primary threat as early as 2035. It is important to continue to evaluate the way that the U.S. and its Army respond to adversaries’ increasingly technological attempts to influence, in order to maintain the information and geopolitical superiority of the Nation. For example, the U.S. possesses different moral and ethical standards that restrict the use of information operations. However, because adversarial nations like Russia and China pervasively employ influence and deceptive measures in peacetime, the U.S. and its Army could benefit from developing alternative methods for maintaining an operational advantage against its adversaries.


Adversarial nations can also take advantage of “the [Western] media’s willingness to seek hard evidence and listen to both sides of an argument before coming to a conclusion” by “inserting fabricated or prejudicial information into Western analysis and blocking access to evidence.” The West’s free press will continue to be the primary counter to constructed narratives. Additionally, extensive training of U.S. military and Government personnel, in conjunction with educating its civilian population about Russia and China’s deceitful narratives may decrease the likelihood of perceptions being manipulated:  “If the nation can teach the media to scrutinize the obvious, understand the military, and appreciate the nuances of deception, it may become less vulnerable to deception.” Other ways to exploit Russian and Chinese vulnerabilities could include taking advantage of poor operations security, as well as the use and analysis of geotags to refute and discredit Russian and Chinese propaganda narratives.

A final consideration involves the formation of an interagency committee, similar to the Active Measures Working Group from the 1980s, for the identification and countering of adversarial disinformation and propaganda. The coordination of the disinformation efforts by manipulative countries like Russia is pervasive and exhaustive. Thus, coordination of information operations and counter-propaganda efforts is likewise important between the U.S. Government, the Army, and the rest of the branches of the military. The passing of the Countering Foreign Propaganda and Disinformation Act, part of the 2017 National Defense Authorization Act, was an important first step in the continuing fight to counter foreign information and influence operations that seek to manipulate the U.S. and its decision-makers and undermine its national interests.

For more information on how adversaries will seek to shape perception in the Future Operational Environment, read the following related blog posts:

Influence at Machine Speed: The Coming of AI-Powered Propaganda

Virtual War – A Revolution in Human Affairs (Part I)

Personalized Warfare

Taylor Galanides is a Junior at The College of William and Mary in Virginia, studying Psychology. She is currently interning at Headquarters, U.S. Army Training and Doctrine Command (TRADOC) with the G-2 Futures team.

64. Top Ten Takeaways from the Installations of the Future Conference

On 19-20 June 2018, the U.S. Army Training and Doctrine Command (TRADOC) Mad Scientist Initiative co-hosted the Installations of the Future Conference with the Office of the Assistant Secretary of the Army for Installations, Energy and Environment (OASA (IE&E)) and Georgia Tech Research Institute (GTRI).  Emerging technologies supporting the hyper-connectivity revolution will enable improved training capabilities, security, readiness support (e.g., holistic medical facilities and brain gyms), and quality of life programs at Army installations. Our concepts and emerging doctrine for multi-domain operations recognizes this as increasingly important by including Army installations in the Strategic Support Area. Installations of the Future will serve as mission command platforms to project virtual power and expertise as well as Army formations directly to the battlefield.

We have identified the following “Top 10” takeaways related to our future installations:

Source: Laserfishe

1. Threats and Tensions.Army Installations are no longer sanctuaries” — Mr. Richard G. Kidd IV, Deputy Assistant Secretary of the Army, Strategic Integration. There is a tension between openness and security that will need balancing to take advantage of smart technologies at our Army installations. The revolution in connected devices and the ability to virtually project power and expertise will increase the potential for adversaries to target our installations. Hyper-connectivity increases the attack surface for cyber-attacks and the access to publicly available information on our Soldiers and their families, making personalized warfare and the use of psychological attacks and deep fakes likely.

2. Exclusion vs. Inclusion. The role of and access to future Army installations depends on the balance between these two extremes. The connections between local communities and Army installations will increase potential threat vectors, but resilience might depend on expanding inclusion. Additionally, access to specialized expertise in robotics, autonomy, and information technologies will require increased connections with outside-the-gate academic institutions and industry.

Source: pcmag.com

3. Infrastructure Sensorization.  Increased sensorization of infrastructure runs the risk of driving efficiencies to the point of building in unforeseen risks. In the business world, these efficiencies are profit-driven, with clearer risks and rewards. Use of table top exercises can explore hidden risks and help Garrison Commanders to build resilient infrastructure and communities. Automation can cause cascading failures as people begin to fall “out of the loop.”

4. Army Modernization Challenge.  Installations of the Future is a microcosm of overarching Army Modernization challenges. We are simultaneously invested in legacy infrastructure that we need to upgrade, and making decisions to build new smart facilities. Striking an effective and efficient balance will start with public-private partnerships to capture the expertise that exists in our universities and in industry. The expertise needed to succeed in this modernization effort does not exist in the Army. There are significant opportunities for Army Installations to participate in ongoing consortiums like the “Middle Georgia” Smart City Community and the Global Cities Challenge to pilot innovations in spaces such as energy resilience.

5. Technology is outpacing regulations and policy. The sensorization and available edge analytics in our public space offers improved security but might be perceived as decreasing personal privacy. While we give up some personal privacy when we live and work on Army installations, this collection of data will require active engagement with our communities. We studied an ongoing Unmanned Aerial System (UAS) support concept to detect gunshot incidents in Louisville, KY, to determine the need to involve legislatures, local political leaders, communities, and multiple layers of law enforcement.

6. Synthetic Training Environment. The Installation of the Future offers the Army significant opportunities to divest itself of large brick and mortar training facilities and stove-piped, contractor support-intensive Training Aids, Devices, Simulations, and Simulators (TADSS).  MG Maria Gervais, Deputy Commanding General, Combined Arms Center – Training (DCG, CAC-T), presented the Army’s Synthetic Training Environment (STE), incorporating Virtual Reality (VR)“big box” open-architecture simulations using a One World Terrain database, and reduced infrastructure and contractor-support footprints to improve Learning and Training.  The STE, delivering high-fidelity simulations and the opportunity for our Soldiers and Leaders to exercise all Warfighting Functions across the full Operational Environment with greater repetitions at home station, will complement the Live Training Environment and enhance overall Army readiness.

Source: The Goldwater

7. Security Technologies. Many of the security-oriented technologies (autonomous drones, camera integration, facial recognition, edge analytics, and Artificial Intelligence) that triage and fuse information will also improve our deployed Intelligence, Surveillance, and Reconnaissance (ISR) capabilities. The Chinese lead the world in these technologies today.

Source: TechViz

8. Virtual Prototyping. The U.S. Army Engineer Research and Development Center (ERDC) is developing a computational testbed using virtual prototyping to determine the best investments for future Army installations. The four drivers in planning for Future Installations are:  1) Initial Maneuver Platform (Force Projection); 2) Resilient Installations working with their community partners; 3) Warfighter Readiness; and 4) Cost effectiveness in terms of efficiency and sustainability.

9. Standard Approach to Smart Installations. A common suite of tools is needed to integrate smart technologies onto installations. While Garrison Commanders need mission command to take advantage of the specific cultures of their installations and surrounding communities, the Army cannot afford to have installations going in different directions on modernization efforts. A method is needed to rapidly pilot prototypes and then determine whether and how to scale the technologies across Army installations.

10. “Low Hanging Fruit.” There are opportunities for Army Installations to lead their communities in tech integration. Partnerships in energy savings, waste management, and early 5G infrastructure provide the Army with early adopter opportunities for collaboration with local communities, states, and across the nation. We must educate contracting officers and Government consumers to look for and seize upon these opportunities.

Videos from each of the Installations of the Future Conference presentations are posted here. The associated slides will be posted here within the week on the Mad Scientist All Partners Access Network site.

If you enjoyed this post, check out the following:

• Watch Mr. Richard Kidd IV discuss Installations of the Future on Government Matters.

• Read Mad Scientist Ed Blayney’s takeaways from the Installations of the Future Conference in his article, entitled We need more Mad Scientists in our Smart Cities.

• See the TRADOC G-2 Operational Environment Enterprise’s:

–  The Changing Character of Future Warfare video.

–  Evolving Threats to Army Installations video.

• Review our Call for Ideas winning submissions Trusting Smart Cities: Risk Factors and Implications by Dr. Margaret Loper, and Day in the Life of a Garrison Commander by the team at AT&T Global Public Sector — both are graciously hosted by our colleagues at Small Wars Journal.

• Re-visit our following blog posts: Smart Cities and Installations of the Future: Challenges and Opportunities and Base in a Box.

61. Base in a Box

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following guest blog post by Mr. Lewis Jones. Originally a “Letter Home” submission to the Call for Ideas associated with the Mad Scientist Installations of the Future Conference (see more information about this event at the end of this post), we hope that you will enjoy Mr. Jones’ vision of a mid-Twenty First Century forward deployed base.]

Hey Dad, guess who got new PCS orders!  From March 2042 I’ll be assigned to Joint Base Harris in Japan.  You spent your early career in Japan, right?  I’ll never forget your stories about Camp Zama, a sprawling installation housing hundreds of soldiers and civilians. I  used to love hearing about the 2020s, when enemy sensors, drones, and artificial intelligence first wreaked havoc on operations there.

Source: John Lamb/The Image Bank/Getty Images

Remember the Garrison commander whose face was 3D-scanned by a rigged vending machine near the gate? The enemy released that humiliating video right before a major bilateral operation. By the time we proved it was fake, our partners had already withdrawn.




What about the incident at the intel battalion’s favorite TDY hotel with a pool-side storage safe? Soldiers went swimming and tossed their wallets into the safe, unaware that an embedded scanner would clone their SIPR tokens. To make matters worse, the soldiers secured the safe with a four digit code… using the same numbers as their token PIN.

Source: CNN
Oh, and remember the Prankenstein A.I. attack? It scanned social media to identify Army personnel living off-base, then called local law enforcement with fake complaints. The computer-generated voice was very convincing, even giving physical descriptions based on soldier’s actual photos. You said that one soured host-nation relations for years!

Or the drones that hovered over Camp Zama, broadcasting fake Wi-Fi hotspots. The enemy scooped up so much intelligence and — ah, you get the picture. Overseas bases were so vulnerable back then.


Well, the S1 sent me a virtual tour and the new base is completely different. When U.S. Forces Japan rebuilt its installations, those wide open bases were replaced by miniature, self-contained fortresses. Joint Base Harris, for example, was built inside a refurbished shopping mall: an entire installation, compressed into a single building!

Source: The Cinephile Gardener

Here’s what I saw on my virtual tour:

  • Source: Gizmodo UK

      The roof has solar panels and battery banks for independent power. There’s also an enormous greenhouse, launch pads for drones and helos, and a running trail.

 

  The ground level contains a water plant that extracts and purifies groundwater, along with indoor hydroponic farms. Special filtration units scrub the air; they’re even rated against CBRN threats.

  • Source: tandemnsi.com

      What was once a multi-floor parking garage is now a motor pool, firing range, and fitness complex. The gym walls are smart-screens, so you can work out in a different environment every day.

 

  Communications are encrypted and routed through a satellite uplink. The base even has its own cellphone tower. Special mesh in the walls prevent anybody outside from eavesdropping on emissions— the entire base is a SCIF.

Source: fortune.com

  The mall’s shops and food court were replaced by all the features and functions of a normal base: nearly 2,000 Army, Air and Cyber Force troops living, working, and training inside. They even have a kitchen-bot in the chow hall that can produce seven custom meals per minute!

 

  Supposedly, the base extends several floors underground, but the tour didn’t show that. I guess that’s where the really secret stuff happens.

Source: Gizmodo Australia

By the way, don’t worry about me feeling cooped up:  Soldiers are assigned top-notch VR specs during in-processing.  During the duty day, they’re only for training simulations. Once you’re off, personal use is authorized. I’ll be able to play virtual games, take virtual tours… MWR even lets you link with telepresence robots to “visit” family back home.

The sealed, self-contained footprint of this new base is far easier to defend in today’s high-tech threat environment. Some guys complain about being stuck inside, but you know what I think? If Navy sailors can spend months at sea in self-contained bases, then there’s no reason the Army can’t do the same on land!

Love,
Your Daughter

 

If you were intrigued by this vision of a future Army installation, please plan on joining us virtually at the Mad Scientist Installations of the Future Conference, co-sponsored by the Office of the Assistant Secretary of the Army for Installations, Energy and Environment (OASA (IE&E)); Georgia Tech Research Institute (GTRI); and Headquarters, U.S. Army Training and Doctrine Command (TRADOC),  at GTRI in Atlanta, Georgia, on 19-20 June 2018.  Click here to learn more about the conference and then participate in the live-streamed proceedings, starting at 0830 EDT on 19 June 2018.

Lewis Jones is an Army civilian with nearly 15 years of experience in the Indo-Pacific region. In addition to his Japanese and Chinese language studies, he has earned a Masters in Diplomacy and International Conflict Management from Norwich University. He has worked as a headhunter for multinational investment banks in Tokyo, as a business intelligence analyst for a DOD contractor, and has supported the Army with cybersecurity program management and contract administration. Lewis writes about geopolitics, international relations, U.S. national security, and the effects of rapid advances in technology.

59. Fundamental Questions Affecting Army Modernization

[Editor’s Note:  The Operational Environment (OE) is the start point for Army Readiness – now and in the Future. The OE answers the question, “What is the Army ready for?”  Without the OE in training and Leader development, Soldiers and Leaders are “practicing” in a benign condition, without the requisite rigor to forge those things essential for winning in a complex, multi-domain battlefield.  Building the Army’s future capabilities, a critical component of future readiness, requires this same start point.  The assumptions the Army makes about the Future OE are the sine qua non start point for developing battlefield systems — these assumptions must be at the forefront of decision-making for all future investments.]

There are no facts about the future. Leaders interested in building future ready organizations must develop assumptions about possible futures and these assumptions require constant scrutiny. Leaders must also make decisions based on these assumptions to posture organizations to take advantage of opportunities and to mitigate risks. Making these decisions is fundamental to building future readiness.

Source: Evan Jensen, ARL

The TRADOC G-2 has made the following foundational assumptions about the future that can serve as launch points for important questions about capability requirements and capabilities under development. These assumptions are further described in An Advanced Engagement Battlespace: Tactical, Operational and Strategic Implications for the Future Operational Environment, published by our colleagues at Small Wars Journal.

1. Contested in all domains (air, land, sea, space, and cyber). Increased lethality, by virtue of ubiquitous sensors, proliferated precision, high kinetic energy weapons and advanced area munitions, further enabled by autonomy, robotics, and Artificial Intelligence (AI) with an increasing potential for overmatch. Adversaries will restrict us to temporary windows of advantage with periods of physical and electronic isolation.

Source: Army Technology

2. Concealment is difficult on the future battlefield. Hiding from advanced sensors — where practicable — will require dramatic reduction of heat, electromagnetic, and optical signatures. Traditional hider techniques such as camouflage, deception, and concealment will have to extend to “cross-domain obscuration” in the cyber domain and the electromagnetic spectrum. Canny competitors will monitor their own emissions in real-time to understand and mitigate their vulnerabilities in the “battle of signatures.” Alternately, “hiding in the open” within complex terrain clutter and near-constant relocation might be feasible, provided such relocation could outpace future recon / strike targeting cycles.   Adversaries will operate among populations in complex terrain, including dense urban areas.

3. Trans-regional, gray zone, and hybrid strategies with both regular and irregular forces, criminal elements, and terrorists attacking our weaknesses and mitigating our advantages. The ensuing spectrum of competition will range from peaceful, legal activities through violent, mass upheavals and civil wars to traditional state-on-state, unlimited warfare.

Source: Science Photo Library / Van Parys Media

4. Adversaries include states, non-state actors, and super-empowered individuals, with non-state actors and super empowered individuals now having access to Weapons of Mass Effect (WME), cyber, space, and Nuclear/Biological/ Chemical (NBC) capabilities. Their operational reach will range from tactical to global, and the application of their impact from one domain into another will be routine. These advanced engagements will also be interactive across the multiple dimensions of conflict, not only across every domain in the physical dimension, but also the cognitive dimension of information operations, and even the moral dimension of belief and values.

Source: Northrop Grumman

5. Increased speed of human interaction, events and action with democratized and rapidly proliferating capabilities means constant co-evolution between competitors. Recon / Strike effectiveness is a function of its sensors, shooters, their connections, and the targeting process driving decisions. Therefore, in a contest between peer competitors with comparable capabilities, advantage will fall to the one that is better integrated and makes better and faster decisions.

These assumptions become useful when they translate to potential decision criteria for Leaders to rely on when evaluating systems being developed for the future battlefield. Each of the following questions are fundamental to ensuring the Army is prepared to operate in the future.

Source: Lockheed Martin

1. How will this system operate when disconnected from a network? Units will be disconnected from their networks on future battlefields. Capabilities that require constant timing and precision geo-locational data will be prioritized for disruption by adversaries with capable EW systems.

2. What signature does this system present to an adversary? It is difficult to hide on the future battlefield and temporary windows of advantage will require formations to reduce their battlefield signatures. Capabilities that require constant multi-directional broadcast and units with large mission command centers will quickly be targeted and neutralized.

Image credit: Alexander Kott

3. How does this system operate in dense urban areas? The physical terrain in dense urban areas and megacities creates concrete canyons isolating units electronically and physically. Automated capabilities operating in dense population areas might also increase the rate of false signatures, confusing, rather than improving, Commander decision-making. New capabilities must be able to operate disconnected in this terrain. Weapons systems must be able to slew and elevate rapidly to engage vertical targets. Automated systems and sensors will require significant training sets to reduce the rate of false signatures.

Source: Military Embedded Systems

4. How does this system take advantage of open and modular architectures? The rapid rate of technological innovations will offer great opportunities to militaries capable of rapidly integrating prototypes into formations.  Capabilities developed with open and modular architectures can be upgraded with autonomous and AI enablers as they mature. Early investment in closed-system capabilities will freeze Armies in a period of rapid co-evolution and lead to overmatch.

5. How does this capability help win in competition short of conflict with a near peer competitor? Near peer competitors will seek to achieve limited objectives short of direct conflict with the U.S. Army. Capabilities will need to be effective at operating in the gray zone as well as serving as deterrence. They will need to be capable of strategic employment from CONUS-based installations.

If you enjoyed this post, check out the following items of interest:

    • Join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live 11 June 2018 — click here to learn more!

55. Influence at Machine Speed: The Coming of AI-Powered Propaganda

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following guest blog post by MAJ Chris Telley, U.S. Army, assigned to the Naval Postgraduate School, addressing how Artificial Intelligence (AI) must be understood as an Information Operations (IO) tool if U.S. defense professionals are to develop effective countermeasures and ensure our resilience to its employment by potential adversaries.]

AI-enabled IO present a more pressing strategic threat than the physical hazards of slaughter-bots or even algorithmically-escalated nuclear war. IO are efforts to “influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries;” here, we’re talking about using AI to do so. AI-guided IO tools can empathize with an audience to say anything, in any way needed, to change the perceptions that drive those physical weapons. Future IO systems will be able to individually monitor and affect tens of thousands of people at once. Defense professionals must understand the fundamental influence potential of these technologies if they are to drive security institutions to counter malign AI use in the information environment.

Source: Peter Adamis / Abalinx.com

Programmatic marketing, using consumer’s data habits to drive real time automated bidding on personalized advertising, has been used for a few years now. Cambridge Analytica’s Facebook targeting made international headlines using similar techniques, but digital electioneering is just the tip of the iceberg. An AI trained with data from users’ social media accounts, economic media interactions (Uber, Applepay, etc.), and their devices’ positional data can infer predictive knowledge of its targets. With that knowledge, emerging tools — like Replika — can truly befriend a person, allowing it to train that individual, for good or ill.

Source: Getty Creative

Substantive feedback is required to train an individual’s response; humans tend to respond best to content and feedback with which they agree. That content can be algorithmically mass produced. For years, Narrative Science tools have helped writers create sports stories and stock summaries, but it’s just as easy to use them to create disinformation. That’s just text, though; today, the AI can create fake video. A recent warning, ostensibly from former President Obama, provides an entertaining yet frightening demonstration of how Deepfakes will challenge our presumptions about truth in the coming years. The Defense Advanced Research Projects Agency (DARPA) is funding a project this summer to determine whether AI-generated Deepfakes will become impossible to distinguish from the real thing, even using other AI systems.

Given that malign actors can now employ AI to lieat machine speed,” they still have to get the story to an audience. Russian bot armies continue to make headlines doing this very thing. The New York Times maintains about a dozen Twitter feeds and produces around 300 tweets a day, but Russia’s Internet Research Agency (IRA) regularly puts out 25,000 tweets in the same twenty-four hours. The IRA’s bots are really just low-tech curators; they collect, interpret, and display desired information to promote the Kremlin’s narratives.

Source: Josep Lago/AFP/Getty Images

Next-generation bot armies will employ far faster computing techniques and profit from an order of magnitude greater network speed when 5G services are fielded. If “Repetition is a key tenet of IO execution,” then this machine gun-like ability to fire information at an audience will, with empathetic precision and custom content, provide the means to change a decisive audience’s very reality. No breakthrough science is needed, no bureaucratic project office required. These pieces are already there, waiting for an adversary to put them together.

The DoD is looking at AI but remains focused on image classification and swarming quadcopters while ignoring the convergent possibilities of predictive audience understanding, tailored content production, and massive scale dissemination. What little digital IO we’ve done, sometimes called social media “WebOps,” has been contractor heavy and prone to naïve missteps. However, groups like USSOCOM’s SOFWERX and the students at the Naval Postgraduate School are advancing the state of our art. At NPS, future senior leaders are working on AI, now. A half-dozen of the school’s departments have stood up classes and events specifically aimed at operationalizing advanced computing. The young defense professionals currently working on AI should grapple with emerging influence tools and form the foundation of the DoD’s future institutional capabilities.

MAJ Chris Telley is an Army information operations officer assigned to the Naval Postgraduate School. His assignments have included theater engagement at U.S. Army Japan and advanced technology integration with the U.S. Air Force. Chris commanded in Afghanistan and served in Iraq as a United States Marine. He tweets at @chris_telley.

This blog post represents the opinions of the author and do not reflect the position of the Army or the United States Government.