80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

79. Character vs. Nature of Warfare: What We Can Learn (Again) from Clausewitz

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following post by guest blogger LTC Rob Taber, U.S. Army Training and Doctrine Command (TRADOC) G-2 Futures Directorate, clarifying the often confused character and nature of warfare, and addressing their respective mutability.]

No one is arguing that warfare is not changing. Where people disagree, however, is whether the nature of warfare, the character of warfare, or both are changing.

Source:  Office of the Director of National Intelligence

Take, for example, the National Intelligence Council’s assertion in “Global Trends: Paradox of Progress.” They state, “The nature of conflict is changing. The risk of conflict will increase due to diverging interests among major powers, an expanding terror threat, continued instability in weak states, and the spread of lethal, disruptive technologies. Disrupting societies will become more common, with long-range precision weapons, cyber, and robotic systems to target infrastructure from afar, and more accessible technology to create weapons of mass destruction.”[I]

Additionally, Brad D. Williams, in an introduction to an interview he conducted with Amir Husain, asserts, “Generals and military theorists have sought to characterize the nature of war for millennia, and for long periods of time, warfare doesn’t dramatically change. But, occasionally, new methods for conducting war cause a fundamental reconsideration of its very nature and implications.”[II] Williams then cites “cavalry, the rifled musket and Blitzkrieg as three historical examples”[III] from Husain and General John R. Allen’s (ret.) article, “On Hyperwar.”

Unfortunately, the NIC and Mr. Williams miss the reality that the nature of war is not changing, and it is unlikely to ever change. While these authors may have simply interchanged “nature” when they meant “character,” it is important to be clear on the difference between the two and the implications for the military. To put it more succinctly, words have meaning.

The nature of something is the basic make up of that thing. It is, at core, what that “thing” is. The character of something is the combination of all the different parts and pieces that make up that thing. In the context of warfare, it is useful to ask every doctrine writer’s personal hero, Carl Von Clausewitz, what his views are on the matter.

Source: Tetsell’s Blog. https://tetsell.wordpress.com/2014/10/13/clausewitz/

He argues that war is “subjective,”[IV]an act of policy,”[V] and “a pulsation of violence.”[VI] Put another way, the nature of war is chaotic, inherently political, and violent. Clausewitz then states that despite war’s “colorful resemblance to a game of chance, all the vicissitudes of its passion, courage, imagination, and enthusiasm it includes are merely its special characteristics.”[VII] In other words, all changes in warfare are those smaller pieces that evolve and interact to make up the character of war.

The argument that artificial intelligence (AI) and other technologies will enable military commanders to have “a qualitatively unsurpassed level of situational awareness and understanding heretofore unavailable to strategic commander[s][VIII] is a grand claim, but one that has been made many times in the past, and remains unfulfilled. The chaos of war, its fog, friction, and chance will likely never be deciphered, regardless of what technology we throw at it. While it is certain that AI-enabled technologies will be able to gather, assess, and deliver heretofore unimaginable amounts of data, these technologies will remain vulnerable to age-old practices of denial, deception, and camouflage.

 

The enemy gets a vote, and in this case, the enemy also gets to play with their AI-enabled technologies that are doing their best to provide decision advantage over us. The information sphere in war will be more cluttered and more confusing than ever.

Regardless of the tools of warfare, be they robotic, autonomous, and/or AI-enabled, they remain tools. And while they will be the primary tools of the warfighter, the decision to enable the warfighter to employ those tools will, more often than not, come from political leaders bent on achieving a certain goal with military force.

Drone Wars are Coming / Source: USNI Proceedings, July 2017, Vol. 143 / 7 /  1,373

Finally, the violence of warfare will not change. Certainly robotics and autonomy will enable machines that can think and operate without humans in the loop. Imagine the future in which the unmanned bomber gets blown out of the sky by the AI-enabled directed energy integrated air defense network. That’s still violence. There are still explosions and kinetic energy with the potential for collateral damage to humans, both combatants and civilians.

Source: Lockheed Martin

Not to mention the bomber carried a payload meant to destroy something in the first place. A military force, at its core, will always carry the mission to kill things and break stuff. What will be different is what tools they use to execute that mission.

To learn more about the changing character of warfare:

– Read the TRADOC G-2’s The Operational Environment and the Changing Character of Warfare paper.

– Watch The Changing Character of Future Warfare video.

Additionally, please note that the content from the Mad Scientist Learning in 2050 Conference at Georgetown University, 8-9 August 2018, is now posted and available for your review:

– Read the Top Ten” Takeaways from the Learning in 2050 Conference.

– Watch videos of each of the conference presentations on the TRADOC G-2 Operational Environment (OE) Enterprise YouTube Channel here.

– Review the conference presentation slides (with links to the associated videos) on the Mad Scientist All Partners Access Network (APAN) site here.

LTC Rob Taber is currently the Deputy Director of the Futures Directorate within the TRADOC G-2. He is an Army Strategic Intelligence Officer and holds a Master of Science of Strategic Intelligence from the National Intelligence University. His operational assignments include 1st Infantry Division, United States European Command, and the Defense Intelligence Agency.

Note:  The featured graphic at the top of this post captures U.S. cavalrymen on General John J. Pershing’s Punitive Expedition into Mexico in 1916.  Less than two years later, the United States would find itself fully engaged in Europe in a mechanized First World War.  (Source:  Tom Laemlein / Armor Plate Press, courtesy of Neil Grant, The Lewis Gun, Osprey Publishing, 2014, page 19)

_______________________________________________________

[I] National Intelligence Council, “Global Trends: Paradox of Progress,” January 2017, https://www.dni.gov/files/documents/nic/GT-Full-Report.pdf, p. 6.
[II] Brad D. Williams, “Emerging ‘Hyperwar’ Signals ‘AI-Fueled, machine waged’ Future of Conflict,” Fifth Domain, August 7, 2017, https://www.fifthdomain.com/dod/2017/08/07/emerging-hyperwar-signals-ai-fueled-machine-waged-future-of-conflict/.
[III] Ibid.
[VI] Carl Von Clausewitz, On War, ed. Michael Howard and Peter Paret (Princeton: Princeton University Press, 1976), 85.
[V] Ibid, 87.
[VI] Ibid.
[VII] Ibid, 86.
[VIII] John Allen, Amir Hussain, “On Hyper-War,” Fortuna’s Corner, July 10, 2017, https://fortunascorner.com/2017/07/10/on-hyper-war-by-gen-ret-john-allenusmc-amir-hussain/.

78. The Classified Mind – The Cyber Pearl Harbor of 2034

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following post by guest blogger Dr. Jan Kallberg, faculty member, United States Military Academy at West Point, and Research Scientist with the Army Cyber Institute at West Point. His post serves as a cautionary tale regarding our finite intellectual resources and the associated existential threat in failing to protect them!]

Preface: Based on my experience in cybersecurity, migrating to a broader cyber field, there have always been those exceptional individuals that have an unreplicable ability to see the challenge early on, create a technical solution, and know how to play it in the right order for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of Artificial Intelligence increases our reliance on these highly capable individuals – because someone must set the rules, the boundaries, and point out the trajectory for Artificial Intelligence at initiation.

Source: https://thebulletin.org/2017/10/neuroscience-and-the-new-weapons-of-the-mind/

As an industrialist society, we tend to see technology and the information that feeds it as the weapons – and ignore the few humans that have a large-scale direct impact. Even if identified as a weapon, how do you make a human mind classified? Can we protect these high-ability individuals that in the digital world are weapons, not as tools but compilers of capability, or are we still focused on the tools? Why do we see only weapons that are steel and electronics and not the weaponized mind as a weapon?  I believe firmly that we underestimate the importance of Applicable Intelligence – the ability to play the cyber engagement in the optimal order.  Adversaries are often good observers because they are scouting for our weak spots. I set the stage for the following post in 2034, close enough to be realistic and far enough for things to happen when our adversaries are betting that we rely more on a few minds than we are willing to accept.

Post:  In a not too distant future, 20th of August 2034, a peer adversary’s first strategic moves are the targeted killings of less than twenty individuals as they go about their daily lives:  watching a 3-D printer making a protein sandwich at a breakfast restaurant; stepping out from the downtown Chicago monorail; or taking a taste of a poison-filled retro Jolt Cola. In the gray zone, when the geopolitical temperature increases, but we are still not at war yet, our adversary acts quickly and expedites a limited number of targeted killings within the United States of persons whom are unknown to mass media, the general public, and have only one thing in common – Applicable Intelligence (AI).

The ability to apply is a far greater asset than the technology itself. Cyber and card games have one thing in common, the order you play your cards matters. In cyber, the tools are publicly available, anyone can download them from the Internet and use them, but the weaponization of the tools occurs when used by someone who understands how to play the tools in an optimal order. These minds are different because they see an opportunity to exploit in a digital fog of war where others don’t or can’t see it. They address problems unburdened by traditional thinking, in new innovative ways, maximizing the dual-purpose of digital tools, and can create tangible cyber effects.

It is the Applicable Intelligence (AI) that creates the procedures, the application of tools, and turns simple digital software in sets or combinations as a convergence to digitally lethal weapons. This AI is the intelligence to mix, match, tweak, and arrange dual purpose software. In 2034, it is as if you had the supernatural ability to create a thermonuclear bomb from what you can find at Kroger or Albertson.

Sadly we missed it; we didn’t see it. We never left the 20th century. Our adversary saw it clearly and at the dawn of conflict killed off the weaponized minds, without discretion, and with no concern for international law or morality.

These intellects are weapons of growing strategic magnitude. In 2034, the United States missed the importance of these few intellects. This error left them unprotected.

All of our efforts were instead focusing on what they delivered, the application and the technology, which was hidden in secret vaults and only discussed in sensitive compartmented information facilities. Therefore, we classify to the highest level to ensure the confidentiality and integrity of our cyber capabilities. Meanwhile, the most critical component, the militarized intellect, we put no value to because it is a human. In a society marinated in an engineering mindset, humans are like desk space, electricity, and broadband; it is a commodity that is input in the production of the technical machinery. The marveled technical machinery is the only thing we care about today, 2018, and as it turned out in 2034 as well.

We are stuck in how we think, and we are unable to see it coming, but our adversaries see it. At a systematic level, we are unable to see humans as the weapon itself, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed. As the armory of the war of 1812, as the stockpile of 1943, and as the launch pad of 2034. Arms are made of steel, or fancier metals, with electronics – we failed in 2034 to see weapons made of corn, steak, and an added combative intellect.

General Nakasone stated in 2017, “Our best ones [coders] are 50 or 100 times better than their peers,” and continued “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.” In reality, the success of cyber and cyber operations is highly dependent not on the tools or toolsets but instead upon the super-empowered individual that General Nakasone calls “the 50-x coder.”

Manhattan Project K-25 Gaseous Diffusion Process Building, Oak Ridge, TN / Source: atomicarchive.com

There were clear signals that we could have noticed before General Nakasone pointed it out clearly in 2017. The United States’ Manhattan Project during World War II had at its peak 125,000 workers on the payroll, but the intellects that drove the project to success and completion were few. The difference with the Manhattan Project and the future of cyber is that we were unable to see the human as a weapon, being locked in by our path dependency as an engineering society where we hail the technology and forget the importance of the humans behind it.

J. Robert Oppenheimer – the militarized intellect behind the  Manhattan Project / Source: Life Magazine

America’s endless love of technical innovations and advanced machinery reflects in a nation that has celebrated mechanical wonders and engineered solutions since its creation. For America, technical wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the intercontinental railroad, the Panama Canal, the manufacturing era, the moon landing, and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps that can solve a problem or act.

The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced. In 2034, the era of digital conflicts and the war between algorithms with engagements occurring at machine speed with no time for leadership or human interaction, it is the intellects that design and understand how to play it. We didn’t see it.

In 2034, with fewer than twenty bodies piled up after targeted killings, resides the Cyber Pearl Harbor. It was not imploding critical infrastructure, a tsunami of cyber attacks, nor hackers flooding our financial systems, but instead traditional lead and gunpowder. The super-empowered individuals are gone, and we are stuck in a digital war at speeds we don’t understand, unable to play it in the right order, and with limited intellectual torque to see through the fog of war provided by an exploding kaleidoscope of nodes and digital engagements.

Source: Shutterstock

If you enjoyed this post, read our Personalized Warfare post.

Dr. Jan Kallberg is currently an Assistant Professor of Political Science with the Department of Social Sciences, United States Military Academy at West Point, and a Research Scientist with the Army Cyber Institute at West Point. He was earlier a researcher with the Cyber Security Research and Education Institute, The University of Texas at Dallas, and is a part-time faculty member at George Washington University. Dr. Kallberg earned his Ph.D. and MA from the University of Texas at Dallas and earned a JD/LL.M. from Juridicum Law School, Stockholm University. Dr. Kallberg is a certified CISSP, ISACA CISM, and serves as the Managing Editor for the Cyber Defense Review. He has authored papers in the Strategic Studies Quarterly, Joint Forces Quarterly, IEEE IT Professional, IEEE Access, IEEE Security and Privacy, and IEEE Technology and Society.

71. Shaping Perceptions with Information Operations: Lessons for the Future

[Editor’s Note: Mad Scientist Laboratory is pleased to present today’s guest post by Ms. Taylor Galanides, TRADOC G-2 Summer Intern, exploring how the increasing momentum of human interaction, events, and actions, driven by the convergence of innovative technologies, is enabling adversaries to exploit susceptibilities and vulnerabilities to manipulate populations and undermine national interests.  Ms. Galanides examines contemporary Information Operations as a harbinger of virtual warfare in the future Operational Environment.]

More information is available than ever before. Recent and extensive developments in technology, media, communication, and culture – such as the advent of social media, 24-hour news coverage, and smart devices – allow people to closely monitor domestic and foreign affairs. In the coming decades, the increased speed of engagements, as well as the precise and pervasive targeting of both civilian and military populations, means that these populations and their respective nations will be even more vulnerable to influence and manipulation attempts, misinformation, and cyber-attacks from foreign adversaries.

The value of influencing and shaping the perceptions of foreign and domestic populations in order to pursue national and military interests has long been recognized. This can be achieved through the employment of information operations, which seek to affect the decision-making process of adversaries. The U.S. Army views information operations as an instrumental part of the broader effort to maintain an operational advantage over adversaries. Information operations is specifically defined by the U.S. Army as “The integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries while protecting our own.”

The U.S. Army Training and Doctrine Command (TRADOC) G-2’s The Operational Environment and the Changing Character of Future Warfare further emphasizes this increased attention to the information and cognitive domains in the future – in the Era of Contested Equality (2035 through 2050). As a result, it has been predicted that no single nation will hold hegemony over its adversaries, and major powers and non-state actors alike “… will engage in a fight for information on a global scale.” Winning preemptively in the competitive dimension before escalation into armed conflict through the use of information and psychological warfare will become key.

Source: Becoming Human – Artificial Intelligence Magazine

Part of the driving force that is changing the character of warfare includes the rise of innovative technologies such as computer bots, artificial intelligence, and smart devices. Such emerging and advancing technologies have facilitated the convergence of new susceptibilities to individual and international security; as such, it will become increasingly more important to employ defensive and counter information operations to avoid forming misperceptions or being deceived.

Harbinger of the Future:  Information Operations in Crimea

Russia’s invasion of eastern Ukraine and subsequent annexation of Crimea in 2014 effectively serve as cautionary examples of Russia’s evolving information operations and their perception-shaping capabilities. In Crimea, Russia sought to create a “hallucinating fog of war” in an attempt to alter the analytical judgments and perceptions of its adversaries. With the additional help of computer hackers, bots, trolls, and television broadcasts, the Russian government was able to create a manipulated version of reality that claimed Russian intervention in Crimea was not only necessary, but humanitarian, in order to protect Russian speakers. Additionally, Russian cyberespionage efforts included the jamming or shutting down of telecommunication infrastructures, important Ukrainian websites, and cell phones of key officials prior to the invasion. Through the use of large demonstrations called “snap exercises,” the Russians were able to mask military buildups along the border, as well as its political and military intentions. Russia further disguised their intentions and objectives by claiming adherence to international law, while also claiming victimization from the West’s attempts to destabilize, subvert, and undermine their nation.

By denying any involvement in Crimea until after the annexation was complete, distorting the facts surrounding the situation, and refraining from any declaration of war, Russia effectively infiltrated the international information domain and shaped the decision-making process of NATO countries to keep them out of the conflict.  NATO nations ultimately chose minimal intervention despite specific evidence of Russia’s deliberate intervention in order to keep the conflict de-escalated. Despite the West’s refusal to acknowledge the annexation of Crimea, it could be argued that Russia achieved their objective of expanding its sphere of influence.

Vulnerabilities and Considerations

Russia is the U.S.’ current pacing threat, and China is projected to overtake Russia as the Nation’s primary threat as early as 2035. It is important to continue to evaluate the way that the U.S. and its Army respond to adversaries’ increasingly technological attempts to influence, in order to maintain the information and geopolitical superiority of the Nation. For example, the U.S. possesses different moral and ethical standards that restrict the use of information operations. However, because adversarial nations like Russia and China pervasively employ influence and deceptive measures in peacetime, the U.S. and its Army could benefit from developing alternative methods for maintaining an operational advantage against its adversaries.


Adversarial nations can also take advantage of “the [Western] media’s willingness to seek hard evidence and listen to both sides of an argument before coming to a conclusion” by “inserting fabricated or prejudicial information into Western analysis and blocking access to evidence.” The West’s free press will continue to be the primary counter to constructed narratives. Additionally, extensive training of U.S. military and Government personnel, in conjunction with educating its civilian population about Russia and China’s deceitful narratives may decrease the likelihood of perceptions being manipulated:  “If the nation can teach the media to scrutinize the obvious, understand the military, and appreciate the nuances of deception, it may become less vulnerable to deception.” Other ways to exploit Russian and Chinese vulnerabilities could include taking advantage of poor operations security, as well as the use and analysis of geotags to refute and discredit Russian and Chinese propaganda narratives.

A final consideration involves the formation of an interagency committee, similar to the Active Measures Working Group from the 1980s, for the identification and countering of adversarial disinformation and propaganda. The coordination of the disinformation efforts by manipulative countries like Russia is pervasive and exhaustive. Thus, coordination of information operations and counter-propaganda efforts is likewise important between the U.S. Government, the Army, and the rest of the branches of the military. The passing of the Countering Foreign Propaganda and Disinformation Act, part of the 2017 National Defense Authorization Act, was an important first step in the continuing fight to counter foreign information and influence operations that seek to manipulate the U.S. and its decision-makers and undermine its national interests.

For more information on how adversaries will seek to shape perception in the Future Operational Environment, read the following related blog posts:

Influence at Machine Speed: The Coming of AI-Powered Propaganda

Virtual War – A Revolution in Human Affairs (Part I)

Personalized Warfare

Taylor Galanides is a Junior at The College of William and Mary in Virginia, studying Psychology. She is currently interning at Headquarters, U.S. Army Training and Doctrine Command (TRADOC) with the G-2 Futures team.

61. Base in a Box

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following guest blog post by Mr. Lewis Jones. Originally a “Letter Home” submission to the Call for Ideas associated with the Mad Scientist Installations of the Future Conference (see more information about this event at the end of this post), we hope that you will enjoy Mr. Jones’ vision of a mid-Twenty First Century forward deployed base.]

Hey Dad, guess who got new PCS orders!  From March 2042 I’ll be assigned to Joint Base Harris in Japan.  You spent your early career in Japan, right?  I’ll never forget your stories about Camp Zama, a sprawling installation housing hundreds of soldiers and civilians. I  used to love hearing about the 2020s, when enemy sensors, drones, and artificial intelligence first wreaked havoc on operations there.

Source: John Lamb/The Image Bank/Getty Images

Remember the Garrison commander whose face was 3D-scanned by a rigged vending machine near the gate? The enemy released that humiliating video right before a major bilateral operation. By the time we proved it was fake, our partners had already withdrawn.




What about the incident at the intel battalion’s favorite TDY hotel with a pool-side storage safe? Soldiers went swimming and tossed their wallets into the safe, unaware that an embedded scanner would clone their SIPR tokens. To make matters worse, the soldiers secured the safe with a four digit code… using the same numbers as their token PIN.

Source: CNN
Oh, and remember the Prankenstein A.I. attack? It scanned social media to identify Army personnel living off-base, then called local law enforcement with fake complaints. The computer-generated voice was very convincing, even giving physical descriptions based on soldier’s actual photos. You said that one soured host-nation relations for years!

Or the drones that hovered over Camp Zama, broadcasting fake Wi-Fi hotspots. The enemy scooped up so much intelligence and — ah, you get the picture. Overseas bases were so vulnerable back then.


Well, the S1 sent me a virtual tour and the new base is completely different. When U.S. Forces Japan rebuilt its installations, those wide open bases were replaced by miniature, self-contained fortresses. Joint Base Harris, for example, was built inside a refurbished shopping mall: an entire installation, compressed into a single building!

Source: The Cinephile Gardener

Here’s what I saw on my virtual tour:

  • Source: Gizmodo UK

      The roof has solar panels and battery banks for independent power. There’s also an enormous greenhouse, launch pads for drones and helos, and a running trail.

 

  The ground level contains a water plant that extracts and purifies groundwater, along with indoor hydroponic farms. Special filtration units scrub the air; they’re even rated against CBRN threats.

  • Source: tandemnsi.com

      What was once a multi-floor parking garage is now a motor pool, firing range, and fitness complex. The gym walls are smart-screens, so you can work out in a different environment every day.

 

  Communications are encrypted and routed through a satellite uplink. The base even has its own cellphone tower. Special mesh in the walls prevent anybody outside from eavesdropping on emissions— the entire base is a SCIF.

Source: fortune.com

  The mall’s shops and food court were replaced by all the features and functions of a normal base: nearly 2,000 Army, Air and Cyber Force troops living, working, and training inside. They even have a kitchen-bot in the chow hall that can produce seven custom meals per minute!

 

  Supposedly, the base extends several floors underground, but the tour didn’t show that. I guess that’s where the really secret stuff happens.

Source: Gizmodo Australia

By the way, don’t worry about me feeling cooped up:  Soldiers are assigned top-notch VR specs during in-processing.  During the duty day, they’re only for training simulations. Once you’re off, personal use is authorized. I’ll be able to play virtual games, take virtual tours… MWR even lets you link with telepresence robots to “visit” family back home.

The sealed, self-contained footprint of this new base is far easier to defend in today’s high-tech threat environment. Some guys complain about being stuck inside, but you know what I think? If Navy sailors can spend months at sea in self-contained bases, then there’s no reason the Army can’t do the same on land!

Love,
Your Daughter

 

If you were intrigued by this vision of a future Army installation, please plan on joining us virtually at the Mad Scientist Installations of the Future Conference, co-sponsored by the Office of the Assistant Secretary of the Army for Installations, Energy and Environment (OASA (IE&E)); Georgia Tech Research Institute (GTRI); and Headquarters, U.S. Army Training and Doctrine Command (TRADOC),  at GTRI in Atlanta, Georgia, on 19-20 June 2018.  Click here to learn more about the conference and then participate in the live-streamed proceedings, starting at 0830 EDT on 19 June 2018.

Lewis Jones is an Army civilian with nearly 15 years of experience in the Indo-Pacific region. In addition to his Japanese and Chinese language studies, he has earned a Masters in Diplomacy and International Conflict Management from Norwich University. He has worked as a headhunter for multinational investment banks in Tokyo, as a business intelligence analyst for a DOD contractor, and has supported the Army with cybersecurity program management and contract administration. Lewis writes about geopolitics, international relations, U.S. national security, and the effects of rapid advances in technology.

43. The Changing Character of Warfare: Takeaways for the Future

The Future Operational Environment (OE), as described in The Operational Environment and the Changing Character of Future Warfare , brings with it an inexorable series of movements which lead us to consider the following critical question:

What do these issues mean for the nature and character of warfare?

The nature of war, which has remained relatively constant from Thucydides, through Clausewitz, through the Cold War, and on into the present, certainly remains constant through the Era of Accelerated Human Progress (i.e., now through 2035). War is still waged because of fear, honor, and interest, and remains an expression of politics by other means. However, as we move into the Era of Contested Equality (i.e., 2035-2050), the character of warfare has changed in several key areas:

The Moral and Cognitive Dimensions are Ascendant.

The proliferation of high technology, coupled with the speed of human interaction and pervasive connectivity, means that no one nation will have an absolute strategic advantage in capabilities. When breakthroughs occur, the advantages they confer will be fleeting, as rivals quickly adapt. Under such conditions, the physical dimension of warfare may become less important than the cognitive and the moral. As a result, there will be less self-imposed restrictions by some powers on the use of military force, and hybrid strategies involving information operations, direct cyber-attacks against individuals and segments of populations, or national infrastructure, terrorism, the use of proxies, and Weapons of Mass Destruction (WMD) will aim to prevail against an enemy’s will.

Integration across Diplomacy, Information, Military, and Economic (DIME).

Clausewitz’s timeless dictum that war is policy by other means takes on a new importance as the distance between war and policy recedes; but also must take into account other elements of national power to form true whole-of-government and, when possible, collective security approaches to national security issues. The interrelationship across the DIME will require a closer integration across all elements of government, and Joint decision-making bodies will need to quickly and effectively deliver DIME effects across the physical, the cognitive, and moral dimensions. Military operations are an essential element of this equation, but may not necessarily be the decisive means of achieving an end state.

Limitations of Military Force.

While mid-Century militaries will have more capability than at any time in history, their ability to wage high-intensity conflict will become more limited. Force-on-force conflict will be so destructive, will be waged at the new speed of human and AI-enhanced interaction, and will occur at such extended long-ranges that exquisitely trained and equipped forces facing a peer or near-peer rival will rapidly suffer significant losses in manpower and equipment that will be difficult to replace. Robotics, unmanned vehicles, and man-machine teaming activities offer partial solutions, but warfare will still revolve around increasingly vulnerable human beings. Military forces will need to consider how advances in AI, bio-engineering, man-machine interface, neuro-implanted knowledge, and other areas of enhanced human performance and learning can quickly help reduce the long lead time in training and developing personnel.

The Primacy of Information.

In the timeless struggle between offense and defense, information will become the most important and most useful tool at all levels of warfare. The ability of an actor to use information to target the enemy’s will, without necessarily having to address its means will increasingly be possible. In the past, nations have tried to target an enemy’s will through kinetic attacks on its means – the enemy military – or through the direct targeting of the will by attacking the national infrastructure or a national populace itself. Sophisticated, nuanced information operations, taking advantage of an ability to directly target an affected audience through cyber operations or other forms of influence operations, and reinforced by a credible capable armed force can bend an adversary’s will before battle is joined.

Expansion of the Battle Area.

Nations, non-state actors, and even individuals will be able to target military forces and civilian infrastructure at increasing – often over intercontinental – ranges using a host of conventional and unconventional means. A force deploying to a combat zone will be vulnerable from the individual soldier’s personal residence, to his or her installation, and during his or her entire deployment. Adversaries also will have the ability to target or hold at risk non-military infrastructure and even populations with increasingly sophisticated, nuanced and destructive capabilities, including WMD, hypersonic conventional weapons, and perhaps most critically, cyber weapons and information warfare. WMD will not be the only threat capable of directly targeting and even destroying a society, as cyber and information can directly target infrastructure, banking, food supplies, power, and general ways of life. Limited wars focusing on a limited area of operations waged between peers or near-peer adversaries will become more dangerous as adversaries will have an unprecedented capability to broaden their attacks to their enemy’s homeland. The U.S. Homeland likely will not avoid the effects of warfare and will be vulnerable in at least eight areas.

Ethics of Warfare Shift.
Traditional norms of warfare, definitions of combatants and non-combatants, and even what constitutes military action or national casus belli will be turned upside down and remain in flux at all levels of warfare.


– Does cyber activity, or information operations aimed at influencing national policy, rise to the level of warfare?

– Is using cyber capabilities to target a national infrastructure legal, if it has broad societal impacts?

– Can one target an electric grid that supports a civilian hospital, but also powers a military base a continent away from the battle zone from which unmanned systems are controlled?

– What is the threshold for WMD use?

– Is the use of autonomous robots against human soldiers legal?

These and other questions will arise, and likely will be answered differently by individual actors.

The changes in the character of war by mid-Century will be pronounced, and are directly related and traceable to our present. The natural progression of the changes in the character of war may be a change in the nature of war, perhaps towards the end of the Era of Contested Equality or in the second half of the Twenty First Century.

For additional information, watch the TRADOC G-2 Operational Environment Enterprise’s The Changing Character of Future Warfare video.