87. LikeWar — The Weaponization of Social Media

[Editor’s Note: Regular readers will note that one of our enduring themes is the Internet’s emergence as a central disruptive innovation. With the publication of proclaimed Mad Scientist P.W. Singer and co-author Emerson T. Brooking’s LikeWar – The Weaponization of Social Media, Mad Scientist Laboratory addresses what is arguably the most powerful manifestation of the internet — Social Media — and how it is inextricably linked to the future of warfare. Messrs. Singer and Brooking’s new book is essential reading if today’s Leaders (both in and out of uniform) are to understand, defend against, and ultimately wield the non-kinetic, yet violently manipulative effects of Social Media.]

“The modern internet is not just a network, but an ecosystem of 4 billion souls…. Those who can manipulate this swirling tide, steer its direction and flow, can…. accomplish astonishing evil. They can foment violence, stoke hate, sow falsehoods, incite wars, and even erode the pillars of democracy itself.”

As noted in The Operational Environment and the Changing Character of Future Warfare, Social Media and the Internet of Things have spawned a revolution that has connected “all aspects of human engagement where cognition, ideas, and perceptions, are almost instantaneously available.” While this connectivity has been a powerfully beneficial global change agent, it has also amplified human foibles and biases. Authors Singer and Brookings note that humans by nature are social creatures that tend to gravitate into like-minded groups. We “Like” and share things online that resonate with our own beliefs. We also tend to believe what resonates with us and our community of friends.

Whether the cause is dangerous (support for a terrorist group), mundane (support for a political party), or inane (belief that the earth is flat), social media guarantees that you can find others who share your views and even be steered to them by the platforms’ own algorithms… As groups of like-minded people clump together, they grow to resemble fanatical tribes, trapped in echo chambers of their own design.”

Weaponization of Information

The advent of Social Media less than 20 years ago has changed how we wage war.

Attacking an adversary’s most important center of gravity — the spirit of its people — no longer requires massive bombing runs or reams of propaganda. All it takes is a smartphone and a few idle seconds. And anyone can do it.”

Nation states and non-state actors alike are leveraging social media to manipulate like-minded populations’ cognitive biases to influence the dynamics of conflict. This continuous on-line fight for your mind represents “not a single information war but thousands and potentially millions of them.”

 

LikeWar provides a host of examples describing how contemporary belligerents are weaponizing Social Media to augment their operations in the physical domain. Regarding the battle to defeat ISIS and re-take Mosul, authors Singer and Brookings note that:

Social media had changed not just the message, but the dynamics of conflict. How information was being accessed, manipulated, and spread had taken on new power. Who was involved in the fight, where they were located, and even how they achieved victory had been twisted and transformed. Indeed, if what was online could swing the course of a battle — or eliminate the need for battle entirely — what, exactly, could be considered ‘war’ at all?

Even American gang members are entering the fray as super-empowered individuals, leveraging social media to instigate killings via “Facebook drilling” in Chicago or “wallbanging” in Los Angeles.

And it is only “a handful of Silicon Valley engineers,” with their brother and sister technocrats in Beijing, St. Petersburg, and a few other global hubs of Twenty-first Century innovation that are forging and then unleashing the code that is democratizing this virtual warfare.

Artificial Intelligence (AI)-Enabled Information Operations

Seeing is believing, right? Not anymore! Previously clumsy efforts to photo-shop images and fabricate grainy videos and poorly executed CGI have given way to sophisticated Deepfakes, using AI algorithms to create nearly undetectable fake images, videos, and audio tracks that then go viral on-line to dupe, deceive, and manipulate. This year, FakeApp was launched as free software, enabling anyone with an artificial neural network and a graphics processor to create and share bogus videos via Social Media. Each Deepfake video that:

“… you watch, like, or share represents a tiny ripple on the information battlefield, privileging one side at the expense of others. Your online attention and actions are thus both targets and ammunition in an unending series of skirmishes.”

Just as AI is facilitating these distortions in reality, the race is on to harness AI to detect and delete these fakes and prevent “the end of truth.”

If you enjoyed this post:

– Listen to the accompanying playlist composed by P.W. Singer while reading LikeWar.

– Watch P.W. Singer’s presentation on Meta Trends – Technology, and a New Kind of Race from Day 2 of the Mad Scientist Strategic Security Environment in 2025 and Beyond Conference at Georgetown University, 9 August 2016.

– Read more about virtual warfare in the following Mad Scientist Laboratory blog posts:

— MAJ Chris Telley’s Influence at Machine Speed: The Coming of AI-Powered Propaganda

— COL(R) Stefan J. Banach’s Virtual War – A Revolution in Human Affairs (Parts I and II)

— Mad Scientist Intiative’s Personalized Warfare

— Ms. Marie Murphy’s Virtual Nations: An Emerging Supranational Cyber Trend

— Lt Col Jennifer Snow’s Alternet: What Happens When the Internet is No Longer Trusted?

85. Benefits, Vulnerabilities, and the Ethics of Soldier Enhancement

[Editor’s Note: The United States Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International at their Menlo Park, CA, campus on 8-9 March 2018, where participants discussed the advent of new biotechnologies and the associated benefits, vulnerabilities, and ethics associated with Soldier enhancement for the Army of the Future.  The following post is an excerpt from this conference’s final report.]

Source:  Max Pixel

Advances in synthetic biology likely will enhance future Soldier performance – speed, strength, endurance, and resilience – but will bring with it vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing the enhancement.

 

Emerging synthetic biology tools – e.g., CRISPR, Talon, and ZFN – present an opportunity to engineer Soldiers’ DNA and enhance their abilities. Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing. [1] Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Cognitive enhancement could make Soldiers more lethal, more decisive, and perhaps more resilient. Using neurofeedback, a process that allows a user to see their brain activity in real-time, one can identify ideal brain states, and use them to enhance an individual’s mental performance. Through the mapping and presentation of identified expert brains, novices can rapidly improve their acuity after just a few training sessions. [2] Further, there are studies being conducted that explore the possibility of directly emulating those expert brain states with non-invasive EEG caps that could improve performance almost immediately. [3]  Dr. Amy Kruse, the Chief Scientific Officer at the Platypus Institute, referred to this phenomenon as “sitting on a gold mine of brains.”

There is also the potential to change and improve Soldier’s physical attributes. Scientists can develop drugs, specific dietary plans, and potentially use genetic editing to improve speed, strength, agility, and endurance.

Source: Andrew Herr, CEO Helicase

In order to fully leverage the capability of human performance enhancement, Andrew Herr, CEO of Helicase and an Adjunct Fellow at CNAS, suggested that human performance R&D be moved out of the medical field and become its own research area due to its differing objectives and the convergence between varying technologies.

Soldiers, Airmen, Marines, and Sailors are already trying to enhance themselves with commercial products – often containing unknown or unsafe ingredients – so it is incumbent on the U.S. military to, at the very least, help those who want to improve.

However, a host of new vulnerabilities, at the genetic level, accompany this revolutionary leap in human evolution. If one can map the human genome and more thoroughly scan and understand the brain, they can target genomes and brains in the same ways. Soldiers could become incredibly vulnerable at the genomic level, forcing the Army to not only protect Soldiers using body armor and armored vehicles, but also protect their identities, genomes, and physiologies.

Adversaries will exploit all biological enhancements to gain competitive advantage over U.S. forces. Targeted genome editing technology such as CRISPR will enable adversarial threats to employ super-empowered Soldiers on the battlefield and target specific populations with bioweapons. U.S. adversaries may use technologies recklessly to achieve short term gains with no consideration of long range effects. [4] [5]

There are numerous ethical questions that come with the enhancement of Soldiers such as the moral acceptability of the Army making permanent enhancements to Soldiers, the responsibility for returning transitioning Soldiers to a “baseline human,” and the general definition of what a “baseline human” is legally defined as.

Transhumanism H+ symbol by Antonu / Source:  https://commons.wikimedia.org/wiki/File:Transhumanism_h%2B.svg

By altering, enhancing, and augmenting the biology of the human Soldier, the United States Army will potentially enter into uncharted ethical territory. Instead of issuing items to Soldiers to complement their physical and cognitive assets, by 2050, the U.S. Army may have the will and the means to issue them increased biological abilities in those areas. The future implications and the limits or thresholds for enhancement have not yet been considered. The military is already willing to correct the vision of certain members – laser eye surgery, for example – a practice that could be accurately referred to as human enhancement, so discretely defining where the threshold lies will be important. It is already known that other countries, and possible adversaries, are willing to cross the line where we are not. Russia, most recently, was banned from competition in the 2018 Winter Olympics for widespread performance-enhancing drug violations that were believed to be supported by the Russian Government. [6] Those drugs violate the spirit of competition in the Olympics, but no such spirit exists in warfare.

Another consideration is whether or not the Soldier enhancements are permanent. By enhancing Soldiers’ faculties, the Army is, in fact, enhancing their lethality or their ability to defeat the enemy. What happens with these enhancements—whether the Army can or should remove them— when a Soldier leaves the Army is an open question. As stated previously, the Army is willing and able to improve eyesight, but does not revert that eyesight back to its original state after the individual has separated. Some possible moral questions surrounding Soldier enhancement include:

• If the Army were to increase a Soldier’s stamina, visual acuity, resistance to disease, and pain tolerance, making them a more lethal warfighter, is it incumbent upon the Army to remove those enhancements?

• If the Soldier later used those enhancements in civilian life for nefarious purposes, would the Army be responsible?

Answers to these legal questions are beyond the scope of this paper, but can be considered now before the advent of these new technologies becomes widespread.

Image by Leonardo da Vinci / Source: Flickr

If the Army decides to reverse certain Soldier enhancements, it likely will need to determine the definition of a “baseline human.” This would establish norms for features, traits, and abilities that can be permanently enhanced and which must be removed before leaving service. This would undoubtedly involve both legal and moral challenges.

 

The complete Mad Scientist Bio Convergence and Soldier 2050 Final Report can be read here.

To learn more about the ramifications of Soldier enhancement, please go to:

– Dr. Amy Kruse’s Human 2.0 podcast, hosted by our colleagues at Modern War Institute.

– The Ethics and the Future of War panel discussion, facilitated by LTG Jim Dubik (USA-Ret.) from Day 2 (26 July 2017) of the Mad Scientist Visualizing Multi Domain Battle in 2030-2050 Conference at Georgetown University.


[1] Ahmad, Zarah and Stephanie Larson, “The DNA Utility in Military Environments,” slide 5, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018.
[2] Kruse, Amy, “Human 2.0 Upgrading Human Performance,” Slide 12, presented at Mad Scientist Bio Convergence and the Soldier 2050 Conference, 8 March 2018
[3]https://www.frontiersin.org/articles/10.3389/fnhum.2016.00034/full
[4] https://www.technologyreview.com/the-download/610034/china-is-already-gene-editing-a-lot-of-humans/
[5] https://www.c4isrnet.com/unmanned/2018/05/07/russia-confirms-its-armed-robot-tank-was-in-syria/
[6] https://www.washingtonpost.com/sports/russia-banned-from-2018-olympics-following-doping-allegations/2017/12/05/9ab49790-d9d4-11e7-b859-fb0995360725_story.html?noredirect=on&utm_term=.d12db68f42d1

82. Bias and Machine Learning

[Editor’s Note:  Today’s post poses four central questions to our Mad Scientist community of action regarding bias in machine learning and the associated ramifications for artificial intelligence, autonomy, lethality, and decision-making on future warfighting.]

We thought that we had the answers, it was the questions we had wrong” – Bono, U2

Source: www.vpnsrus.com via flickr

As machine learning and deep learning algorithms become more commonplace, it is clear that the utopian ideal of a bias-neutral Artificial Intelligence (AI) is exactly just that. These algorithms have underlying biases embedded in their coding, imparted by their human programmers (either consciously or unconsciously). These algorithms can develop further biases during the machine learning and training process.  Dr. Tolga Bolukbasi, Boston University, recently described algorithms as not being capable of distinguishing right from wrong, unlike humans that can judge their actions, even when they act against ethical norms. For algorithms, data is the ultimate determining factor.

Realizing that algorithms supporting future Intelligence, Surveillance, and Reconnaissance (ISR) networks and Commander’s decision support aids will have inherent biases — what is the impact on future warfighting? This question is exceptionally relevant as Soldiers and Leaders consider the influence of biases in man-machine relationships, and their potential ramifications on the battlefield, especially with regard to the rules of engagement (i.e., mission execution and combat efficiency versus the proportional use of force and minimizing civilian casualties and collateral damage).

It is difficult to make predictions, particularly about the future.” This quote has been attributed to anyone ranging from Mark Twain to Niels Bohr to Yogi Berra. Point prediction is a sucker’s bet. However, asking the right questions about biases in AI is incredibly important.

The Mad Scientist Initiative has developed a series of questions to help frame the discussion regarding what biases we are willing to accept and in what cases they will be acceptable. Feel free to share your observations and questions in the comments section of this blog post (below) or email them to us at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil.

1) What types of bias are we willing to accept? Will a so-called cognitive bias that forgoes a logical, deliberative process be allowable? What about a programming bias that is discriminative towards any specific gender(s), ethnicity(ies), race(s), or even age(s)?

2) In what types of systems will we accept biases? Will machine learning applications in supposedly non-lethal warfighting functions like sustainment, protection, and intelligence be given more leeway with regards to bias?

3) Will the biases in machine learning programming and algorithms be more apparent and/or outweigh the inherent biases of humans-in-the-loop? How will perceived biases affect trust and reliance on machine learning applications?

4) At what point will the pace of innovation and introduction of this technology on the battlefield by our adversaries cause us to forego concerns of bias and rapidly field systems to gain a decisive Observe, Orient, Decide, and Act (OODA) loop and combat speed advantage on the Hyperactive Battlefield?

For additional information impacting on this important discussion, please see the following:

An Appropriate Level of Trust… blog post

Ethical Dilemmas of Future Warfare blog post

Ethics and the Future of War panel discussion video

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

79. Character vs. Nature of Warfare: What We Can Learn (Again) from Clausewitz

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following post by guest blogger LTC Rob Taber, U.S. Army Training and Doctrine Command (TRADOC) G-2 Futures Directorate, clarifying the often confused character and nature of warfare, and addressing their respective mutability.]

No one is arguing that warfare is not changing. Where people disagree, however, is whether the nature of warfare, the character of warfare, or both are changing.

Source:  Office of the Director of National Intelligence

Take, for example, the National Intelligence Council’s assertion in “Global Trends: Paradox of Progress.” They state, “The nature of conflict is changing. The risk of conflict will increase due to diverging interests among major powers, an expanding terror threat, continued instability in weak states, and the spread of lethal, disruptive technologies. Disrupting societies will become more common, with long-range precision weapons, cyber, and robotic systems to target infrastructure from afar, and more accessible technology to create weapons of mass destruction.”[I]

Additionally, Brad D. Williams, in an introduction to an interview he conducted with Amir Husain, asserts, “Generals and military theorists have sought to characterize the nature of war for millennia, and for long periods of time, warfare doesn’t dramatically change. But, occasionally, new methods for conducting war cause a fundamental reconsideration of its very nature and implications.”[II] Williams then cites “cavalry, the rifled musket and Blitzkrieg as three historical examples”[III] from Husain and General John R. Allen’s (ret.) article, “On Hyperwar.”

Unfortunately, the NIC and Mr. Williams miss the reality that the nature of war is not changing, and it is unlikely to ever change. While these authors may have simply interchanged “nature” when they meant “character,” it is important to be clear on the difference between the two and the implications for the military. To put it more succinctly, words have meaning.

The nature of something is the basic make up of that thing. It is, at core, what that “thing” is. The character of something is the combination of all the different parts and pieces that make up that thing. In the context of warfare, it is useful to ask every doctrine writer’s personal hero, Carl Von Clausewitz, what his views are on the matter.

Source: Tetsell’s Blog. https://tetsell.wordpress.com/2014/10/13/clausewitz/

He argues that war is “subjective,”[IV]an act of policy,”[V] and “a pulsation of violence.”[VI] Put another way, the nature of war is chaotic, inherently political, and violent. Clausewitz then states that despite war’s “colorful resemblance to a game of chance, all the vicissitudes of its passion, courage, imagination, and enthusiasm it includes are merely its special characteristics.”[VII] In other words, all changes in warfare are those smaller pieces that evolve and interact to make up the character of war.

The argument that artificial intelligence (AI) and other technologies will enable military commanders to have “a qualitatively unsurpassed level of situational awareness and understanding heretofore unavailable to strategic commander[s][VIII] is a grand claim, but one that has been made many times in the past, and remains unfulfilled. The chaos of war, its fog, friction, and chance will likely never be deciphered, regardless of what technology we throw at it. While it is certain that AI-enabled technologies will be able to gather, assess, and deliver heretofore unimaginable amounts of data, these technologies will remain vulnerable to age-old practices of denial, deception, and camouflage.

 

The enemy gets a vote, and in this case, the enemy also gets to play with their AI-enabled technologies that are doing their best to provide decision advantage over us. The information sphere in war will be more cluttered and more confusing than ever.

Regardless of the tools of warfare, be they robotic, autonomous, and/or AI-enabled, they remain tools. And while they will be the primary tools of the warfighter, the decision to enable the warfighter to employ those tools will, more often than not, come from political leaders bent on achieving a certain goal with military force.

Drone Wars are Coming / Source: USNI Proceedings, July 2017, Vol. 143 / 7 /  1,373

Finally, the violence of warfare will not change. Certainly robotics and autonomy will enable machines that can think and operate without humans in the loop. Imagine the future in which the unmanned bomber gets blown out of the sky by the AI-enabled directed energy integrated air defense network. That’s still violence. There are still explosions and kinetic energy with the potential for collateral damage to humans, both combatants and civilians.

Source: Lockheed Martin

Not to mention the bomber carried a payload meant to destroy something in the first place. A military force, at its core, will always carry the mission to kill things and break stuff. What will be different is what tools they use to execute that mission.

To learn more about the changing character of warfare:

– Read the TRADOC G-2’s The Operational Environment and the Changing Character of Warfare paper.

– Watch The Changing Character of Future Warfare video.

Additionally, please note that the content from the Mad Scientist Learning in 2050 Conference at Georgetown University, 8-9 August 2018, is now posted and available for your review:

– Read the Top Ten” Takeaways from the Learning in 2050 Conference.

– Watch videos of each of the conference presentations on the TRADOC G-2 Operational Environment (OE) Enterprise YouTube Channel here.

– Review the conference presentation slides (with links to the associated videos) on the Mad Scientist All Partners Access Network (APAN) site here.

LTC Rob Taber is currently the Deputy Director of the Futures Directorate within the TRADOC G-2. He is an Army Strategic Intelligence Officer and holds a Master of Science of Strategic Intelligence from the National Intelligence University. His operational assignments include 1st Infantry Division, United States European Command, and the Defense Intelligence Agency.

Note:  The featured graphic at the top of this post captures U.S. cavalrymen on General John J. Pershing’s Punitive Expedition into Mexico in 1916.  Less than two years later, the United States would find itself fully engaged in Europe in a mechanized First World War.  (Source:  Tom Laemlein / Armor Plate Press, courtesy of Neil Grant, The Lewis Gun, Osprey Publishing, 2014, page 19)

_______________________________________________________

[I] National Intelligence Council, “Global Trends: Paradox of Progress,” January 2017, https://www.dni.gov/files/documents/nic/GT-Full-Report.pdf, p. 6.
[II] Brad D. Williams, “Emerging ‘Hyperwar’ Signals ‘AI-Fueled, machine waged’ Future of Conflict,” Fifth Domain, August 7, 2017, https://www.fifthdomain.com/dod/2017/08/07/emerging-hyperwar-signals-ai-fueled-machine-waged-future-of-conflict/.
[III] Ibid.
[VI] Carl Von Clausewitz, On War, ed. Michael Howard and Peter Paret (Princeton: Princeton University Press, 1976), 85.
[V] Ibid, 87.
[VI] Ibid.
[VII] Ibid, 86.
[VIII] John Allen, Amir Hussain, “On Hyper-War,” Fortuna’s Corner, July 10, 2017, https://fortunascorner.com/2017/07/10/on-hyper-war-by-gen-ret-john-allenusmc-amir-hussain/.

72. First Salvo on “Learning in 2050” – Continuity and Change

[Editor’s Note: The U.S. Army Training and Doctrine Command (TRADOC) G-2 is co-hosting the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies on 8-9 August 2018 in Washington, DC.  In advance of this conference, Mad Scientist Laboratory is pleased to present today’s post addressing what is necessary to truly transform Learning in 2050 by returning guest blogger Mr. Nick Marsella.  Read Mr. Marsella’s previous two posts addressing Futures Work at Part I and Part II]

Only a handful of years ago, a conference on the topic of learning in 2050 would spur discussions on needed changes in the way we formally educate and train people to live successful lives and be productive citizens.[I] Advocates in K-12 would probably argue for increasing investment in schools, better technology, and increased STEM education. Higher educators would raise many of the same concerns, pointing to the value of the “the academy” and its universities as integral to the nation’s economic, security, and social well-being by preparing the nation’s future leaders, innovators, and scientists.

Yet, times have changed. “Learning in 2050” could easily address how education and training must meet the required immediate learning needs of the individual and for supporting “lifelong learning” in a very changing and competitive world.[II] The conference could also address how new discoveries in learning and the cognitive sciences will inform the education and training fields, and potentially enhance individual abilities to learn and think.[III] “Learning in 2050” could also focus on how organizational learning will be even more important than today – spelling the difference between bankruptcy and irrelevancy – or for military forces – victory or defeat. We must also address how to teach people to learn and organize themselves for learning.[IV]

Lastly, a “Learning in 2050” conference could also focus on machine learning and how artificial intelligence will transform not only the workplace, but have a major impact on national security.[V] Aside from understanding the potential and limitations of this transformative technology, increasingly we must train and educate people on how to use it to their advantage and understand its limitations for effective “human – machine teaming.” We must also provide opportunities to use fielded new technologies and for individuals to learn when and how to trust it.[VI]

All of these areas would provide rich discussions and perhaps new insights. But just as LTG (ret) H.R. McMaster warned us about thinking about the challenges in future warfare, we must first acknowledge the continuities for this broad topic of “Learning in 2050” and its implications for the U.S. Army.[VII] Until the Army is replaced by robots or knowledge and skills are uploaded directly into the brain as shown in the “Matrix” — learning involves humans and the learning process and the Army’s Soldiers and its civilian workforce [not discounting organizational or machine learning].

Source: U.S. Army https://www.army.mil/article/206197/army_researchers_looking_to_neurostimulation_to_enhance_accelerate_soldiers_abilities

While much may change in the way the individual will learn, we must recognize that the focus of “Learning in 2050” is on the learner and the systems, programs/schools, or technologies adopted in the future must support the learner. As Herbert Simon, one of the founders of cognitive science and a Nobel laureate noted: “Learning results from what the student does and thinks and only from what the student does and thinks. The teacher can advance learning only by influencing what the student does to learn.”[VIII] To the Army’s credit, the U.S. Army Learning Concept for Training and Education 2020-2040 vision supports this approach by immersing “Soldiers and Army civilians in a progressive, continuous, learner-centric, competency-based learning environment,” but the danger is we will be captured by technology, procedures, and discussions about the utility and need for “brick and mortar schools.”[IX]

Learning results from what the student does and thinks and only from what the student does and thinks.

Learning is a process that involves changing knowledge, belief, behavior, and attitudes and is entirely dependent on the learner as he/she interprets and responds to the learning experience – in and out of the classroom.[X] Our ideas, concepts, or recommendations to improve the future of learning in 2050 must either:  improve student learning outcomes, improve student learning efficiency by accelerating learning, or improve the student’s motivation and engagement to learn.

“Learning in 2050” must identify external environmental factors which will affect what the student may need to learn to respond to the future, and also recognize that the generation of 2050 will be different from today’s student in values, beliefs, attitudes, and acceptance of technology.[XI] Changes in the learning system must be ethical, affordable, and feasible. To support effective student learning, learning outcomes must be clearly defined – whether a student is participating in a yearlong professional education program or a five-day field training exercise – and must be understood by the learner.[XII]

We must think big. For example, Professor of Cognition and Education at Harvard’s Graduate School of Education, Howard Gardner postulated that to be successful in the 21st Century requires the development of the “disciplined mind, the synthesizing mind, the creative mind, the respectful mind, and the ethical mind.”[XIII]

Approaches, processes, and organization, along with the use of technology and other cognitive science tools, must focus on the learning process. Illustrated below is the typical officer career timeline with formal educational opportunities sprinkled throughout the years.[XIV] While some form of formal education in “brick and mortar” schools will continue, one wonders if we will turn this model on its head – with more upfront education; shorter focused professional education; more blended programs combining resident/non-resident instruction; and continual access to experts, courses, and knowledge selected by the individual for “on demand” learning. Today, we often use education as a reward for performance (i.e., resident PME); in the future, education must be a “right of the Profession,” equally provided to all (to include Army civilians) – necessary for performance as a member of the profession of arms.

Source: DA Pam 600-3, Commissioned Officer Professional Development and Career Management, December 2014, p.27

The role of the teacher will change. Instructors will become “learning coaches” to help the learner identify gaps and needs in meaningful and dynamic individual learning plans. Like the Army’s Master Fitness Trainer whom advises and monitors a unit’s physical readiness, we must create in our units “Master Learning Coaches,” not simply a training specialist who manages the schedule and records. One can imagine technology evolving to do some of this as the Alexa’s and Siri’s of today become the AI tutors and mentors of the future. We must also remember that any system or process for learning in 2050 must fit the needs of multiple communities: Active Army, Army National Guard, and Army Reserve forces, as well as Army civilians.

Just as the delivery of instruction will change, the assessment of learning will change as well. Simulations and gaming should aim to provide an “Enders’ Game” experience, where reality and simulation are indistinguishable. Training systems should enable individuals to practice repeatedly and as Vince Lombardi noted – “Practice does not make perfect. Perfect practice makes perfect.” Experiential learning will reinforce classroom, on-line instruction, or short intensive courses/seminars through the linkage of “classroom seat time” and “field time” at the Combat Training Centers, Warfighter, or other exercises or experiences.

Tell me and I forget; teach me and I may remember; involve me and I learn.  Benjamin Franklin[XV]

Of course, much will have to change in terms of policies and the way we think about education, training, and learning. If one moves back in time the same number of years that we are looking to the future – it is the year 1984. How much has changed since then?

While in some ways technology has transformed the learning process – e.g., typewriters to laptops; card catalogues to instant on-line access to the world’s literature from anywhere; and classes at brick and mortar schools to Massive Open Online Courses (MOOCs), and blended and on-line learning with Blackboard. Yet, as Mark Twain reportedly noted – “if history doesn’t repeat itself – it rhymes” and some things look the same as they did in 1984, with lectures and passive learning in large lecture halls – just as PowerPoint lectures are ongoing today for some passively undergoing PME.

If “Learning in 2050” is to be truly transformative – we must think differently. We must move beyond the industrial age approach of mass education with its caste systems and allocation of seats. To be successful in the future, we must recognize that our efforts must center on the learner to provide immediate access to knowledge to learn in time to be of value.

Nick Marsella is a retired Army Colonel and is currently a Department of the Army civilian serving as the Devil’s Advocate/Red Team for Training and Doctrine Command. ___________________________________________________________________

[I] While the terms “education” and “training” are often used interchangeably, I will use the oft quoted rule – training is about skills in order to do a job or perform a task, while education is broader in terms of instilling general competencies and to deal with the unexpected.

[II] The noted futurist Alvin Toffler is often quoted noting: “The illiterate of the 21st Century are not those who cannot read and write but those who cannot learn, unlearn, and relearn.”

[III] Sheftick, G. (2018, May 18). Army researchers look to neurostimulation to enhance, accelerate Soldier’s abilities. Retrieved from: https://www.army.mil/article/206197/army_researchers_looking_to_neurostimulation_to_enhance_accelerate_soldiers_abilities

[IV] This will become increasing important as the useful shelf life of knowledge is shortening. See Zao-Sanders, M. (2017). A 2×2 matrix to help you prioritize the skills to learn right now. Harvard Business Review. Retrieved from: https://hbr.org/2017/09/a-2×2-matrix-to-help-you-prioritize-the-skills-to-learn-right-now  — so much to learn, so little time.

[V] Much has been written on AI and its implications. One of the most recent and interesting papers was recently released by the Center for New American Security in June 2018. See: Scharre, P. & Horowitz, M.C. (2018). Artificial Intelligence: What every policymaker needs to know. Retrieved from: https://www.cnas.org/publications/reports/artificial-intelligence-what-every-policymaker-needs-to-know
For those wanting further details and potential insights see: Executive Office of the President, National Science and Technology Council, Committee on Technology Report, Preparing for the Future of Artificial Intelligence, October 2016.

[VI] Based on my anecdotal experiences, complicated systems, such as those found in command and control, have been fielded to units without sufficient training. Even when fielded with training, unless in combat, proficiency using the systems quickly lapses. See: Mission Command Digital Master Gunner, May 17, 2016, retrieved from https://www.army.mil/standto/archive_2016-05-17. See Freedberg, S. Jr. Artificial Stupidity: Fumbling the Handoff from AI to Human Control. Breaking Defense. Retrieved from: https://breakingdefense.com/2017/06/artificial-stupidity-fumbling-the-handoff/

[VII] McMaster, H.R. (LTG) (2015). Continuity and Change: The Army Operating Concept and Clear Thinking about Future War. Military Review.

[VIII] Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C. & Norman, M. K. (2010). How learning works: 7 research-based principles for smart teaching. San Francisco, CA: Jossey-Bass, p. 1.

[IX] U.S. Army Training and Doctrine Command. TRADOC Pamphlet 525-8-2. The U.S. Army Learning Concept for Training and Education 2020-2040.

[X] Ambrose, et al., p.3.

[XI] For example, should machine language be learned as a foreign language in lieu of a traditional foreign language (e.g., Spanish) – given the development of automated machine language translators (AKA = the Universal Translator)?

[XII] The point here is we must clearly understand what we want the learner to learn and adequately define it and insure the learner knows what the outcomes are. For example, we continually espouse that we want leaders to be critical thinkers, but I challenge the reader to find the definitive definition and expected attributes from being a critical thinker given ADRP 6-22, Army Leadership, FM 6-22 Army Leadership, and ADRP 5 and 6 describe it differently. At a recent higher education conference of leaders, administrators and selected faculty, one member succinctly put it this way to highlight the importance of student’s understanding expected learning outcomes: “Teaching students without providing them with learning outcomes is like giving a 500 piece puzzle without an image of what they’re assembling.”

[XIII] Gardner, H. (2008). Five Minds for the Future. Boston, MA: Harvard Business Press. For application of Gardner’s premise see Marsella, N.R. (2017). Reframing the Human Dimension: Gardner’s “Five Minds for the Future.” Journal of Military Learning. Retrieved from: https://www.armyupress.army.mil/Journals/Journal-of-Military-Learning/Journal-of-Military-Learning-Archives/April-2017-Edition/Reframing-the-Human-Dimension/

[XIV] Officer education may differ due to a variety of factors but the normal progression for Professional Military Education includes: Basic Officer Leader Course (BOLC B, to include ROTC/USMA/OCS which is BOLC A); Captains Career Course; Intermediate Level Education (ILE) and Senior Service College as well as specialty training (e.g., language school), graduate school, and Joint schools. Extracted from previous edition of DA Pam 600-3, Commissioned Office Professional Development and Career Management, December 2014, p.27 which is now obsolete. Graphic is as an example. For current policy, see DA PAM 600-3, dated 26 June 2017. .

[XV] See https://blogs.darden.virginia.edu/brunerblog/

69. Demons in the Tall Grass

[Editor’s Note:  Mad Scientist is pleased to present Mr. Mike Matson‘s guest blog post set in 2037 — pitting the defending Angolan 6th Mechanized Brigade with Russian advisors and mercenaries against a Namibian Special Forces incursion supported by South African National Defence Force (SANDF) Special Operators.  Both sides employ autonomous combat systems, albeit very differently — Enjoy!]

Preface:  This story was inspired by two events. First, Boston Dynamics over the last year had released a series of short videos of their humanoid and animal-inspired robots which had generated a strong visceral Internet reaction. Elon Musk had commented about one video that they would “in a few years… move so fast you’ll need a strobe light to see it.” That visual stuck with me and I was looking for an opportunity to expand on that image.

The second event was a recent trip to the Grand Tetons. I had a black bear rise up out of an otherwise empty meadow less than 50 meters away. A 200-kilo predator which can run at 60kph and yet remain invisible in high grass left a strong impression. And while I didn’t see any gray wolves, a guide discussed how some of the packs, composed of groups of 45-kilogram sized animals, had learned how to take down 700-kilogram bison. I visualized packs of speeding robotic wolves with bear-sized robots following behind.

I used these events as the genesis to explore a completely different approach to designing and employing unmanned ground combat vehicles (GCVs). Instead of the Russian crewless, traditional-styled armored vehicles, I approached GCVs from the standpoint of South Africa, which may not have the same resources as Russia, but has an innovative defense industry. If starting from scratch, how might their designs diverge? What could they do with less resources? And how would these designs match up to “traditional” GCVs?

To find out what would happen, I pitted an Angolan mechanize brigade outfitted with Russian GCVs against South African special forces armed with a top secret indigenous GCV program. The setting is southern Angola in 2037, and there are Demons in the Tall Grass. As Mr. Musk said in his Tweet, sweet dreams!  Mike Matson

 

Source: Google Maps

(2230Z 25 May 2037) Savate, Angola

Paulo crouched in his slit trench with his squad mates.  He knew this was something other than an exercise.  The entire Angolan 6th Mechanized Brigade had road marched south to Savate, about 60 kilometers from the Namibian border. There, they were ordered to dig fighting positions and issued live ammunition.

Everyone was nervous. Thirty minutes before, one of their patrols a kilometer south of them had made contact.  A company had gone out in support and a massive firefight had ensued. A panicked officer could be heard on the net calling in artillery on their own position because they were being attacked by demons in the tall grass. Nobody had yet returned.

A pair of Uran-9s, line abreast; Source: RussianDefence.com / Lex Kitaev

Behind Paulo, the battalion commander came forward. With him were three Russian mercenaries.  Paulo knew the Russians had brought along two companies of robot tanks. The robot tanks sported an impressively large number of guns, missiles and lasers. Two of them had deployed with the quick reaction force.  Explosions suggested that they had been destroyed.

Paulo watched the Angolan officer carefully. Suddenly there was a screamed warning from down the trenches.  He whipped around and saw forms in the tall grass moving towards the trenches at a high rate of speed, spread out across his entire front. A dozen or more speeding lines headed directly towards the trenches like fish swimming just under the water.

“Fire!” Paulo ordered and started shooting, properly squeezing off three round bursts. The lines kept coming. Paulo had strobe light-like glimpses of bounding animals. Just before they burst from cover, piercingly loud hyena cries filled the night.  Paulo slammed his hand on the nearby clacker to detonate the directional mines to his front. The world exploded in noise and dust.

(Earlier That Morning) 25 Kilometers south of Savate

Captain Verlin Ellis, Bravo Group, SANDF, crouched with his NCO, his soldiers, and his Namibian SF counterpart at dawn under a tree surrounded by thick green bush.

“Listen up everyone, the operation is a go. Intelligence shows the brigade in a holding position south of Savate. We are to conduct a recon north until we can fix their position. Alpha and Charlie groups will be working their way up the left side. Charlie will hit their right flank with their predator package at the same time we attack from the south and Alpha will be the stopper group with the third group north of town. Once we have them located, we are to hold until nightfall, then attack.”

The tarps came off Bravo Group’s trucks and the men got to work unloading.

Source: BigDog / DeviantArt

First off were Bravo Group’s attack force of forty hyenas. Standing just under two feet high on their articulated legs, and weighing roughly 40 kilos, the small robots were off-loaded and their integrated solar panels were unfolded to top off their battery charges.

The hyenas operated in pack formations via an encrypted mesh network. While they could be directed by human operators if needed and could send and receive data via satellite or drone relay, they were designed to operate in total autonomy at ranges up to 40 kilometers from their handlers.

Each hyena had a swiveling front section like a head with four sensors and a small speaker. The sensors were a camera and separate thermal camera, a range finder, and a laser designator/pointer. Built into the hump of the hyena’s back was a fixed rifle barrel in a bullpup configuration, chambered in 5.56mm, which fired in three round bursts.

On each side there was a pre-loaded 40mm double tube grenade launcher. The guided, low velocity grenades could be launched forward between 25-150 meters. The hyenas were loaded with a mix of HE, CS gas, HEAT, and thermite grenades. They could select targets themselves or have another hyena or human operator designate a target, in which case they were also capable of non-line-of-sight attacks. The attack dogs contained a five-kilo shaped charge limpet mine for attaching to vehicles. There were 24 attack hyenas.

Source: Fausto De Martini / Kill Command

Second off came the buffalos, the heavy weapons support element. There were six of the 350 kilo beasts. They were roughly the same size as a water buffalo, hence their name. They retained the same basic head sensor suite as the hyenas, and a larger, sturdier version of the hyena’s legs.

Three of them mounted an 81mm auto-loading mortar and on their backs were 10 concave docking stations each holding a three ounce helicopter drone called a sparrow. The drone had a ten-minute flight radius with its tiny motor. One ounce of the drone was plastic explosive. They had a simple optical sensor and were designed to land and detonate on anything matching their picture recognition algorithms, such as ammo crates, fuel cans, or engine hoods.

The fourth buffalo sported a small, sleek turret on a flat back, with a 12.7mm machine gun, and the buffalo held 500 rounds of armor-piercing tracer.

The fifth buffalo held an automatic grenade launcher with 200 smart rounds in a similar turret to the 12.7mm gun. The grenades were programmed as they fired and could detonate over trenches or beyond obstacles to hit men behind cover.

The sixth carried three anti-tank missiles in a telescoping turret. Like the mortars, their fire could be directed by hyenas, human operators, or self-directed.

Source: KhezuG / Deviantart.com

Once the hyenas and buffalos were charging, the last truck was carefully unloaded.  Off came the boars — suicide bombs on legs. Each of the 15 machines was short, with stubbier legs for stability. Their outer shells were composed of pre-scarred metal and were overlaid with a layer of small steel balls for enhanced shrapnel. Inside they packed 75 kilos of high explosive. For tonight’s mission each boar was downloaded with different sounds to blare from their speakers, with choices ranging from Zulu war cries, to lion roars, to AC/DC’s Thunderstruck. Chaos was their primary mission.

Between the three Recce groups, nine machines failed warmup. That left 180 fully autonomous and cooperative war machines to hunt the 1,200 strong Angolan 6th Mechanized Brigade.

(One Hour after Attack Began) Savate

Paulo and his team advanced, following spoor through the bush.  The anti-tank team begged to go back but Paulo refused.

Suddenly there was a slight gap in the tall grass just as something in front of them on the far side of a clearing fired. It looked like a giant metal rhino, and it had an automatic grenade launcher on top of it. It fired a burst, then sat down on its haunches to hide.

So that’s why I can’t see them after they fire. Very clever, thought Paulo. He tried calling in fire support but all channels were jammed.

Paulo signaled with his hands for both gunners to shoot. The range was almost too close. Both gunners fired at the same time, striking the beast. It exploded with a surprising fury, blowing them all off their feet and lighting up the sky. They laid there stunned as debris pitter-pattered in the dirt around them.

That was enough for Paulo and the men. They headed back to the safety of the trenches.

As they returned, eight armored vehicles appeared. On the left was an Angolan T-72 tank and three Russian robot tanks. On the right there was a BMP-4 and three more Russian robot tanks.

An animal-machine was trotting close to the vegetation outside the trenches and one of the Russian tank’s lasers swiveled and fired, emitting a loud hum, hitting it. The animal-machine was cut in two. The tanks stopped near the trench to shoot at unseen targets in the dark as Paulo entered the trenches.

The hyena yipping increased in volume as predators began to swarm around the armored force. Five or six were circling their perimeter yipping and shooting grenades. Two others crept under some bushes 70 meters to Paulo’s right and laid down like dogs. A long, thin antenna rose out of the back of one dog with some small device on top. The tanks furiously fired at the fleeting targets which circled them.

Mortar rounds burst around the armor, striking a Russian tank on the thin turret top, destroying it.

From a new direction, the ghost machine gun struck a Russian robot tank with a dozen exploding armor-piercing rounds. The turret was pounded and the externally mounted rockets were hit, bouncing the tank in place from the explosions. A robot tank popped smoke, instantly covering the entire armored force in a blinding white cloud which only added to the chaos. Suddenly the Russian turrets all stopped firing just as a third robot tank was hit by armor-piercing rounds in the treads and disabled.

Silent Ruin;  Source: Army Cyber Institute at West Point / Don Hudson & Kinsun Lo

If you enjoyed this blog post, read “Demons in the Grass” in its entirety here, published by our colleagues at Small Wars Journal.

Mike Matson is a writer in Louisville, Kentucky, with a deep interest in national security and cyber matters. His writing focuses on military and intelligence-oriented science fiction. He has two previous articles published by Mad Scientist: the non-fiction “Complex Cyber Terrain in Hyper-Connected Urban Areas,” and the fictional story, “Gods of Olympus.”  In addition to Louisville, Kentucky, and Washington, DC, he has lived, studied, and worked in Brussels, Belgium, and Tallinn, Estonia. He holds a B.A. in International Studies from The American University and an M.S. in Strategic Intelligence from the National Intelligence University, both in Washington, DC. He can be found on Twitter at @Mike40245.

61. Base in a Box

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following guest blog post by Mr. Lewis Jones. Originally a “Letter Home” submission to the Call for Ideas associated with the Mad Scientist Installations of the Future Conference (see more information about this event at the end of this post), we hope that you will enjoy Mr. Jones’ vision of a mid-Twenty First Century forward deployed base.]

Hey Dad, guess who got new PCS orders!  From March 2042 I’ll be assigned to Joint Base Harris in Japan.  You spent your early career in Japan, right?  I’ll never forget your stories about Camp Zama, a sprawling installation housing hundreds of soldiers and civilians. I  used to love hearing about the 2020s, when enemy sensors, drones, and artificial intelligence first wreaked havoc on operations there.

Source: John Lamb/The Image Bank/Getty Images

Remember the Garrison commander whose face was 3D-scanned by a rigged vending machine near the gate? The enemy released that humiliating video right before a major bilateral operation. By the time we proved it was fake, our partners had already withdrawn.




What about the incident at the intel battalion’s favorite TDY hotel with a pool-side storage safe? Soldiers went swimming and tossed their wallets into the safe, unaware that an embedded scanner would clone their SIPR tokens. To make matters worse, the soldiers secured the safe with a four digit code… using the same numbers as their token PIN.

Source: CNN
Oh, and remember the Prankenstein A.I. attack? It scanned social media to identify Army personnel living off-base, then called local law enforcement with fake complaints. The computer-generated voice was very convincing, even giving physical descriptions based on soldier’s actual photos. You said that one soured host-nation relations for years!

Or the drones that hovered over Camp Zama, broadcasting fake Wi-Fi hotspots. The enemy scooped up so much intelligence and — ah, you get the picture. Overseas bases were so vulnerable back then.


Well, the S1 sent me a virtual tour and the new base is completely different. When U.S. Forces Japan rebuilt its installations, those wide open bases were replaced by miniature, self-contained fortresses. Joint Base Harris, for example, was built inside a refurbished shopping mall: an entire installation, compressed into a single building!

Source: The Cinephile Gardener

Here’s what I saw on my virtual tour:

  • Source: Gizmodo UK

      The roof has solar panels and battery banks for independent power. There’s also an enormous greenhouse, launch pads for drones and helos, and a running trail.

 

  The ground level contains a water plant that extracts and purifies groundwater, along with indoor hydroponic farms. Special filtration units scrub the air; they’re even rated against CBRN threats.

  • Source: tandemnsi.com

      What was once a multi-floor parking garage is now a motor pool, firing range, and fitness complex. The gym walls are smart-screens, so you can work out in a different environment every day.

 

  Communications are encrypted and routed through a satellite uplink. The base even has its own cellphone tower. Special mesh in the walls prevent anybody outside from eavesdropping on emissions— the entire base is a SCIF.

Source: fortune.com

  The mall’s shops and food court were replaced by all the features and functions of a normal base: nearly 2,000 Army, Air and Cyber Force troops living, working, and training inside. They even have a kitchen-bot in the chow hall that can produce seven custom meals per minute!

 

  Supposedly, the base extends several floors underground, but the tour didn’t show that. I guess that’s where the really secret stuff happens.

Source: Gizmodo Australia

By the way, don’t worry about me feeling cooped up:  Soldiers are assigned top-notch VR specs during in-processing.  During the duty day, they’re only for training simulations. Once you’re off, personal use is authorized. I’ll be able to play virtual games, take virtual tours… MWR even lets you link with telepresence robots to “visit” family back home.

The sealed, self-contained footprint of this new base is far easier to defend in today’s high-tech threat environment. Some guys complain about being stuck inside, but you know what I think? If Navy sailors can spend months at sea in self-contained bases, then there’s no reason the Army can’t do the same on land!

Love,
Your Daughter

 

If you were intrigued by this vision of a future Army installation, please plan on joining us virtually at the Mad Scientist Installations of the Future Conference, co-sponsored by the Office of the Assistant Secretary of the Army for Installations, Energy and Environment (OASA (IE&E)); Georgia Tech Research Institute (GTRI); and Headquarters, U.S. Army Training and Doctrine Command (TRADOC),  at GTRI in Atlanta, Georgia, on 19-20 June 2018.  Click here to learn more about the conference and then participate in the live-streamed proceedings, starting at 0830 EDT on 19 June 2018.

Lewis Jones is an Army civilian with nearly 15 years of experience in the Indo-Pacific region. In addition to his Japanese and Chinese language studies, he has earned a Masters in Diplomacy and International Conflict Management from Norwich University. He has worked as a headhunter for multinational investment banks in Tokyo, as a business intelligence analyst for a DOD contractor, and has supported the Army with cybersecurity program management and contract administration. Lewis writes about geopolitics, international relations, U.S. national security, and the effects of rapid advances in technology.

60. Mission Engineering and Prototype Warfare: Operationalizing Technology Faster to Stay Ahead of the Threat

[Editor’s Note: Mad Scientist is pleased to present the following post by a team of guest bloggers from The Strategic Cohort at the U.S. Army Tank Automotive Research, Development, and Engineering Center (TARDEC). Their post lays out a clear and cogent approach to Army modernization, in keeping with the Chief of Staff of the Army GEN Mark A. Milley’s and Secretary of the Army Mark T. Esper’s guidance “to focus the Army’s efforts on delivering the weapons, combat vehicles, sustainment systems, and equipment that Soldiers need when they need it” and making “our Soldiers more effective and our units less logistically dependent.” — The Army Vision,  06 June 2018 ]

 

 

“Success no longer goes to the country that develops a new fighting technology first, but rather to the one that better integrates it and adapts its way of fighting….” The National Defense Strategy (2018).

 

 

Executive Summary
While Futures Command and legislative changes streamline acquisition bureaucracy, the Army will still struggle to keep pace with the global commercial technology marketplace as well as innovate ahead of adversaries who are also innovating.

Chinese Lijian Sharp Sword Unmanned Combat Air Vehicle (UCAV) — Source: U.S. Naval Institute (USNI) News

Reverse engineering and technology theft make it possible for adversaries to inexpensively copy DoD-specific technology “widgets,” potentially resulting in a “negative return” on investment of DoD research dollars. Our adversaries’ pace of innovation further compounds our challenge. Thus the Army must not only equip the force to confront what is expected,

Northrop Grumman X-47B UCAV — Source: USNI News

but equip the force to confront an adaptable enemy in a wide variety of environments. This paper proposes a framework that will enable identification of strategically relevant problems and provide solutions to those problems at the speed of relevance and invert the cost asymmetry.

To increase the rate of innovation, the future Army must learn to continually assimilate, produce, and operationalize technologies much faster than our adversaries to gain time-domain overmatch. The overarching goal is to create an environment that our adversaries cannot duplicate: integration of advanced technologies with skilled Soldiers and well-trained teams. The confluence of two high level concepts — the Office of the Secretary of Defense’s Mission Engineering and Robert Leonard’s Prototype Warfare (see his Principles of Warfare for the Information Age book) — pave the way to increasing the rate of innovation by operationalizing technology faster to stay ahead of the threat, while simultaneously reducing the cost of technology overmatch.

Mission Engineering
OSD’s Mission Engineering concept, proposed by Dr. Robert Gold, calls for acquisitions to treat the end-to-end mission as the system to optimize, in which individual systems are components. Further, the concept utilizes an assessment framework to measure progress towards mission accomplishment through test and evaluation in the mission context. In fact, all actions throughout the capability development cycle must tie back to the mission context through the assessment framework. It goes beyond just sharing data to consider functions and the strategy for trades, tools, cross-cutting functions, and other aspects of developing a system or system of systems.

Consider the example mission objective of an airfield seizure. Traditional thinking and methods would identify an immediate needed capability for two identical air droppable vehicles, therefore starting with a highly constrained platform engineering solution. Mission Engineering would instead start by asking: what is the best way to seize an airfield? What mix of capabilities are required to do so? What mix of vehicles (e.g.,  Soldiers, exoskeletons, robots, etc.) might you need within space and weight constraints of the delivery aircraft? What should the individual performance requirements be for each piece of equipment?

Mission Engineering breaks down cultural and technical “domain stovepipes” by optimizing for the mission instead of a ground, aviation, or cyber specific solution. There is huge innovation space between the conventional domain seams.

Source: www.defenceimages.mod.uk

For example, ground vehicle concepts would be able to explore looking more like motherships deploying exoskeletons, drone swarms, or other ideas that have not been identified or presented because they have no clear home in a particular domain. It warrants stating twice that there are a series of mission optimized solutions that have not been identified or presented because they have no clear home in the current construct. Focusing the enterprise on the mission context of the problem set will enable solutions development that is relevant and timely while also connecting a network of innovators who each only have a piece of the whole picture.

Prototype Warfare

Prototype Warfare represents a paradigm shift from fielding large fleets of common-one-size-fits-all systems to rapidly fielding small quantities of tailored systems. Tailored systems focus on specific functions, specific geographic areas, or even specific fights and are inexpensively produced and possibly disposable.

MRZR with a tethered Hoverfly quadcopter unmanned aircraft system — Source: DefenseNews / Jen Judson

For example, vehicle needs are different for urban, desert, and mountain terrains. A single system is unlikely to excel across those three terrains without employing exotic and expensive materials and technology (becoming expensive and exquisite). They could comprise the entire force or just do specific missions, such as Hobart’s Funnies during the D-Day landings.

A further advantage of tailored systems is that they will force the enemy to deal with a variety of unknown U.S. assets, perhaps seen for the first time. A tank platoon might have a heterogeneous mix of assets with different weapons and armor. Since protection and lethality will be unknown to the enemy, it will be asymmetrically challenging for them to develop in a timely fashion tactics, techniques, and procedures or materiel to effectively counter such new capabilities.

Potential Enablers
Key technological advances present the opportunity to implement the Mission Engineering and Prototype Warfare concepts. Early Synthetic Prototyping (ESP), rapid manufacturing, and the burgeoning field of artificial intelligence (AI) provide ways to achieve these concepts. Each on its own would present significant opportunities. ESP, AI, and rapid manufacturing, when applied within the Mission Engineering/Prototype Warfare framework, create the potential for an innovation revolution.

Under development by the Army Capabilities Integration Center (ARCIC) and U.S. Army Research, Development, and Engineering Command (RDECOM), ESP is a physics-based persistent game network that allows Soldiers and engineers to collaborate on exploration of the materiel, force structure, and tactics trade space. ESP will generate 12 million hours of digital battlefield data per year.

Beyond the ESP engine itself, the Army still needs to invest in cutting edge research in machine learning and big data techniques needed to derive useful data on tactics and technical performance from the data. Understanding human intent and behaviors is difficult work for current computers, but the payoff is truly disruptive. Also, as robotic systems become more prominent on the battlefield, the country with the best AI to control them will have a great advantage. The best AI depends on having the most training, experimental, and digitally generated data. The Army is also acutely aware of the challenges involved in testing and system safety for AI enabled systems; understanding what these systems are intended to do in a mission context fosters debate on the subject within an agreed upon problem space and associated assessment framework.

Finally, to achieve the vision, the Army needs to invest in technology that allows rapid problem identification, engineering, and fielding of tailored systems. For over two decades, the Army has touted modularity to achieve system tailoring and flexibility. However, any time something is modularized, it adds some sort of interface burden or complexity. A specific-built system will always outperform a modular system. Research efforts are needed to understand the trade-offs of custom production versus modularity. The DoD also needs to strategically grow investment in new manufacturing technologies (to include 3D printing) and open architectures with industry.

Associated Implications
New challenges are created when there is a hugely varied fleet of tailored systems, especially for logistics, training, and maintenance. One key is to develop a well-tracked digital manufacturing database of replacement parts. For maintenance, new technologies such as augmented reality might be used to show mechanics who have never seen a system how to rapidly diagnose and make repairs.

Source: Military Embedded Systems

New Soldier interfaces for platforms should also be developed that are standardized/simplified so it is intuitive for a soldier to operate different systems in the same way it is intuitive to operate an iPhone/iPad/Mac to reduce and possibly eliminate the need for system specific training. For example, imagine a future soldier gets into a vehicle and inserts his or her common access card. A driving display populates with the Soldier’s custom widgets, similar to a smartphone display. The displays might also help soldiers understand vehicle performance envelopes. For example, a line might be displayed over the terrain showing how sharp a soldier might turn without a rollover.

Conclusion
The globalization of technology allows anyone with money to purchase “bleeding-edge,” militarizable commercial technology. This changes the way we think about the ability to generate combat power to compete internationally from the physical domain, to the time domain. Through the proposed mission engineering and prototype warfare framework, the Army can assimilate and operationalize technology quicker to create an ongoing time-domain overmatch and invert the current cost asymmetry which is adversely affecting the public’s will to fight. Placing human thought and other resources towards finding new ways to understand mission context and field new solutions will provide capability at the speed of relevance and help reduce operational surprise through a better understanding of what is possible.

Source: Defence Science and Technology Laboratory / Gov.UK

If you enjoyed this post, join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live today — watch the associated video here and join the discussion here!

This article was written by Dr. Rob Smith, Senior Research Scientist; Mr. Shaheen Shidfar, Strategic Cohort Lead; Mr. James Parker, Associate Director; Mr. Matthew A. Horning, Mission Engineer; and Mr. Thomas Vern, Associate Director. Collectively, these gentlemen are a subset of The Strategic Cohort, a multi-disciplinary independent group of volunteers located at TARDEC that study the Army’s Operating Concept Framework to understand how we must change to survive and thrive in the future operating environment. The Strategic Cohort analyzes these concepts and other reference materials, then engages in disciplined debate to provide recommendations to improve TARDEC’s alignment with future concepts, educate our workforce, and create dialogue with the concept developers providing a feedback loop for new ideas.

Further Reading:

Gold, Robert. “Mission Engineering.” 19th Annual NDIA Systems Engineering Conference, Oct. 26, 2016, Springfield, VA. Presentation.

Leonard, Robert R. The Principles of War for the Information Age, Presidio Press (2000).

Martin, A., & FitzGerald, B. “Process Over Platforms.” Center for a New American Security, Dec. 13, 2013.

FitzGerald, B., Sander, A. & Parziale, J. “Future Foundry A New Strategic Approach to Military-Technical Advantage.” Center for a New American Security, Dec. 14, 2016.

Kozloski, Robert. “The Path to Prototype Warfare.” War on the Rocks, 17 July 2017.

Hammes, T.X. “The Future of Warfare: Small, Many, Smart vs. Few & Exquisite?” War on the Rocks, 7 Aug. 2015.

Smith, Robert E. “Tactical Utility of Tailored Systems.” Military Review (2016).

Smith, Robert E. and Vogt, Brian. “Early Synthetic Prototyping Digital Warfighting For Systems Engineering.” Journal of Cyber Security and Information Systems 5.4 (2017).