122. The Guy Behind the Guy: AI as the Indispensable Marshal

[Editor’s Note: Mad Scientist Laboratory is pleased to present today’s guest blog post by Mr. Brady Moore and Mr. Chris Sauceda, addressing how Artificial Intelligence (AI) systems and entities conducting machine speed collection, collation, and analysis of battlefield information will free up warfighters and commanders to do what they do best — fight and make decisions, respectively. This Augmented Intelligence will enable commanders to focus on the battle with coup d’œil, or the “stroke of an eye,” maintaining situational awareness on future fights at machine speed, without losing precious time crunching data.]

Jon Favreau’s Mike character (left) is the “guy behind the guy,” to Vince Vaughn’s Trent character (right) in Swingers, directed by Doug Liman, Miramax;(1996) / Source: Pinterest

In the 1996 film Swingers, the characters Trent (played by Vince Vaughn) and Mike (played by Jon Favreau) star as a couple of young guys trying to make it in Hollywood. On a trip to Las Vegas, Trent introduces Mike as “the guy behind the guy” – implying that Mike’s value is that he has the know-how to get things done, acts quickly, and therefore is indispensable to a leading figure. Yes, I’m talking about Artificial Intelligence for Decision-Making on the future battlefield – and “the guy behind the guy” sums up how AI will provide a decisive advantage in Multi-Domain Operations (MDO).

Some of the problems commanders will have on future battlefields will be the same ones they have today and the same ones they had 200 years ago: the friction and fog of war. The rise of information availability and connectivity brings today’s challenges – of which most of us are aware. Advanced adversary technologies will bring future challenges for intelligence gathering, command, communication, mobility, and dispersion. Future commanders and their staffs must be able to deal with both perennial and novel challenges faster than their adversaries, in disadvantageous circumstances we can’t control. “The guy behind the guy” will need to be conversant in vast amounts of information and quick to act.

Louis-Alexandre Berthier was a French Marshal and Vice-Constable of the Empire, and Chief of Staff under Napoleon / oil portrait by Jacques Augustin Catherine Pajou (1766–1828), Source: Wikimedia Commons

In western warfare, the original “guy behind the guy” wasn’t Mike – it was this stunning figure. Marshal Louis-Alexandre Berthier was Napoleon Bonaparte’s Chief of Staff from the start of his first Italian campaign in 1796 until his first abdication in 1814. Famous for rarely sleeping while on campaign, Paul Thiebault said of Berthier in 1796:

“Quite apart from his specialist training as a topographical engineer, he had knowledge and experience of staff work and furthermore a remarkable grasp of everything to do with war. He had also, above all else, the gift of writing a complete order and transmitting it with the utmost speed and clarity…No one could have better suited General Bonaparte, who wanted a man capable of relieving him of all detailed work, to understand him instantly and to foresee what he would need.”

Bonaparte’s military record, his genius for war, and skill as a leader are undisputed, but Berthier so enhanced his capabilities that even Napoleon himself admitted about his absence at Waterloo, “If Berthier had been there, I would not have met this misfortune.”

Augmented Intelligence, where intelligent systems enhance human capabilities (rather than systems that aspire to replicate the full scope of human intelligence), has the potential to act as a digital Chief of Staff to a battlefield commander. Just like Berthier, AI for decision-making would free up leaders to clearly consider more factors and make better decisions – allowing them to command more, and research and analyze less. AI should allow humans to do what they do best in combat – be imaginative, compel others, and act with an inherent intuition, while the AI tool finds, processes, and presents the needed information in time.

So Augmented Intelligence would filter information to prioritize only the most relevant and timely information to help manage today’s information overload, as well as quickly help communicate intent – but what about yesterday’s friction and fog, and tomorrow’s adversary technology? The future battlefield seems like one where U.S. commanders will be starved for the kind of Intelligence, Surveillance, and Reconnaissance (ISR) and communication we are so used to today, a battlefield with contested Electromagnetic Spectrum (EMS) and active cyber effects, whether known or unknown. How can commanders and their staffs begin to overcome challenges we haven’t yet been presented in war?

Average is Over: Powering America Beyond the Age of the Great Stagnation, by Tyler Cowen / Dutton, The Penguin Group, published in 2013

In his 2013 book Average is Over, economist Tyler Cowen examines the way freestyle chess players (who are free to use computers when playing the game) use AI tools to compete and win, and makes some interesting observations that are absolutely applicable to the future of warfare at every level. He finds competitors have to play against foes who have AI tools themselves, and that AI tools make chess move decisions that can be recognized (by people) and countered. The most successful freestyle chess players use a combination of their own knowledge of the game, but pick and choose times and situations to use different kinds of AI throughout a game. Their opponents not only then have to consider which AI is being used against them, but also their human operator’s overall strategy. This combination of Augmented Intelligence with an AI tool, along with natural inclinations and human intuitions will likely result in a powerful equilibrium of human and AI perception, analysis, and ultimately enhanced complex decision-making.

With a well-trained and versatile “guy behind the guy,” a commander and staff could employ different aspects of Augmented Intelligence at different times, based on need or appropriateness. A company commander in a dense urban fight, equipped with an appropriate AI tool – a “guy behind the guy” that helps him make sense of the battlefield – what could that commander accomplish with his company? He could employ the tool to notice things humans don’t – or at least notice them faster and alert him. Changes in historic traffic patterns or electronic signals in an area could indicate an upcoming attack or a fleeing enemy, or the system could let the commander know that just a little more specific data could help establish a pattern where enemy data was scarce. And if the commander was presented with the very complex and large problems that characterize modern dense urban combat, the system could help shrink and sequence problems to make them more solvable – for instance find a good subset of information to experiment with and help prove a hypothesis before trying out a solution in the real world – risking bandwidth instead of blood.

The U.S. strategy for MDO has already identified the critical need to observe, orient, decide, and act faster than our adversaries – multiple AI tools that have all necessary information, and can present it and act quickly will certainly be indispensable to leaders on the battlefield. An AI “guy behind the guy” continuously sizing up the situation, finding the right information and allowing for better, faster decisions in difficult situations is how Augmented Intelligence will best serve leaders in combat and provide battlefield advantage.

If you enjoyed this post, please also read:

… watch Juliane Gallina‘s Arsenal of the Mind presentation at the Mad Scientist Robotics, AI, & Autonomy Visioning Multi Domain Battle in 2030-2050 Conference at Georgia Tech Research Institute, Atlanta, Georgia, on 7-8 March 2017

… and learn more about potential AI battlefield applications in our Crowdsourcing the Future of the AI Battlefield information paper.

Brady Moore is a Senior Enterprise Client Executive at Neudesic in New York City. A graduate of The Citadel, he is a former U.S. Army Infantry and Special Forces officer with service as a leader, planner, and advisor across Iraq, Afghanistan, Africa, and, South Asia. After leaving the Army in 2011, he obtained an MBA at Penn State and worked as an IBM Cognitive Solutions Leader covering analytics, AI, and Machine Learning in National Security. He’s the Junior Vice Commander of VFW Post 2906 in Pompton Lakes, NJ, and Cofounder of the Special Forces Association Chapter 58 in New York City. He also works with Elite Meet as often as he can.

Chris Sauceda is an account manager within the U.S. Army Defense and Intel IBM account, covering Command and Control, Cyber, and Advanced Analytics/ Artificial Intelligence. Chris served on active duty and deployed in support of Operation Iraqi Freedom, and has been in the Defense contracting business for over 13 years. Focused on driving cutting edge technologies to the warfighter, he also currently serves as a Signal Officer in the Texas Military Department.

121. Emergent Global Trends Impacting on the Future Operational Environment

[Editor’s Note: Regular readers of the Mad Scientist Laboratory are familiar with a number of disruptive trends and their individual and convergent impacts on the Future Operational Environment (OE). In today’s post, we explore three recent publications to expand our understanding of these and additional emergent global trends.  We also solicit your input on any other trends that have the potential to transform the OE and change the character of future warfare.]

The U.S. Army finds itself at a historical inflection point, where disparate, yet related elements of the Operational Environment (OE) are converging, creating a situation where fast-moving trends across the Diplomatic, Information, Military, and Economic (DIME) spheres are rapidly transforming the nature of all aspects of society and human life – including the character of warfare.” — The Operational Environment and the Changing Character of Future Warfare

Last year, the Mad Scientist Initiative published several products that envisioned these fast-moving trends and how they are transforming the Future OE. These products included our:

• Updated Potential Game Changers information sheet, identifying a host of innovative technologies with the potential to disrupt future warfare during The Era of Accelerated Human Progress (now through 2035) and The Era of Contested Equality (2035 through 2050).

 

 

 

Black Swans and Pink Flamingos blog post, addressing both Black Swan events (i.e., unknown, unknowns) which, though not likely, might have significant impacts on how we think about warfighting and security; and Pink Flamingos, which are the known, knowns that are often discussed, but ignored by Leaders trapped by organizational cultures and rigid bureaucratic decision-making structures.

With the advent of 2019, three new predictive publications have both confirmed and expanded the Mad Scientist Initiative’s understanding of emergent trends and technologies:

• Government Accounting Office (GAO) Report to Congressional Committees: National Security Long Range Emerging Threats Facing the United States As Identified by Federal Agencies, December 2018

• Deloitte Insights Technology, Media, and Telecommunications Predictions 2019, January 2019

• World Economic Forum (WEF) The Global Risks Report 2019, 14th Edition, January 2019

Commonalities:

These three publications collectively confirmed Mad Scientist’s thoughts regarding the disruptive potential of Artificial Intelligence (AI), Quantum Computing, the Internet of Things (IoT), and Big Data; and individually echoed our concerns regarding Cyber, Additive Manufacturing, Space and Counterspace, Natural Disasters, and the continuing threat of Weapons of Mass Destruction. That said, the real value of these (and other) predictions is in informing us about the trends we might have missed, and expanding our understanding of those that we were already tracking.

New Insights:

From the GAO Report we learned:

Megacorporations as adversaries. Our list of potential adversaries must expand to include “large companies that have the financial resources and a power base to exert influence on par with or exceeding non-state actors.” Think super-empowered individual(s) enhanced further by the wealth, reach, influence, and cover afforded by a transnational corporation.

The rich population is shrinking, the poor population is not. Working-age populations are shrinking in wealthy countries and in China and Russia, and are growing in developing, poorer countries…. [with] the potential to increase economic, employment, urbanization and welfare pressures, and spur migration.”

Climate change, environment, and health issues will demand attention. More extreme weather, water and soil stress, and food insecurity will disrupt societies. Sea-level rise, ocean acidification, glacial melt, and pollution will change living patterns. Tensions over climate change will grow.”

Internal and International Migration. Governments in megacities … may not have the capacity to provide adequate resources and infrastructure…. Mass migration events may occur and threaten regional stability, undermine governments, and strain U.S. military and civilian responses.”

Infectious Diseases. New and evolving diseases from the natural environment—exacerbated by changes in climate, the movement of people into cities, and global trade and travel—may become a
pandemic. Drug-resistant forms of diseases previously considered treatable could become widespread again…. Diminishing permafrost could expand habitats for pathogens that cause disease.”

From Deloitte Insights Predictions we learned:

Intuitive AI development services may not require specialized knowledge. “Baidu recently released an AI training platform called EZDL that requires no coding experience and works even with small data training sets…. Cloud providers have developed pre-built machine learning APIs [application-programming interfaces] for technologies such as natural language processing that customers can access instead of building their own.”

Cryptocurrency growth may have driven Chinese semiconductor innovation. Chinese chipmakers’ Application-Specific Integrated Circuits (ASICs), initially designed to meet domestic bitmining demands, may also meet China’s growing demand for AI chipsets vice Graphics Processing Units (GPUs). “Not only could these activities spark more domestic innovation… China just might be positioned to have a larger impact on the next generation of cognitive technologies.”

Quantum-safe security was important yesterday. Malicious adversaries could store classically encrypted information today to decrypt in the future using a QC [Quantum Computer], in a gambit known as a ‘harvest-and-decrypt’ attack.”

From the WEF Report we learned:

This is an increasingly anxious, unhappy, and lonely world. Anger is increasing and empathy appears to be in short supply…. Depression and anxiety disorders increased [globally] between 1990 and 2013…. It is not difficult to imagine such emotional and psychological disruptions having serious diplomatic—and perhaps even military—consequences.”

The risk from biological pathogens is increasing. “Outbreaks since 2000 have been described as a ‘rollcall of near-miss catastrophes’” and they are on the rise. “Biological weapons still have attractions for malicious non-state actors…. it [is] difficult to reliably attribute a biological attack… the direct effects—fatalities and injuries—would be compounded by potentially grave societal and political disruption.”

Use of weather manipulation tools stokes geopolitical tensions. Could be used to disrupt … agriculture or military planning… if states decided unilaterally to use more radical geo-engineering technologies, it could trigger dramatic climatic disruptions.”

Food supply disruption emerges as a tool as geo-economic tensions intensify. Worsening trade wars might spill over into high-stakes threats to disrupt food or agricultural supplies…. Could lead to disruptions of domestic and cross-border flows of food. At the extreme, state or non-state actors could target the crops of an adversary state… with a clandestine biological attack.”

Taps run dry on Water Day Zero. “Population growth, migration, industrialization, climate change, drought, groundwater depletion, weak infrastructure, and poor urban planning” all stress megacities’ ability to meet burgeoning demands, further exacerbating existing urban / rural divides, and could potentially lead to conflicts over remaining supply sources.

What Are We Missing?

The aforementioned trends are by no means comprehensive. Mad Scientist invites our readers to assist us in identifying any other additional emergent global trends that will potentially transform the OE and change the character of future warfare. Please share them with us and our readers by scrolling down to the bottom of this post to the “Leave a Reply” section, entering them in the Comment Box with an accompanying rationale, and then selecting the “Post Comment” button. Thank you in advance for all of your submissions!

If you enjoyed reading these assessments about future trends, please also see the Statement for the Record:  Worldwide Threat Assessment of the US Intelligence Community, 29 January 2019, from the U.S. Senate Select Committee on Intelligence.

120. Autonomous Robotic Systems in the Russian Ground Forces

[Editor’s Note: Mad Scientist Laboratory welcomes back returning guest blogger and proclaimed Mad Scientist Mr. Samuel Bendett with today’s post, addressing Russia’s commitment to mass produce independent ground combat robotic systems. Simon Briggs, professor of interdisciplinary arts at the University of Edinburgh, predicts that “in 2030 AI will be in routine use to fight wars and kill people, far more effectively than humans can currently kill.”  Mr. Bendett’s post below addresses the status of current operationally tested and fielded Russian Unmanned Ground Vehicle (UGV) capabilities, and their pivot to acquire systems able to “independently recognize targets, use weapons, and interact in groups and swarms.” (Note:  Some of the embedded links in this post are best accessed using non-DoD networks.)]

Russian Minister of Defense Sergei Shoigu / Source: Wikimedia Commons

Over the past several years, the Russian military has invested heavily in the design, production, and testing of unmanned combat systems. In March 2018, Russian Defense Minister Sergei Shoigu said that mass production of combat robots for the Russian army could begin as early as that year. Now, the Ministry of Defense (MOD) is moving ahead with creating plans for such systems to act independently on the battlefield.

According to the Russian state media (TASS), Russian military robotic complexes (RBCs) will be able to independently recognize targets, use weapons, and interact in groups and swarms. Such plans were stated in the article by the staff of the 3rd Central Scientific Research Institute of the Russian Federation’s MOD.

Uran-6 Airborne Countermine System with flail / Source: Russian Federation MOD

Russia has already tested several Unmanned Ground Vehicles (UGVs) in combat. Its Uran-6, Scarab, and Sphera demining UGVs were rated well by the Russian engineering forces, and there are plans to start acquisition of such vehicles. However, these systems were designed to have their operators close by. When it came to a UGV that was originally built for operator remoteness in potential combat, things got more complicated.

Uran-9 engaging targets with its 30mm 2A72 autocannon on a test range.  Operational tests in Syria proved less successful.  / Source:  YouTube

Russia’s Uran-9 combat UGV experienced a large number of failures when tested in Syria, among them transportation, communication, firing, and situational awareness. The lessons from Uran-9 tests supposedly prompted the Russian military to consider placing more emphasis on using such UGVs as one-off attack vehicles against adversary hard points and stationary targets.

Russian ground combat forces conducting urban operations in Syria / Source: Wikimedia

Nonetheless, the aforementioned TASS article analyzes the general requirements for unmanned military systems employed by Russian ground forces. Among them is the ability to solve tasks in different combat conditions during day and night, under enemy fire, electronic and informational counteraction, in conditions of radiation, chemical contamination, and electromagnetic attack – as well as requirements such as modularity and multifunctionality. The article also points out “the [systems’] ability to independently perform tasks in conditions of ambiguity” – implying the use of Artificial Intelligence.

To achieve these requirements, the creation of an “intelligent decision-making system” is proposed, which will also supervise the use of weapons. “The way out of this situation is the intensification of research on increasing the autonomy of the RBCs and the introduction of intelligent decision-making systems at the control stages, including group, autonomous movement and use of equipment for its intended purpose, including weapons, into military robotics,” the article says.

An example of the complex, ambiguous environments that will challenge future Russian RBCs:  Russian troops in Aleppo, Syria / Source: Wikimedia Commons via article in the University of Melboune’s Pursuit, “Why is Russia Still Supporting Syria?”

The TASS article states that in the near future, the MOD is planning to initiate work aimed at providing technical support for solving this problem set. This research will include domestic laser scanning devices for geographical positioning, the development of methods and equipment for determining the permeability of the soil on which the UGV operates, the development of methods for controlling the military robot in “unstable communications,” and the development of methods for analyzing combat environments such as recognizing scenes, images, and targets.

Successfully employing UGVs in combat requires complicated systems, something that the aforementioned initiatives will seek to address. This work will probably rely on Russia’s Syrian experience, as well as on the current projects and upgrades to Moscow’s growing fleet of combat UGVs. On 24 January 2018, the Kalashnikov Design Bureau that oversees the completion of Uran-9 work admitted that this UGV has been accepted into military service. Although few details were given, the statement did include the fact that this vehicle will be further “refined” based on lessons learned during its Syria deployment, and that the Uran-9 presents “good scientific and technical groundwork for further products.” The extent of upgrades to that vehicle was not given – however, numerous failures in Syrian trials imply that there is lots of work ahead for this project. The statement also indicates that the Uran-9 may be a test-bed for further UGV development, an interesting fact considering the country’s already diverse collection of combat UGVs

As reported in DefenseOne, Russian Colonel Col. Oleg Pomazuev stated that the Nerekhta UGV “outperformed” manned systems in recent exercises / Source: DefenseOne and Sergey Ptichkin / RG

Today, the Russian military is testing and evaluating several systems, such as Nerekhta and Soratnik. The latter was also supposedly tested in “near-combat” conditions, presumably in Syria or elsewhere. The MOD has been testing smaller Platforma-M and large Vikhr combat UGVs, along with other unmanned vehicles. Yet the defining characteristic for these machines so far has been the fact that they were all remote-operated by soldiers, often in near proximity to the machine itself. Endowing these UGVs with more independent decision–making in the “fog of war” via an intelligent command and control system may exponentially increase their combat effectiveness — assuming that such systems can function as planned.

If you enjoyed this post, please also:

Read Mr. Bendett’s previous post, Russian Ground Battlefield Robots: A Candid Evaluation and Ways Forward

… and watch Zvezda Broadcasting‘s video, showing a Vikhr unmanned, tele-operated BMP-3 maneuvering and shooting its 7.62mm MG, 30mm cannon, and automatic grenade launcher on a test range.

Automated lethality is but one of the many Future Operational Environment trends that the U.S. Army’s Mad Scientist Initiative is tracking. Mad Scientist seeks to crowdsource your visions of future combat with our Science Fiction Writing Contest 2019. Our deadline for submission is 1 APRIL 2019, so please review the contest details and associated release form here, get those creative writing juices flowing, and send us your visions of combat in 2030!  Selected submissions may be chosen for publication or a possible future speaking opportunity.

Samuel Bendett is a Researcher at CNA and a Fellow in Russia Studies at the American Foreign Policy Council. He is also a proud Mad Scientist.

119. The Queue

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our next edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Mad Scientist Initiative has come across during the previous month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment (OE). We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

1. How Satellites the Size of a Grilled Cheese Sandwich Could Change the World, by Aaron Pressman, Fortune (via Yahoo! Finance), 24 January 2019.

One of Swam Technologies’ miniaturized satellites / Source:  Swarm Technologies

Space is rapidly democratizing and the death of tactical and operational surprise might be the casualty. Sara Spangelo and her startup, Swarm Technologies, is on a quest to deliver global communications at the lowest possible cost. This is a shared objective with companies like Elon Musk’s Starlink, but his solution includes thousands of satellites requiring many successful rocket launches. Swarm Technologies takes the decrease in launch costs due to commercialization and the miniaturization of satellites to the max. Swarm Technologies satellites will be the size of a grilled cheese sandwich and will harness the currents coursing through space to maneuver. This should reduce the required cost and time to create a worldwide network of connectivity for texting and collecting Internet of Things (IoT) data to approximately 25 million dollars and eighteen months.

The work at Starlink and Swarm Technologies only represents a small part of a new space race led by companies rather than the governments that built and manage much of space capability today. In the recent Mad Sci blog post “War Laid Bare,” Matthew Ader described this explosion and how access to global communications and sensing might tip the scales of warfare in favor of the finder, providing an overwhelming advantage over competitors that require stealth or need to hide their signatures to be effective in 21st Century warfare.

Eliminating dead zones in global coverage / Source: Swarm Technologies

The impact of this level of global transparency not only weighs on governments and their militaries, but businesses will find it more difficult to hide from competitors and regulators. Cade Metz writes in the New York TimesBusinesses Will Not Be Able to Hide: Spy Satellites May Give Edge from Above” about the impact this will have on global competition. It is a brave new world unless you have something to hide!

 

2. New Rules Takes the Guesswork out of Human Gene Editing, by Kristin Houser, Futurism, 14 December 2018.

Subtitled, “This will fundamentally change the way we use CRISPR,” the subject article was published following Dr. He Jiankui’s announcement in November 2018 that he had successfully gene-edited two human babies. Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) associated protein 9, or CRISPR/Cas9, has become the “go to” tool for genomic engineering. When Dr. He announced that he had altered (as embryos) the twin girls Lulu and Nana’s genes in order to make them HIV-resistant, there was a global outcry from scientists, bio-ethicists, and politicians alike for a variety of reasons. One was the potential imprecision of the genetic editing performed, with the associated risk of unintended genomic damage leading to future health issues for the twins.

With the publication of “Target-Specific Precision of CRISPR-Mediated Genome Editing” in the scientific journal Molecular Cell by research scientists at The Francis Crick Institute in London, however, this particular concern appears to have been mitigated with a set of simple rules that determine the precision of CRISPR/Cas9 editing in human cells.

The effects of CRISPR were thought to be unpredictable and seemingly random,” Crick researcher and group leader Paola Scaffidi said in their news release, “but by analysing hundreds of edits we were shocked to find that there are actually simple, predictable patterns behind it all.”

Per Scaffidi, “Until now, editing genes with CRISPR has involved a lot of guesswork, frustration and trial and error…. The effects of CRISPR were thought to be unpredictable and seemingly random, but by analysing hundreds of edits we were shocked to find that there are actually simple, predictable patterns behind it all. This will fundamentally change the way we use CRISPR, allowing us to study gene function with greater precision and significantly accelerating our science.”

As predicted by Stanford’s bio-ethicist Hank Greely at last March’s Mad Scientist Bio Convergence and Soldier 2050 Conference in Menlo Park, CA, “killer apps” like healthier babies will help overcome the initial resistance to human enhancement via genetic engineering. The Crick Institute’s discovery, with its associated enhanced precision and reliability, may pave the way for market-based designer eugenics. Ramifications for the Future Operational Environment include further societal polarization between the privileged few that will have access to the genomic protocols providing such enhancements and the majority that do not (as in the 2013 film Elysium); the potential for unscrupulous regimes, non-state actors, and super-empowered individuals to breed and employ cadres of genetically enhanced thugs, “button men,” and super soldiers; and the relative policing / combat disadvantage experienced by those powers that outlaw such human genetic enhancements.

 

3. Radical Speaker Series: Countering Weaponized Information, SOFWERX and USSOCOM / J5 Donovan Group, 14 December 2018.

SOFWERX, in collaboration with USSOCOM / J5 Donovan Group, hosted a Radical Speaker Series on weaponized information. Mass influence operations, deep fakes, and social media metrics have been used by state and non-state actors in attempts to influence everything from public sentiment on policy issues to election results. The type and extent of influence operations has laid bare policy and technology gaps. This represents an emerging new threat vector for global competition.

As discussed in the TRADOC G-2’s The Operational Environment and the Changing Character of Future Warfare, Social Media and the Internet of Things has connected “all aspects of human engagement where cognition, ideas, and perceptions, are almost instantaneously available.” While this connectivity has been a global change agent, some are suggesting starting over and abandoning the internet as we know it in favor of alternative internet or “Alternet” solutions.  LikeWar authors Singer and Brookings provide examples of how our adversaries are weaponizing Social Media to augment their operations in the physical domain. One example is the defeat ISIS and re-capture of Mosul, “… Who was involved in the fight, where they were located, and even how they achieved victory had been twisted and transformed. Indeed, if what was online could swing the course of a battle — or eliminate the need for battle entirely — what, exactly, could be considered ‘war’ at all?”

Taken to the next level in the battle for the brain, novel neuroweapons could grant adversaries (and perhaps the United States) the ability to disrupt, degrade, damage, kill, and even “hack” human brains to influence populations. The resulting confusion and panic could disrupt government and society, without mass casualties. These attacks against the human brain facilitate personalized warfare. Neuroweapons are “Weapons of Mass Disruption” that may characterize segments of warfare in the future. These capabilities come with a host of ethical and moral considerations — does affecting someone’s brain purposely, even temporarily, violate ethical codes, treaties, conventions, and international norms followed by the U.S. military? As posed by Singer and Brookings — “what, exactly, could be considered ‘war’ at all?”

 

4. Nano, short film directed by Mike Manning, 2017.

Nano / Source: IMDb

This short film noir focuses on invasive technology and explores themes of liberty, control, and what citizens are willing to trade for safety and security. In a future America, technology has progressed to the point where embedded devices in humans are not only possible and popular, but the norm. These devices, known as Nano, can sync with one’s physiology, alter genomes, change hair and eye color, and, most importantly to law enforcement and government entities, control motor functions. Nano has resulted in a safer society, with tremendous reductions in gun violence. In the film, a new law has passed mandating that all citizens must be upgraded to Nano 2.0 – this controversial move means that the Government will now have access to everyone’s location, will be able to monitor them in real-time, and control their physiology. The Government could, were they so inclined, change someone’s hair color remotely, without permission or, perhaps, more frighteningly, induce indefinite paralysis.

Nano explores and, in some cases, answers the questions about future technologies and their potential impact on society. Nano illustrates how with many of the advantages and services we gain through new technologies, we sometimes have to give up things just as valuable. Technology no longer operates in a vacuum – meaning control over ourselves doesn’t exist. When we use a cellphone, when we access a website, when we, in Nano, change the color of our hair, our actions are being monitored, logged, and tracked by something. With cellphone use, we are willing to live with the fact that we give off a signature that could be traced by a number of agencies, including our service providers, as a net positive outweighing the associated negatives. But where does that line fall? How far would the average citizen go if they could have an embedded device installed that would heal minor wounds and lacerations? What becomes of privacy and what would we be willing to give up? Nano shows the negative consequences of this progression and the dystopian nature of technological slavery. It proposes questions of trust, both in the state and in individuals, and how blurred the lines can be, both in terms of freedoms and physical appearance.

 

5.Artificial Intelligence and the Future of Humans,” by Janna Anderson, Lee Rainie, and Alex Luchsinger, The Pew Research Center, 10 December 2018 (reviewed by Ms. Marie Murphy).

Source: Flikr

The Pew Research Center canvassed a host of technology innovators and business and policy leaders on whether artificial intelligence (AI) and related technology will enhance human capabilities and improve human life, or will it lessen human autonomy and agency to a detrimental level. A majority of the experts who responded to this query agreed that AI will better the lives of most people, but qualified this by noting significant negative outcomes will likely accompany the proliferation and integration of AI systems.

Most agree that AI will greatly benefit humanity and increase the quality of life for many, such as eliminating poverty and disease, while conveniently supplementing human intelligence helping to solve crucial problems. However, there are concerns that AI will conflict with and eventually overpower human autonomy, intelligence, decision-making, analysis, and many other uniquely “human” characteristics. Professionals in the field expressed concerns over the potential for data abuse and cybercrime, job loss, and becoming dependent on AI resulting in the loss of the ability to think independently.

Amy Webb, the founder of the Future Today Institute and professor of strategic foresight at New York University posits that the integration of AI will last for the next 50 years until every industry is reliant on AI systems, requiring workers to possess hybrid skills to compete for jobs that do not yet exist. Simon Briggs, professor of interdisciplinary arts at the University of Edinburgh, predicts that the potential negative outcomes of AI will be the result of a failure of humanity, and that “in 2030 AI will be in routine use to fight wars and kill people, far more effectively than humans can currently kill,” and, “we cannot expect our AI systems to be ethical on our behalf”.

As the U.S. Army continues to explore and experiment with how best to employ AI on the battlefield, there is the great challenge of ensuring that they are being used in the most effective and beneficial capacity, without reducing the efficiency and relevance of the humans working alongside the machines. Warfare will become more integrated with this technology, so monitoring the transition carefully is important for the successful application of AI to military strategy and operations to mitigate its potential negative effects.

 

6.Automation Will Replace Production, Food, and Transportation Jobs First,” by James Dennin, INVERSE, 28 January 2019.

A newly released paper from the Brookings Institute indicated that the advent of autonomy and advanced automation will have unevenly distributed positive and negative effects on varying job and career sectors. According to the report, the three fields most vulnerable to reduction through automation will be production, food service, and transportation jobs. Additionally, certain geographic categories (especially rural, less populated areas) will suffer graver effects of this continuous push towards autonomy.

Though automation is expected to displace labor in 72% of businesses in 2019, the prospects of future workers is not all doom and gloom. As the report notes, automation in a general sense replaces tasks and not entire jobs, although AI and autonomy makes the specter of total job replacement more likely. Remaining tasks make humans even more critical though there may be less of them. While a wide variety of workers are at risk, young people face higher risks of labor displacement (16-24 year olds) partially due to a large amount of their jobs being in the aforementioned sectors.

All of these automation impacts have significant implications for the Future Operational Environment, U.S. Army, and the Future of Warfare. An increase in automation and autonomy in production, food service, and transportation may mean that Soldiers can focus more exclusively on warfighting – moving, shooting, communicating – and in many cases will be complemented and made more lethal through automation. The dynamic nature of work due to these shifts could cause significant unrest requiring military attention in unexpected places. Additionally, the labor displacement of so much youth could be both a boon and a hindrance to the Army. On one hand, there could be a glut of new recruits due to poor employment outlook in the private sector; contrariwise, many of the freshly available recruits may not inherently have the required skills or even aptitude for becoming Warfighters.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future OE, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

118. The Future of Learning: Personalized, Continuous, and Accelerated

[Editor’s Note: At the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies in Washington, DC, Leading scientists, innovators, and scholars gathered to discuss how humans will receive, process, and integrate  information in the future.  The convergence  of technology, the speed of change, the generational differences of new Recruits, and the uncertainty of the Future Operational Environment will dramatically alter the way Soldiers and Leaders learn in 2050.  One clear signal generated from this conference is that learning in the future will be personalized, continuous, and accelerated.]

Personalized Learning

The principal consequence of individual differences is that every general law of teaching has to be applied with consideration of the particular person.” – E.L. Thorndike (1906)

The world is becoming increasingly personalized, and individual choice and preference drives much of daily life, from commerce, to transportation, to entertainment. For example, your Amazon account today can keep your payment information on file (one click away), suggest new products based on your purchase history, and allow you to shop from anywhere and ship to any place, all while tracking your purchase every step of the way, including providing photographic proof of delivery. Online retailers, personal transportation services, and streaming content providers track and maintain an unprecedented amount of specific individual information to deliver a detailed and personalized experience for the consumer.

There is an opportunity to improve the effectiveness in targeted areas of learning – skills training, foundational learning, and functional training, for example – if learning institutions and organizations, as well as learners, follow the path of personalization set by commerce, transportation, and entertainment.1 This necessitates an institutional shift in the way we educate Soldiers. Instead of training being administered based on rank or pre-determined schedule, it is conducted based on need, temporally optimized for maximum absorption and retention, in a style that matches the learner, and implemented on the battlefield, if needed.

An important facet of personalized learning is personal attention to the learner. Tutors have been used in education for 60,000 years.2 However, they always have been limited to how many educators could devote their attention to one student. With advancements in AI, intelligent tutors could reduce the cost and manpower requirements associated with one-on-one instructor to student ratios. Research indicates that students who have access to tutors as opposed to exclusive classroom instruction were more effective learners as seen in the chart below. In other words, the average tutored student performed better than 98 percent of the students in the traditional classroom.3 What was a problem of scale in the past – cost, manpower, time – can be alleviated in the future through the use of AI-enabled ubiquitous intelligent tutors.

Another aspect of personalized learning is the diminishing importance of geo-location. Education, in general, has traditionally been executed in a “brick and mortar” setting. The students, learners, or trainees physically travel to the location of the teacher, expert, or trainer in order for knowledge to be imparted. Historically, this was the only viable option. However, a hyper-connected world with enabling technologies like virtual and augmented reality; high-bandwidth networks with low latency; high fidelity modeling, simulations, and video; and universal interfaces reduces or eliminates the necessity for physical co-location. This allows Soldiers to attend courses hosted virtually anywhere, participate in combined arms and Joint exercises globally, and experience a variety of austere and otherwise inaccessible environments through virtual and augmented reality.4

Based on these trends and emerging opportunities to increase efficiency, the Army may have to re-evaluate its educational and training frameworks and traditional operational practices to adjust for more individualized and personalized learning styles. When personalized learning is optimized, Soldiers could become more lethal, specially skilled, and decisive along a shorter timeline, using lesser budget resources, and with reduced manpower.

Continuous Learning

Continuous learning, or the process of repeatedly engaging in activities designed to learn new information or skills, is a natural process that will remain necessary for Soldiers and Leaders in 2050. The future workforce will define and drive when, where, and how learning takes place. Continuous learning has the advantage of allowing humans to learn from past mistakes and understand biases by “working the problem” – assessing and fixing biases, actively changing behavior to offset biases, moving on to decision-making, and then returning to work the problem again for further solutions. Learners must be given the chance to fail, and failure must be built in to the continuous learning process so that the learner not only arrives at the solution organically, but practices critical thinking and evaluation skills.5

There are costs and caveats to successful continuous learning. After a skill is learned, it must be continually practiced and maintained. Amy Titus explained how skills perish after 3-5 years unless they are updated to meet present needs and circumstances. In an environment of rapidly changing technology and situational dynamics, keeping skills up to date must be a conscious and nonstop process. One of the major obstacles to continuous learning is that learning is work and requires a measure of self-motivation to execute. Learners only effectively learn if they are curious, so learning to pass a class or check a box does not yield the same result as genuine interest in the subject.6 New approaches such as gamification and experiential learning can help mitigate some of these limitations.

Accelerated Learning

The concept of accelerated learning, or using a compressed timeline and various approaches, methodologies, or technological means to maximize learning, opens up several questions: what kinds of technologies accelerate learning, and how does technology accelerate learning? Technologies useful for accelerated learning include the immersive reality spectrum – virtual reality/augmented reality (mixed reality) and haptic feedback – as well as wearables, neural stimulation, and brain mapping. These technologies and devices enable the individualization and personalization of learning. Individualization allows the learner to identify their strengths and weaknesses in learning, retaining, and applying information and provides a program structured to capitalize on his/her naturally favored learning style to maximize the amount and depth of information presented in the most time and cost-effective manner.

Digital learning platforms are important tools for the tracking of a Soldier’s progress. This tool not only delivers individualized progress reports to superiors and instructors, but also allows the learner to remain up to date regardless of their physical location. Intelligent tutors may be integrated into a digital learning platform, providing real-time, individual feedback and suggesting areas for improvement or those in need of increased attention. Intelligent tutors and other technologies utilized in the accelerated learning process, such as augmented reality, can be readily adapted to a variety of situations conforming to the needs of a specific unit or mission.

Besides external methods of accelerated learning, there are also biological techniques to increase the speed and accuracy of learning new skills. DARPA scientist Dr. Tristan McClure-Begley introduced Targeted Neuroplasticity Training (TNT), whereby the peripheral nervous system is artificially stimulated resulting in the rapid acquisition of a specific skill. Soldiers can learn movements and retain that muscle memory faster than the time it would take to complete many sets of repetitions by pairing nerve stimulation with the performance of a physical action.

Accelerated learning does not guarantee positive outcomes. There is a high initial startup cost to producing mixed, augmented, and virtual reality training programs, and these programs require massive amounts of data and inputs for the most realistic product.7 There are questions about the longevity and quality of retention when learning is delivered through accelerated means. About 40 percent of information that humans receive is forgotten after 20 minutes and another 40 percent is lost after 30 days if it is not reinforced.8

Most learners attribute mastery of a skill to practical application and not formal training programs.9 TNT attempts to mitigate this factor by allowing for multiple physical repetitions to be administered quickly. But this technique must be correctly administered, or psychological and physiological pairing may not occur correctly or occur between the wrong stimuli, creating maladaptive plasticity, which is training the wrong behavior.

An increased emphasis on continuous and accelerated learning could present the Army with an opportunity to have Soldiers that are lifelong learners capable of quickly picking up emerging required skills and knowledge. However, this focus would need to account for peak learner interest and long-term viability.

If you enjoyed this post, please also watch Dr. Dexter Fletcher‘s video presentation on Digital Mentors and Tutors and Dr. Tristan McClure-Begley‘s presentation on Targeted Neuroplasticity Training from of the Mad Scientist Learning in 2050 Conference

… see the following related blog posts:

… and read The Mad Scientist Learning in 2050 Conference Final Report.


1 Smith-Lewis, Andrew, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

2 Fletcher, Dexter, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

3 https://www.edsurge.com/news/2014-08-10-personalization-and-the-2-sigma-problem

4 Titus, Amy, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

5 Taylor, Christopher, Mad Scientist Conference: Learning in 2050, Georgetown University, 9 August 2018

6 Masie, Elliott, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

7 Hill, Randall, Mad Scientist Conference: Learning in 2050, Georgetown University, 9 August 2018

8 Goodwin, Gregory, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

9 Masie, Elliott, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018

117. Old Human vs. New Human

[Editor’s Note: On 8-9 August 2018, the U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies in Washington, DC. Leading scientists, innovators, and scholars from academia, industry, and the government gathered to address future learning techniques and technologies that are critical in preparing for Army operations in the mid-21st century against adversaries in rapidly evolving battlespaces. One finding from this conference is that tomorrow’s Soldiers will learn differently from earlier generations, given the technological innovations that will have surrounded them from birth through their high school graduation.  To effectively engage these “New Humans” and prepare them for combat on future battlefields, the Army must discard old paradigms of learning that no longer resonate (e.g., those desiccated lectures delivered via interminable PowerPoint presentations) and embrace more effective means of instruction.]

The recruit of 2050 will be born in 2032 and will be fundamentally different from the generations born before them. Marc Prensky, educational writer and speaker who coined the term digital native, asserts this “New Human” will stand in stark contrast to the “Old Human” in the ways they assimilate information and approach learning.1 Where humans today are born into a world with ubiquitous internet, hyper-connectivity, and the Internet of Things, each of these elements are generally external to the human. By 2032, these technologies likely will have converged and will be embedded or integrated into the individual with connectivity literally on the tips of their fingers. The challenge for the Army will be to recognize the implications of this momentous shift and alter its learning methodologies, approach to training, and educational paradigm to account for these digital natives.

These New Humans will be accustomed to the use of artificial intelligence (AI) to augment and supplement decision-making in their everyday lives. AI will be responsible for keeping them on schedule, suggesting options for what and when to eat, delivering relevant news and information, and serving as an on-demand embedded expert. The Old Human learned to use these technologies and adapted their learning style to accommodate them, while the New Human will be born into them and their learning style will be a result of them. In 2018, 94% of Americans aged 18-29 owned some kind of smartphone.2 Compare that to 73% ownership for ages 50-64 and 46% for age 65 and above and it becomes clear that there is a strong disconnect between the age groups in terms of employing technology. Both of the leading software developers for smartphones include a built-in artificially intelligent digital assistant, and at the end of 2017, nearly half of all U.S. adults used a digital voice assistant in some way.3 Based on these trends, there likely will be in the future an even greater technological wedge between New Humans and Old Humans.

http://www.pewinternet.org/fact-sheet/mobile/

New Humans will be information assimilators, where Old Humans were information gatherers. The techniques to acquire and gather information have evolved swiftly since the advent of the printing press, from user-intensive methods such as manual research, to a reduction in user involvement through Internet search engines. Now, narrow AI using natural language processing is transitioning to AI-enabled predictive learning. Through these AI-enabled virtual entities, New Humans will carry targeted, predictive, and continuous learning assistants with them. These assistants will observe, listen, and process everything of relevance to the learner and then deliver them information as necessary.

There is an abundance of research on the stark contrast between the three generations currently in the workforce: Baby Boomers, Generation X, and Millennials.4, 5 There will be similar fundamental differences between Old Humans and New Humans and their learning styles. The New Human likely will value experiential learning over traditional classroom learning.6 The convergence of mixed reality and advanced, high fidelity modeling and simulation will provide New Humans with immersive, experiential learning. For example, Soldiers learning military history and battlefield tactics will be able to experience it ubiquitously, observing how each facet of the battlefield affects the whole in real-time as opposed to reading about it sequentially. Soldiers in training could stand next to an avatar of General Patton and experience him explaining his command decisions firsthand.

There is an opportunity for the Army to adapt its education and training to these growing differences. The Army could—and eventually will need—to recruit, train, and develop New Humans by altering its current structure and recruitment programs. It will become imperative to conduct training with new tools, materials, and technologies that will allow Soldiers to become information assimilators. Additionally, the incorporation of experiential learning techniques will entice Soldiers’ learning. There is an opportunity for the Army to pave the way and train its Soldiers with cutting edge technology rather than trying to belatedly catch up to what is publicly available.

Evolution in Learning Technologies

If you enjoyed this post, please also watch Elliott Masie‘s video presentation on Dynamic Readiness and  Mark Prensky‘s presentation on The Future of Learning from of the Mad Scientist Learning in 2050 Conference

… see the following related blog posts:

… and read The Mad Scientist Learning in 2050 Final Report.


1 Prensky, Mark, Mad Scientist Conference: Learning in 2050, Georgetown University, 9 August 2018

2 http://www.pewinternet.org/fact-sheet/mobile/

3 http://www.pewresearch.org/fact-tank/2017/12/12/nearly-half-of-americans-use-digital-voice-assistants-mostly-on-their-smartphones/

4 https://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Generational-issues-in-the-workplace.aspx

5 https://blogs.uco.edu/customizededucation/2018/01/16/generational-differences-in-the-workplace/

6 https://www.apa.org/monitor/2010/03/undergraduates.aspx

116. Three Futurist Urban Scenarios

[Editor’s Note: Mad Scientist welcomes back returning guest blogger Dr. Nir Buras with today’s post.  We’ve found crowdsourcing (i.e., the gathering of ideas, thoughts, and concepts from a widespread variety of interested individuals) to be a very effective tool in enabling us to diversify our thoughts and challenge our assumptions.  Dr. Buras’ post takes the results from one such crowdsourcing exercise and extrapolates three future urban scenarios.  Given The Army Vision‘s clarion call to “Focus training on high-intensity conflict, with emphasis on operating in dense urban terrain,” our readers would do well to consider how the Army would operate in each of Dr. Buras’ posited future scenarios…]

The challenges of the 21st century have been forecast and are well-known. In many ways we are already experiencing the future now. But predictions are hard to validate. A way around that is turning to slightly older predictions to illuminate the magnitude of the issues and the reality of their propositions.1 Futurists William E. Halal and Michael Marien’s predictions of 2011 have aged enough to be useful. In an improved version of the Delphi method, they iteratively built consensus among participants. Halal and Marien balanced the individual sense of over sixty well-qualified experts and thinkers representing a range of technologies with facilitated feedback from the others. They translated their implicit or tacit know how to make qualified quantitative empirical predictions.2

From their research we can transpose three future urban scenarios:  The High-Tech City, The Feral City, and Muddling Through.

The High-Tech City

The High-Tech City scenario is based primarily on futurist Jim Dator’s high-tech predictions. It envisions the continued growth of a technologically progressive, upwardly mobile, internationally dominant, science-guided, rich, leisure-filled, abundant, and liberal society. Widespread understanding of what works largely avoids energy shortages, climate change, and global conflict.3

The high-tech, digital megacity is envisaged as a Dubai on steroids. It is hyper-connected and energy-efficient, powered by self-sustaining, renewable resources and nuclear energy.4

 

Connected by subways and skyways, with skyscraping vertical gardens, the cities are ringed by elaborately managed green spaces and ecosystems. The city’s 50 to 150-story megastructures, “cities-in-buildings,” incorporate apartments, offices, schools and grocery stores, hospitals and shopping centers, sports facilities and cultural centers, gardens, and running tracks. Alongside them rise vertical farms housing animals and crops. The rooftop garden of the 2015 film High Rise depicts how aerial terraces up high provide a sense of suburban living in the high-tech city.5

On land, zero-emission driverless traffic zips about on intelligent highways. High-speed trains glide silently by. After dark, spider bots and snake drones automatically inspect and repair buildings and infrastructure.6

In the air, helicopters, drones, and flying cars zoom around. Small drones, mimicking insects and birds, and programmable nano-chips, some as small as “smart” dust, swarm over the city into any object or shape on command. To avoid surface traffic, inconvenience, and crime, wealthier residents fly everywhere.7

Dominated by centralized government and private sector bureaucracies wielding AI, these self-constructing robotic “cyburgs” have massive technology, robotics, and nanotechnology embedded in every aspect of their life, powered by mammoth fusion energy plants.8

Every unit of every component is embedded with at least one flea-size chip. Connected into a single worldwide digital network, trillions of sensors monitor countless parameters for the city and everything in it. The ruling AI, commanded directly by individual minds, autonomously creates, edits, and implements software, simultaneously processing feedback from a global network of sensors.9

 

Metropolis by Fritz Lang was the first film to show a city of the future as a modernist dystopia. / Produced by Ufa.

The High-Tech City is not a new concept. It goes back to Jules Verne, H. G. Wells, and Fritz Lang, who most inspired its urban look in the 1927 film Metropolis. The extrapolated growth of technology has long been the basis for predictions. But professional futurists surprisingly agree that a High-Tech Jetsons scenario has only a 0%-5% probability of being realized.10

Poignantly, the early predictors transmitted a message that the stressful lifestyle of the High-Tech City contradicts the intention of freedom from drudge. Moreover, the High-Tech megacities’ appetite for minerals may lay waste to whole ecosystems. Much of the earth may become a feral wilderness. Massive, centralized AI Internet clouds and distribution systems give a false sense of cultural robustness. People become redundant and democracy meaningless. The world may fail to react to accelerated global crises, with disastrous consequences. The paradoxical obsolescence of high-tech could slide humanity into a new Dark Age.11

The Feral City

Futurists disturbingly describe a Decline to Disaster scenario as five times more likely to happen than the high-tech one. From Tainter’s theory of collapse and Jane Jacobs’s Dark Age Ahead we learn that the cycles of urban problem-solving lead to more problems and ultimately failures. If Murphy’s Law kicks in, futurists predict a 60% chance that large parts of the world may be plunged into an Armageddon-type techno-dystopian scenario, typified by the films Mad Max (1979) and Blade Runner (1982).12

Apocalyptic feral cities, once vital components in national economies, are routinely imagined as vast, sprawling urban environments defined by blighted buildings. An immense petri dish of both ancient and new diseases, rule of law has long been replaced by gang anarchy and the only security available in them is attained through brute power.13

Neat suburban areas were long ago stripped for their raw materials. Daily life in feral cities is characterized by a ubiquitous specter of murder, bloodshed, and war, of the militarization of young men, and the constant threat of rape to females. Urban enclaves are separated by wild zones, fragmented habitats consisting of wild nature and subsistence agriculture. With minimal or no sanitation facilities, a complete absence of environmental controls, and massive populations, feral cities suffer from extreme air pollution from vehicles and the use of open fires and coal for cooking and heating. In effect toxic-waste dumps, these cities pollute vast stretches of land, poisoning coastal waters, watersheds, and river systems throughout their hinterlands.14

Pollution is exported outside the enclaves, where the practices of the desperately poor, and the extraction of resources for the wealthy, induce extreme environmental deterioration. Rivers flow with human waste and leached chemicals from mining, contaminating much of the soil on their banks.15

Globally connected, a feral city might possess a modicum of commercial linkages, and some of its inhabitants might have access to advanced communication and computing. In some areas, agriculture might forcefully cultivate high-yield, GMO, and biomass crops. But secure long-distance travel nearly disappears, undertaken mostly by the super-rich and otherwise powerful.16

Dystopian reality and dystopian art: (a) Bangladeshi, hanging on a train in Ijtema in 2017, already live the dystopian future. (b) A Dystopian City

Futurists backcasting from 2050 say that the current urbanization of violence and war are harbingers of the feral city scenario. But feral cities have long been present. The Warsaw Ghetto in World War Two was among them, as were the Los Angeles’ Watts neighborhood in the 1960s and 1990s; Mogadishu in 2003, and Gaza repeatedly.17

Walled City of Kowloon

Conflict and crime changed once charming, peaceful Aleppo, Bamako, Caracas, Erbil, Mosul, Tripoli, and Salvador into feral cities. Medieval San Gimignano was one. Spectacularly, from 1889 to 1994 the ghastly spaces of Hong Kong’s singular urban phenomenon, the Walled City of Kowloon, provided a living example.18

Muddling Through

The good news is that futurists tend to believe in a 65%-85% probability of a Muddling Through scenario. Despite interlinked, cascading catastrophes, they suggest that technologies may gain some on the problems. Somehow securing a sustainable world for 9 billion people by 2050, they suggest the world will be massively changed, yet somehow livable.19

Lending credibility to the Muddling Through scenario is that it blends numerous hypotheses. It predicts that people living in rural communities will tend the land scientifically. Its technological salvation hypothesis posits that science will come to the rescue. Its free market hypothesis assumes that commerce will drive technological advancements.20

It pictures a “conserver” society tinged by Marxism, a neo-puritan “ecotopia,” colored by both the high-tech and feral scenarios. Tropical diseases, corruption, capitalism, socialism, inequality, and war are not eradicated. But nationalism, tribalism, and xenophobia are reduced after global traumas. Though measurably poorer, most people will still have a reasonable level of wellbeing.21 According to the Muddling Through scenario, large cities retract and densify around their old centers and waterfronts. Largely self-sufficient, small towns and cities survive amid the ruins of suburban sprawl, separated by resurgent forests and fields. Shopping malls, office towers and office parks, town dumps, tract homes, and abandoned steel and glass buildings are stripped for their recyclables. Unsalvageable downtowns in some cases go feral.22

A mix of high and low tech fosters digital communication with those at a distance. There would be drip irrigation, hydroponic farming, aquaculture, and grey water recycling, overlaid with artificial intelligence, biotechnology and biomimicry, nuclear power, geoengineering, and oil from algae.23

In some places, rail links are maintained, but cars are a rarity, and transportation is greatly reduced. Collapsed or dismantled freeways and bridges return to the forest or desert. While flying still exists, it is rarer. But expanded virtual mobility offering “holodeck” experiences subsumes tourism. Cosmopolitanism happens on the porch with an iPad.24

Surprisingly, the Muddling Through scenario ends up with urban fabric similar in properties to homeostatic planning had it been done intentionally. Work is a short walk from home. Corner stores pop up, as do rudimentary cafés, bistros, and other gathering places. Forty percent of the food is produced in or around cities on small farms. Wildlife returns to course freely. Groups of travelers move on surviving “high roads.” Communities meet at large sports venues situated in the countryside between them.25

Sea level rise is met with river and sea walls. At their base, vast new coral beds and kelp forests grow over the skeletons of submerged districts and towns. In a matter of years, rivers and seas build new beaches. Their flood plains are populated with new plants. Smaller scale trade waterfronts are reactivated for shipping, and some ships are even powered by sail. Cities occupying harbors, rivers, and railroad junctions reconnect to distant supply chains, mostly for non-quotidian (i.e., luxury) goods.26

Learning from Rome to Understand Detroit

Rome’s deterioration from a third century city of more than 1,000,000 people started long before it was acknowledged. An unnoticed population drop to 800,000 was characterized by ever larger buildings of decreasing beauty and craft, including the huge Baths of Diocletian (298-306 CE). Anticipating barbarian invasion, Rome’s walls were built (271-275 CE). It was ransacked twice (410 and 455 CE).27

But as if in a dream, 5th century life of the diminishing but still substantial population continued as normal. Invading Goths maintained Rome’s Senate, taxes, and cops. But administrative and military infrastructure vaporized. An unraveling education system led to the rise of illiteracy. Noble families began using mob politics, economic and social linkages broke down, travel and transportation became unsafe, and manufacturing collapsed.28

Rome when it was empty Campo Vaccino (Empty Field), Claude Lorrain (1604/1605–1682), 1636, Louvre, Paris

By 500 CE, Rome had less than 100,000 people. Systematic agriculture disappeared, and much land returned to forest. The Pope and nobility pillaged abandoned public buildings for their materials. The expansive city was reduced to small groups of inhabited buildings, interspersed among large areas of abandoned ruins and overgrown vegetation. In the 12th and 13th centuries the population of Rome was possibly as few as 20,000 people.29

The long journey from first cities, to Ancient Greece, Rome, and the Middle Ages, through Paris, Washington, and Shanghai, helps us understand how our cities might end up. Holding Rome up to the mirrors of history reads like backcasting Rome’s decline and survival in a Muddling Through scenario from today’s view. Halal predicted that muddling would start about 2023 to 2027 and that if we weren’t muddling by then, collapse would set in by 2029.30

Detroit started muddling in 1968. New York proved to be a fragile city during blackouts, as did Dubai in its 2009 financial crisis. Since the 1970s, most of America’s ten “dead cities,” many formerly among its largest and most vibrant, came disturbingly close to being feral. The overlapping invisibilities of heavily armed warlords and brutal police, make the favelas of Medellin and Rio de Janeiro virtually feral.31

Today we are at a tipping point. We can wait for the collapse of systems to reach homeostasis or attain it intentionally by applying Classic Planning principles.32

If you enjoyed this post, please also see Dr. Buras’ other posts:

Nir Buras is a PhD architect and planner with over 30 years of in-depth experience in strategic planning, architecture, and transportation design, as well as teaching and lecturing. His planning, design and construction experience includes East Side Access at Grand Central Terminal, New York; International Terminal D, Dallas-Fort-Worth; the Washington DC Dulles Metro line; work on the US Capitol and the Senate and House Office Buildings in Washington. Projects he has worked on have been published in the New York Times, the Washington Post, local newspapers, and trade magazines. Buras, whose original degree was Architect and Town planner, learned his first lesson in urbanism while planning military bases in the Negev Desert in Israel. Engaged in numerous projects since then, Buras has watched first-hand how urban planning impacted architecture. After the last decade of applying in practice the classical method that Buras learned in post-doctoral studies, his book, *The Art of Classic Planning* (Harvard University Press, 2019), presents the urban design and planning method of Classic Planning as a path forward for homeostatic, durable urbanism.


1 Population growth, clean water, compromised resilience of infrastructures, drug-resistant microbes, pandemics, possible famine, authoritarian regimes, social breakdowns, terrestrial cataclysms, terrorist mischief, nuclear mishaps, perhaps major war, inequity, education and healthcare collapse, climate change, ecological devastation, biodiversity loss, ocean acidification, world confusion, institutional gridlock, failures of leadership, failure to cooperate. Sources include: Glenn, Jerome C., Theodore J. Gordon, Elizabeth Florescu, 2013-14 State of the Future Millennium Project: Global Futures Studies and Research, Millennium-project.org (website), Washington, DC, 2014; Cutter, S. L. et al., Urban Systems, Infrastructure, and Vulnerability, in Climate Change Impacts in the United States: The Third National Climate Assessment, in Melillo, J. M. et al., (eds.), U.S. Global Change Research Program, 2014, Ch. 11, pp. 282-296; Kaminski, Frank, A review of James Kunstler’s The Long Emergency 10 years later, Mud City Press (website), Eugene, OR, 9 March 2015; Urban, Mark C., Accelerating extinction risk from climate change, Science Magazine, Vol. 348, Issue 6234, 1 May 2015, pp. 571-573; Kunstler, J.H., Clusterfuck Nation: A Glimpse into the Future, Kunstler.com (website), 2001b; US Geological Survey, Materials Flow and Sustainability, Fact Sheet FS-068-98, June 1998; Klare, M. T., The Race for What’s Left, Metropolitan Books, New York, 2012; Drielsma, Johannes A. et al., Mineral resources in life cycle impact assessment – defining the path forward, International Journal of Life Cycle Assessment, 21 (1), 2016, pp. 85-105; Meinert, Lawrence D. et al., Mineral Resources: Reserves, Peak Production and the Future, Resources 5(14), 2016; OECD World Nuclear Agency and International Atomic Energy Agency, 2004; Tahil, William, The Trouble with Lithium Implications of Future PHEV Production for Lithium Demand, Meridian International Research, 2007; Turner, Graham, Cathy Alexander, Limits to Growth was right. New research shows we’re nearing collapse, Guardian, Manchester, 1 September 2014; Kelemen, Peter, quoted in Cho, Renee, Rare Earth Metals: Will We Have Enough?, in State of the Planet, News from the Earth Institute, Earth Institute, Columbia University, September 19, 2012; Griffiths, Sarah, The end of the world as we know it? CO2 levels to reach a ‘tipping point’ on 6 June – and Earth may never recover, expert warns, Daily Mail, London, 12 May 2016; van der Werf, G.R. et al., CO2 emissions from forest loss, Nature Geoscience, Volume 2, November 2009, pp. 737–738; Global Deforestation, Global Change Program, University of Michigan, January 4, 2006; Arnell, Nigel, Future worlds: a narrative description of a plausible world following climate change, Met Office, London, 2012; The End, Scientific American, Special Issue, Sept 2010; Dator, Jim, Memo on mini-scenarios for the pacific island region, 3, November, 1981b, quoted in Bezold, Clement, Jim Dator’s Alternative Futures and the Path to IAF’s Aspirational Futures, Journal of Futures Studies, 14(2), November 2009, pp. 123 – 134.

2 Halal, William, Through the megacrisis: the passage to global maturity, Foresight Journal, VOL. 15 NO. 5, 2013a, pp. 392-404; Halal, William E., and Michael Marien, Global MegaCrisis Four Scenarios, Two Perspectives, The Futurist, Vol. 45, No. 3, May-June 2011; Halal, William E., Forecasting the technology revolution: Results and learnings from the TechCast project, Technological Forecasting and Social Change, 80.8, 2013b, pp. 1635-1643; TechCast Project, George Washington University, TechCast.org (website), Washington, DC, N.D.; National Research Council, Persistent Forecasting of Disruptive Technologies—Report 2, The National Academies Press, Washington, DC,2010.  Halal, William E., Technology’s Promise: Expert Knowledge on the Transformation of Business and Society, Palgrave Macmillan, London, 2008; Halal et al., The GW Forecast of Emerging Technologies, Technology Forecasting & Social Change, Vol. 59, 1998, pp. 89-110. The name was inspired by the oracle at Delphi (8th century BCE to 390 CE). The modern Delphi Method helps uncover data, and collect and distill the judgments of experts using rounds of questionnaires, interspersed with feedback. Each round is developed based on the results of the previous, until the research question is answered, a consensus is reached, a theoretical saturation is achieved, or sufficient information was exchanged. Linstone, Harold A., & Murray Turoff (eds.), The Delphi method: Techniques and applications, Addinson-Wesley, London, 1975; Halal, William E., Business Strategy for the Technology Revolution: Competing at the Edge of Creative Destruction, Journal of Knowledge Economics, Springer Science+Business Media, New York, September 2012. The author consolidated both of Halal and Marien muddling scenarios into one. The uncertainty of each particular forecast element was about 20% – 30 %.

3 Dator, James, Advancing Futures, Westport: Ct, Praeger, 2002; Bezold, 2009.

4 Chan, Tony, in Reubold, Todd, Envision 2050: The Future of Cities, Ensia.com (website), 16 June, 2014; Kunstler, James Howard, Back to the Future, Orion Magazine, June 23, 2011. Urry, John et al., Living in the City, Foresight, Government Office for Science, London, 2014; Hoff, Mary, Envision 2050: The Future of Transportation, Ensia.com (website), 31 March, 2014.

5 Kaku, Michio, The World in 2100, New York Post, New York, 20 March 2011. Tonn, Bruce E., LeCorbusier Meets the Jetsons in Anytown U.S.A. in the Year 2050: Glimpses of the Future, Planning Forum, Community and, Regional Planning, Volume 8, School of Architecture, The University of Texas, Austin, 2002; Urry et al., 2014.

6 Kaku, 2011; Hon, 2016. Rubbish bins will send alarms when they are about full. Talking garbage bins will reward people with poems, aphorisms, and songs for placing street rubbish in the bin. Heinonen, 2013.

7 Urry et al., 2014.

8 Heinonen, 2013. The prefix cy*, an abbreviation of cybernetics, relates to computers and virtual reality. The suffix *burg means city, fortified town. Urrutia, Orlando, Eco-Cybernetic City of the Future, Pacebutler.com (website), 12 February 2010; Tonn, 2002.

9 Shepard, M., Sentient City: Ubiquitous Computing, Architecture, And The Future of Urban Space. MIT Press, Cambridge, 2011; Kurzweil, Ray, The Singularity is Near, Penguin Group, New York, 2005. Some futurists predict that the energy required to keep a “global brain” operating may so deplete energy that it will bankrupt society and cause total collapse. Heinonen, 2013. The terms smart city, intelligent city, and digital city are sometimes synonymous, but the digital or intelligent city is considered heavily technological. Heinonen, 2013; Giffinger, Rudolf et al., Smart cities – Ranking of European medium-sized cities. Centre of Regional Science, Vienna UT, October 2007; Kaku, 2011; Vermesan, Ovidiu and Friess, Peter, Internet of Things: Converging Technologies for Smart Environments and Integrated Ecosystems, River Publishers, Aalborg DK, 2013; Cooper, G., Using Technology to Improve Society, The Guardian, Manchester, 2010; Heinonen, 2013. Typical smart city programs utilize traffic data visualization, smart grids, smart water and e-government solutions, The Internet, smartphones, inexpensive sensors, and mobile devices. Amsterdam, Dubai, Cairo, Edinburg, Malaga, and Yokohama have smart city schemes. Webb, Molly et al., Information Marketplaces: The New Economics of Cities, The Climate Group, ARUP, Accenture and The University of Nottingham, 2011.

10 Dator, 2002; Bezold, 2009. The Jetsons originally ran a single season in 1962-63. It was revived but not resuscitated in 1985. The term Jetsons today stands for “unlikely, faraway futurism.” Novak, Matt, 50 Years of the Jetsons: Why The Show Still Matters, Smithsonian.Com, 19 September 2012.

11 Perrow, Charles, Normal Accidents: Living with High-Risk Technologies, Basic Books, New York, 1984. By adding complexity, including conventional engineering warnings, precautions, and safeguards, systems failure not only becomes inevitable, but it may help create new categories of accidents, such as those of Bhopal, the Challenger disaster, Chernobyl, and Fukushima. Deconcentrating high-risk populations, corporate power, and critical infrastructures is suggested. Perrow, Charles, The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters, Princeton University Press, Princeton, 2011; Turner, 2014; Jacobs, Jane, Dark Age Ahead, Random House, New York, 2004, p.24.

12 Jacobs, 2004; Dirda, Michael, A living urban legend on the sorry way we live now, Washington Post, Washington DC, 6 June, 2004; Dator, 2002; Bezold, 2009; Dator, James, Alternative futures & the futures of law, in Dator, James & Clement Bezold (eds.), Judging the future, University of Hawaii Press, Honolulu, 1981. pp.1-17; Halal, 2013b.

13 The term feral city was coined in Norton, Richard J., Feral Cities, Naval War College Review, Vol. LVI, No. 4, Autumn 2003. See also Brunn, Stanley D. et al., Cities of the World: World Regional Urban Development, Rowman & Littlefield, Lanham, MD, 2003, pp. 5–14, chap. 1.

14 Norton, 2003.

15 Urry, J., Offshoring. Polity, Cambridge, 2014; Gallopin, G., A. Hammond, P. Raskin, R. Swart, Branch Points, Global Scenario Group, Stockholm Environment Institute, Stockholm, 1997, p. 34. Norton, 2003.

16 Tonn, 2002; Urry et al., 2014.

17 Backcasting is future hindsight. Kilcullen, David, Out of the Mountains: The Coming Age of the Urban Guerrilla, Oxford University Press, Oxford, 2013.

18 Heterotopia, in Foucault, Michel, The Order of Things, Vintage Books, New York, 1971; Foucault, M., Of Other Spaces, Diacritics 16, 1986, pp. 22-27. Girard, Greg, and Ian Lambot, City of Darkness: Life in Kowloon Walled City, Watermark, Chiddingfold, 1993, 2007, 2014; Tan, Aaron Hee-Hung, Kowloon Walled City: Heterotopia in a Space of Disappearance (Master’s Thesis), Harvard University, Cambridge, MA, 1993; Sinn, Elizabeth, Kowloon Walled City: Its Origin and Early History (PDF). Journal of the Hong Kong Branch of the Royal Asiatic Society, 27, 1987, pp. 30–31; Harter, Seth, Hong Kong’s Dirty Little Secret: Clearing the Walled City of Kowloon, Journal of Urban History 27, 1, 2000, pp. 92-113; Grau, Lester W. and Geoffrey Demarest, Diehard Buildings: Control Architecture a Challenge for the Urban Warrior, Military Review, Combined Arms Center, Fort Leavenworth, Kansas, September / October 2003; Kunstler, James Howard, A Reflection on Cities of the Future, Energy Bulletin, Post Carbon Institute, 28 September, 2006; ArenaNet Art Director Daniel Dociu wins Spectrum 14 gold medal!, Guild Wars.com (website), 9 March 2007. Authors, game designers, and filmmakers used the Walled City to convey a sense of feral urbanization. It was the setting for Jean-Claude Van Damme’s 1988 film Bloodsport; Jackie Chan’s 1993 film Crime Story was partly filmed there during among genuine scenes of building demolition; and the video game Shadowrun: Hong Kong features a futuristic Walled City. Today the location of the former Kowloon Walled City is occupied by a park modelled on early Qing Dynasty Jiangnan gardens.

19 Halal, 2013a; Wright, Austin Tappan, Islandia, Farrar & Rinehart, New York, Toronto, 1942; Tonn, Bruce E., Anytown U.S.A. in the Year 2050: Glimpses of the Future, Planning Forum, Community and, Regional Planning, Volume 8, School of Architecture, The University of Texas, Austin, 2002; Porritt, Jonathon, The World We Made: Alex McKay’s Story from 2050, Phaidon Press, London, 2013. World Made by Hand novels by James Howard Kunstler: World Made By Hand, Grove Press, New York, 2008; The Witch of Hebron, Atlantic Monthly Press, 2010; A History of the Future, Atlantic Monthly, 2014; The Harrows of Spring, Atlantic Monthly Press, 2016

20 Turner, 2014.

21 Dator, 2002; Bezold, 2009; Dator & Bezold, 1981; Dator, 1981a; Dator, 1981b; Dator, James, The Unholy Trinity, Plus One (Preface), Journal of Futures Studies, University of Hawaii, 13(3), February 2009, pp. 33 – 48; McDonough, William & Michael Braungart, Cradle to Cradle: Remaking the Way We Make Things, Macmillan, New York, 2002; Porritt, 2013; Urry et al., 2014.

22 Wright, 1942; Kunstler, 2011; Givens, Mark, Bring It On Home: An Interview with James Howard Kunstler, Urban Landscapes and Environmental Psychology, Mung Being (website), Issue 11, N.D., p. 30; Kunstler, World Made by Hand series.

23 Tonn, 2002; Mollison, B. C. Permaculture: A Designer’s Manual. Tagari Publications, Tyalgum, Australia, 1988; Holmgren, D. and B. Mollison, Permaculture One, Transworld Publishers, Melbourne, 1978; Holmgren, D., Permaculture: Principles and Pathways beyond Sustainability, Holmgren Design Services, Hepburn, Victoria, Australia, 2002; Holmgren, David, Future Scenarios: How Communities Can Adopt to Peak Oil and Climate Change, Chelsea Green Publishing White River Junction, Vermont, 2009; Walker, L., Eco-Village at Ithaca: Pioneering a Sustainable Culture, New Society Publishers, Gabriola Island, 2005; Hopkins, R., The Transition Handbook: From Oil Dependency to Local Resilience, Green Books, Totnes, Devon, 2008; Urry et al., 2014; Porritt, 2013.

24 Urry et al., 2014; Porritt, 2013; Caletrío, Javier, “The world we made. Alex McKay’s story from 2050” by Jonathon Porritt (review), Mobile Lives Forum, forumviesmobiles.org (website), 21 May 2015.

25 Kunstler, 2001b.

26 Kunstler, 2006; Williams, 2014.

27 Krautheimer, Richard, Rome: Profile of A City, 312-1308, Princeton University Press, Princeton, 1980.

28 Palmer, Ada, The Shape of Rome, exurbe.com (website), Chicago, 15 August 2013.

29 Procopius of Caesarea, (c.490/507- c.560s) Procopius, Dewing, H B., and Glanville Downey (trans), Procopius, Harvard University Press, Cambridge, MA, 2000.On the Wars in eight books (Polemon or De bellis) was published 552, with an addition in 554; Storey, Glenn R., The population of ancient Rome, Antiquity, December 1, 199; Wickham, Chris, Medieval Rome: Stability and Crisis of a City, 900-1150, Oxford Studies in Medieval European History, Oxford University Press, New York, Oxford, 2015. Population numbers are uncertain well into the Renaissance. Krautheimer, 1980.

30 Porritt, 2013; Alexander, Samuel, Resilience through Simplification: Revisiting Tainter’s Theory of Collapse, Simplicity Institute Report, Melbourne (?), 2012b; Palmer, 2013: Halal, 2013a, 2013b.

31 America’s “Ten Dead Cities” in 2010: Buffalo; Flint; Hartford; Cleveland; New Orleans; Detroit; Albany; Atlantic City; Allentown, and Galveston. McIntyre, Douglas A., America’s Ten Dead Cities: From Detroit to New Orleans, 24/7 Wall Street (website), 23 August, 2010; Gibson, Campbell, Population of The 100 Largest Cities And Other Urban Places In The United States: 1790 To 1990, Population Division, U.S. Bureau of the Census, Washington, DC, June 1998. See also “America’s 150 forgotten cities.” Hoyt, Lorlene and André Leroux, Voices from Forgotten Cities Innovative Revitalization Coalitions in America’s Older Small Cities, MIT, Cambridge, MA, 2007; Manaugh, Geoff, Cities Gone Wild, Bldgblog.com (website), 1 December 2009.

32 Buras, Nir, The Art of Classic Planning for Beautiful and Enduring Communities, Harvard University Press, Cambridge, 2019.

115. War Laid Bare

[Editor’s Note:  Multi-Domain Operations (MDO) describes how the U.S. Army, as part of the joint force, can counter and defeat a near-peer adversary capable of contesting the U.S. in all domains, in both competition and armed conflict.  MDO provides commanders numerous options for executing simultaneous and sequential operations using surprise and the rapid and continuous integration of capabilities across all domains to present multiple dilemmas to an adversary in order to gain physical and psychological advantages and influence and control over the operational environment.

Today’s guest blog post by Mr. Matthew Ader, however, addresses the advent of inexpensive CubeSats, capable of providing global surveillance at a fraction of the cost of legacy spy satellites, and how they could usher in the end of covert movement for combat units and their associated logistical support, and with that the demise of strategic and operational deception and surprise.]

One of the key factors of war since time immemorial has been uncertainty. The dispositions of the enemy, the strength of their industry, the will of their people – all have been guessed at, but rarely known for certain. Thanks to a pair of companies in California, that is about to change. Deception is dying.

Planet Labs operates a constellation of around 180 CubeSats – shoebox sized satellites in low earth orbit. Each day, they photograph the entirety of the globe, sending 6 terabytes of data to Earth for use. That capacity alone is valuable, but the sheer volume of data makes it impossible to analyze quickly.

For a human.

Artificial Intelligence (AI) image analysis is not so limited. This has been recognized and operationalized by Orbital Insight, a company specializing in AI image analysis. Partnering with Planet Labs, Orbital Insight delivers unique intelligence – for example, counting cars in parking lots to determine market movements. If they can count cars, they can certainly count tanks.

And, unlike conventional satellites, CubeSat imagery is cheap. It costs about US$100,000 to put one into orbit. The cost of a Planet Labs satellite is not easily available, but a similar sized CubeSat costs an estimated US$30,000. A 180-satellite constellation would therefore cost US$24.3 million, around a third of the price of a single F-35. If more timely imagery is required, buying more satellites is not an obstacle. It’s harder to find solid numbers for AI, but Project Maven, DoD’s flagship image analysis research program, was budgeted at $93 million a year.

Therefore, it’s not implausible that given some years for technology to mature and a few billion dollars investment1, any national military will have the capability to persistently surveil the entire Earth. A combination of camouflage and low-resolution satellite cameras will probably preserve tactical deception. But strategic and operational deception, the covert movement of battalions and carrier strike groups, will be impossible. That is a revolution in military affairs.

In particular, logistics will become very difficult. The depots and truck convoys required to sustain a modern army will be easily visible. Long range, uninterceptable hypersonic weapons can then strike these targets with impunity. Even absent high-tech hypersonics, conventional missiles and rocket artillery can still have a serious impact. The result is that deploying and sustaining any sizeable force against an enemy with a large CubeSat constellation will be very difficult.

In trying to predict the future of war, it is easy to fall prey to LTG H.R.  McMaster’s ‘vampire fallacy’ of thinking new technology will deliver bloodless, decisive victory. Certainly, there are a range of factors which could mitigate the incredible intelligence advantages of CubeSat constellations. These could range from better cyberwarfare to degrade enemy intelligence sharing, to more effective missile defense, to directly attacking the CubeSats themselves.

These mitigating factors do not occur in the wild. It will take years of hard work to develop and deploy them. The U.S. military, in partnership with its allies, must take the lead on developing its own CubeSat constellations and countermeasures. Because if they don’t, someone else will – and the results for U.S. power could be potentially catastrophic.

If you enjoyed reading this post, please also see our related discussions regarding Concealment in the Fundamental Questions Affecting Army Modernization post and Finders vs Hiders in our Timeless Competitions post…

… and our Star Wars 2050 post by Ms. Marie Murphy.

Mr. Matthew Ader is a first-year undergraduate reading War Studies at King’s College London.


1 Accounting for ground receiving stations, CubeSat replacements, additional staff and other associated costs.

114. Mad Scientist Science Fiction Writing Contest 2019

Futuristic tank rendering  / Source: U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC)

[Editor’s Note:  Story Telling is a powerful tool that allows us to envision how innovative technologies could be employed and operationalized in the Future Operational Environment.  Mad Scientist is seeking your visions of future combat with our Science Fiction Writing Contest 2019.  Our deadline for submission is 1 APRIL 2019, so please review the contest details below, get those creative writing juices flowing, and send us your visions of combat in 2030!] 

Still from “The Future of the Soldier” video / Source:  U.S. Army Natick Soldier Research Development and Engineering Center

Background: The U.S. Army finds itself at a historical inflection point, where disparate, yet related elements of an increasingly complex Operational Environment (OE) are converging, creating a situation where fast moving trends are rapidly transforming the nature of all aspects of society and human life – including the character of warfare. It is important to take a creative approach to projecting and anticipating both transformational and enduring trends that will lend themselves to the depiction of the future. In this vein, the U.S. Army Mad Scientist Initiative is seeking your creativity and unique ideas to describe a battlefield that does not yet exist.

Illustration from “Silent Ruin” by Don Hudson & Kinsun Lo / Source:   U.S.  Army Cyber Institute at West Point

Task: Write about the following scenario – On March 17th, 2030, the country of Donovia, after months of strained relations and covert hostilities, invades neighboring country Otso. Donovia is a wealthy nation that is a near-peer competitor to the United States. Like the United States, Donovia has invested heavily in disruptive technologies such as robotics, AI, autonomy, quantum information sciences, bio enhancements and gene editing, space-based weapons and communications, drones, nanotechnology, and directed energy weapons. The United States is a close ally of Otso and is compelled to intervene due to treaty obligations and historical ties. The United States is about to engage Donovia in its first battle with a near-peer competitor in over 80 years…

Three ways to approach:
1) Forecasting – Description of the timeline and events leading up to the battle.
2) Describing – Account of the battle while it’s happening.
3) Backcasting – Retrospective look after the battle has ended (i.e., After Action Review or lessons learned).

Three questions to consider while writing (U.S., adversaries, and others):
1) What will forces and Soldiers look like in 2030?
2) What technologies will enable them or be prevalent on the battlefield?
3) What do Multi-Domain Operations look like in 2030?

Submission Guidelines:
– No more than 5000 words in length
– Provide your submission in .doc or .docx format
– Please use conventional text formatting (e.g., no columns) and have images “in line” with text
– Submissions from Government and DoD employees must be cleared through their respective PAOs prior to submission
MUST include completed release form (on the back of contest flyer)
CANNOT have been previously published

Selected submissions may be chosen for publication or a possible future speaking opportunity.

Contact: Send your submissions to: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil

For additional story telling inspiration, please see the following blog posts:

 

113. Connected Warfare

[Editor’s Note: As stated previously here in the Mad Scientist Laboratory, the nature of war remains inherently humanistic in the Future Operational Environment.  Today’s post by guest blogger COL James K. Greer (USA-Ret.) calls on us to stop envisioning Artificial Intelligence (AI) as a separate and distinct end state (oftentimes in competition with humanity) and to instead focus on preparing for future connected competitions and wars.]

The possibilities and challenges for future security, military operations, and warfare associated with advancements in AI are proposed and discussed with ever-increasing frequency, both within formal defense establishments and informally among national security professionals and stakeholders. One is confronted with a myriad of alternative futures, including everything from a humanity-killing variation of Terminator’s SkyNet to uncontrolled warfare ala WarGames to Deep Learning used to enhance existing military processes and operations. And of course legal and ethical issues surrounding the military use of AI abound.

Source: tmrwedition.com

Yet in most discussions of the military applications of AI and its use in warfare, we have a blind spot in our thinking about technological progress toward the future. That blind spot is that we think about AI largely as disconnected from humans and the human brain. Rather than thinking about AI-enabled systems as connected to humans, we think about them as parallel processes. We talk about human-in-the loop or human-on-the-loop largely in terms of the control over autonomous systems, rather than comprehensive connection to and interaction with those systems.

But even while significant progress is being made in the development of AI, almost no attention is paid to the military implications of advances in human connectivity. Experiments have already been conducted connecting the human brain directly to the internet, which of course connects the human mind not only to the Internet of Things (IoT), but potentially to every computer and AI device in the world. Such connections will be enabled by a chip in the brain that provides connectivity while enabling humans to perform all normal functions, including all those associated with warfare (as envisioned by John Scalzi’s BrainPal in “Old Man’s War”).

Source: Grau et al.

Moreover, experiments in connecting human brains to each other are ongoing. Brain-to-brain connectivity has occurred in a controlled setting enabled by an internet connection. And, in experiments conducted to date, the brain of one human can be used to direct the weapons firing of another human, demonstrating applicability to future warfare. While experimentation in brain-to-internet and brain-to-brain connectivity is not as advanced as the development of AI, it is easy to see that the potential benefits, desirability, and frankly, market forces are likely to accelerate the human side of connectivity development past the AI side.

Source: tapestrysolutions.com

So, when contemplating the future of human activity, of which warfare is unfortunately a central component, we cannot and must not think of AI development and human development as separate, but rather as interconnected. Future warfare will be connected warfare, with implications we must now begin to consider. How would such connected warfare be conducted? How would mission command be exercised between man and machine? What are the leadership implications of the human leader’s brain being connected to those of their subordinates? How will humans manage information for decision-making without being completely overloaded and paralyzed by overwhelming amounts of data? What are the moral, ethical, and legal implications of connected humans in combat, as well as responsibility for the actions of machines to which they are connected? These and thousands of other questions and implications related to policy and operation must be considered.

The power of AI resides not just in that of the individual computer, but in the connection of each computer to literally millions, if not billions, of sensors, servers, computers, and smart devices employing thousands, if not millions, of software programs and apps. The consensus is that at some point the computing and analytic power of AI will surpass that of the individual. And therein lies a major flaw in our thinking about the future. The power of AI may surpass that of a human being, but it won’t surpass the learning, thinking, and decision-making power of connected human beings. When a future human is connected to the internet, that human will have access to the computing power of all AI. But, when that same human is connected to several (in a platoon), or hundreds (on a ship) or thousands (in multiple headquarters) of other humans, then the power of AI will be exceeded by multiple orders of magnitude. The challenge of course is being able to think effectively under those circumstances, with your brain connected to all those sensors, computers, and other humans. This is what Ray Kurzwell terms “hybrid thinking.”   Imagine how that is going to change every facet of human life, to include every aspect of warfare, and how everyone in our future defense establishment, uniformed or not, will have to be capable of hybrid thinking.

Source: Genetic Literacy Project

So, what will the military human bring to warfare that the AI-empowered computer won’t? Certainly, one of the major challenges with AI thus far has been its inability to demonstrate human intuition. AI can replicate some derivative tasks with intuition using what is now called “Artificial Intuition.” These tasks are primarily the intuitive decisions that result from experience: AI generates this experience through some large number of iterations, which is how Goggle’s AlphaGo was able to beat the human world Go champion. Still, this is only a small part of the capacity of humans in terms not only of intuition, but of “insight,” what we call the “light bulb moment”. Humans will also bring emotional intelligence to connected warfare. Emotional intelligence, including aspects such as empathy, loyalty, and courage, are critical in the crucible of war and are not capabilities that machines can provide the Force, not today and perhaps not ever.

Warfare in the future is not going to be conducted by machines, no matter how far AI advances. Warfare will instead be connected human to human, human to internet, and internet to machine in complex, global networks. We cannot know today how such warfare will be conducted or what characteristics and capabilities of future forces will be necessary for victory. What we can do is cease developing AI as if it were something separate and distinct from, and often envisioned in competition with, humanity and instead focus our endeavors and investments in preparing for future connected competitions and wars.

If you enjoyed this post, please read the following Mad Scientist Laboratory blog posts:

… and watch Dr. Alexander Kott‘s presentation The Network is the Robot, presented at the Mad Scientist Robotics, Artificial Intelligence, & Autonomy: Visioning Multi Domain Battle in 2030-2050 Conference, at the Georgia Tech Research Institute, 8-9 March 2017, in Atlanta, Georgia.

COL James K. Greer (USA-Ret.) is the Defense Threat Reduction Agency (DTRA) and Joint Improvised Threat Defeat Organization (JIDO) Integrator at the Combined Arms Command. A former cavalry officer, he served thirty years in the US Army, commanding at all levels from platoon through Brigade. Jim served in operational units in CONUS, Germany, the Balkans and the Middle East. He served in US Army Training and Doctrine Command (TRADOC), primarily focused on leader, capabilities and doctrine development. He has significant concept development experience, co-writing concepts for Force XXI, Army After Next and Army Transformation. Jim was the Army representative to OSD-Net assessment 20XX Wargame Series developing concepts OSD and the Joint Staff. He is a former Director of the Army School of Advanced Military Studies (SAMS) and instructor in tactics at West Point. Jim is a veteran of six combat tours in Iraq, Afghanistan, and the Balkans, including serving as Chief of Staff of the Multi-National Security Transition Command – Iraq (MNSTC-I). Since leaving active duty, Jim has led the conduct of research for the Army Research Institute (ARI) and designed, developed and delivered instruction in leadership, strategic foresight, design, and strategic and operational planning. Dr. Greer holds a Doctorate in Education, with his dissertation subject as US Army leader self-development. A graduate of the United States Military Academy, he has a Master’s Degree in Education, with a concentration in Psychological Counseling: as well as Masters Degrees in National Security from the National War College and Operational Planning from the School of Advanced Military Studies.