86. Alternet: What Happens When the Internet is No Longer Trusted?

[Editor’s Note: Mad Scientist Laboratory is pleased to present a post by Mad Scientist and guest blogger Lt Col Jennifer “JJ” Snow, addressing the emergence of a post-internet world.]

The Internet of the 1990s was about choosing your own adventure. The Internet of right now, over the last 10 years, is about somebody else choosing your adventure for you.” – Cindy Cohn, Executive Director of the Electronic Frontier Foundation

The internet was designed in its earliest iteration to provide a new form of communication, a way for people to connect, to share information in real-time, to provide a positive tool for collaboration and learning. Looking back at those early ideas, many of the founding fathers of the internet express disappointment in what it has become: a place where privacy and people are abused, information is wielded like a weapon, nations and corporations alike battle each other and other nefarious actors in the digital shadows, and fake news dominates the taglines in hopes of grabbing the most dollars per click. In light of what technologists, ethical hackers, and the public view as a potentially irrecoverable situation, many are suggesting starting over and abandoning the internet as we know it in favor of alternative internet options or “Alternet.” [1]

These initiatives are nascent but are increasingly gaining traction as they offer users the option to manage their own identity and information online; choose what they do and don’t want to share digitally; provide transparency as a currency, meaning users can view rules, policies, and protocols openly at any time and see when changes are made real-time; and allow users to be their own data authority. While progress in this space will be slow but steady over the next two years, expect that “Alternets” will become a publicly recognized substitute to the big internet companies in five years and a commonplace feature of the web in 10 years as users become more disenchanted, distrustful, and decide they want greater control, attribution, or anonymity as needed, and desire an internet that meets their norms, cultural, and community preferences.

There are several interesting challenges that come with the fracturing of the internet in this manner.

First, Alternets will be more insular, require individual verification to join, and users will need to buy special equipment like a community specific encrypted router or use a particular variant of the blockchain to access the web.

Secondly, Alternets may serve to fracture the internet itself in interesting ways that could impact how data and users are able to digitally traverse the globe.

Third, Alternets will provide both the attribution many desire in social media to stop cyber bullying, scammers, and fake news, and the anonymity features that allow both dissident and terror groups to operate safely in virtual spaces. As with all technologies, there will always be opportunities for both positive and malicious use.

Fourth, the development and spread of Alternets may serve to further polarize various interests, organizations, and nations as like-minded communities will group together rather than strive to engage in constructive discourse, further reducing the opportunity for bridging entities to be effective negotiators.

Fifth, such online fracturing may also manifest physically in real life leading to conflict, both digital and physical, and may enhance the weaponization of cyber in new ways to include citizen cyber militia actively operating in defense of their communities and/or their nation or offensive attacks by nations operating from their own “Alternet” separate from the existing DNS system and not regulated and not easily targetable by competitor nations, thus limiting their ability to counterstrike and creating an overmatch situation. [2]

Current examples of “Alternets” that exist today include the private citizen efforts of the Metacurrency Project called Holo; the Russian independent internet for the BRICS block of nations; the PRC alternative which has also been installed in Tanzania, Nigeria, and Vietnam; BitDust, a decentralized, encrypted, anonymous storage and communication solution; Mastodon a decentralized, personally hosted, microblogging solution used in the Middle East, Africa, and Asia; and Hyperboria which was born out of the DarkNet and is an encrypted, distributed, peer-to-peer IPv6 network with Distributed Hash Table (DHT)-based source routing.

A full listing of “Alternet” projects and tools can be found in the footnotes. [3]

To learn more about the security ramifications associated with the rise of Alternets, read the following blog posts:

The Future of the Cyber Domain

Virtual Nations: An Emerging Supranational Cyber Trend, by Marie Murphy

JJ Snow is an Air Force Lt Colonel assigned as the U.S. Special Operations Command Innovation Officer and the J5 Donovan Group Future Plans and Strategy Deputy Director. In her current role, JJ serves as the government representative for technology outreach and engagement on behalf of the command and 756 interagency action officers spanning 40 different government agencies.

She is responsible for maintaining a network of non-traditional experts across industry, academia and ethical hackers/technologists to provide government with critical access, expertise and capacity across a broad spectrum of technologies to rapidly identify best of breed while also proactively responding to potential threat aspects of concern to Special Operations and national security. She supports senior government leadership in process innovation, innovation planning in big government, and the development of smart technology policy and advises senior government representatives on emerging disruptive technologies.

She holds a MS Degree in Defense Analysis with distinction from the Naval Postgraduate School (NPS) and a MA from American Military University in Strategic Intelligence with honors.


[1] Saldana et al., “Alternative Networks: Toward Global Access to the Internet for All.” IEEE Communications Magazine, vol. 55, no. 9, pp. 187-193, 2017.
Lafrance, Adrienne; “The Promise of a New Internet.” The Atlantic (10 JUN 2014)
Finley, Klint; “The Pied Piper’s New Internet Isn’t Just Possible – It’s Almost Here.” Wired (1 JUN 2017)

[2] Eric Harris-Braun, Nicolas Luck, Arthur Brock; “Holochain: Scalable Agent-Centric Distributed Computing.” Holo (15 FEB 2018)
Degurin, Mack; “Russia’s Alternate Internet.” NY Magazine (13 JUL 2018)
Sacks, Sam; “Beijing Wants to Rewrite the Rules of the Internet.” The Atlantic (18 JUN 2018)

[3] The following links are included to provide the reader with the options of exploring some additional alternative internet options that exist and are in use today. A big thank you to Ross Jones in recognition for his detailed GitHub Wiki on this subject which is captured in the last link concerning alternative-internet solutions and tools:  https://hyperboria.net/, https://github.com/redecentralize/alternative-internet

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

79. Character vs. Nature of Warfare: What We Can Learn (Again) from Clausewitz

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following post by guest blogger LTC Rob Taber, U.S. Army Training and Doctrine Command (TRADOC) G-2 Futures Directorate, clarifying the often confused character and nature of warfare, and addressing their respective mutability.]

No one is arguing that warfare is not changing. Where people disagree, however, is whether the nature of warfare, the character of warfare, or both are changing.

Source:  Office of the Director of National Intelligence

Take, for example, the National Intelligence Council’s assertion in “Global Trends: Paradox of Progress.” They state, “The nature of conflict is changing. The risk of conflict will increase due to diverging interests among major powers, an expanding terror threat, continued instability in weak states, and the spread of lethal, disruptive technologies. Disrupting societies will become more common, with long-range precision weapons, cyber, and robotic systems to target infrastructure from afar, and more accessible technology to create weapons of mass destruction.”[I]

Additionally, Brad D. Williams, in an introduction to an interview he conducted with Amir Husain, asserts, “Generals and military theorists have sought to characterize the nature of war for millennia, and for long periods of time, warfare doesn’t dramatically change. But, occasionally, new methods for conducting war cause a fundamental reconsideration of its very nature and implications.”[II] Williams then cites “cavalry, the rifled musket and Blitzkrieg as three historical examples”[III] from Husain and General John R. Allen’s (ret.) article, “On Hyperwar.”

Unfortunately, the NIC and Mr. Williams miss the reality that the nature of war is not changing, and it is unlikely to ever change. While these authors may have simply interchanged “nature” when they meant “character,” it is important to be clear on the difference between the two and the implications for the military. To put it more succinctly, words have meaning.

The nature of something is the basic make up of that thing. It is, at core, what that “thing” is. The character of something is the combination of all the different parts and pieces that make up that thing. In the context of warfare, it is useful to ask every doctrine writer’s personal hero, Carl Von Clausewitz, what his views are on the matter.

Source: Tetsell’s Blog. https://tetsell.wordpress.com/2014/10/13/clausewitz/

He argues that war is “subjective,”[IV]an act of policy,”[V] and “a pulsation of violence.”[VI] Put another way, the nature of war is chaotic, inherently political, and violent. Clausewitz then states that despite war’s “colorful resemblance to a game of chance, all the vicissitudes of its passion, courage, imagination, and enthusiasm it includes are merely its special characteristics.”[VII] In other words, all changes in warfare are those smaller pieces that evolve and interact to make up the character of war.

The argument that artificial intelligence (AI) and other technologies will enable military commanders to have “a qualitatively unsurpassed level of situational awareness and understanding heretofore unavailable to strategic commander[s][VIII] is a grand claim, but one that has been made many times in the past, and remains unfulfilled. The chaos of war, its fog, friction, and chance will likely never be deciphered, regardless of what technology we throw at it. While it is certain that AI-enabled technologies will be able to gather, assess, and deliver heretofore unimaginable amounts of data, these technologies will remain vulnerable to age-old practices of denial, deception, and camouflage.

 

The enemy gets a vote, and in this case, the enemy also gets to play with their AI-enabled technologies that are doing their best to provide decision advantage over us. The information sphere in war will be more cluttered and more confusing than ever.

Regardless of the tools of warfare, be they robotic, autonomous, and/or AI-enabled, they remain tools. And while they will be the primary tools of the warfighter, the decision to enable the warfighter to employ those tools will, more often than not, come from political leaders bent on achieving a certain goal with military force.

Drone Wars are Coming / Source: USNI Proceedings, July 2017, Vol. 143 / 7 /  1,373

Finally, the violence of warfare will not change. Certainly robotics and autonomy will enable machines that can think and operate without humans in the loop. Imagine the future in which the unmanned bomber gets blown out of the sky by the AI-enabled directed energy integrated air defense network. That’s still violence. There are still explosions and kinetic energy with the potential for collateral damage to humans, both combatants and civilians.

Source: Lockheed Martin

Not to mention the bomber carried a payload meant to destroy something in the first place. A military force, at its core, will always carry the mission to kill things and break stuff. What will be different is what tools they use to execute that mission.

To learn more about the changing character of warfare:

– Read the TRADOC G-2’s The Operational Environment and the Changing Character of Warfare paper.

– Watch The Changing Character of Future Warfare video.

Additionally, please note that the content from the Mad Scientist Learning in 2050 Conference at Georgetown University, 8-9 August 2018, is now posted and available for your review:

– Read the Top Ten” Takeaways from the Learning in 2050 Conference.

– Watch videos of each of the conference presentations on the TRADOC G-2 Operational Environment (OE) Enterprise YouTube Channel here.

– Review the conference presentation slides (with links to the associated videos) on the Mad Scientist All Partners Access Network (APAN) site here.

LTC Rob Taber is currently the Deputy Director of the Futures Directorate within the TRADOC G-2. He is an Army Strategic Intelligence Officer and holds a Master of Science of Strategic Intelligence from the National Intelligence University. His operational assignments include 1st Infantry Division, United States European Command, and the Defense Intelligence Agency.

Note:  The featured graphic at the top of this post captures U.S. cavalrymen on General John J. Pershing’s Punitive Expedition into Mexico in 1916.  Less than two years later, the United States would find itself fully engaged in Europe in a mechanized First World War.  (Source:  Tom Laemlein / Armor Plate Press, courtesy of Neil Grant, The Lewis Gun, Osprey Publishing, 2014, page 19)

_______________________________________________________

[I] National Intelligence Council, “Global Trends: Paradox of Progress,” January 2017, https://www.dni.gov/files/documents/nic/GT-Full-Report.pdf, p. 6.
[II] Brad D. Williams, “Emerging ‘Hyperwar’ Signals ‘AI-Fueled, machine waged’ Future of Conflict,” Fifth Domain, August 7, 2017, https://www.fifthdomain.com/dod/2017/08/07/emerging-hyperwar-signals-ai-fueled-machine-waged-future-of-conflict/.
[III] Ibid.
[VI] Carl Von Clausewitz, On War, ed. Michael Howard and Peter Paret (Princeton: Princeton University Press, 1976), 85.
[V] Ibid, 87.
[VI] Ibid.
[VII] Ibid, 86.
[VIII] John Allen, Amir Hussain, “On Hyper-War,” Fortuna’s Corner, July 10, 2017, https://fortunascorner.com/2017/07/10/on-hyper-war-by-gen-ret-john-allenusmc-amir-hussain/.

78. The Classified Mind – The Cyber Pearl Harbor of 2034

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the following post by guest blogger Dr. Jan Kallberg, faculty member, United States Military Academy at West Point, and Research Scientist with the Army Cyber Institute at West Point. His post serves as a cautionary tale regarding our finite intellectual resources and the associated existential threat in failing to protect them!]

Preface: Based on my experience in cybersecurity, migrating to a broader cyber field, there have always been those exceptional individuals that have an unreplicable ability to see the challenge early on, create a technical solution, and know how to play it in the right order for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of Artificial Intelligence increases our reliance on these highly capable individuals – because someone must set the rules, the boundaries, and point out the trajectory for Artificial Intelligence at initiation.

Source: https://thebulletin.org/2017/10/neuroscience-and-the-new-weapons-of-the-mind/

As an industrialist society, we tend to see technology and the information that feeds it as the weapons – and ignore the few humans that have a large-scale direct impact. Even if identified as a weapon, how do you make a human mind classified? Can we protect these high-ability individuals that in the digital world are weapons, not as tools but compilers of capability, or are we still focused on the tools? Why do we see only weapons that are steel and electronics and not the weaponized mind as a weapon?  I believe firmly that we underestimate the importance of Applicable Intelligence – the ability to play the cyber engagement in the optimal order.  Adversaries are often good observers because they are scouting for our weak spots. I set the stage for the following post in 2034, close enough to be realistic and far enough for things to happen when our adversaries are betting that we rely more on a few minds than we are willing to accept.

Post:  In a not too distant future, 20th of August 2034, a peer adversary’s first strategic moves are the targeted killings of less than twenty individuals as they go about their daily lives:  watching a 3-D printer making a protein sandwich at a breakfast restaurant; stepping out from the downtown Chicago monorail; or taking a taste of a poison-filled retro Jolt Cola. In the gray zone, when the geopolitical temperature increases, but we are still not at war yet, our adversary acts quickly and expedites a limited number of targeted killings within the United States of persons whom are unknown to mass media, the general public, and have only one thing in common – Applicable Intelligence (AI).

The ability to apply is a far greater asset than the technology itself. Cyber and card games have one thing in common, the order you play your cards matters. In cyber, the tools are publicly available, anyone can download them from the Internet and use them, but the weaponization of the tools occurs when used by someone who understands how to play the tools in an optimal order. These minds are different because they see an opportunity to exploit in a digital fog of war where others don’t or can’t see it. They address problems unburdened by traditional thinking, in new innovative ways, maximizing the dual-purpose of digital tools, and can create tangible cyber effects.

It is the Applicable Intelligence (AI) that creates the procedures, the application of tools, and turns simple digital software in sets or combinations as a convergence to digitally lethal weapons. This AI is the intelligence to mix, match, tweak, and arrange dual purpose software. In 2034, it is as if you had the supernatural ability to create a thermonuclear bomb from what you can find at Kroger or Albertson.

Sadly we missed it; we didn’t see it. We never left the 20th century. Our adversary saw it clearly and at the dawn of conflict killed off the weaponized minds, without discretion, and with no concern for international law or morality.

These intellects are weapons of growing strategic magnitude. In 2034, the United States missed the importance of these few intellects. This error left them unprotected.

All of our efforts were instead focusing on what they delivered, the application and the technology, which was hidden in secret vaults and only discussed in sensitive compartmented information facilities. Therefore, we classify to the highest level to ensure the confidentiality and integrity of our cyber capabilities. Meanwhile, the most critical component, the militarized intellect, we put no value to because it is a human. In a society marinated in an engineering mindset, humans are like desk space, electricity, and broadband; it is a commodity that is input in the production of the technical machinery. The marveled technical machinery is the only thing we care about today, 2018, and as it turned out in 2034 as well.

We are stuck in how we think, and we are unable to see it coming, but our adversaries see it. At a systematic level, we are unable to see humans as the weapon itself, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed. As the armory of the war of 1812, as the stockpile of 1943, and as the launch pad of 2034. Arms are made of steel, or fancier metals, with electronics – we failed in 2034 to see weapons made of corn, steak, and an added combative intellect.

General Nakasone stated in 2017, “Our best ones [coders] are 50 or 100 times better than their peers,” and continued “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.” In reality, the success of cyber and cyber operations is highly dependent not on the tools or toolsets but instead upon the super-empowered individual that General Nakasone calls “the 50-x coder.”

Manhattan Project K-25 Gaseous Diffusion Process Building, Oak Ridge, TN / Source: atomicarchive.com

There were clear signals that we could have noticed before General Nakasone pointed it out clearly in 2017. The United States’ Manhattan Project during World War II had at its peak 125,000 workers on the payroll, but the intellects that drove the project to success and completion were few. The difference with the Manhattan Project and the future of cyber is that we were unable to see the human as a weapon, being locked in by our path dependency as an engineering society where we hail the technology and forget the importance of the humans behind it.

J. Robert Oppenheimer – the militarized intellect behind the  Manhattan Project / Source: Life Magazine

America’s endless love of technical innovations and advanced machinery reflects in a nation that has celebrated mechanical wonders and engineered solutions since its creation. For America, technical wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the intercontinental railroad, the Panama Canal, the manufacturing era, the moon landing, and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps that can solve a problem or act.

The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced. In 2034, the era of digital conflicts and the war between algorithms with engagements occurring at machine speed with no time for leadership or human interaction, it is the intellects that design and understand how to play it. We didn’t see it.

In 2034, with fewer than twenty bodies piled up after targeted killings, resides the Cyber Pearl Harbor. It was not imploding critical infrastructure, a tsunami of cyber attacks, nor hackers flooding our financial systems, but instead traditional lead and gunpowder. The super-empowered individuals are gone, and we are stuck in a digital war at speeds we don’t understand, unable to play it in the right order, and with limited intellectual torque to see through the fog of war provided by an exploding kaleidoscope of nodes and digital engagements.

Source: Shutterstock

If you enjoyed this post, read our Personalized Warfare post.

Dr. Jan Kallberg is currently an Assistant Professor of Political Science with the Department of Social Sciences, United States Military Academy at West Point, and a Research Scientist with the Army Cyber Institute at West Point. He was earlier a researcher with the Cyber Security Research and Education Institute, The University of Texas at Dallas, and is a part-time faculty member at George Washington University. Dr. Kallberg earned his Ph.D. and MA from the University of Texas at Dallas and earned a JD/LL.M. from Juridicum Law School, Stockholm University. Dr. Kallberg is a certified CISSP, ISACA CISM, and serves as the Managing Editor for the Cyber Defense Review. He has authored papers in the Strategic Studies Quarterly, Joint Forces Quarterly, IEEE IT Professional, IEEE Access, IEEE Security and Privacy, and IEEE Technology and Society.

59. Fundamental Questions Affecting Army Modernization

[Editor’s Note:  The Operational Environment (OE) is the start point for Army Readiness – now and in the Future. The OE answers the question, “What is the Army ready for?”  Without the OE in training and Leader development, Soldiers and Leaders are “practicing” in a benign condition, without the requisite rigor to forge those things essential for winning in a complex, multi-domain battlefield.  Building the Army’s future capabilities, a critical component of future readiness, requires this same start point.  The assumptions the Army makes about the Future OE are the sine qua non start point for developing battlefield systems — these assumptions must be at the forefront of decision-making for all future investments.]

There are no facts about the future. Leaders interested in building future ready organizations must develop assumptions about possible futures and these assumptions require constant scrutiny. Leaders must also make decisions based on these assumptions to posture organizations to take advantage of opportunities and to mitigate risks. Making these decisions is fundamental to building future readiness.

Source: Evan Jensen, ARL

The TRADOC G-2 has made the following foundational assumptions about the future that can serve as launch points for important questions about capability requirements and capabilities under development. These assumptions are further described in An Advanced Engagement Battlespace: Tactical, Operational and Strategic Implications for the Future Operational Environment, published by our colleagues at Small Wars Journal.

1. Contested in all domains (air, land, sea, space, and cyber). Increased lethality, by virtue of ubiquitous sensors, proliferated precision, high kinetic energy weapons and advanced area munitions, further enabled by autonomy, robotics, and Artificial Intelligence (AI) with an increasing potential for overmatch. Adversaries will restrict us to temporary windows of advantage with periods of physical and electronic isolation.

Source: Army Technology

2. Concealment is difficult on the future battlefield. Hiding from advanced sensors — where practicable — will require dramatic reduction of heat, electromagnetic, and optical signatures. Traditional hider techniques such as camouflage, deception, and concealment will have to extend to “cross-domain obscuration” in the cyber domain and the electromagnetic spectrum. Canny competitors will monitor their own emissions in real-time to understand and mitigate their vulnerabilities in the “battle of signatures.” Alternately, “hiding in the open” within complex terrain clutter and near-constant relocation might be feasible, provided such relocation could outpace future recon / strike targeting cycles.   Adversaries will operate among populations in complex terrain, including dense urban areas.

3. Trans-regional, gray zone, and hybrid strategies with both regular and irregular forces, criminal elements, and terrorists attacking our weaknesses and mitigating our advantages. The ensuing spectrum of competition will range from peaceful, legal activities through violent, mass upheavals and civil wars to traditional state-on-state, unlimited warfare.

Source: Science Photo Library / Van Parys Media

4. Adversaries include states, non-state actors, and super-empowered individuals, with non-state actors and super empowered individuals now having access to Weapons of Mass Effect (WME), cyber, space, and Nuclear/Biological/ Chemical (NBC) capabilities. Their operational reach will range from tactical to global, and the application of their impact from one domain into another will be routine. These advanced engagements will also be interactive across the multiple dimensions of conflict, not only across every domain in the physical dimension, but also the cognitive dimension of information operations, and even the moral dimension of belief and values.

Source: Northrop Grumman

5. Increased speed of human interaction, events and action with democratized and rapidly proliferating capabilities means constant co-evolution between competitors. Recon / Strike effectiveness is a function of its sensors, shooters, their connections, and the targeting process driving decisions. Therefore, in a contest between peer competitors with comparable capabilities, advantage will fall to the one that is better integrated and makes better and faster decisions.

These assumptions become useful when they translate to potential decision criteria for Leaders to rely on when evaluating systems being developed for the future battlefield. Each of the following questions are fundamental to ensuring the Army is prepared to operate in the future.

Source: Lockheed Martin

1. How will this system operate when disconnected from a network? Units will be disconnected from their networks on future battlefields. Capabilities that require constant timing and precision geo-locational data will be prioritized for disruption by adversaries with capable EW systems.

2. What signature does this system present to an adversary? It is difficult to hide on the future battlefield and temporary windows of advantage will require formations to reduce their battlefield signatures. Capabilities that require constant multi-directional broadcast and units with large mission command centers will quickly be targeted and neutralized.

Image credit: Alexander Kott

3. How does this system operate in dense urban areas? The physical terrain in dense urban areas and megacities creates concrete canyons isolating units electronically and physically. Automated capabilities operating in dense population areas might also increase the rate of false signatures, confusing, rather than improving, Commander decision-making. New capabilities must be able to operate disconnected in this terrain. Weapons systems must be able to slew and elevate rapidly to engage vertical targets. Automated systems and sensors will require significant training sets to reduce the rate of false signatures.

Source: Military Embedded Systems

4. How does this system take advantage of open and modular architectures? The rapid rate of technological innovations will offer great opportunities to militaries capable of rapidly integrating prototypes into formations.  Capabilities developed with open and modular architectures can be upgraded with autonomous and AI enablers as they mature. Early investment in closed-system capabilities will freeze Armies in a period of rapid co-evolution and lead to overmatch.

5. How does this capability help win in competition short of conflict with a near peer competitor? Near peer competitors will seek to achieve limited objectives short of direct conflict with the U.S. Army. Capabilities will need to be effective at operating in the gray zone as well as serving as deterrence. They will need to be capable of strategic employment from CONUS-based installations.

If you enjoyed this post, check out the following items of interest:

    • Join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live 11 June 2018 — click here to learn more!

55. Influence at Machine Speed: The Coming of AI-Powered Propaganda

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following guest blog post by MAJ Chris Telley, U.S. Army, assigned to the Naval Postgraduate School, addressing how Artificial Intelligence (AI) must be understood as an Information Operations (IO) tool if U.S. defense professionals are to develop effective countermeasures and ensure our resilience to its employment by potential adversaries.]

AI-enabled IO present a more pressing strategic threat than the physical hazards of slaughter-bots or even algorithmically-escalated nuclear war. IO are efforts to “influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries;” here, we’re talking about using AI to do so. AI-guided IO tools can empathize with an audience to say anything, in any way needed, to change the perceptions that drive those physical weapons. Future IO systems will be able to individually monitor and affect tens of thousands of people at once. Defense professionals must understand the fundamental influence potential of these technologies if they are to drive security institutions to counter malign AI use in the information environment.

Source: Peter Adamis / Abalinx.com

Programmatic marketing, using consumer’s data habits to drive real time automated bidding on personalized advertising, has been used for a few years now. Cambridge Analytica’s Facebook targeting made international headlines using similar techniques, but digital electioneering is just the tip of the iceberg. An AI trained with data from users’ social media accounts, economic media interactions (Uber, Applepay, etc.), and their devices’ positional data can infer predictive knowledge of its targets. With that knowledge, emerging tools — like Replika — can truly befriend a person, allowing it to train that individual, for good or ill.

Source: Getty Creative

Substantive feedback is required to train an individual’s response; humans tend to respond best to content and feedback with which they agree. That content can be algorithmically mass produced. For years, Narrative Science tools have helped writers create sports stories and stock summaries, but it’s just as easy to use them to create disinformation. That’s just text, though; today, the AI can create fake video. A recent warning, ostensibly from former President Obama, provides an entertaining yet frightening demonstration of how Deepfakes will challenge our presumptions about truth in the coming years. The Defense Advanced Research Projects Agency (DARPA) is funding a project this summer to determine whether AI-generated Deepfakes will become impossible to distinguish from the real thing, even using other AI systems.

Given that malign actors can now employ AI to lieat machine speed,” they still have to get the story to an audience. Russian bot armies continue to make headlines doing this very thing. The New York Times maintains about a dozen Twitter feeds and produces around 300 tweets a day, but Russia’s Internet Research Agency (IRA) regularly puts out 25,000 tweets in the same twenty-four hours. The IRA’s bots are really just low-tech curators; they collect, interpret, and display desired information to promote the Kremlin’s narratives.

Source: Josep Lago/AFP/Getty Images

Next-generation bot armies will employ far faster computing techniques and profit from an order of magnitude greater network speed when 5G services are fielded. If “Repetition is a key tenet of IO execution,” then this machine gun-like ability to fire information at an audience will, with empathetic precision and custom content, provide the means to change a decisive audience’s very reality. No breakthrough science is needed, no bureaucratic project office required. These pieces are already there, waiting for an adversary to put them together.

The DoD is looking at AI but remains focused on image classification and swarming quadcopters while ignoring the convergent possibilities of predictive audience understanding, tailored content production, and massive scale dissemination. What little digital IO we’ve done, sometimes called social media “WebOps,” has been contractor heavy and prone to naïve missteps. However, groups like USSOCOM’s SOFWERX and the students at the Naval Postgraduate School are advancing the state of our art. At NPS, future senior leaders are working on AI, now. A half-dozen of the school’s departments have stood up classes and events specifically aimed at operationalizing advanced computing. The young defense professionals currently working on AI should grapple with emerging influence tools and form the foundation of the DoD’s future institutional capabilities.

MAJ Chris Telley is an Army information operations officer assigned to the Naval Postgraduate School. His assignments have included theater engagement at U.S. Army Japan and advanced technology integration with the U.S. Air Force. Chris commanded in Afghanistan and served in Iraq as a United States Marine. He tweets at @chris_telley.

This blog post represents the opinions of the author and do not reflect the position of the Army or the United States Government.

54. A View of the Future: 2035-2050

[Editor’s Note: The following post addresses the Era of Contested Equality (2035-2050) and is extracted from the U.S. Army Training and Doctrine Command (TRADOC) G-2’s The Operational Environment and the Changing Character of Future Warfare, published last summer. This seminal document provides the U.S. Army with a holistic and heuristic approach to projecting and anticipating both transformational and enduring trends that will lend themselves to the depiction of the future.]

Changes encountered during the Future Operational Environment’s Era of Accelerated Human Progress (the present through 2035) begin a process that will re-shape the global security situation and fundamentally alter the character of warfare. While its nature remains constant, the speed, automation, ranges, both broad and narrow effects, its increasingly integrated multi-domain conduct, and the complexity of the terrain and social structures in which it occurs will make mid-century warfare both familiar and utterly alien.

During the Era of Contested Equality (2035-2050), great powers and rising challengers have converted hybrid combinations of economic power, technological prowess, and virulent, cyber-enabled ideologies into effective strategic strength. They apply this strength to disrupt or defend the economic, social, and cultural foundations of the old Post-World War II liberal order and assert or dispute regional alternatives to established global norms. State and non-state actors compete for power and control, often below the threshold of traditional armed conflict – or shield and protect their activities under the aegis of escalatory WMD, cyber, or long-range conventional options and doctrines.

It is not clear whether the threats faced in the preceding Era of Accelerated Human Progress persist, although it is likely that China and Russia will remain key competitors, and that some form of non-state ideologically motivated extremist groups will exist. Other threats may have fundamentally changed their worldviews, or may not even exist by mid-Century, while other states, and combinations of states will rise and fall as challengers during the 2035-2050 timeframe. The security environment in this period will be characterized by conditions that will facilitate competition and conflict among rivals, and lead to endemic strife and warfare, and will have several defining features.

The nation-state perseveres. The nation-state will remain the primary actor in the international system, but it will be weaker both domestically and globally than it was at the start of the century. Trends of fragmentation, competition, and identity politics will challenge global governance and broader globalization, with both collective security and globalism in decline. States share their strategic environments with networked societies which increasingly circumvent governments unresponsive to their citizens’ needs. Many states will face challenges from insurgents and global identity networks – ethnic, religious, regional, social, or economic – which either resist state authority or ignore it altogether.

Super-Power Diminishes. Early-century great powers will lose their dominance in command and control, surveillance, and precision-strike technologies as even non-state actors will acquire and refine their own application of these technologies in conflict and war. Rising competitors will be able to acquire capabilities through a broad knowledge diffusion, cyber intellectual property theft, and their own targeted investments without having to invest into massive “sunken” research costs. This diffusion of knowledge and capability and the aforementioned erosion of long-term collective security will lead to the formation of ad hoc communities of interest. The costs of maintaining global hegemony at the mid-point of the century will be too great for any single power, meaning that the world will be multi-polar and dominated by complex combinations of short-term alliances, relations, and interests.

This era will be marked by contested norms and persistent disorder, where multiple state and non-state actors assert alternative rules and norms, which when contested, will use military force, often in a dimension short of traditional armed conflict.

For additional information on the Future Operational Environment and the Era of Contested Equality:

•  Listen to Modern War Institute‘s podcast where Retired Maj. Gen. David Fastabend and Mr. Ian Sullivan address Technology and the Future of Warfare

•  Watch the TRADOC G-2 Operational Environment Enterprise’s The Changing Character of Future Warfare video.

51. Black Swans and Pink Flamingos

The Mad Scientist Initiative recently facilitated a workshop with thought leaders from across the Department of Defense, the Intelligence Community, other Government agencies, industry, and academia to address the unknown, unknowns (i.e., Black Swans) and the known, knowns (i.e., Pink Flamingos) to synthesize cross-agency thinking about possible disruptions to the Future Operational Environment.

Black Swans: In Nassim Nicholas Taleb’s original context, a black swan (unknown, unknowns) is an event or situation which is unpredictable, but has a major effect. For this conference, we used a looser definition, identifying possibilities that are not likely, but might have significant impacts on how we think about warfighting and security.

Pink Flamingos: Defined by Frank Hoffman, Pink Flamingos are the known, knowns that are often discussed, but ignored by Leaders trapped by organizational cultures and rigid bureaucratic decision-making structures. Peter Schwartz further describes Pink Flamingos as the “inevitable surprise.” Digital photography was a pink flamingo to Kodak.

At the workshop, attendees identified the following Black Swans:

Naturally Occurring Disaster: These events (i.e., Carrington Event — solar flare frying solid state electronics, super volcano eruptions, earthquake swarms, etc.) would have an enormous impact on the Army and its ability to continue to operate and defend the nation and support national recovery operations. While warning times have increased for many of these events, there are limited measures that can be implemented to mitigate the devastating effects of these events.


Virtual Nations: While the primacy of Westphalian borders has been challenged and the power of traditional nation-states has been waning over the last decade, some political scientists have assumed that supranational organizations and non-state actors would take their place. One potential black swan is the emergence of virtual nations due to the convergence of blockchain technologies, crypto-currency, and the ability to project power and legitimacy through the virtual world. Virtual nations could be organized based on ideologies, business models, or single interests. Virtual nations could supersede, supplement, or compete with traditional, physical nations. The Army of the future may not be prepared to interact and compete with virtual nations.


Competition in Venues Other than Warfare (Economic, Technological, Demographic, etc.) Achieving Primacy: In the near future, war in the traditional sense may be less prevalent, while competitions in other areas may be the driving forces behind national oppositions. How does the Army need to prepare for an eventuality where armed conflict is not as important as it once was?


Alternate Internet — “Alternet”: A distinct entity, separate from the general commercial internet, only accessible with specific corresponding hardware. This technology would allow for unregulated and unmonitored communication and commerce, potentially granting safe haven to criminal and terrorist activities.

At the workshop, attendees identified the following Pink Flamingos:

Safe at Home: Army installations are no longer the sanctuaries they once were, as adversaries will be able to attack Soldiers and families through social media and other cyberspace means. Additionally, installations no longer merely house, train, and deploy Soldiers — unmanned combat systems are controlled from home installations -— a trend in virtual power that will increase in the future. The Army needs a plan to harden our installations and train Soldiers and families to be resilient for this eventuality.


Hypersonics: High speed (Mach 5 or higher) and highly maneuverable missiles or glide vehicles that can defeat our air defense systems. The speed of these weapons is unmatched and their maneuverability allows them to keep their targets unknown until only seconds before impact, negating current countermeasures.


Generalized, Operationalized Artificial Intelligence (AI): Artificial intelligence is one of the most prominent pink flamingos throughout global media and governments. Narrow artificial intelligence is being addressed as rapidly as possible through ventures such as Project MAVEN. However, generalized and operationalized artificial intelligence – that can think, contextualize, and operate like a human – has the potential to disrupt not only operations, but also the military at its very core and foundation.


Space/Counterspace: Space is becoming increasingly congested, commercialized, and democratized. Disruption, degradation, and denial in space threatens to cripple multi-domain warfare operations. States and non-state actors alike are exploring options to counter one another, compete, and potentially even fight in space.


Quantum Sciences: Quantum science – communication, computing, and sensing – has the potential to solve some intractable but very specific problem sets. Quantum technology remains in its infancy. However, as the growth of qubits in quantum computing continues to expand, so does the potentiality of traditional encryption being utterly broken. Quantum sensing can allow for much more precise atomic clocks surpassing the precision timing of GPS, as well as quantum imaging that provides better results than classical imaging in a variety of wavelengths.


Bioweapons/Biohacking: The democratization of bio technology will mean that super-empowered individuals as well as nation states will have the ability to engineer weapons and hacks that can augment friendly human forces or target and degrade enemy human forces (e.g., targeted disease or genetic modifications).


Personalized Warfare: Warfare is now waged on a personal level, where adversaries can attack the bank accounts of Soldiers’ families, infiltrate their social media, or even target them specifically by their genetics. The Army needs to understand that the individual Soldier can be exploited in many different ways, often through information publicly provided or stolen.

Source: ommbeu / Fotolia
Deep Fakes/Information Warfare: Information warfare and “fake news” have played a prominent role in global politics over the last several years and could dominate the relationship between societies, governments, politicians, and militaries in the future operational environment. Information operations, thanks to big data and humanity’s ever-growing digital presence, are targeted at an extremely personal and specific level. One of the more concerning aspects of this is an artificial intelligence-based human image/voice synthesis technique known as deep fakes. Deep fakes can essentially put words in the mouths of prominent or trusted politicians and celebrities.


Multi-Domain Swarming: Swarming is often thought about in terms of unmanned aerial systems (UAS), but one significant pink flamingo is swarming taking place across multiple domains with self-organizing, autonomous aerial, ground, maritime (sub and surface), and even subterranean unmanned systems. U.S. defense systems on a linear modernization and development model will not be capable of dealing with the saturation and complexity issues arising from these multi-domain swarms.


Lethal Autonomy: An autonomous system with the ability to track, target, and fire without the supervision or authority of a human in/on the loop. The U.S. Army will have to examine its own policy regarding these issues as well as our adversaries, who may be less deterred by ethical/policy issues.


Tactical Nuclear Exchange: While strategic nuclear war and mutually assured destruction have been discussed and addressed ad nauseam, not enough attention has been given to the potential of a tactical nuclear exchange between state actors. One tactical nuclear attack, while not guaranteeing a nuclear holocaust, would bring about a myriad of problems for U.S. forces worldwide (e.g., the potential for escalation, fallout, contamination of water and air, and disaster response). Additionally, a high altitude nuclear burst’s electromagnetic pulse has the potential to fry solid state electronics across a wide-area, with devastating results to the affected nation’s electrical grid, essential government services, and food distribution networks.

Leaders must anticipate these future possibilities in determining the character of future conflicts and in force design and equipping decisions. Using a mental model of black swans and pink flamingos provides a helpful framework for assessing the risks associated with these decisions.

For additional information on projected black swans for the next 20+ years, see the RAND Corporation’s Discontinuities and Distractions — Rethinking Security for the Year 2040.

47. Quanta of Competition

(Editor’s Note: Mad Scientist Laboratory is pleased to present the following post by repeat guest blogger Mr. Victor R. Morris. Strap in and prepare yourselves for a mind-expanding discussion on the competition field’s application of quantum field theory to political warfare and the extended battlefield!
Mr. Morris’ previous post addressing the cross-domain effects of human-machine networks may be read here.)

The competition field is a field of fields. It is the unification of physical, information, electromagnetic and cyber, political warfare, and extended military battle fields manifested through cross-field synergy and information feedback loop.

The competition field interacts with the physical, information, and cyber and electromagnetic fields. Political warfare and extended military battle are field quanta and reach excitable states due to cross-field synergy and information exchange. These excitable states are unpredictable, yet measurable via probability in the competition continuum. The measurements correlate to the information feedback loop of relative and finite information. The feedback loop results from system interactions, decision-making, effects, and learning. Learning drives interactions, ensuring information exchange in the competition continuum.

The competition field concept was developed from quantum mechanics, multi-domain battle operational frameworks, and geostrategic competition fundamentals to address grand strategy design, long-term, strategic inter-state competition, and non-state actor considerations in macro scale and spacetime.

The concept applies quantum field theory to political warfare and the “extended battlefield,” where Joint and multinational systems are the quanta of these fields, prone to excitable states like field quanta. In quantum mechanics, “quanta” refers to the minimum amount of physical entity involved in an interaction, like a photon or bit. The concept also unites the “Gray Zone” with the political warfare field interacting with the extended military battlefield.

Multi-domain battle and gray zone phenomena result from interactions in the extended military battle and political warfare fields. In quantum field theory, “interactions” refer to particles and corresponding underlying quantum fields. The competition field is the fundamental starting point for strategy design and system of systems thinking.

War/conflict, “Gray Zone,” and peace manifest based on uncertain, yet probability-determined interactions that drive decision-making, effects, and learning to continue the feedback loop of finite information. In the competition field, competition is relative or relational to information. Information does not measure what is known, but the probabilities of something. The competition field correlates the scientific and granular notions of information with the Operational Environment’s fields (also called domains) and physical systems during interactions. Systems are quantized like subatomic particles in the form of Centers of Gravity (COG), subsystems, critical factors, flows, nodes, and entities.

System and particle interactions are uncertain and not deterministic predictions described in exporting security as preventive war strategy and Newtonian physics. Measures short of war and war itself (i.e., violent or armed competition) are interactions in the competition field based on convergence, acceleration, force, distance, time, and other variables. Systems or things do not enter into relations; relations ground the notion of the system.

The information environment is also a field of fields. It exists with the physical, electromagnetic, cyberspace, and space-time fields in the competition field. In Joint doctrine, this is the holistic operational environment. Quantum mechanic’s granularity, relationality, and uncertainty of this field are described in the cognitive, informational, and physical dimensions.

These dimensions or fields include the quanta of human beings, Internet of Things (IoT), data, and individual or group decision-making. The cognitive dimension encompasses the minds of those who transmit, receive, and respond to or act on information.

The cognitive dimension is the most important component of the information environment and influences decision-making in the competition field. The scientific notion of information and probability of occurrence measurement are the largest contributors to understanding quantum physics and the concept of competition.

Colonel John Boyd, a military strategist, was a student of Sun Tzu and Clausewitz and studied military history to see where concepts overlapped and diverged. He knowingly or unknowingly described quantum mechanic’s postulates when he critiqued Clausewitz’s center of gravity concept. He suggested finding the thing that allows the organic whole to stay connected and breaking down those connections.

In theories of quantum gravity, that “thing” is the quanta of gravity, hypothetically called a graviton. In this assessment, it is the quanta of competition. The quanta of competition are not in competition; they are themselves competition and are described by links and the relation they express. The quanta of competition are also suited for quantum biology, since they involve both biological and environmental objects and problem sets.

Additionally, what Clausewitz described as polarity, intelligence, and friction are information at the quantum state. Position, momentum, spin, and the polarization of entangled particles are measured and correlated. The constant exchange of relevant and irrelevant information occurs as competition field quanta interact in the competition continuum.

In this vision, Joint and multinational systems are their own fields, oscillating in the political and extended military battle fields. Interactions manifest forces to exploit windows of superiority, seize the initiative, and attain positions of relative advantage in the competition continuum. Interagency and intergovernmental systems are also manifested in granular and relational manners to enable these objectives. This is only possible through combination, cooperation, and information.

The competition field attempts to explain the relationship between the holistic operational environment and physical systems bridging quantum mechanics and geostrategic competition constructs.

Clausewitz said, “War is merely a continuation of policy by other means.” Policy is a continuation of processes and events between interactions. Lethal or non-lethal effects are based on the measurement of possible alternatives enumerated by reciprocal information and the ability to make decisions in the competition field.

Victor R. Morris is a civilian irregular warfare and threat mitigation instructor at the Joint Multinational Readiness Center (JMRC) in Germany.

44. Megacities: Future Challenges and Responses

“Cities now sprawl over large areas of the globe and contain almost two-thirds of the world’s population. These numbers will only increase. Some megacities will become more important politically and economically than the nation-state in which they reside…. Furthermore, the move of large numbers of people to large urban areas and megacities will strain resources, as these areas will become increasingly reliant on rural areas for food, water, and even additional power. From a military perspective, cities represent challenges, opportunities, and unique vulnerabilities.” The Operational Environment and the Changing Character of Future Warfare

The U.S. Army Training and Doctrine Command (TRADOC) G-2, in partnership with U.S. Army Pacific (USARPAC) and the Australian Army, facilitated the Multi-Domain Battle (MDB) in Megacities Conference on April 3-4, 2018 at Fort Hamilton, New York. Briefings and videos from this event are now posted on the Mad Scientist APAN Site’s MDB in Megacities Conference Page and the TRADOC G-2 Operational Environment Enterprise YouTube Channel.

To whet your appetite while we await publication of the preliminary results from the aforementioned conference, the Mad Scientist Laboratory has extracted and reiterated below key findings from the Mad Scientist Megacities and Dense Urban Areas Initiative in 2025 and Beyond Conference Final Report. This conference, facilitated in April 2016 by the TRADOC G-2, Arizona State University Research Enterprise (ASURE), Army Capabilities Integration Center (ARCIC), and the Army’s Intelligence Center of Excellence (ICoE), sought to ensure that no U.S. Army Soldier will ever be disadvantaged when operating in an urban environment. The future challenges and responses identified at this conference are presented below:

Future challenges that U.S. forces will face when operating in a megacity environment include:

Rapid growth in urban areas will produce more demand on the infrastructure and flow systems, more waste, and increased urban density.




• A major challenge of megacities is density (data, people, and infrastructure).






• The absence of clearly demarcated boundaries for the area of operations will be problematic.






• The Army will have to consider the rural and regional areas around megacities as well as the world-wide implications of operations within megacities.


• The proliferation of advanced weaponry, coupled with the rapid digital spread of information and ideology, allows anyone to be a threat and will lead to growing instability in many parts of the world.


• Changing infrastructure, subcultures, and places to “hide in plain sight” present a particular challenge to data gathering.




• Megacities are more susceptible to natural and manmade disasters when in close proximity to large bodies of water. Extreme water events caused by floods, hurricanes, typhoons, and tsunamis will exacerbate life threatening situations in areas of increased urbanization.


Urban vertical and subterranean warfare significantly complicate Army operations, freedom of movement, and force protection.




Disease in megacities can result in catastrophic, global outcomes. Infectious disease will interface with urbanization, impacting military missions (e.g. warfare, humanitarian missions, and force protection). Rapid growth of dense urban areas in developing countries will continue to push people into environments that put them in greater contact with animal reservoirs of disease. Denial, fear, misinformation, decontamination, and disposal are among the many factors future military forces may have to contend with.

(Note: many of these were highlighted at last week’s MDB in Megacities Conference)

Future Army Concepts and Doctrine should account for the following areas:

• Adoption of a city as a system of systems perspective will require adaptation of a significant portion of Army doctrine resulting in an urban analytic framework tailored to address the operational data layers found within urban centers, their environmental dynamism, and their state of connectedness.

• The dynamic nature of urban environments demands an expansion of traditional Intelligence Preparation of the Battlefield (IPB) thinking. IPB often fails to gain sight of the dynamics between the components of problems within an interactively complex system and is not conducive to an interactively complex Operational Environment. The basic definition of IPB often does not take into account how the variables explaining Dense Urban Areas are increasingly interconnected, offers little instruction on how to address a complex, multidimensional environment, and provides little operational advice or examples.

Megacities research needs to better address the likelihood of more lethal competitors. Current mental models are stuck on non-hybrid, warrior-like opponents.

• Changes in doctrine to enable the development of knowledge experts in megacities is needed where personnel are assigned to monitor cities.




Greater emphasis must be placed on strategically supporting, manipulating, and/or undermining the flows, infrastructure, and systems of the megacity, as opposed to current emphasis on kinetic, military tasks.




• The Army must change its thinking to focus more on rigorous big data-driven analysis, instead of relying largely on the same reductionist models that limit holistic thinking.




• The Army must change its attitude towards cyberwarfare and innovate new ideas and concepts for warfare. This is especially important in cities with high densities of smart technology where the Internet of Things (IoT) might provide a wealth of intelligence information.

• A shift in how medical data is defined, stored, captured, visualized, and shared is needed for more easily transportable semi-autonomous and autonomous Tactical Combat Casualty Care capabilities to support future missions. This will require a paradigm shift in the practice of operational medicine from an “art” that employs subjective measures to assess and treat, to a “science” based on employing objective quantifiable measures.

Faster technological iteration and adaptation is needed as opposed to large, long-term development, acquisition, and sustainment programs. Smaller, faster, and more flexible systems to supplement, or supersede, existing weapons and other systems with rapid prototyping, small automated production runs, remote software updates, and development and deployment to upgrade a soldier’s tools in months or weeks will be needed.

For additional insights regarding combat in urban terrain, please listen to the following podcasts, hosted by our colleagues at Modern War Institute:

The Battle for Mosul, with Col. Pat Work

The Future Urban Battlefield, with Dr. Russell Glenn

See Dr. Russell Glenn’s guest blog post, “Megacities: The Time is Nigh

Also see the TRADOC G-2 Operational Environment Enterprise (OEE) Red Diamond Threats Newsletter, Volume 9, Issue 1, January-February 2018, pages 18-21, for Manila: An Exemplar of Dense Urban Terrain. This article “illustrates the complex political and civil-military challenges that would impact potential operations or activities in megacities.”

Please also see Jeremy D. McLain’s article (submitted in response to our Soldier 2050 Call for Ideas) entitled, Full-Auto Teddy Bear: Non-Lethal Automatons and Lethal Human Teaming to Increase Overall ‘Lethality’ in Complex Urban Environments, published by our colleagues at Small Wars Journal.