[Editor’s Note: In today’s post, Mad Scientist Laboratory explores China’s whole-of-nation approach to exploiting operational environments, synchronizing government, military, and industry activities to change geostrategic power paradigms via competition in 2035. Excerpted from products previously developed and published by the TRADOC G-2’s Operational Environment and Threat Analysis Directorate (see links below), this post describes China’s approach to exploitation and identifies the implications for the U.S. Army — Enjoy!]
TheOperational Environment is envisioned as a continuum, divided into two eras: the Era of Accelerated Human Progress(now through 2035) and the Era of Contested Equality (2035 through 2050). This latter era is marked by significant breakthroughs in technology andconvergences in terms of capabilities, which lead to significant changes in the character of warfare. During this period, traditional aspects of warfare undergo dramatic, almost revolutionary changes which at the end of this timeframe may even challenge the very nature of warfare itself. In this era, no one actor is likely to have any long-term strategic or technological advantage, with aggregate power between the U.S. and its strategic competitors being equivalent, but not necessarily symmetric. Prevailing in this period will depend on an ability to synchronize multi-domain capabilities against an artificial intelligence-enhanced adversary with an overarching capability to visualize and understand the battlespace at even greater ranges and velocities. Equally important will be controlling information and the narrative surrounding the conflict. Adversaries will adopt sophisticated information operations and narrative strategies to change the context of the conflict and thus defeat U.S. political will.
The future strategic environment will be characterized by apersistent state of competition where global competitors seek to exploit the conditions of operational environments to gain advantage. Adversaries understand that the application of any or all elements of national power in competition just below the threshold of armed conflict is an effective strategy against the U.S.
China is rapidly modernizing its armed forces and developing new approaches to warfare. Beijing has invested significant resources into research and development of a wide array of advanced technologies. Coupled with its time-honored practice of reverse engineering technologies or systems it purchases or acquires through espionage, this effort likely will allow China to surpass Russia as our most capable threat sometime around 2030.
China’s Approach to Exploitation
China’s whole-of-nation approach, which involves synchronization of actions across government, military, and industry, will facilitate exploitation of operational environments and enable it to gain global influence through economic exploitation.
China will leverage the international system to advance its own interests while attempting to constrain others, including the U.S.
Preferred Conditions and Methods
The following conditions and methods are conducive to exploitation by China, enabling them to shape the strategic environment in 2035:
Infrastructure Capacity Challenges: China targets undeveloped and fragile environments where their capital investments, technology, and human capital can produce financial gains and generate political influence.
Interconnected Economies: China looks for partners and opportunities to become a significant stakeholder in a wide variety of economies in order to capitalize on its investments as well as generate political influence.
Specialized Economies: China looks for opportunities to partner with specialized markets and leverage their vulnerabilities for gain.
Technology Access Gaps: China targets areas where their capital investments in technology provide partners with key resources and competitive advantages by filling technology gaps.
Implications for the U.S. Army:
Traditional Army threat paradigms may not be sufficient for competition.
The Army could be drawn into unanticipated escalation as a result of China’s activities during the competition phase.
Army military partnerships will likely be undermined by China in 2035.
Army operations and engagements will be increasingly impacted by the pervasiveness of Chinese goods, technology, infrastructure, and systems.
If you enjoyed this post, please see theoriginal paper and associated infographic of the same title, both by the TRADOC G-2’s Operational Environment and Threat Analysis Directorate and hosted on their All Partners Access Network (APAN) site…
… and read the following MadSci Laboratory blog posts:
[Editor’s Note: The United States Army faces multiple, complex challenges in tomorrow’sOperational Environment (OE), confronting strategic competitors in an increasingly contested space across every domain (land, air, maritime, space, and cyberspace). The Mad Scientist Initiative, the U.S. Army Training and Doctrine Command (TRADOC) G-2 Futures, and Army Futures Command (AFC) Future Operational Environment Cell have collaborated with representatives from industry, academia, and the Intelligence Community to explore the blurring lines between competition and conflict, and the character of great power warfare in the future. Today’s post captures our key findings regarding the OE and what will be required to successfully compete, fight, and win in it — Enjoy!].
Alternative Views of Warfare: The U.S. Army’s view of the possible return to Large Scale Combat Operations (LSCO) and capital systems warfare might not be the future of warfare. Near-peer competitors will seek to achieve national objectives through competition short of conflict, and regional competitors and non-state actors will effectively compete and fight with smaller, cheaper, and greater numbers of systems against our smaller number of exquisite systems. However, preparation for LSCO and great state warfare may actually contribute to its prevention.
Competition and Conflict are Blurring: The dichotomy of war and peace is no longer a useful construct for thinking about national security or the development of land force capabilities. There are no longer defined transitions from peace to war and competition to conflict. This state of simultaneous competition and conflict is continuous and dynamic, but not necessarily cyclical. Potential adversaries will seek to achieve their national interest short of conflict and will use a range of actions from cyber to kinetic against unmanned systems walking up to the line of a short or protracted armed conflict. Authoritarian regimes are able to more easily ensure unity of effort and whole-of-government over Western democracies and work to exploit fractures and gaps in decision-making, governance, and policy.
The globalization of the world – in communications, commerce, and belligerence (short of war) – as well as the fragmentation of societies and splintering of identities has created new factions and “tribes,” and opened the aperture on who has offensive capabilities that were previously limited to state actors. Additionally, the concept of competition itself has broadened as social media, digital finance, smart technology, and online essential services add to a growing target area.
Adversaries seek to shape public opinion and influence decisions through targeted information operations campaigns, often relying on weaponized social media. Competitors invest heavily in research and development in burgeoning technology fields– Artificial Intelligence (Al), quantum sciences, andbiotech – and engage in technology theft to weaken U.S. technological superiority. Cyber attacks and probing are used to undermine confidence in financial institutions and critical government and public functions – Supervisory Control and Data Acquisition (SCADA), voting, banking, and governance. Competition and conflict are occurring in all instruments of power throughout the entirety of the Diplomatic, Information, Military and Economic (DIME) model.
Cyber actions raise the question of what is the threshold to be considered an act of war. If an adversary launches a cyber attack against a critical financial institution and an economic crisis results – is it an act of war? There is a similar concern regarding unmanned assets. While the kinetic destruction of an unmanned system may cost millions, no lives are lost. How much damage without human loss of life is acceptable?
Nuclear Deterrence limits Great Power Warfare:Multi-Domain Operations (MDO) is predicated on a return to Great Power warfare. However, nuclear deterrence could make that eventuality less likely. The U.S. may be competing more often below the threshold of conventional war and the decisive battles of the 20th Century (e.g., Midway and Operation Overlord). The two most threatening adversaries – Russia and China – have substantial nuclear arsenals, as does the United States, which will continue to make Great Power conventional warfare a high risk / high cost endeavor. The availability of non-nuclear capabilities that can deliver regional and global effects is a new attribute of the OE. This further complicates the deterrence value of militaries and the escalation theory behind flexible deterrent options. The inherent implications of cyber effects in the real world – especially in economies, government functions, and essential services – further exacerbates the blurring between competition and conflict.
Hemispheric Competition and Conflict: Over the last twenty years, Russia and China have been viewed as regional competitors in Eurasia or South-East Asia. These competitors will seek to undermine and fracture traditional Western institutions, democracies, and alliances. Both are transitioning to a hemispheric threat with a primary focus on challenging the U.S. Army all the way from its home station installations (i.e., the Strategic Support Area) to the Close Area fight. We can expect cyber attacks against critical infrastructure, the use of advanced information warfaresuch as deep fakes targeting units and families, and the possibility of small scale kinetic attacksduring what were once uncontested administrative actions of deployment. There is no institutional memory for this threat and adding time and required speed for deployment is not enough to exercise MDO.
Disposable versus Exquisite: Current thinking espouses technologically advanced and expensive weapons platforms over disposable ones, which brings with it an aversion to employ these exquisite platforms in contested domains and an inability to rapidly reconstitute them once they are committed and subsequently attrited. In LSCO with a near-peer competitor, the ability to reconstitute will be imperative. The Army (and larger DoD) may need to shift away from large and expensive systems tocheap, scalable, and potentially even disposable unmanned systems (UxS). Additionally, the increases in miniaturized computing power in cheaper systems, coupled with advances in machine learning could lead to massed precision rather than sacrificing precision for mass and vice versa.
This challenge is exacerbated by the ability for this new form of mass to quickly aggregate/disaggregate, adapt, self-organize, self-heal, and reconstitute, making it largely unpredictable and dynamic. Adopting these capabilities could provide the U.S. Army and allied forces with an opportunity to use mass precision to disrupt enemy Observe, Orient, Decide, and Act (OODA) loops, confuse kill chains/webs, overwhelm limited adversary formations, and exploit vulnerabilities in extended logistics tails and advanced but immature communication networks.
Human-Starts-the-Loop: There have been numerous discussions anddebateover whether armed forces will continue to have a “man-in-the-loop” regarding Lethal Autonomous Weapons Systems (LAWS). Lethal autonomy in future warfare may instead be “human-starts-the-loop,” meaning that humans will be involved in the development of weapons/targeting systems – establishing rules and scripts – and will initiate the process, but will then allow the system to operate autonomously. It has been stated that it would be ethically disingenuous to remain constrained by “human-on-the-loop” or “human-in-the-loop” constructs when our adversaries are unlikely to similarly restrict their own autonomous warfighting capabilities. Further, the employment of this approach could impact the Army’s MDO strategy. The effects of “human-starts-the-loop” on the kill chain – shortening, flattening, or otherwise dispersing – would necessitate changes in force structuring that could maximize resource allocation in personnel, platforms, and materiel. This scenario presents the Army with an opportunity to execute MDO successfully with increased cost savings, by: 1) Conducting independent maneuver – more agile and streamlined units moving rapidly; 2) Employing cross-domain fires – efficiency and speed in targeting and execution; 3) Maximizing human potential – putting capable Warfighters in optimal positions; and 4) Fielding in echelons above brigade – flattening command structures and increasing efficiency.
Emulation and the Accumulation of Advantages: China and Russia are emulating many U.S. Department of Defense modernization and training initiatives. China now has Combat Training Centers.Russia has programs that mirror the Army’s Cross Functional Team initiatives and the Artificial Intelligence (AI) Task Force. China and Russia are undergoing their own versions of force modernization to better professionalize the ranks and improve operational reach. Within these different technical spaces, both China and Russia are accumulating advantages that they envision will blunt traditional U.S. combat advantages and the tenets described in MDO. However, both nations remain vulnerable and dependent on U.S. innovations in microelectronics, as well as the challenges of incorporating these technologies into their own doctrine, training, and cultures.
Our “Tenth Man” – Challenging our Assumptions about the Operational Environment and Warfare posts, where Part 1 discusses whether the future fight will necessarily even involve LSCO and Part 2addresses the implications of a changed or changing nature of war.
[Editor’s Note: The U.S. Army Futures Command (AFC) and Training and Doctrine Command (TRADOC) co-sponsored the Mad Scientist Disruption and the Operational Environment Conference with the Cockrell School of Engineering at The University of Texas at Austin on 24-25 April 2019 in Austin, Texas. Today’s post is excerpted from this conference’s Final Report and addresses how the speed of technological innovation and convergence continues to outpace human governance. The U.S. Army must not only consider how best to employ these advances in modernizing the force, but also the concomitant ethical, moral, and legal implications their use may present in the Operational Environment (see links to the newly published TRADOC Pamphlet 525-92, The Operational Environment and the Changing Character of Warfare, and the complete Mad Scientist Disruption and the Operational Environment Conference Final Report at the bottom of this post).]
Technological advancement and subsequent employment often outpaces moral, ethical, and legal standards. Governmental and regulatory bodies are then caught between technological progress and the evolution of social thinking. The Disruption and the Operational Environment Conference uncovered and explored several tension points that the Army may be challenged by in the future.
Space is one of the least explored domains in which the Army will operate; as such, we may encounter a host of associated ethical and legal dilemmas. In the course of warfare, if the Army or an adversary intentionally or inadvertently destroys commercial communications infrastructure – GPS satellites – the ramifications to the economy, transportation, and emergency services would be dire and deadly. The Army will be challenged to consider how and where National Defense measures in space affect non-combatants and American civilians on the ground.
International governing bodies may have to consider what responsibility space-faring entities – countries, universities, private companies – will have for mitigating orbital congestion caused by excessive launching and the aggressive exploitation of space. If the Army is judicious with its own footprint in space, it could reduce the risk of accidental collisions and unnecessary clutter and congestion. It is extremely expensive to clean up space debris and deconflicting active operations is essential. With each entity acting in their own self-interest, with limited binding law or governance and no enforcement, overuse of space could lead to a “tragedy of the commons” effect.1 The Army has the opportunity to more closely align itself with international partners to develop guidelines and protocols for space operations to avoid potential conflicts and to influence and shape future policy. Without this early intervention, the Army may face ethical and moral challenges in the future regarding its addition of orbital objects to an already dangerously cluttered Low Earth Orbit. What will the Army be responsible for in democratized space? Will there be a moral or ethical limit on space launches?
Autonomy in Robotics
Robotics have been pervasive and normalized in military operations in the post-9/11 Operational Environment. However, the burgeoning field of autonomy in robotics with the potential to supplant humans in time-critical decision-making will bring about significant ethical, moral, and legal challenges that the Army, and larger DoD are currently facing. This issue will be exacerbated in the Operational Environment by an increased utilization and reliance on autonomy.
The increasing prevalence of autonomy will raise a number of important questions. At what point is it more ethical to allow a machine to make a decision that may save lives of either combatants or civilians? Where does fault, responsibility, or attribution lie when an autonomous system takes lives? Will defensive autonomous operations – air defense systems, active protection systems – be more ethically acceptable than offensive – airstrikes, fire missions – autonomy? Can Artificial Intelligence/Machine Learning (AI/ML) make decisions in line with Army core values?
Deepfakes and AI-Generated Identities, Personas, and Content
Anew era of Information Operations (IO)is emerging due to disruptive technologies such as deepfakes – videos that are constructed to make a person appear to say or do something that they never said or did – and AI Generative Adversarial Networks (GANs) that produce fully original faces, bodies, personas, and robust identities.2 Deepfakes and GANs are alarming to national security experts as they could trigger accidental escalation, undermine trust in authorities, and cause unforeseen havoc. This is amplified by content such as news, sports, and creative writing similarly being generated by AI/ML applications.
This new era of IO has many ethical and moral implications for the Army. In the past, the Army has utilized industrial and early information age IO tools such as leaflets, open-air messaging, and cyber influence mechanisms to shape perceptions around the world. Today and moving forward in the Operational Environment, advances in technology create ethical questions such as: is it ethical or legal to use cyber or digital manipulations against populations of both U.S. allies and strategic competitors? Under what title or authority does the use of deepfakes and AI-generated images fall? How will the Army need to supplement existing policy to include technologies that didn’t exist when it was written?
AI in Formations
With the introduction of decision-making AI, the Army will be faced with questions abouttrust, man-machine relationships, and transparency. Does AI in cyber require the same moral benchmark as lethal decision-making? Does transparency equal ethical AI? What allowance for error in AI is acceptable compared to humans? Where does the Army allow AI to make decisions – only in non-combat or non-lethal situations?
Commanders, stakeholders, and decision-makers will need to gain a level of comfort and trust with AI entities exemplifying a true man-machine relationship. The full integration of AI into training and combat exercises provides an opportunity to build trust early in the process before decision-making becomes critical and life-threatening. AI often includes unintentional or implicitbias in its programming. Is bias-free AI possible? How can bias be checked within the programming? How can bias be managed once it is discovered and how much will be allowed? Finally, does the bias-checking software contain bias? Bias can also be used in a positive way. Through ML – using data from previous exercises, missions, doctrine, and the law of war – the Army could inculcate core values, ethos, and historically successful decision-making into AI.
If existential threats to the United States increase, so does pressure to use artificial and autonomous systems to gain or maintain overmatch and domain superiority. As the Army explores shifting additional authority to AI and autonomous systems, how will it address the second and third order ethical and legal ramifications? How does the Army rectify its traditional values and ethical norms with disruptive technology that rapidly evolves?
“Ethics and the Future of War panel, facilitated by LTG Dubik (USA-Ret.) at the Mad Scientist Visualizing Multi Domain Battle 2030-2050 Conference, facilitated at Georgetown University, on 25-26 July 2017.
Just Published!TRADOC Pamphlet 525-92, The Operational Environment and the Changing Character of Warfare, 7 October 2019, describes the conditions Army forces will face and establishes two distinct timeframes characterizing near-term advantages adversaries may have, as well as breakthroughs in technology and convergences in capabilities in the far term that will change the character of warfare. This pamphlet describes both timeframes in detail, accounting for all aspects across the Diplomatic, Information, Military, and Economic (DIME) spheres to allow Army forces to train to an accurate and realistic Operational Environment.
[Editor’s Note: Mad Scientist Laboratory is pleased to publish today’s post by guest blogger Zachary Kallenborn. In the first of a series of posts, Mr. Kallenborn addresses how the convergence of emerging technologies is eroding barriers to terrorist organizations acquiring the requisite equipment, materiel, and expertise to develop and deliver chemical, biological, radiological, and nuclear (CBRN) agents in an attack. Learn about the challenges that (thankfully) remain and the ramifications for the operational environment. (Note: Some of the embedded links in this post are best accessed using non-DoD networks.)]
On the evening of July 15, 2034, 264 West Point cadets reported to the hospital with a severe, but unknown illness. West Point Military Police (MP) investigated the incident and discovered video footage of two men launching several autonomous drones from a pickup truck near the base, then driving off. A suspicious fire the same night at a local apartment complex revealed remnants of 3D printers and synthetic biology kits. The investigation remains ongoing…
Such a scenario is fantasy, but increasingly plausible.
Various emerging technologies reduce the barriers to chemical, biological, radiological, and nuclear (CBRN) terrorism — bioterrorism in particular. The convergence of these technologies used may allow terrorists to acquire CBRN weapons with minimal identifiable signatures. Although these technologies exist today, their sophistication, availability, and terrorist interest in their use is likely to grow over the coming decades. For example, the first powered model airplane was flown in 1937; however, terrorists did not attempt to use drones until 1994.1 Thankfully, major challenges will still inhibit truly catastrophic CBRN terror.
CBRN weapon acquisition is a difficult task for terrorist organizations. Terrorists must acquire significant specialized equipment, materiel, expertise, and the organizational capabilities to support the acquisition of such weapons and a physical location to assemble them. Even supposed successes likeAum Shinrikyo’s attack on the Tokyo subway were not nearly as impactful as they could have been. Aum’s biological weapons program was also a notable failure. In one instance, a member of the cult fell into a vat of clostridium botulinum (the bacteria that produces the botulinum toxin) and emerged unharmed.2 As a result, only 1-2% of terrorist organizations pursue or use CBRN weapons.3 But these barriers are eroding.
3D printing may ease the acquisition of some equipment and materiel. 3D printers can be used to create equipment components at reduced cost and have been used to create bioreactors, microscopes, and others key elements.4 Bioprinters can also create tissue samples to test weapons agents.5 The digital build-files for 3D printed items can also be sent and received online, perhaps from black market sellers or individuals sympathetic to the terrorist’s ideology.6
Synthetic biology offers improved access to biological weapons agents, especially to otherwise highly controlled agents. Synthetic biology can be used to create new or modify existing organisms.7 According to the World Health Organization, synthetic biology techniques could plausibly allow recreation of the variola virus (smallpox).8 That is especially significant because the virus only exists in two highly secure laboratories.9
Delivery of a CBRN agent can also be a challenge. CBRN agents useful for mass casualty attacks rely on the air to carry the agent to an adversary (nuclear weapons are an obvious exception, but the likelihood of a terrorist organization acquiring a nuclear weapon is extremely low). Poor wind conditions, physical barriers, rain, and other environmental conditions can inhibit delivery. Biological weapons also require spray systems that can create droplets of an appropriate size, so that the agent is light enough to float in the air, but heavy enough to enter the lungs (approximately 1-10 microns).
Drones also make CBRN agent delivery easier. Drones offer terrorists access to the air. Terrorists can use them to fly over physical barriers, such as fencing or walls to carry out an attack. Drones also give terrorists more control over where they launch an attack: they can choose a well-defended position or one proximate to an escape route. Although small drone payload sizes limit the amount of agent that can be delivered, terrorists can acquire multiple drones.
Advances in drone autonomy allow terrorists to control more drones at once.10 Autonomy also allows terrorists to launch more complex attacks, perhaps directing autonomous drones to multiple targets or follow a path through multiple, well-populated areas. Greater autonomy also reduces the risks to the terrorists, because they can flee more readily from the area.
3D printing can also help with CBRN agent delivery. Spray-tanks and nozzles subject to export controls can be 3D printed.11 3D printers can also be used to make drones.12 3D printers also provide customizability to adapt these systems for CBRN agent delivery.
CBRN weapons acquisition also requires significant technical expertise. Terrorist organizations must correctly perform complex scientific procedures, know which procedures to use, know which equipment and materials are needed, and operate the equipment. They must do all of that without harming themselves or others (harming innocents may not seem like a concern for an organization intent on mass harm; however, it would risk exposure of the larger plot.) Much of this knowledge is tacit, meaning that it is based on experience and cannot be easily transferred to other individuals.
Emerging technologies do not drastically reduce this barrier, though experts disagree. For example, genome-synthesis requires significant tacit knowledge that terrorists cannot easily acquire without relevant experience.13 Likewise, 3D printers are unlikely to spit out a completely assembled piece of equipment. Rather, 3D printers may provide parts that need to be assembled into a final result. However, some experts argue that as technologies become more ubiquitous, they will be commercialized and made easier to use.14 While this technology is likely to become more accessible, physical limitations will place an upper bound on how accessible it can become.
The Future Operational Environment
If CBRN terrorism is becoming easier, U.S. forces can be expected to be at greater risk of CBRN attack and face more frequent attacks. An attack with infectious biological weapons from afar would not likely be discovered until well after the attack took place. Although still quite unlikely, a major biological attack could cause massive harm. Timed correctly, a CBRN terror attack could delay deployment of troops to a combat zone, inhibit launch of close-air support assets, or harm morale by delaying delivery of delicious pizza MREs.15 Off the battlefield, troops may have less access to protective gear and be at greater risk of harm. Even a poorly made agent can harm military operations: quarantines must still be established and operations limited until the risk is neutralized or at least determined to be non-harmful.
However, counter-intuitively, terrorist demand for CBRN weapons may actually decrease, because emerging technologies also offer easier pathways to mass casualties. These risks will be explored in the next article in this series.
Zachary Kallenborn is a freelance researcher and analyst, specializing in Chemical, Biological, Radiological, and Nuclear (CBRN) weapons, CBRN terrorism, drone swarms, and emerging technologies writ large. His research has appeared in the Nonproliferation Review, Studies in Conflict and Terrorism, Defense One, the Modern War Institute at West Point, and other outlets. His most recent study, Swarming Destruction: Drone Swarms and CBRN Weapons, examines the threats and opportunities of drone swarms for the full scope of CBRN weapons.
Disclaimer: The views expressed in this blog post do not necessarily reflect those of the Department of Defense, Department of the Army, Army Futures Command (AFC), or Training and Doctrine Command (TRADOC).
7 Committee on Strategies for Identifying and Addressing Potential Biodefense Vulnerabilities Posed by Synthetic Biology, “Biodefense in the Age of Synthetic Biology,” (Washington DC: The National Academies Press, 2018), 9.
[Editor’s Note: Mad Scientist Laboratory welcomes returning guest blogger and proclaimed Mad Scientist Mr. Howard R. Simkin with his submission to ourMad Scientist Crowdsourcing topic from earlier this summer on The Operational Environment: What Will Change and What Will Drive It – Today to 2035? Mr. Simkin’s post addresses the military challenges posed by Splinternets. Competition duringMulti-Domain Operations is predicated on our Forces’ capability to conduct cyber and influence operations against and inside our strategic competitors’ networks. In a world of splinternets, our flexibility to conduct and respond to non-kinetic engagements is challenged by this new reality in the operational environment. (Note: Some of the embedded links in this post are best accessed using non-DoD networks.)]
This paper discusses the splintering of the Internet that is currently underway – the creation of what are commonly being called splinternets. Most versions of the future operational environment assume an Internet that is largely accessible to all. Recent trends point to a splintering effect as various nation states or multi-state entities seek to regulate access to or isolate their portion of the Internet.1,2 This paper will briefly discuss the impacts of those tendencies and propose an operational response.
What are the impacts of a future operational environment in which the Internet has fractured into a number of mutually exclusive subsets, referred to as splinternets?
Splinternets threaten both access to data and the exponential growth of the Internet as a global commons. There are two main drivers fracturing the Internet. One is regulation and the other is isolationism. Rooted in politics, the Internet is being fractured by regulation and isolationism. Counterbalancing this fracturing is the Distributed Web (DWeb).
Regulation usually involves revenue or internal security. While admirable in intent, regulations cast a chill over the growth and health of the Internet.3 Even well-intentioned regulations become a burden which forces smaller operators to go out of business or to ignore the regulations. Depending on the country involved, activity which was perfectly legal can become illegal by bureaucratic fiat. This acts as a further impetus to drive users to alternative platforms. An example is the European Union (EU) General Data Protection Regulation (GDPR), which came into effect on 25 May 2018. It includes a number of provisions which make it far more difficult to collect data. The GDPR covers not only entities based in the EU but also those who have users in the EU.4 U.S. companies such as Facebook have scrambled to comply so as to maintain access to the EU virtual space.5
China is the leader in efforts to isolate their portion of the internet from outside influence.6 To accomplish this, they have received help from their own tech giants as well as U.S. companies such as Google.7 The Chinese have made it very difficult for outside entities to penetrate the “Great Firewall” while maintaining the ability of the Peoples Liberation Army (PLA) to conduct malign activities across the Internet.8 Recently, Eric Schmidt, the former CEO of Google opined that China would succeed in splitting the Internet in the not too distant future.9
Russia has also proposed a similar strategy, which they would extend to the BRICS (Brazil, Russia, India, China and South Africa). The reason given is the “dominance of the US and a few EU states concerning Internet regulation” which Russia sees as a “serious danger” to its safety, RosBiznesKonsalting (RBK)10 quotes from minutes taken at a meeting of the Russian Security Council. Having its own root servers would make Russia independent of monitors like the International Corporation for Assigned Names and Numbers (ICANN) and protect the country in the event of “outages or deliberate interference.” “Putin sees [the] Internet as [a] CIA tool.”11
Distributed Web (DWeb).
The DWeb is “a peer-to-peer Internet that is free from firewalls, government regulation, and spying.” Admittedly, the DWeb is a difficult problem. However, both the University of Michigan and a private firm, Maidsafe claim to be close to a solution.12 Brewster Kahle, founder of the Internet Archive and organizer of the first Decentralized Web Summit two years ago, recently advocated a “DWeb Camp.” Should a DWeb become a reality, many of the current efforts by governments to control or regulate the Internet would founder.
Our operational response should involve Special Operations Forces (SOF), Space, and Cyber forces. The creation of splinternets places a premium on the ability to gain physical access to the splinternet’s internal networks. SOF is an ideal force to perform this operation because of their ability to work in politically sensitive and denied environments with or through indigenous populations. Once SOF gains physical access, Space would be the most logical means to send and receive data. Cyber forces would then perform operations within the splinternet.
Most versions of the future operational environment assume an Internet that is largely accessible to all. Therefore, splinternets are an important ‘alternative future’ to consider. In conjunction with Space and Cyber forces, SOF can play a key role in the operational response to allow the Joint Force to continue to operate against splinternet capable adversaries.
If you enjoyed this post, please see:
– Mr. Simkin‘s previous Mad Scientist Laboratory posts:
… as well as his winning Call for Ideas presentation The Future ODA (Operational Detachment Alpha) 2035-2050, delivered at the Mad Scientist Bio Convergence and Soldier 2050 Conference, co-hosted with SRI International on 8–9 March 2018 at their Menlo Park campus in California.
Howard R. Simkin is a Senior Concept Developer in the DCS, G-9 Capability Development & Integration Directorate, U.S. Army Special Operations Command. He has over 40 years of combined military, law enforcement, defense contractor, and government experience. He is a retired Special Forces officer with a wide variety of special operations experience. He is also a proclaimed Mad Scientist.
Disclaimer: This is a USASOC G9 Gray Paper that has already been cleared for unlimited release. Distribution is unlimited. The views expressed in this blog post are those of the author, and do not necessarily reflect those of the Department of Defense, Department of the Army, U.S. Army Special Operations Command (USASOC), Army Futures Command (AFC), or Training and Doctrine Command (TRADOC).
On 8-9 August 2018, the U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Learning in 2050 Conference with Georgetown University’s Center for Security Studies in Washington, DC. Leading scientists, innovators, and scholars from academia, industry, and the government gathered to address future learning techniques and technologies that are critical in preparing for Army operations in the mid-21st century against adversaries in rapidly evolving battlespaces. The new and innovative learning capabilities addressed at this conference will enable our Soldiers and Leaders to act quickly and decisively in a changing Operational Environment (OE) with fleeting windows of opportunity and more advanced and lethal technologies.
We have identified the following “Top 10” takeaways related to Learning in 2050:
1. Many learning technologies built around commercial products are available today (Amazon Alexa, Smart Phones, Immersion tech, Avatar experts) for introduction into our training and educational institutions. Many of these technologies are part of the Army’s concept for aSynthetic Training Environment (STE)and there are nascent manifestations already. For these technologies to be widely available to the future Army, the Army of today must be prepared to address:
– The cultural challenges associated with changing the dynamicbetween learners and instructors, teachers, and coaches; and
– The adequate funding to produce capabilities at scale so that digital tutors or other technologies (Augmented Reality [AR] / Virtual Reality [VR], etc.) and skills required in a dynamic future, like critical thinking/group think mitigation, are widely available or perhaps ubiquitous.
2. Personalization and individualization of learning in the future will be paramount, and some training that today takes place in physical schools will be more the exception, with learning occurring at the point of need. This transformation will not be limited to lesson plans or even just learning styles:
– Project-oriented learning; when today’s high school students are building apps, they are asked “What positive change do you want to have?” One example is an open table for Bully Free Tables. In the future, learners will learn through working on projects;
– Project-oriented learning will lead to a convergence of learning and operations, creating a chicken (learning) or the egg (mission/project) relationship; and
– Learning must be adapted to consciously address the desired, or extant, culture.
3. Some jobs and skill sets have not even been articulated yet. Hobbies and recreational activities engaged in by kids and enthusiasts today could become occupations or Military Occupational Specialties (MOS’s) of the future (e.g., drone creator/maintainer, 3-D printing specialist, digital and cyber fortification construction engineer — think Minecraft and Fortnite with real-world physical implications). Some emerging trends inpersonalized warfare, big data, and virtual nations could bring about the necessity for more specialists that don’t currently exist (e.g., data protection and/or data erasure specialists).
4. The New Human (who will be born in 2032 and is the recruit of 2050) will be fundamentally different from the Old Human. The Chief of Staff of the Army (CSA) in 2050 is currently a young Captain in our Army today. While we are arguably cyborgs today (with integrated electronics in our pockets and on our wrists), the New Humans will likely be cyborgs in the truest sense of the word, with some havingembedded sensors. How will those New Humans learn? What will they need to learn? Why would they want to learn something? These are all critical questions the Army will continue to ask over the next several decades.
5. Learning is continuous and self-initiated, while education is a point in time and is “done to you” by someone else. Learning may result in a certificate or degree – similar to education – or can lead to the foundations of a skill or a deeper understanding of operations and activity. How will organizations quantify learning in the future? Will degrees or even certifications still be the benchmark for talent and capability?
6. Learning isn’t slowing down, it’s speeding up. More and more things are becoming instantaneous and humans have no concept of extreme speed. Tesla cars have the ability to update software, with owners getting into a veritably different car each day. What happens to our Soldiers when military vehicles change much more iteratively? This may force a paradigm shift wherein learning means tightening local and global connections (tough to do considering government/military network securities, firewalls, vulnerabilities, and constraints); viewing technology as extended brains all networked together (similar to Dr. Alexander Kott’s look at the Internet of Battlefield Things [IoBT]); and leveraging these capabilities to enable Soldier learning at extremely high speeds.
7. While there are a number of emerging concepts and technologies to improve and accelerate learning (TNT, extended reality, personalized learning models, and intelligent tutors), the focus, training stimuli, data sets, and desired outcomes all have to be properly tuned and aligned or the Learner could end up losing correct behavior habits (developing maladaptive plasticity), developing incorrect or skewed behaviors (per the desired capability), or assuming inert cognitive biases.
8. Geolocation may become increasingly less important when it comes to learning in the future. If Apple required users to go to Silicon Valley to get trained on an iPhone, they would be exponentially less successful. But this is how the Army currently trains. The ubiquity of connectivity, the growth of the Internet of Things (and eventually Internet of Everything), the introduction of universal interfaces (think one XBOX controller capable of controlling 10 different types of vehicles), major advances in modeling and simulations, and social media innovation all converge to minimize the importance of teachers, students, mentors, and learners being collocated at the same physical location.
9. Significant questions have to be asked regarding the specificity of training in children at a young age to the point that we may be overemphasizing STEM from an early age and not helping them learn across a wider spectrum. We need Transdisciplinarity in the coming generations.
10. 3-D reconstructions of bases, training areas, cities, and military objectives coupled with mixed reality, haptic sensing, and intuitive controls have the potential to dramatically change how Soldiers train and learn when it comes to not only single performance tasks (e.g., marksmanship, vehicle driving, reconnaissance, etc.) but also in dense urban operations, multi-unit maneuver, and command and control.
During the next two weeks, we will be posting the videos from each of the Learning in 2050 Conference presentations on the TRADOC G-2 Operational Environment (OE) EnterpriseYouTube Channel and the associated slides on our Mad Scientist APAN site — stay connected here at the Mad Scientist Laboratory.
One of the main thrusts in the Mad Scientist lines of effort is harnessing and cultivating the Intellect of the Nation. In this vein, we are asking Learning in 2050 Conference participants (both in person and online) to share their ideas on the presentations and topic. Please consider:
– What topics were most important to you personally and professionally?
– What were your main takeaways from the event?
– What topics did you want the speakers to extrapolate more on?
– What were the implications for your given occupation/career field from the findings of the event?
Your input will be of critical importance to our analysis and products that will have significant impact on the future of the force in design, structuring, planning, and training! Please submit your input to Mad Scientist at: firstname.lastname@example.org.
Mad Scientist Laboratory is pleased to announce that Headquarters, U.S. Army Training and Doctrine Command (TRADOC) is co-sponsoring the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies this week (Wednesday and Thursday, 8-9 August 2018) in Washington, DC.
Future learning techniques and technologies are critical to the Army’s operations in the 21st century against adversaries in rapidly evolving battlespaces. The ability to effectively respond to a changing Operational Environment (OE) with fleeting windows of opportunity is paramount, and Leaders must act quickly to adjust to different OEs and more advanced and lethal technologies. Learning technologies must enable Soldiers to learn, think, and adapt using innovative synthetic environments to accelerate learning and attain expertise more quickly. Looking to 2050, learning enablers will become far more mobile and on-demand.
Looking at Learning in 2050, topics of interest include, but are not limited to: Virtual, Augmented, and Mixed Realities (VR/AR/MR); interactive, autonomous, accelerated, and augmented learning technologies; gamification; skills needed for Soldiers and Leaders in 2050;synthetic training environments; virtual mentors; and intelligent artificial tutors. Advanced learning capabilities present the opportunity for Soldiers and Leaders to prepare for operations and operate in multiple domains while improving current cognitive load limitations.
Plan to join us virtually at the conference as leading scientists, innovators, and scholars from academia, industry, and government gather to discuss:
1) How will emerging technologies improve learning or augment intelligence in professional military education, at home station, while deployed, and on the battlefield?
2) How can the Army accelerate learning to improve Soldier and unit agility in rapidly changing OEs?
3) What new skills will Soldiers and Leaders require to fight and win in 2050?
– Read our Learning in 2050 Call for Ideas finalists’ submissionshere, graciously hosted by our colleagues at Small Wars Journal.
– Starting Tuesday, 7 August 2018, see the conference agenda’s list of presentations and the associated world-class speakers’ biographieshere.
Join us at the conference on-linehere via live-streaming audio and video, beginning at 0840 EDT on Wednesday, 08 Aug 2018; submit your questions to each of the presenters via the moderated interactive chat room; and tag your comments @TRADOC on Twitter with #Learningin2050.
On 19-20 June 2018, the U.S. Army Training and Doctrine Command (TRADOC) Mad Scientist Initiative co-hosted the Installations of the Future Conference with the Office of the Assistant Secretary of the Army for Installations, Energy and Environment (OASA (IE&E)) and Georgia Tech Research Institute (GTRI). Emerging technologies supporting the hyper-connectivity revolution will enable improved training capabilities, security, readiness support (e.g., holistic medical facilities and brain gyms), and quality of life programs at Army installations. Our concepts and emerging doctrine for multi-domain operations recognizes this as increasingly important by including Army installations in the Strategic Support Area. Installations of the Future will serve as mission command platforms to project virtual power and expertise as well as Army formations directly to the battlefield.
We have identified the following “Top 10” takeaways related to our future installations:
1. Threats and Tensions. “Army Installations are no longer sanctuaries” — Mr. Richard G. Kidd IV, Deputy Assistant Secretary of the Army, Strategic Integration. There is a tension between openness and security that will need balancing to take advantage of smart technologies at our Army installations. The revolution in connected devices and the ability to virtually project power and expertise will increase the potential for adversaries to target our installations. Hyper-connectivity increases the attack surface for cyber-attacks and the access to publicly available information on our Soldiers and their families, making personalized warfare and the use ofpsychological attacks and deep fakes likely.
2. Exclusion vs. Inclusion. The role of and access to future Army installations depends on the balance between these two extremes. The connections between local communities and Army installations will increase potential threat vectors, but resilience might depend on expanding inclusion. Additionally, access to specialized expertise inrobotics, autonomy, and information technologies will require increased connections with outside-the-gate academic institutions and industry.
3. Infrastructure Sensorization. Increased sensorization of infrastructure runs the risk of driving efficiencies to the point of building in unforeseen risks. In the business world, these efficiencies are profit-driven, with clearer risks and rewards. Use of table top exercises can explore hidden risks and help Garrison Commanders to build resilient infrastructure and communities. Automation can causecascading failures as people begin to fall “out of the loop.”
4. Army Modernization Challenge. Installations of the Future is a microcosm of overarching Army Modernization challenges. We are simultaneously invested in legacy infrastructure that we need to upgrade, and making decisions to build new smart facilities. Striking an effective and efficient balance will start with public-private partnerships to capture the expertise that exists in our universities and in industry. The expertise needed to succeed in this modernization effort does not exist in the Army. There are significant opportunities for Army Installations to participate in ongoing consortiums like the “Middle Georgia” Smart City Community and the Global Cities Challenge to pilot innovations in spaces such as energy resilience.
5. Technology is outpacing regulations and policy. The sensorization and available edge analytics in our public space offers improved security but might be perceived as decreasing personal privacy. While we give up some personal privacy when we live and work on Army installations, this collection of data will require active engagement with our communities. We studied an ongoing Unmanned Aerial System (UAS) supportconcept to detect gunshot incidents in Louisville, KY, to determine the need to involve legislatures, local political leaders, communities, and multiple layers of law enforcement.
6. Synthetic Training Environment. The Installation of the Future offers the Army significant opportunities to divest itself of large brick and mortar training facilities and stove-piped, contractor support-intensive Training Aids, Devices, Simulations, and Simulators (TADSS). MG Maria Gervais, Deputy Commanding General, Combined Arms Center – Training (DCG, CAC-T), presented the Army’sSynthetic Training Environment (STE), incorporating Virtual Reality (VR), “big box” open-architecture simulations using a One World Terrain database, and reduced infrastructure and contractor-support footprints to improve Learning and Training. The STE, delivering high-fidelity simulations and the opportunity for our Soldiers and Leaders to exercise all Warfighting Functions across the full Operational Environment with greater repetitions at home station, will complement the Live Training Environment and enhance overall Army readiness.
7. Security Technologies. Many of the security-oriented technologies (autonomous drones, camera integration, facial recognition, edge analytics, and Artificial Intelligence) that triage and fuse information will also improve our deployed Intelligence, Surveillance, and Reconnaissance (ISR) capabilities. The Chinese lead the world in these technologies today.
8. Virtual Prototyping. The U.S. Army Engineer Research and Development Center (ERDC) is developing a computational testbed using virtual prototypingto determine the best investments for future Army installations. The four drivers in planning for Future Installations are: 1) Initial Maneuver Platform (Force Projection); 2) Resilient Installations working with their community partners; 3) Warfighter Readiness; and 4) Cost effectiveness in terms of efficiency and sustainability.
9. Standard Approach to Smart Installations. A common suite of tools is needed to integrate smart technologies onto installations. While Garrison Commanders need mission command to take advantage of the specific cultures of their installations and surrounding communities, the Army cannot afford to have installations going in different directions on modernization efforts. A method is needed to rapidly pilot prototypes and then determine whether and how to scale the technologies across Army installations.
10. “Low Hanging Fruit.” There are opportunities for Army Installations to lead their communities in tech integration. Partnerships in energy savings, waste management, and early 5G infrastructure provide the Army with early adopter opportunities for collaboration with local communities, states, and across the nation. We must educate contracting officers and Government consumers to look for and seize upon these opportunities.
Videos from each of the Installations of the Future Conference presentations are posted here. The associated slides will be postedhere within the week on the Mad Scientist All Partners Access Network site.
If you enjoyed this post, check out the following:
[Editor’s Note: The Operational Environment (OE) is the start point for Army Readiness – now and in the Future. The OE answers the question, “What is the Army ready for?” Without the OE in training and Leader development, Soldiers and Leaders are “practicing” in a benign condition, without the requisite rigor to forge those things essential for winning in a complex, multi-domain battlefield. Building the Army’s future capabilities, a critical component of future readiness, requires this same start point. The assumptions the Army makes about the Future OE are the sine qua non start point for developing battlefield systems — these assumptions must be at the forefront of decision-making for all future investments.]
There are no facts about the future. Leaders interested in building future ready organizations must develop assumptions about possible futures and these assumptions require constant scrutiny. Leaders must also make decisions based on these assumptions to posture organizations to take advantage of opportunities and to mitigate risks. Making these decisions is fundamental to building future readiness.
1. Contested in all domains (air, land, sea, space, and cyber). Increased lethality, by virtue of ubiquitous sensors, proliferated precision, high kinetic energy weapons and advanced area munitions, further enabled by autonomy, robotics, andArtificial Intelligence (AI)with an increasing potential for overmatch. Adversaries will restrict us to temporary windows of advantage with periods of physical and electronic isolation.
2. Concealment is difficult on the future battlefield. Hiding from advanced sensors — where practicable — will require dramatic reduction of heat, electromagnetic, and optical signatures. Traditional hider techniques such as camouflage, deception, and concealment will have to extend to “cross-domain obscuration” in the cyber domain and the electromagnetic spectrum. Canny competitors will monitor their own emissions in real-time to understand and mitigate their vulnerabilities in the “battle of signatures.” Alternately, “hiding in the open” within complex terrain clutter and near-constant relocation might be feasible, provided such relocation could outpace future recon / strike targeting cycles. Adversaries will operate among populations in complex terrain, including dense urban areas.
3. Trans-regional, gray zone, and hybrid strategies with both regular and irregular forces, criminal elements, and terrorists attacking our weaknesses and mitigating our advantages. The ensuing spectrum of competition will range from peaceful, legal activities through violent, mass upheavals and civil wars to traditional state-on-state, unlimited warfare.
4. Adversaries include states, non-state actors, and super-empowered individuals, with non-state actors and super empowered individuals now having access to Weapons of Mass Effect (WME), cyber, space, and Nuclear/Biological/ Chemical (NBC) capabilities. Their operational reach will range from tactical to global, and the application of their impact from one domain into another will be routine. These advanced engagements will also be interactive across the multiple dimensions of conflict, not only across every domain in the physical dimension, but also the cognitive dimension of information operations, and even the moral dimension of belief and values.
5. Increased speed of human interaction, events and action withdemocratized and rapidly proliferating capabilities means constant co-evolution between competitors. Recon / Strike effectiveness is a function of its sensors, shooters, their connections, and the targeting process driving decisions. Therefore, in a contest between peer competitors with comparable capabilities, advantage will fall to the one that is better integrated and makes better and faster decisions.
These assumptions become useful when they translate to potential decision criteria for Leaders to rely on when evaluating systems being developed for the future battlefield. Each of the following questions are fundamental to ensuring the Army is prepared to operate in the future.
1. How will this system operate when disconnected from a network? Units will be disconnected from their networks on future battlefields. Capabilities that require constant timing and precision geo-locational data will be prioritized for disruption by adversaries with capable EW systems.
2. What signature does this system present to an adversary? It is difficult to hide on the future battlefield and temporary windows of advantage will require formations to reduce their battlefield signatures. Capabilities that require constant multi-directional broadcast and units with large mission command centers will quickly be targeted and neutralized.
3. How does this system operate in dense urban areas? The physical terrain in dense urban areas and megacities creates concrete canyons isolating units electronically and physically. Automated capabilities operating in dense population areas might also increase the rate of false signatures, confusing, rather than improving, Commander decision-making. New capabilities must be able to operate disconnected in this terrain. Weapons systems must be able to slew and elevate rapidly to engage vertical targets. Automated systems and sensors will require significant training sets to reduce the rate of false signatures.
4. How does this system take advantage of open and modular architectures? The rapid rate of technological innovations will offer great opportunities to militaries capable of rapidly integrating prototypes into formations. Capabilities developed with open and modular architectures can be upgraded with autonomous and AI enablers as they mature. Early investment in closed-system capabilities will freeze Armies in a period of rapid co-evolution and lead to overmatch.
5. How does this capability help win in competition short of conflict with a near peer competitor? Near peer competitors will seek to achieve limited objectives short of direct conflict with the U.S. Army. Capabilities will need to be effective at operating in the gray zone as well as serving as deterrence. They will need to be capable of strategic employment from CONUS-based installations.
If you enjoyed this post, check out the following items of interest:
Join SciTech Futures‘ community of experts, analysts, and creatives on 11-18 June 2018 as they discuss the logistical challenges of urban campaigns, both today and on into 2035. What disruptive technologies and doctrines will blue (and red) forces have available in 2035? Are unconventional forces the future of urban combat? Their next ideation exercise goes live 11 June 2018 — click here to learn more!
[Editor’s Note: Mad Scientist Laboratory is pleased to publish our latest iteration of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the previous month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]
There are no facts about the future and the future is not a linear extrapolation from the present. We inherently understand this about the future, but Leaders oftentimes seek to quantify the unquantifiable. Eliot Peper opens his Harvard Business Review article with a story about one of the biggest urban problems in New York City at the end of the 19th century – it stank! Horses were producing 45,000 tons of manure a month. The urban planners of 1898 convened a conference to address this issue, but the experts failed to find a solution. More importantly, they could not envision a future 14 years hence, when cars would outnumber horses. The urban problem of the future was not horse manure, but motor vehicle-generated pollution and road infrastructure. All quantifiable data available to the 1898 urban planners only extrapolated to more humans, horses, and manure. It is likely that any expert sharing an assumption about cars over horses would have been laughed out of the conference hall. Flash forward a century and the number one observation from the 9/11 Commission was that the Leaders and experts responsible for preventing such an attack lacked imagination. Story tellingand the science fiction genre allow Leaders to imaginebeyond the numbers and broaden the assumptions needed to envision possible futures. Story telling also helps Leaders and futurists to envision the human context around emerging technologies. For more on Science Fiction and futuring, watch Dr. David Brin‘s Mad Scientistpresentation.
2. “Automated Valor,” by August Cole, Proceedings Magazine, U.S. Naval Institute, May 2018.
Fellow Mad Scientist August Cole’s short story, commissioned by the British Army Concepts Branch, explores the future of urban warfare from a refreshingly new, non-US perspective. Sparking debate about force development and military operations in the 2030s, this story portrays a vivid combat scenario in a world where autonomous weapons have proliferated. Mr. Cole’s story embraces a number of Future Operational Environment themes familiar to Mad Scientists, including combat leadership andteam identity(Soldier and machine),human trust of AI decision-making, virtual and earned citizenship,deep fakes, small unittactical operations, and multi-national Joint operations against an expansionist Chinese super power. Visualizing the future fight from this British Commonwealth perspective provides a new twist in story telling, describing what it will mean to be a Soldier on the battlefield in 2039, depending on machine teammates in the close fight.
3. Altered Carbon, Netflix series, 2018 (based upon a 2002 novel by Richard K. Morgan) — submitted by Mad Scientist Pat Filbert.
Set 300+ years in a futuristic Earth, the show’s main character, or more to the point, his “cortical stack” (alien technology, reverse-engineered for human use that records the sum total of an individual’s consciousness) has been “imprisoned” for 250 years and is “released” back into the general population to solve a mysterious murder. At this time, AI exists in and fully interactswith both the physical and cyber domains. The show incorporates a number of aspects related totrust in AI and technology. Such aspects enable a future where combat is fought by “stored soldiers” on distant worlds using advanced technological capabilities. Some humans have accepted AI projections as near-peers, so the trust factor comes up repeatedly between the humans who accept and embrace this technology and those who remain skeptical, like Will Smith’s character inI, Robot. The implications of AI becoming sentient and capable of violence are at the core of the morality argument against AI technology. The popular acceptance of AI possessing human-like qualities would definitely be a “leap forward” in more than just technology. For additional insights on this topic, watch Mad Scientist Linda MacDonald Glenn‘s presentation.
4. “SOCOM’s Top 10 Technologies“ Podcast, National Defense Magazine, National Defense Industry Association, 3 May 2018 — submitted by Marie Murphy.
This podcast provides a summary of some of the primary emerging technologies that the United States Special Operations Command (SOCOM) and the Department of Defense are developing for military application. In the immediate future — exoskeletons and commercial drone use; in the deep future — quantum computing and China‘s rise to dominate the microelectronics market by 2030 are highlighted in the list. Stew Magnuson, Editor-in-Chief of National Defense Magazine, states that technology is nearing the end of the applicability of Moore’s Law. Due to this, a major consideration for the development of new scientific and technical advancements is private, profit-driven industry, which will certainly be responsible for future cutting-edge technologies. Given that many innovations the military uses or seeks to apply now stem from private sector innovation, what happens when Moore’s Lawexpires and technology moves too quickly for military research and adaptation?
Researchers analyzed the decision-making habits of gamers that play League of Legendsin order to identify and build mental models. Identifying these models will help understand how they are built and, more importantly, how they change over time as players gain proficiency from novice to expert. The researchers analyzed survey responses based on the game and compared the differences between novices, journeymen, and experts. There were clear differences in the way the mental models were organized based on experience, with experts making abstract connections and even showing signs of subnetworks. The researchers plan to use this information for better game design and the development / tailoring of training programs. The Army could leverage the potential of these mental models with neural feedback to accelerate Soldier learning, breaking the tyranny of the 10,000 hour rule of expertise. That said, this information could also prove to be a weapon in the hands of an adversary. What happens to game theory if the adversary knows how your mind works, what your proclivities are, and what courses of action you are likely to favor? What happens if the adversary can identify, based on your actions, who in your unit is a novice and who is an expert, and targets them accordingly (i.e., focusing on defeating the experts first, while leaving the less experienced)? Accessing this information could provide an adversary with an advantage that may prove the difference between success and defeat. Learn more about cognitive enhancement in fellow Mad Scientist Dr. Amy Kruse’s podcast, Human 2.0, hosted by our colleagues at Modern War Institute.
Researchers at the University of California, Berkeley, have exploited mainstream commercial Artificial Intelligence (AI) assistants (e.g., Siri, Alexa, Google Assistant) in order to secretly send commands. The researchers were able to send secret messages to the devices that were embedded in an existing audio track that were undetectable to the human ear. The track could be played and the AI could be told to do any number of things, from transferring money, to adding an item to a shopping list, or opening a malicious website. The adversarial applications of this are immense and abundant. A nefariousactor could surreptitiously activate a device, mute it, and then send and receive information stored on it or even use it to unlock doors, start cars, or call other devices. As the Army becomes more reliant on AI and automation, its vulnerability toPersonalized Warfareattacks via these axes will increase. Will the Army ever be able to use voice activated devices that can be so easily compromised by an undetectable source?
At a recent workshop, the Mad Scientist community was informed of the constraints associated with neural embedded man-machine interfaces – namely, conventional electrode materials will degrade relatively quickly via corrosion brought on by the human brain’s inflammatory immune system response. This challenge may have been overcome by researchers at Carnegie Mellon University, funded by the Defense Advanced Research Projects Agency (DARPA), who have developed a “flexible, squishy silicon-based hydrogel that sticks to neural tissue, bringing non-invasive electrodes to the brain’s surface.” As a tissue analog, this hydrogel is less likely to trigger the brain’s natural defensive response, thus potentially revolutionizing the integration of prosthetics and medical devices with patients’ brains. As with most disruptive technologies, preliminary niche applications (in this case, medical) may jump, initially to the edge, then possibly ripple throughout society. The advent of hydrogel-based electrodes has the potential to accelerate the current transhumanism movement and facilitate direct brain-machine interfaces, as envisioned in Mr. Howard Simkin’s Sine Paripost. Projected forward, the possibility of an Internet of Everything and Everyone may prove to be a two-edged sword, facilitating both the direct upload of knowledge on demand, and the direct hacking of individuals.
If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at: email@example.com — we may select it for inclusion in our next edition of “The Queue”!