74. Mad Scientist Learning in 2050 Conference

Mad Scientist Laboratory is pleased to announce that Headquarters, U.S. Army Training and Doctrine Command (TRADOC) is co-sponsoring the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies this week (Wednesday and Thursday, 8-9 August 2018) in Washington, DC.

Future learning techniques and technologies are critical to the Army’s operations in the 21st century against adversaries in rapidly evolving battlespaces. The ability to effectively respond to a changing Operational Environment (OE) with fleeting windows of opportunity is paramount, and Leaders must act quickly to adjust to different OEs and more advanced and lethal technologies. Learning technologies must enable Soldiers to learn, think, and adapt using innovative synthetic environments to accelerate learning and attain expertise more quickly. Looking to 2050, learning enablers will become far more mobile and on-demand.

Looking at Learning in 2050, topics of interest include, but are not limited to: Virtual, Augmented, and Mixed Realities (VR/AR/MR); interactive, autonomous, accelerated, and augmented learning technologies; gamification; skills needed for Soldiers and Leaders in 2050; synthetic training environments; virtual mentors; and intelligent artificial tutors. Advanced learning capabilities present the opportunity for Soldiers and Leaders to prepare for operations and operate in multiple domains while improving current cognitive load limitations.

Plan to join us virtually at the conference as leading scientists, innovators, and scholars from academia, industry, and government gather to discuss:

1) How will emerging technologies improve learning or augment intelligence in professional military education, at home station, while deployed, and on the battlefield?

2) How can the Army accelerate learning to improve Soldier and unit agility in rapidly changing OEs?

3) What new skills will Soldiers and Leaders require to fight and win in 2050?

Get ready…

– Read our Learning in 2050 Call for Ideas finalists’ submissions here, graciously hosted by our colleagues at Small Wars Journal.

– Review the following blog posts:  First Salvo on “Learning in 2050” – Continuity and Change and Keeping the Edge.

– Starting Tuesday, 7 August 2018, see the conference agenda’s list of presentations and the associated world-class speakers’ biographies here.

and Go!

Join us at the conference on-line here via live-streaming audio and video, beginning at 0840 EDT on Wednesday, 08 Aug 2018; submit your questions to each of the presenters via the moderated interactive chat room; and tag your comments @TRADOC on Twitter with #Learningin2050.

See you all there!

 

72. First Salvo on “Learning in 2050” – Continuity and Change

[Editor’s Note: The U.S. Army Training and Doctrine Command (TRADOC) G-2 is co-hosting the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies on 8-9 August 2018 in Washington, DC.  In advance of this conference, Mad Scientist Laboratory is pleased to present today’s post addressing what is necessary to truly transform Learning in 2050 by returning guest blogger Mr. Nick Marsella.  Read Mr. Marsella’s previous two posts addressing Futures Work at Part I and Part II]

Only a handful of years ago, a conference on the topic of learning in 2050 would spur discussions on needed changes in the way we formally educate and train people to live successful lives and be productive citizens.[I] Advocates in K-12 would probably argue for increasing investment in schools, better technology, and increased STEM education. Higher educators would raise many of the same concerns, pointing to the value of the “the academy” and its universities as integral to the nation’s economic, security, and social well-being by preparing the nation’s future leaders, innovators, and scientists.

Yet, times have changed. “Learning in 2050” could easily address how education and training must meet the required immediate learning needs of the individual and for supporting “lifelong learning” in a very changing and competitive world.[II] The conference could also address how new discoveries in learning and the cognitive sciences will inform the education and training fields, and potentially enhance individual abilities to learn and think.[III] “Learning in 2050” could also focus on how organizational learning will be even more important than today – spelling the difference between bankruptcy and irrelevancy – or for military forces – victory or defeat. We must also address how to teach people to learn and organize themselves for learning.[IV]

Lastly, a “Learning in 2050” conference could also focus on machine learning and how artificial intelligence will transform not only the workplace, but have a major impact on national security.[V] Aside from understanding the potential and limitations of this transformative technology, increasingly we must train and educate people on how to use it to their advantage and understand its limitations for effective “human – machine teaming.” We must also provide opportunities to use fielded new technologies and for individuals to learn when and how to trust it.[VI]

All of these areas would provide rich discussions and perhaps new insights. But just as LTG (ret) H.R. McMaster warned us about thinking about the challenges in future warfare, we must first acknowledge the continuities for this broad topic of “Learning in 2050” and its implications for the U.S. Army.[VII] Until the Army is replaced by robots or knowledge and skills are uploaded directly into the brain as shown in the “Matrix” — learning involves humans and the learning process and the Army’s Soldiers and its civilian workforce [not discounting organizational or machine learning].

Source: U.S. Army https://www.army.mil/article/206197/army_researchers_looking_to_neurostimulation_to_enhance_accelerate_soldiers_abilities

While much may change in the way the individual will learn, we must recognize that the focus of “Learning in 2050” is on the learner and the systems, programs/schools, or technologies adopted in the future must support the learner. As Herbert Simon, one of the founders of cognitive science and a Nobel laureate noted: “Learning results from what the student does and thinks and only from what the student does and thinks. The teacher can advance learning only by influencing what the student does to learn.”[VIII] To the Army’s credit, the U.S. Army Learning Concept for Training and Education 2020-2040 vision supports this approach by immersing “Soldiers and Army civilians in a progressive, continuous, learner-centric, competency-based learning environment,” but the danger is we will be captured by technology, procedures, and discussions about the utility and need for “brick and mortar schools.”[IX]

Learning results from what the student does and thinks and only from what the student does and thinks.

Learning is a process that involves changing knowledge, belief, behavior, and attitudes and is entirely dependent on the learner as he/she interprets and responds to the learning experience – in and out of the classroom.[X] Our ideas, concepts, or recommendations to improve the future of learning in 2050 must either:  improve student learning outcomes, improve student learning efficiency by accelerating learning, or improve the student’s motivation and engagement to learn.

“Learning in 2050” must identify external environmental factors which will affect what the student may need to learn to respond to the future, and also recognize that the generation of 2050 will be different from today’s student in values, beliefs, attitudes, and acceptance of technology.[XI] Changes in the learning system must be ethical, affordable, and feasible. To support effective student learning, learning outcomes must be clearly defined – whether a student is participating in a yearlong professional education program or a five-day field training exercise – and must be understood by the learner.[XII]

We must think big. For example, Professor of Cognition and Education at Harvard’s Graduate School of Education, Howard Gardner postulated that to be successful in the 21st Century requires the development of the “disciplined mind, the synthesizing mind, the creative mind, the respectful mind, and the ethical mind.”[XIII]

Approaches, processes, and organization, along with the use of technology and other cognitive science tools, must focus on the learning process. Illustrated below is the typical officer career timeline with formal educational opportunities sprinkled throughout the years.[XIV] While some form of formal education in “brick and mortar” schools will continue, one wonders if we will turn this model on its head – with more upfront education; shorter focused professional education; more blended programs combining resident/non-resident instruction; and continual access to experts, courses, and knowledge selected by the individual for “on demand” learning. Today, we often use education as a reward for performance (i.e., resident PME); in the future, education must be a “right of the Profession,” equally provided to all (to include Army civilians) – necessary for performance as a member of the profession of arms.

Source: DA Pam 600-3, Commissioned Officer Professional Development and Career Management, December 2014, p.27

The role of the teacher will change. Instructors will become “learning coaches” to help the learner identify gaps and needs in meaningful and dynamic individual learning plans. Like the Army’s Master Fitness Trainer whom advises and monitors a unit’s physical readiness, we must create in our units “Master Learning Coaches,” not simply a training specialist who manages the schedule and records. One can imagine technology evolving to do some of this as the Alexa’s and Siri’s of today become the AI tutors and mentors of the future. We must also remember that any system or process for learning in 2050 must fit the needs of multiple communities: Active Army, Army National Guard, and Army Reserve forces, as well as Army civilians.

Just as the delivery of instruction will change, the assessment of learning will change as well. Simulations and gaming should aim to provide an “Enders’ Game” experience, where reality and simulation are indistinguishable. Training systems should enable individuals to practice repeatedly and as Vince Lombardi noted – “Practice does not make perfect. Perfect practice makes perfect.” Experiential learning will reinforce classroom, on-line instruction, or short intensive courses/seminars through the linkage of “classroom seat time” and “field time” at the Combat Training Centers, Warfighter, or other exercises or experiences.

Tell me and I forget; teach me and I may remember; involve me and I learn.  Benjamin Franklin[XV]

Of course, much will have to change in terms of policies and the way we think about education, training, and learning. If one moves back in time the same number of years that we are looking to the future – it is the year 1984. How much has changed since then?

While in some ways technology has transformed the learning process – e.g., typewriters to laptops; card catalogues to instant on-line access to the world’s literature from anywhere; and classes at brick and mortar schools to Massive Open Online Courses (MOOCs), and blended and on-line learning with Blackboard. Yet, as Mark Twain reportedly noted – “if history doesn’t repeat itself – it rhymes” and some things look the same as they did in 1984, with lectures and passive learning in large lecture halls – just as PowerPoint lectures are ongoing today for some passively undergoing PME.

If “Learning in 2050” is to be truly transformative – we must think differently. We must move beyond the industrial age approach of mass education with its caste systems and allocation of seats. To be successful in the future, we must recognize that our efforts must center on the learner to provide immediate access to knowledge to learn in time to be of value.

Nick Marsella is a retired Army Colonel and is currently a Department of the Army civilian serving as the Devil’s Advocate/Red Team for Training and Doctrine Command. ___________________________________________________________________

[I] While the terms “education” and “training” are often used interchangeably, I will use the oft quoted rule – training is about skills in order to do a job or perform a task, while education is broader in terms of instilling general competencies and to deal with the unexpected.

[II] The noted futurist Alvin Toffler is often quoted noting: “The illiterate of the 21st Century are not those who cannot read and write but those who cannot learn, unlearn, and relearn.”

[III] Sheftick, G. (2018, May 18). Army researchers look to neurostimulation to enhance, accelerate Soldier’s abilities. Retrieved from: https://www.army.mil/article/206197/army_researchers_looking_to_neurostimulation_to_enhance_accelerate_soldiers_abilities

[IV] This will become increasing important as the useful shelf life of knowledge is shortening. See Zao-Sanders, M. (2017). A 2×2 matrix to help you prioritize the skills to learn right now. Harvard Business Review. Retrieved from: https://hbr.org/2017/09/a-2×2-matrix-to-help-you-prioritize-the-skills-to-learn-right-now  — so much to learn, so little time.

[V] Much has been written on AI and its implications. One of the most recent and interesting papers was recently released by the Center for New American Security in June 2018. See: Scharre, P. & Horowitz, M.C. (2018). Artificial Intelligence: What every policymaker needs to know. Retrieved from: https://www.cnas.org/publications/reports/artificial-intelligence-what-every-policymaker-needs-to-know
For those wanting further details and potential insights see: Executive Office of the President, National Science and Technology Council, Committee on Technology Report, Preparing for the Future of Artificial Intelligence, October 2016.

[VI] Based on my anecdotal experiences, complicated systems, such as those found in command and control, have been fielded to units without sufficient training. Even when fielded with training, unless in combat, proficiency using the systems quickly lapses. See: Mission Command Digital Master Gunner, May 17, 2016, retrieved from https://www.army.mil/standto/archive_2016-05-17. See Freedberg, S. Jr. Artificial Stupidity: Fumbling the Handoff from AI to Human Control. Breaking Defense. Retrieved from: https://breakingdefense.com/2017/06/artificial-stupidity-fumbling-the-handoff/

[VII] McMaster, H.R. (LTG) (2015). Continuity and Change: The Army Operating Concept and Clear Thinking about Future War. Military Review.

[VIII] Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C. & Norman, M. K. (2010). How learning works: 7 research-based principles for smart teaching. San Francisco, CA: Jossey-Bass, p. 1.

[IX] U.S. Army Training and Doctrine Command. TRADOC Pamphlet 525-8-2. The U.S. Army Learning Concept for Training and Education 2020-2040.

[X] Ambrose, et al., p.3.

[XI] For example, should machine language be learned as a foreign language in lieu of a traditional foreign language (e.g., Spanish) – given the development of automated machine language translators (AKA = the Universal Translator)?

[XII] The point here is we must clearly understand what we want the learner to learn and adequately define it and insure the learner knows what the outcomes are. For example, we continually espouse that we want leaders to be critical thinkers, but I challenge the reader to find the definitive definition and expected attributes from being a critical thinker given ADRP 6-22, Army Leadership, FM 6-22 Army Leadership, and ADRP 5 and 6 describe it differently. At a recent higher education conference of leaders, administrators and selected faculty, one member succinctly put it this way to highlight the importance of student’s understanding expected learning outcomes: “Teaching students without providing them with learning outcomes is like giving a 500 piece puzzle without an image of what they’re assembling.”

[XIII] Gardner, H. (2008). Five Minds for the Future. Boston, MA: Harvard Business Press. For application of Gardner’s premise see Marsella, N.R. (2017). Reframing the Human Dimension: Gardner’s “Five Minds for the Future.” Journal of Military Learning. Retrieved from: https://www.armyupress.army.mil/Journals/Journal-of-Military-Learning/Journal-of-Military-Learning-Archives/April-2017-Edition/Reframing-the-Human-Dimension/

[XIV] Officer education may differ due to a variety of factors but the normal progression for Professional Military Education includes: Basic Officer Leader Course (BOLC B, to include ROTC/USMA/OCS which is BOLC A); Captains Career Course; Intermediate Level Education (ILE) and Senior Service College as well as specialty training (e.g., language school), graduate school, and Joint schools. Extracted from previous edition of DA Pam 600-3, Commissioned Office Professional Development and Career Management, December 2014, p.27 which is now obsolete. Graphic is as an example. For current policy, see DA PAM 600-3, dated 26 June 2017. .

[XV] See https://blogs.darden.virginia.edu/brunerblog/

56. An Appropriate Level of Trust…

The Mad Scientist team participates in many thought exercises, tabletops, and wargames associated with how we will live, work, and fight in the future. A consistent theme in these events is the idea that a major barrier to the integration of robotic systems into Army formations is a lack of trust between humans and machines. This assumption rings true as we hear the media and opinion polls describe how society doesn’t trust some disruptive technologies, like driverless cars or the robots coming for our jobs.

In his recent book, Army of None, Paul Scharre describes an event that nearly led to a nuclear confrontation between the Soviet Union and the United States. On September 26, 1983, LTC Stanislav Petrov, a Soviet Officer serving in a bunker outside Moscow was alerted to a U.S. missile launch by a recently deployed space-based early warning system. The Soviet Officer trusted his “gut” – or experientially informed intuition – that this was a false alarm. His gut was right and the world was saved from an inadvertent nuclear exchange because this officer did not over trust the system. But is this the rule or an exception to how humans interact with technology?

The subject of trust between Soldiers, Soldiers and Leaders, and the Army and society is central to the idea of the Army as a profession. At the most tactical level, trust is seen as essential to combat readiness as Soldiers must trust each other in dangerous situations. Humans naturally learn to trust their peers and subordinates once they have worked with them for a period of time. You learn what someone’s strengths and weaknesses are, what they can handle, and under what conditions they will struggle. This human dynamic does not translate to human-machine interaction and the tendency to anthropomorphize machines could be a huge barrier.

We recommend that the Army explore the possibility that Soldiers and Leaders could over trust AI and robotic systems. Over trust of these systems could blunt human expertise, judgement, and intuition thought to be critical to winning in complex operational environments. Also, over trust might lead to additional adversarial vulnerabilities such as deception and spoofing.

In 2016, a research team at the Georgia Institute of Technology revealed the results of a study entitled “Overtrust of Robots in Emergency Evacuation Scenarios”. The research team put 42 test participants into a fire emergency with a robot responsible for escorting them to an emergency exit. As the robot passed obvious exits and got lost, 37 participants continued to follow the robot and an additional 2 stood with the robot and didn’t move towards either exit. The study’s takeaway was that roboticists must think about programs that will help humans establish an “appropriate level of trust” with robot teammates.

In Future Crimes, Marc Goodman writes of the idea of “In Screen We Trust” and the vulnerabilities this trust builds into our interaction with our automation. His example of the cyber-attack against the Iranian uranium enrichment centrifuges highlights the vulnerability of experts believing or trusting their screens against mounting evidence that something else might be contributing to the failure of centrifuges. These experts over trusted their technology or just did not have an “appropriate level of trust”. What does this have to do with Soldiers on the future battlefield? Well, increasingly we depend on our screens and, in the future, our heads-up displays to translate the world around us. This translation will only become more demanding on the future battlefield with war at machine speed.

So what should our assumptions be about trust and our robotic teammates on the future battlefield?

1) Soldiers and Leaders will react differently to technology integration.

2) Capability developers must account for trust building factors in physical design, natural language processing, and voice communication.

3) Intuition and judgement remain a critical component of human-machine teaming and operating on the future battlefield. Speed becomes a major challenge as humans become the weak link.

4) Building an “appropriate level of trust” will need to be part of Leader Development and training. Mere expertise in a field does not prevent over trust when interacting with our robotic teammates.

5) Lastly, lack of trust is not a barrier to AI and robotic integration on the future battlefield. These capabilities will exist in our formations as well as those of our adversaries. The formation that develops the best concepts for effective human-machine teaming, with trust being a major component, will have the advantage.

Interested in learning more on this topic? Watch Dr. Kimberly Jackson Ryan (Draper Labs).

[Editor’s Note:  A special word of thanks goes out to fellow Mad Scientist Mr. Paul Scharre for sharing his ideas with the Mad Scientist team regarding this topic.]

55. Influence at Machine Speed: The Coming of AI-Powered Propaganda

[Editor’s Note: Mad Scientist Laboratory is pleased to present the following guest blog post by MAJ Chris Telley, U.S. Army, assigned to the Naval Postgraduate School, addressing how Artificial Intelligence (AI) must be understood as an Information Operations (IO) tool if U.S. defense professionals are to develop effective countermeasures and ensure our resilience to its employment by potential adversaries.]

AI-enabled IO present a more pressing strategic threat than the physical hazards of slaughter-bots or even algorithmically-escalated nuclear war. IO are efforts to “influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries;” here, we’re talking about using AI to do so. AI-guided IO tools can empathize with an audience to say anything, in any way needed, to change the perceptions that drive those physical weapons. Future IO systems will be able to individually monitor and affect tens of thousands of people at once. Defense professionals must understand the fundamental influence potential of these technologies if they are to drive security institutions to counter malign AI use in the information environment.

Source: Peter Adamis / Abalinx.com

Programmatic marketing, using consumer’s data habits to drive real time automated bidding on personalized advertising, has been used for a few years now. Cambridge Analytica’s Facebook targeting made international headlines using similar techniques, but digital electioneering is just the tip of the iceberg. An AI trained with data from users’ social media accounts, economic media interactions (Uber, Applepay, etc.), and their devices’ positional data can infer predictive knowledge of its targets. With that knowledge, emerging tools — like Replika — can truly befriend a person, allowing it to train that individual, for good or ill.

Source: Getty Creative

Substantive feedback is required to train an individual’s response; humans tend to respond best to content and feedback with which they agree. That content can be algorithmically mass produced. For years, Narrative Science tools have helped writers create sports stories and stock summaries, but it’s just as easy to use them to create disinformation. That’s just text, though; today, the AI can create fake video. A recent warning, ostensibly from former President Obama, provides an entertaining yet frightening demonstration of how Deepfakes will challenge our presumptions about truth in the coming years. The Defense Advanced Research Projects Agency (DARPA) is funding a project this summer to determine whether AI-generated Deepfakes will become impossible to distinguish from the real thing, even using other AI systems.

Given that malign actors can now employ AI to lieat machine speed,” they still have to get the story to an audience. Russian bot armies continue to make headlines doing this very thing. The New York Times maintains about a dozen Twitter feeds and produces around 300 tweets a day, but Russia’s Internet Research Agency (IRA) regularly puts out 25,000 tweets in the same twenty-four hours. The IRA’s bots are really just low-tech curators; they collect, interpret, and display desired information to promote the Kremlin’s narratives.

Source: Josep Lago/AFP/Getty Images

Next-generation bot armies will employ far faster computing techniques and profit from an order of magnitude greater network speed when 5G services are fielded. If “Repetition is a key tenet of IO execution,” then this machine gun-like ability to fire information at an audience will, with empathetic precision and custom content, provide the means to change a decisive audience’s very reality. No breakthrough science is needed, no bureaucratic project office required. These pieces are already there, waiting for an adversary to put them together.

The DoD is looking at AI but remains focused on image classification and swarming quadcopters while ignoring the convergent possibilities of predictive audience understanding, tailored content production, and massive scale dissemination. What little digital IO we’ve done, sometimes called social media “WebOps,” has been contractor heavy and prone to naïve missteps. However, groups like USSOCOM’s SOFWERX and the students at the Naval Postgraduate School are advancing the state of our art. At NPS, future senior leaders are working on AI, now. A half-dozen of the school’s departments have stood up classes and events specifically aimed at operationalizing advanced computing. The young defense professionals currently working on AI should grapple with emerging influence tools and form the foundation of the DoD’s future institutional capabilities.

MAJ Chris Telley is an Army information operations officer assigned to the Naval Postgraduate School. His assignments have included theater engagement at U.S. Army Japan and advanced technology integration with the U.S. Air Force. Chris commanded in Afghanistan and served in Iraq as a United States Marine. He tweets at @chris_telley.

This blog post represents the opinions of the author and do not reflect the position of the Army or the United States Government.

52. Potential Game Changers

The Mad Scientist Initiative brings together cutting-edge leaders and thinkers from the technology industry, research laboratories, academia, and across the military and Government to explore the impact of potentially disruptive technologies. Much like Johannes Gutenberg’s moveable type (illustrated above), these transformational game changers have the potential to impact how we live, create, think, and prosper. Understanding their individual and convergent impacts is essential to continued battlefield dominance in the Future Operational Environment. In accordance with The Operational Environment and the Changing Character of Future Warfare, we have divided this continuum into two distinct timeframes:

The Era of Accelerated Human Progress (Now through 2035):
The period where our adversaries can take advantage of new technologies, new doctrine, and revised strategic concepts to effectively challenge U.S. military forces across multiple domains. Game changers during this era include:

• Robotics: Forty plus countries develop military robots with some level of autonomy. Impact on society, employment.
Vulnerable: To Cyber/Electromagnetic (EM) disruption, battery life, ethics without man in the loop.
Formats: Unmanned/Autonomous; ground/air vehicles/subsurface/sea systems. Nano-weapons.
Examples: (Air) Hunter/killer Unmanned Aerial Vehicle (UAV) swarms; (Ground) Russian Uran: Recon, ATGMs, SAMs.

• Artificial Intelligence: Human-Agent Teaming, where humans and intelligent systems work together to achieve either a physical or mental task. The human and the intelligent system will trade-off cognitive and physical loads in a collaborative fashion.

• Swarms/Semi Autonomous: Massed, coordinated, fast, collaborative, small, stand-off. Overwhelm target systems. Mass or disaggregate.



• Internet of Things (IoT): Trillions of internet linked items create opportunities and vulnerabilities. Explosive growth in low Size Weight and Power (SWaP) connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness). Greater than 100 devices per human. Significant end device processing (sensor analytics, sensor to shooter, supply chain management).
Vulnerable: To Cyber/EM/Power disruption. Privacy concerns regarding location and tracking.
Sensor to shooter: Accelerate kill chain, data processing, and decision-making.

• Space: Over 50 nations operate in space, increasingly congested and difficult to monitor, endanger Positioning, Navigation, and Timing (PNT)

GPS Jamming/Spoofing: Increasingly sophisticated, used successfully in Ukraine.
Anti Satellite: China has tested two direct ascent anti-satellite missiles.

The Era of Contested Equality (2035 through 2050):
The period marked by significant breakthroughs in technology and convergences in terms of capabilities, which lead to significant changes in the character of warfare. During this period, traditional aspects of warfare undergo dramatic, almost revolutionary changes which at the end of this timeframe may even challenge the very nature of warfare itself. Game changers during this era include:

• Hyper Velocity Weapons:
Rail Guns (Electrodynamic Kinetic Energy Weapons): Electromagnetic projectile launchers. High velocity/energy and space (Mach 5 or higher). Not powered by explosive.
No Propellant: Easier to store and handle.
Lower Cost Projectiles: Potentially. Extreme G-force requires sturdy payloads.
Limiting factors: Power. Significant IR signature. Materials science.
Hyper Glide Vehicles: Less susceptible to anti-ballistic missile countermeasures.

• Directed Energy Weapons: Signature not visible without technology, must dwell on target. Power requirements currently problematic.
Potential: Tunable, lethal, and non-lethal.
Laser: Directed energy damages intended target. Targets: Counter Aircraft, UAS, Missiles, Projectiles, Sensors, Swarms.
Radio Frequency (RF): Attack targets across the frequency spectrum. Targets: Not just RF; Microwave weapons “cook targets,” people, electronics.

• Synthetic Biology: Engineering / modification of biological entities
Increased Crop Yield: Potential to reduce food scarcity.
Weaponization: Potential for micro-targeting, Seek & destroy microbes that can target DNA. Potentially accessible to super-empowered individuals.
Medical Advances: Enhance soldier survivability.
Genetic Modification: Disease resistant, potentially designer babies and super athletes/soldiers. Synthetic DNA stores digital data. Data can be used for micro-targeting.
CRISPR: Genome editing.

• Information Environment: Use IoT and sensors to harness the flow of information for situational understanding and decision-making advantage.




In envisioning Future Operational Environment possibilities, the Mad Scientist Initiative employs a number of techniques. We have found Crowdsourcing (i.e., the gathering of ideas, thoughts, and concepts from a wide variety of interested individuals assists us in diversifying thoughts and challenging conventional assumptions) to be a particularly effective technique. To that end, we have published our latest, 2-page compendium of Potential Game Changers here — we would like to hear your feedback regarding them. Please let us know your thoughts / observations by posting them in this blog post’s Comment box (found below, in the Leave a Reply section). Alternatively, you can also submit them to us via email at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil. Thank you in advance for your contributions!

43. The Changing Character of Warfare: Takeaways for the Future

The Future Operational Environment (OE), as described in The Operational Environment and the Changing Character of Future Warfare , brings with it an inexorable series of movements which lead us to consider the following critical question:

What do these issues mean for the nature and character of warfare?

The nature of war, which has remained relatively constant from Thucydides, through Clausewitz, through the Cold War, and on into the present, certainly remains constant through the Era of Accelerated Human Progress (i.e., now through 2035). War is still waged because of fear, honor, and interest, and remains an expression of politics by other means. However, as we move into the Era of Contested Equality (i.e., 2035-2050), the character of warfare has changed in several key areas:

The Moral and Cognitive Dimensions are Ascendant.

The proliferation of high technology, coupled with the speed of human interaction and pervasive connectivity, means that no one nation will have an absolute strategic advantage in capabilities. When breakthroughs occur, the advantages they confer will be fleeting, as rivals quickly adapt. Under such conditions, the physical dimension of warfare may become less important than the cognitive and the moral. As a result, there will be less self-imposed restrictions by some powers on the use of military force, and hybrid strategies involving information operations, direct cyber-attacks against individuals and segments of populations, or national infrastructure, terrorism, the use of proxies, and Weapons of Mass Destruction (WMD) will aim to prevail against an enemy’s will.

Integration across Diplomacy, Information, Military, and Economic (DIME).

Clausewitz’s timeless dictum that war is policy by other means takes on a new importance as the distance between war and policy recedes; but also must take into account other elements of national power to form true whole-of-government and, when possible, collective security approaches to national security issues. The interrelationship across the DIME will require a closer integration across all elements of government, and Joint decision-making bodies will need to quickly and effectively deliver DIME effects across the physical, the cognitive, and moral dimensions. Military operations are an essential element of this equation, but may not necessarily be the decisive means of achieving an end state.

Limitations of Military Force.

While mid-Century militaries will have more capability than at any time in history, their ability to wage high-intensity conflict will become more limited. Force-on-force conflict will be so destructive, will be waged at the new speed of human and AI-enhanced interaction, and will occur at such extended long-ranges that exquisitely trained and equipped forces facing a peer or near-peer rival will rapidly suffer significant losses in manpower and equipment that will be difficult to replace. Robotics, unmanned vehicles, and man-machine teaming activities offer partial solutions, but warfare will still revolve around increasingly vulnerable human beings. Military forces will need to consider how advances in AI, bio-engineering, man-machine interface, neuro-implanted knowledge, and other areas of enhanced human performance and learning can quickly help reduce the long lead time in training and developing personnel.

The Primacy of Information.

In the timeless struggle between offense and defense, information will become the most important and most useful tool at all levels of warfare. The ability of an actor to use information to target the enemy’s will, without necessarily having to address its means will increasingly be possible. In the past, nations have tried to target an enemy’s will through kinetic attacks on its means – the enemy military – or through the direct targeting of the will by attacking the national infrastructure or a national populace itself. Sophisticated, nuanced information operations, taking advantage of an ability to directly target an affected audience through cyber operations or other forms of influence operations, and reinforced by a credible capable armed force can bend an adversary’s will before battle is joined.

Expansion of the Battle Area.

Nations, non-state actors, and even individuals will be able to target military forces and civilian infrastructure at increasing – often over intercontinental – ranges using a host of conventional and unconventional means. A force deploying to a combat zone will be vulnerable from the individual soldier’s personal residence, to his or her installation, and during his or her entire deployment. Adversaries also will have the ability to target or hold at risk non-military infrastructure and even populations with increasingly sophisticated, nuanced and destructive capabilities, including WMD, hypersonic conventional weapons, and perhaps most critically, cyber weapons and information warfare. WMD will not be the only threat capable of directly targeting and even destroying a society, as cyber and information can directly target infrastructure, banking, food supplies, power, and general ways of life. Limited wars focusing on a limited area of operations waged between peers or near-peer adversaries will become more dangerous as adversaries will have an unprecedented capability to broaden their attacks to their enemy’s homeland. The U.S. Homeland likely will not avoid the effects of warfare and will be vulnerable in at least eight areas.

Ethics of Warfare Shift.
Traditional norms of warfare, definitions of combatants and non-combatants, and even what constitutes military action or national casus belli will be turned upside down and remain in flux at all levels of warfare.


– Does cyber activity, or information operations aimed at influencing national policy, rise to the level of warfare?

– Is using cyber capabilities to target a national infrastructure legal, if it has broad societal impacts?

– Can one target an electric grid that supports a civilian hospital, but also powers a military base a continent away from the battle zone from which unmanned systems are controlled?

– What is the threshold for WMD use?

– Is the use of autonomous robots against human soldiers legal?

These and other questions will arise, and likely will be answered differently by individual actors.

The changes in the character of war by mid-Century will be pronounced, and are directly related and traceable to our present. The natural progression of the changes in the character of war may be a change in the nature of war, perhaps towards the end of the Era of Contested Equality or in the second half of the Twenty First Century.

For additional information, watch the TRADOC G-2 Operational Environment Enterprise’s The Changing Character of Future Warfare video.

42. China’s Drive for Innovation Dominance

“While the U.S. military may not necessarily have to fight Russia or China, it is likely that U.S. forces through 2050 will encounter their advanced equipment, concepts, doctrine, and tactics in flashpoints or trouble spots around the globe..” — extracted from The Operational Environment and the Changing Character of Future Warfare

The Future Operational Environment’s Era of Contested Equality (i.e., 2035 through 2050) will be marked by significant breakthroughs in technology and convergences, resulting in revolutionary changes that challenge the very nature of warfare itself. No one actor is likely to have any long-term strategic or technological advantage during this period of enduring competition. Prevailing in this environment will depend on an ability to synchronize multi-domain capabilities against an artificial intelligence-enhanced adversary with an overarching capability to visualize and understand the battlespace at even greater ranges and velocities.

Ms. Elsa Kania, Adjunct Fellow, Technology and National Security Program, Center for a New American Security (CNAS), presented People’s Liberation Army (PLA) Human-Machine Integration” at last month’s Bio Convergence and Soldier 2050 Conference. In this presentation, Ms. Kania addressed China’s on-going initiatives that seek to change military power paradigms via competition and innovation in a number of key technologies. This post summarizes Ms. Kania’s presentation.

Xinhua News Agency (Li Gang/Xinhua via AP)
Under President Xi Jinping‘s leadership, China is becoming a major engine of global innovation, second only to the United States. China’s national strategy of “innovation-driven development” places innovation at the forefront of economic and military development. These efforts are beginning to pay off, as Beijing is becoming as innovative as Silicon Valley. China continues to strengthen its military through a series of ambitious Science and Technology (S&T) plans and investments, focusing on disruptive and radical innovations that will enable them to seize the high ground with decisive technologies (e.g., AI, hypervelocity, and biotechnology).

President Xi leads China’s Central Military-Civil Fusion Development Commission, whose priorities include intelligent unmanned systems, biology and cross-disciplinary technologies, and quantum S&T. Though the implementation of a “whole of nation” strategy, President Xi is leveraging private sector advances for military applications. This strategy includes the establishment of Joint Research Institutes to promote collaborative R&D; new national labs focused on achieving dual-use advances; and collaboration within national military-civil fusion innovation demonstration zones. Major projects concentrate on quantum communications and computing, brain science, and brain-inspired research.

By 2030, China will be world’s premier Artificial Intelligence (AI) innovation center. Building upon their successes with Alpha Go, the PLA is seeking to establish a “Battlefield Singularity,” leveraging AI potential in planning, operational command and control, decision support tools, wargaming, and brain-computer interfaces controlling unmanned systems. They will deepen military-civil fusion AI initiatives with Baidu, Alibaba Group, Tencent, and iFLYTEK. AI is seen as a potential game-changer by the Chinese, a way to augment perceived military shortcomings.

This focused initiative on innovation may result in China’s First Offset, characterized by integrating quantum satellites with fiber optic communication networks; human-machine interfaces; drone swarms able to target carrier task forces; naval rail guns; and quantum computing.

Potential areas for biotechnology and AI convergences include:

“Intelligentized” Command Decision-Making: The Joint Staff Department of the Central Military Commission (CMC) has called for the PLA to leverage the “tremendous potential” of AI in planning, operational command, and decision support. Ongoing research is focusing on command automation and “intelligentization,” with experimental demonstrations of an “external brain” for commanders and decision support systems for fighter pilots and submarines.

Brain-Computer Interfaces: Active research programs in brain-computer interfaces are underway (e.g., at PLA Information Engineering University, Tsinghua University), enabling “brain control” of robotic and “unmanned” systems and potentially facilitating brain networking.


Military Exoskeletons: Several prototype exoskeletons have been tested and demonstrated to date, augmenting soldiers’ physical capabilities, with the latest generations being more capable and closer to being fielded by the PLA.








CRISPR in China: Gene editing is currently underway with animals and human embryos due to less stringent regulatory requirements in the PRC. BGI (a would-be “bio-Google”) is currently soliciting DNA from Chinese geniuses in an attempt to understand the genomic basis for intelligence.






Chinese Superintelligence: The Chinese aspire to develop “brain-like” or human-level AI. Their new National Engineering Laboratory for Brain-Inspired Intelligence Technologies and Applications, with Baidu involvement, is focusing on learning from the human brain to tackle AI, advancing next-generation AI technologies.

While technological advantage has been a key pillar of U.S. military power and national competitiveness, China is rapidly catching up. Future primacy in AI and biotech, likely integral in future warfare, could remain contested between the U.S. and China. The PLA will continue explore and invest in these key emerging technologies in their on-going drive for innovation dominance.

For more information regarding the PLA’s on-going innovation efforts:

Watch Ms. Kania’s video presentation and read the associated slides from the Bio Convergence and Soldier 2050 Conference.

Listen to Ms. Kania’s China’s Quest for Enhanced Military Technology podcast, hosted by our colleagues at Modern War Institute.

Read Ms. Kania’s “Battlefield Singularity Artificial Intelligence, Military Revolution, and China’s Future Military Power,” which can be downloaded here.

Check out Ms. Kania’s Battlefield Singularity website.

33. Can TV and Movies Predict the Battlefield of the Future?

(Editor’s Note: Mad Scientist Laboratory is pleased to present Dr. Peter Emanuel’s guest blog post, illustrating how popular culture often presages actual technological advancements and scientific breakthroughs.)

Did Dick Tracy’s wrist watch telephone or Star Trek’s communicator inspire future generations of scientists and engineers to build today’s smartphone? Or were they simply depicting the inevitable manifestation of future technology? If we look back on old issues of Superman comic books that depict a 3D printer half a century before it was invented, we can see popular media has foreshadowed future technology, time and time again. Clearly, there are many phenomena, from time travel to force fields, that have not, and may not ever see the light of day; however, there are enough examples to suggest that dedicated and forward thinking scientists, trying to defend the United States, should consider this question:

Can comic books, video games, television, and movies give us a glimpse into the battlefield of the future?

For today’s Mad Scientist blog, consider what the future may hold for defense against weapons of mass destruction.

Let’s get the 800 lb. gorilla out of the room first! Or, perhaps, the 800 lb. dinosaur by talking about biological warfare in the future. The movie Jurassic Park depicts the hubris of man trying to control life by “containing” its DNA. Our deeper understanding of DNA shows us that life is programmed to be redundant and error prone. It’s actually a fundamental feature that drives evolution. In the year 2050, if we are to control our genetically modified products, we must master containment and control for a system designed since the dawn of time to NOT be contained. Forget bio-terror…What about bio-Error?! Furthermore, the lesson in Jurassic Park from the theft of the frozen dinosaur eggs shows us the asymmetric impact that theft of genetic products can yield. Today, our adversaries amass databases on our genetic histories through theft and globalization and one only has to ask, “What do they know that we should be worried about?”

Let’s move from biology to chemistry. A chemist will argue that biology is just chemistry, and at some level it’s true. Like the movie Outlander and anime like Cowboy Bebop, today’s Middle East battlefield shows the use of CAPTAGON, an addictive narcotic blend used to motivate and subjugate radical Islamists. In 2050, our mastery of tailored chemistry will likely lead to more addictive or targeted drug use that could elicit unpredictable or illogical behaviors. Controlled delivery of mood/behavior altering drugs will frustrate efforts to have a military workforce managed by reliability programs and will require layered and redundant controls even on trusted populations. Such vulnerabilities will likely be a justification for placing weapons and infrastructure under some level of artificial intelligence in the year 2050. Imagine this is the part of the blog where we talked about the Terminator and CyberDyne Systems.

Today, the thought of man-machine interfaces depicted by the Borg from Star Trek and the TV shows such as Aeon Flux and Ghost in the Shell may make our skin crawl. In 2050, societal norms will likely evolve to embrace these driven by the competitive advantage that implants and augmentation affords. Cyborgs and genetic chimeras will blur the line between what is man and what is machine; it will usher in an era when a computer virus can kill, and it will further complicate our ability to identify friend from foe in a way best depicted by the recent Battlestar Galactica TV show. Will the point of need manufacturing systems of the future be soulless biological factories like those depicted in Frank Herbert’s book series, “Dune”? As we prepare for engaging in a multi-domain battlespace by extending our eyes and ears over the horizon with swarming autonomous drones are we opening a window into the heart and mind of our future fighting force?

Some final thoughts for the year 2050 when we maintain a persistent presence off planet Earth. As Robert Heinlein predicted, and recent NASA experiments proved, our DNA changes during prolonged exposure to altered gravity. What of humans who never stepped foot on Earth’s surface, as shown in the recent movie, The Fate of our Stars. Eventually, non-terrestrially based populations will diverge from the gene pool, perhaps kindling a debate on what is truly human? Will orbiting satellites with hyperkinetic weapons such as were pictured in GI Joe Retaliation add another dimension to the cadre of weapons of mass destruction? I would argue that popular media can help spur these discussions and give future mad scientists a glimpse into the realm of the possible. To that end, I think we can justify a little binge watching in the name of national security!

If you enjoyed this post, please check out the following:

– Headquarters, U.S. Army Training and Doctrine Command (TRADOC) is co-sponsoring the Bio Convergence and Soldier 2050 Conference with SRI International at Menlo Park, California, on 08-09 March 2018. Click here to learn more about the conference, the associated on-line game, and then watch the live-streamed proceedings, starting at 0840 PST / 1140 EST on 08 March 2018.

– Our friends at Small Wars Journal are continuing to publish the finalists from our most recent Call for Ideas — click here to check them out!


Dr. Peter Emanuel is the Army’s Senior Research Scientist (ST) for Bioengineering. In this role, he advises Army Leadership on harnessing the opportunities that synthetic biology and biotechnology can bring to National Security.

22. Speed, Scope, and Convergence Trends

“Speed is the essence of war. Take advantage of the enemy’s unpreparedness; travel by unexpected routes and strike him where he has taken no precautions.” — Sun Tzu

This timeless observation from The Art of War resonates through the millennia and is of particular significance to the Future Operational Environment
Mad Scientist Laboratory has addressed the impact of Autonomy, Artificial Intelligence (AI), and Robotic Trends in previous posts. Consequential in their own right, particularly in the hands of our adversaries, the impact of these technology trends is exacerbated by their collective speed, scope, and convergence, leading ultimately to man-machine co-evolution.

Speed. Some Mad Scientists posit that the rate of progress in these technologies will be “faster than Moore’s law.” As our adversaries close the technology gap and potentially overtake us in select areas, there is clearly a “need for speed” as cited in the Defense Science Board (DSB) Report on Autonomy. The speed of actions and decisions will need to increase at a much higher pace over time.

“… the study concluded that autonomy will deliver substantial operational value across an increasingly diverse array of DoD missions, but the DoD must move more rapidly to realize this value. Allies and adversaries alike also have access to rapid technological advances occurring globally. In short, speed matters—in two distinct dimensions. First, autonomy can increase decision speed, enabling the U.S. to act inside an adversary’s operations cycle. Secondly, ongoing rapid transition of autonomy into warfighting capabilities is vital if the U.S. is to sustain military advantage.” — DSB Summer Study on Autonomy, June 2016 (p. 3)

Scope. It may be necessary to increase not only the pace but also the scope of these decisions if these technologies generate the “extreme future” characterized by Mad Scientist Dr. James Canton as “hacking life” / “hacking matter” / “hacking the planet.” In short, no aspect of our current existence will remain untouched. Robotics, artificial intelligence, and autonomy – far from narrow topics – are closely linked to a broad range of enabling / adjunct technologies identified by Mad Scientists, to include:

• Computer Science, particularly algorithm design and software engineering
• Man-Machine Interface, to include Language / Speech and Vision
• Sensing Technologies
• Power and Energy
• Mobility and Manipulation
• Material Science to include revolutionary new materials
• Quantum Science
• Communications
• 3D (Additive) Manufacturing
• Positioning, Navigation and Timing beyond GPS
• Cyber

Science and Technological Convergence. Although 90% of the technology development will occur in the very fragmented, uncontrolled private sector, there is still a need to view robotics, artificial intelligence and autonomy as a holistic, seamless system. Technology convergence is a recurring theme among Mad Scientists. They project that we will alter our fundamental thinking about science because of the “exponential convergence” of key technologies, including:

• Nanoscience and nanotechnology
• Biotechnology and Biomedicine
• Information Technology
• Cognitive Science and Neuroscience
• Quantum Science




This convergence of technologies is already leading to revolutionary achievements with respect to sensing, data acquisition and retrieval, and computer processing hardware. These advances in turn enable machine learning to include reinforcement learning and artificial intelligence. They also facilitate advances in hardware and materials, 3D printing, robotics and autonomy, and open-sourced and reproducible computer code. Exponential convergence will generate “extremely complex futures” that include capability “building blocks” that afford strategic advantage to those who recognize and leverage them.

Co-Evolution. Clearly humans and these technologies are destined to co-evolve. Humans will be augmented in many ways: physically, via exoskeletons; perceptionally, via direct sensor inputs; genetically, via AI-enabled gene-editing technologies such as CRISPR; and cognitively via AI “COGs” and “Cogni-ceuticals.” Human reality will be a “blended” one in which physical and digital environments, media and interactions are woven together in a seamless integration of the virtual and the physical. As daunting – and worrisome – as these technological developments might seem, there will be an equally daunting challenge in the co-evolution between man and machine: the co-evolution of trust.

Trusted man-machine collaboration will require validation of system competence, a process that will take our legacy test and verification procedures far beyond their current limitations. Humans will expect autonomy to be nonetheless “directable,” and will expect autonomous systems to be able to explain the logic for their behavior, regardless of the complexity of the deep neural networks that motivate it. These technologies in turn must be able to adapt to user abilities and preferences, and attain some level of human awareness (e.g., cognitive, physiological, emotional state, situational knowledge, intent recognition).

For additional information on The Convergence of Future Technology, see Dr. Canton’s presentation from the Mad Scientist Robotics, Artificial Intelligence, & Autonomy Conference at Georgia Tech Research Institute last March.