140. A Closer Look at China’s Strategies for Innovation: Questioning True Intent

[Editor’s Note: Mad Scientist Laboratory is pleased to publish today’s guest blog post by Ms. Cindy Hurst, addressing China’s continued drive for dominance regarding innovative technologies.  The asymmetry in ethics existing between their benign and altruistic publicly stated policies and their whole-of-government commitment to modernization and the development of disruptive technologies will remain a key component of multi-domain competition.]

One of China’s most important initiatives is to become an innovative society — but at what cost? In February, the Center for New American Security published a paper, entitled Understanding China’s AI Strategy: Clues to Chinese Strategic Thinking on Artificial Intelligence and National Security. Its author, Gregory Allen, explains that the Chinese government sees Artificial Intelligence (AI) as a “high strategic priority” and is therefore devoting resources “to cultivate AI expertise and strategic thinking among its national security community.” He further urges careful tracking of China’s progress in AI.

Indeed, it would behoove the West to stay abreast of what China is doing in the areas of AI, and not just militarily, but in all areas since there is a clear overlap of civilian and military applications. According to countless official statements, publications, and strategic plans, such as the 13th Five-Year National Science and Technology Innovation Plan, China has placed great emphasis on developing AI, along with other cutting edge technologies, which it views as “majorly influential disruptive technologies” that are capable of altering “the structure of science and technology, the economy, society, and the ecology, to win a competitive advantage in the new round of industry transformation.” 1

Know your enemy and know yourself and in 100 battles you will not be in peril” is one of the key principles of Sun Tzu. The compelling reasons for China’s goals to become a strong global force can easily be explained by understanding its past history and ancient strategies, which are still studied today. The Middle Kingdom had been touted as having once been a seafaring power with a past of contributing world-class innovation at different points over its 5,000 year history. More recently, during the 19th and 20th centuries, China endured what it refers to as the “century of humiliation” — a period in which it was carved up by Western forces during the Opium Wars and then pummeled by Japanese forces in the 1930s.

After the Communist Party’s defeat of the Kuomintang, who retreated to Taiwan, Communist Party Chairman Mao Zedong proclaimed the establishment of the People’s Republic of China in 1949. Since then, the country has vowed to never again be vulnerable to outside forces. They would press forward, making their own path, suffering bumps and bruises along the way. However, it was the United States’ crushing defeat of Iraqi forces during the Persian Gulf War in 1991 that served as the real wakeup call that China lagged far behind Western forces in military capabilities. Since then, generals working at the Academy of Military Science in Beijing and others have studied every aspect of the U.S. revolution in military affairs, including advances in microprocessors, sensors, communication, and Joint operations.2

In its efforts to try to make some headway in technology, China has been accused of stealing massive amounts of foreign intellectual property over the past few decades. Their methodology has included acquisition and reverse engineering, participating in joint ventures sharing research and development, spying, and hacking into government and corporate computer systems. According to a report by CNBC, one in five North American-based corporations on the CNBC Global CFO Council claimed that Chinese companies had stolen their intellectual property within the last year.3 Such thefts and acquisitions make it easier for China to catch up on technology at a low-cost. While the United States spends billions of dollars in research and development, China also benefits without having to expend similar amounts of capital.

Artificial intelligence, quantum information, and Internet of Things are three examples of disruptive technologies shaping the future and in which China aspires to one day have a large or controlling stake. In his speech delivered at the 19th National Congress of the Communist Party of China in October 2017, President Xi Jinping stated that “innovation is the primary driving force behind development” and “it is the strategic underpinning for building a modernized economy.”4

However, while Xi and other Chinese officials outwardly push for international cooperation in AI technology, their efforts and methods have raised concern among some analysts. China openly promotes international cooperation in research and development. However, one might consider possible alternative intentions in trying to push for international cooperation. For example, in Allen’s article, he explains that Fu Ying, the Vice-Chair of the Foreign Affairs Committee of the National People’s Congress had stated that “we should cooperate to preemptively prevent the threat of AI.” Fu further said that China was interested in “playing a leading role in creating norms to mitigate” the risks. A PLA think-tank scholar reportedly expressed support for “mechanisms that are similar to arms control.”5 How sincere are the Chinese in this sentiment? Should it join forces with foreign states to come up with control mechanisms, would China abide by these mechanisms or act in secret, continuing their forward momentum to gain the edge? After all, if both China and the United States, for example, ended up on an even playing field, it would run counter to China’s objectives, if one subscribes to the concept as outlined by Michael Pillsbury in his book, The Hundred-Year Marathon: China’s Secret Strategy to Replace America as the Global Superpower.

While China’s spoken objectives might be sincere, it is prudent to continually review a few of the ancient strategies/stratagems developed during the warring states period, still studied in China today and applied. Some examples include:

1. Cross the sea without the emperor’s knowledge: Hide your true intentions by using the ruse of fake intentions… until you achieve your real intentions.

2. Kill with a borrowed sword: Use the enemy’s strength against them or the strength of another to conquer your enemy.

3. Hide a dagger behind a smile: charm and ingratiate your enemy until you have gained his trust… and then move against him in secret.

In his article, Allen cites a recent Artificial Intelligence Security White Paper, written by “an influential Chinese government think tank,” calling upon China’s government to “avoid Artificial Intelligence arms races among countries” adding that China will “deepen international cooperation on AI laws and regulations, international rules, and so on…” However, as Allen points out, “China’s behavior of aggressively developing, utilizing, and exporting increasingly autonomous robotic weapons and surveillance AI technology runs counter to the country’s stated goals of avoiding an AI arms race.” China may have good intentions. However, its opaque nature breeds skepticism.

Another interesting point to expand upon and that Allen touched upon in his article are the effects of disruptive technologies on societies. According to a Chinese think tank scholar, “China believes that the United States is likely to spend too much to maintain and upgrade mature systems and underinvest in disruptive new systems that make America’s existing sources of advantage vulnerable and obsolete…” When considering the Chinese stratagem, “Sacrifice the plum tree to preserve the peach tree,” it is easy to argue that China will not be easily swayed from developing disruptive technologies, despite possible repercussions and damaging effects. For example, the development of autonomous systems results in unemployment and a steep learning curve. It is inherent in Chinese culture to sacrifice short-term objectives in order to obtain long-term goals. Sustaining initial, short-term repercussions are necessary before China can achieve some of its long-term production goals. Allen explains, “modernization is a top priority, and there is a general understanding that many of its current platforms and approaches are obsolete and must be replaced regardless.”

Particularly intriguing in Allen’s article is his discussion of SenseTime, which is a “world leader in computer vision AI.” The author states that “China’s government and leadership is enthusiastic about using AI for surveillance.” He goes on to say that one Chinese scholar had told him that he “looks forward to a world in AI” in which it will be “impossible to commit a crime without being caught.” While this may seem like an ideal scenario, given the technology is put into the hands of a level-headed and fair law enforcement agency; should it be turned over to an authoritarian dictatorship, such a technology could prove to be disastrous to private citizens. Government control and scare tactics could further suppress their citizens’ basic rights and freedoms.

In conclusion, while China openly pushes the concept of its modernization efforts as a win-win, peaceful development strategy — a careful study of Chinese strategies that have been around for millennia may point to a different scenario, bringing skepticism into the equation. It would be easy to fall prey to an ideology that preaches peace, mutual development, and mutual respect. However, it is important to ask the following two questions: “Is this real?” and “What, if anything, are their ulterior motives?”

If you enjoyed this post, please see:

China’s Drive for Innovation Dominance

Quantum Surprise on the Battlefield?

Cindy Hurst is a research analyst under contract for the Foreign Military Studies Office, Fort Leavenworth, Kansas. Her focus has been primarily on China, with a recent emphasis on research and development, China’s global expansion efforts, and Chinese military strategy. She has published nearly three dozen major papers and countless articles in a variety of journals, magazines, and online venues.

Disclaimer:  The views expressed in this article are Ms. Hurst’s alone and do not imply endorsement by the U.S. Army Training and Doctrine Command, the U.S. Army, the Department of Defense, or the U.S. Government.  This piece is meant to be thought-provoking and does not reflect the current position of the U.S. Army.


1 “Notice of the State Council Regarding the Issuance of the 13th Five-Year National Science and Technology Innovation Plan, State Council Issuance (2016) No. 43, 28 March 2017, http://www.gov.cn/zhengce/content/2016-08/08/content_5098072.htm.

2 “Neither War Nor Peace,” The Economist, 25 January 2018, https://www.economist.com/special-report/2018/01/25/neither-war-nor-peace.

3 Eric Rosenbaum, “1 in 5 Corporations Say China Has Stolen Their IP within the Last Year: CNBC CFO Survey,” CNBC, 1 March 2019, https://www.cnbc.com/2019/02/28/1-in-5-companies-say-china-stole-their-ip-within-the-last-year-cnbc.html.

4 Xi Jinping, “Secure a Decisive Victory in Building a Moderately Prosperous Society in All Respects and Strive for the Great Success of Socialism with Chinese Characteristics for a New Era,” Transcript of speech delivered at the 19th National Congress of the communist Party of China, 18 October 2017.

5 Gregory Allen, “Understanding China’s AI Strategy,” Center for a New American Security, 6 February 2019, https://www.cnas.org/publications/reports/understanding-chinas-ai-strategy.

121. Emergent Global Trends Impacting on the Future Operational Environment

[Editor’s Note: Regular readers of the Mad Scientist Laboratory are familiar with a number of disruptive trends and their individual and convergent impacts on the Future Operational Environment (OE). In today’s post, we explore three recent publications to expand our understanding of these and additional emergent global trends.  We also solicit your input on any other trends that have the potential to transform the OE and change the character of future warfare.]

The U.S. Army finds itself at a historical inflection point, where disparate, yet related elements of the Operational Environment (OE) are converging, creating a situation where fast-moving trends across the Diplomatic, Information, Military, and Economic (DIME) spheres are rapidly transforming the nature of all aspects of society and human life – including the character of warfare.” — The Operational Environment and the Changing Character of Future Warfare

Last year, the Mad Scientist Initiative published several products that envisioned these fast-moving trends and how they are transforming the Future OE. These products included our:

• Updated Potential Game Changers information sheet, identifying a host of innovative technologies with the potential to disrupt future warfare during The Era of Accelerated Human Progress (now through 2035) and The Era of Contested Equality (2035 through 2050).

 

 

 

Black Swans and Pink Flamingos blog post, addressing both Black Swan events (i.e., unknown, unknowns) which, though not likely, might have significant impacts on how we think about warfighting and security; and Pink Flamingos, which are the known, knowns that are often discussed, but ignored by Leaders trapped by organizational cultures and rigid bureaucratic decision-making structures.

With the advent of 2019, three new predictive publications have both confirmed and expanded the Mad Scientist Initiative’s understanding of emergent trends and technologies:

• Government Accounting Office (GAO) Report to Congressional Committees: National Security Long Range Emerging Threats Facing the United States As Identified by Federal Agencies, December 2018

• Deloitte Insights Technology, Media, and Telecommunications Predictions 2019, January 2019

• World Economic Forum (WEF) The Global Risks Report 2019, 14th Edition, January 2019

Commonalities:

These three publications collectively confirmed Mad Scientist’s thoughts regarding the disruptive potential of Artificial Intelligence (AI), Quantum Computing, the Internet of Things (IoT), and Big Data; and individually echoed our concerns regarding Cyber, Additive Manufacturing, Space and Counterspace, Natural Disasters, and the continuing threat of Weapons of Mass Destruction. That said, the real value of these (and other) predictions is in informing us about the trends we might have missed, and expanding our understanding of those that we were already tracking.

New Insights:

From the GAO Report we learned:

Megacorporations as adversaries. Our list of potential adversaries must expand to include “large companies that have the financial resources and a power base to exert influence on par with or exceeding non-state actors.” Think super-empowered individual(s) enhanced further by the wealth, reach, influence, and cover afforded by a transnational corporation.

The rich population is shrinking, the poor population is not. Working-age populations are shrinking in wealthy countries and in China and Russia, and are growing in developing, poorer countries…. [with] the potential to increase economic, employment, urbanization and welfare pressures, and spur migration.”

Climate change, environment, and health issues will demand attention. More extreme weather, water and soil stress, and food insecurity will disrupt societies. Sea-level rise, ocean acidification, glacial melt, and pollution will change living patterns. Tensions over climate change will grow.”

Internal and International Migration. Governments in megacities … may not have the capacity to provide adequate resources and infrastructure…. Mass migration events may occur and threaten regional stability, undermine governments, and strain U.S. military and civilian responses.”

Infectious Diseases. New and evolving diseases from the natural environment—exacerbated by changes in climate, the movement of people into cities, and global trade and travel—may become a
pandemic. Drug-resistant forms of diseases previously considered treatable could become widespread again…. Diminishing permafrost could expand habitats for pathogens that cause disease.”

From Deloitte Insights Predictions we learned:

Intuitive AI development services may not require specialized knowledge. “Baidu recently released an AI training platform called EZDL that requires no coding experience and works even with small data training sets…. Cloud providers have developed pre-built machine learning APIs [application-programming interfaces] for technologies such as natural language processing that customers can access instead of building their own.”

Cryptocurrency growth may have driven Chinese semiconductor innovation. Chinese chipmakers’ Application-Specific Integrated Circuits (ASICs), initially designed to meet domestic bitmining demands, may also meet China’s growing demand for AI chipsets vice Graphics Processing Units (GPUs). “Not only could these activities spark more domestic innovation… China just might be positioned to have a larger impact on the next generation of cognitive technologies.”

Quantum-safe security was important yesterday. Malicious adversaries could store classically encrypted information today to decrypt in the future using a QC [Quantum Computer], in a gambit known as a ‘harvest-and-decrypt’ attack.”

From the WEF Report we learned:

This is an increasingly anxious, unhappy, and lonely world. Anger is increasing and empathy appears to be in short supply…. Depression and anxiety disorders increased [globally] between 1990 and 2013…. It is not difficult to imagine such emotional and psychological disruptions having serious diplomatic—and perhaps even military—consequences.”

The risk from biological pathogens is increasing. “Outbreaks since 2000 have been described as a ‘rollcall of near-miss catastrophes’” and they are on the rise. “Biological weapons still have attractions for malicious non-state actors…. it [is] difficult to reliably attribute a biological attack… the direct effects—fatalities and injuries—would be compounded by potentially grave societal and political disruption.”

Use of weather manipulation tools stokes geopolitical tensions. Could be used to disrupt … agriculture or military planning… if states decided unilaterally to use more radical geo-engineering technologies, it could trigger dramatic climatic disruptions.”

Food supply disruption emerges as a tool as geo-economic tensions intensify. Worsening trade wars might spill over into high-stakes threats to disrupt food or agricultural supplies…. Could lead to disruptions of domestic and cross-border flows of food. At the extreme, state or non-state actors could target the crops of an adversary state… with a clandestine biological attack.”

Taps run dry on Water Day Zero. “Population growth, migration, industrialization, climate change, drought, groundwater depletion, weak infrastructure, and poor urban planning” all stress megacities’ ability to meet burgeoning demands, further exacerbating existing urban / rural divides, and could potentially lead to conflicts over remaining supply sources.

What Are We Missing?

The aforementioned trends are by no means comprehensive. Mad Scientist invites our readers to assist us in identifying any other additional emergent global trends that will potentially transform the OE and change the character of future warfare. Please share them with us and our readers by scrolling down to the bottom of this post to the “Leave a Reply” section, entering them in the Comment Box with an accompanying rationale, and then selecting the “Post Comment” button. Thank you in advance for all of your submissions!

If you enjoyed reading these assessments about future trends, please also see the Statement for the Record:  Worldwide Threat Assessment of the US Intelligence Community, 29 January 2019, from the U.S. Senate Select Committee on Intelligence.

110. Future Jobs and Skillsets

[Editor’s Note:  On 8-9 August 2018, the U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Learning in 2050 Conference with Georgetown University’s Center for Security Studies in Washington, DC.  Leading scientists, innovators, and scholars from academia, industry, and the government gathered to address future learning techniques and technologies that are critical in preparing for Army operations in the mid-21st century against adversaries in rapidly evolving battlespaces.  Today’s post is extracted from this conference’s final report (more of which is addressed at the bottom of this post).]

The U.S. Army currently has more than 150 Military Occupational Specialties (MOSs), each requiring a Soldier to learn unique tasks, skills, and knowledges. The emergence of a number of new technologies – drones, Artificial Intelligence (AI), autonomy, immersive mixed reality, big data storage and analytics, etc. – coupled with the changing character of future warfare means that many of these MOSs will need to change, while others will need to be created. This already has been seen in the wider U.S. and global economy, where the growth of internet services, smartphones, social media, and cloud technology over the last ten years has introduced a host of new occupations that previously did not exist. The future will further define and compel the creation of new jobs and skillsets that have not yet been articulated or even imagined. Today’s hobbies (e.g., drones) and recreational activities (e.g., Minecraft/Fortnite) that potential recruits engage in every day could become MOSs or Additional Skill Identifiers (ASIs) of the future.

Training eighty thousand new Recruits a year on existing MOSs is a colossal undertaking.  A great expansion in the jobs and skillsets needed to field a highly capable future Army, replete with modified or new MOSs, adds a considerable burden to the Army’s learning systems and institutions. These new requirements, however, will almost certainly present an opportunity for the Army to capitalize on intelligent tutors, personalized learning, and immersive learning to lessen costs and save time in Soldier and Leader development.

The recruit of 2050 will be born in 2032 and will be fundamentally different from the generations born before them.  Marc Prensky, educational writer and speaker who coined the term digital native, asserts this “New Human” will stand in stark contrast to the “Old Human” in the ways they learn and approach learning..1 Where humans today are born into a world with ubiquitous internet, hyper-connectivity, and the Internet of Things, each of these elements are generally external to the human.  By 2032, these technologies likely will have converged and will be embedded or integrated into the individual with connectivity literally on the tips of their fingers. 

Some of the newly required skills may be inherent within the next generation(s) of these Recruits. Many of the games, drones, and other everyday technologies that are already or soon to be very common – narrow AI, app development and general programming, and smart devices – will yield a variety of intrinsic skills that Recruits will have prior to entering the Army. Just like we no longer train Soldiers on how to use a computer, games like Fortnite, with no formal relationship with the military, will provide players with militarily-useful skills such as communications, resource management, foraging, force structure management, and fortification and structure building, all while attempting to survive against persistent attack.  Due to these trends, Recruits may come into the Army with fundamental technical skills and baseline military thinking attributes that flatten the learning curve for Initial Entry Training (IET).2

While these new Recruits may have a set of some required skills, there will still be a premium placed on premier skillsets in fields such as AI and machine learning, robotics, big data management, and quantum information sciences. Due to the high demand for these skillsets, the Army will have to compete for talent with private industry, battling them on compensation, benefits, perks, and a less restrictive work environment – limited to no dress code, flexible schedule, and freedom of action. In light of this, the Army may have to consider adjusting or relaxing its current recruitment processes, business practices, and force structuring to ensure it is able to attract and retain expertise. It also may have to reconsider how it adapts and utilizes its civilian workforce to undertake these types of tasks in new and creative ways.

The Recruit of 2050 will need to be engaged much differently than today. Potential Recruits may not want to be contacted by traditional methods3 – phone calls, in person, job fairs – but instead likely will prefer to “meet” digitally first. Recruiters already are seeing this today. In order to improve recruiting efforts, the Army may need to look for Recruits in non-traditional areas such as competitive online gaming. There is an opportunity for the Army to use AI to identify Recruit commonalities and improve its targeted advertisements in the digital realm to entice specific groups who have otherwise been overlooked. The Army is already exploring this avenue of approach through the formation of an eSports team that will engage young potential Recruits and attempt to normalize their view of Soldiers and the Army, making them both more relatable and enticing.4 This presents a broader opportunity to close the chasm that exists between civilians and the military.

The overall dynamic landscape of the future economy, the evolving labor market, and the changing character of future warfare will create an inflection point for the Army to re-evaluate longstanding recruitment strategies, workplace standards, and learning institutions and programs. This will bring about an opportunity for the Army to expand, refine, and realign its collection of skillsets and MOSs, making Soldiers more adapted for future battles, while at the same time challenging the Army to remain prominent in attracting premier talent in a highly competitive environment.

If you enjoyed this extract, please read the comprehensive Learning in 2050 Conference Final Report

… and see our TRADOC 2028 blog post.


1 Prensky, Mark, Mad Scientist Conference: Learning in 2050, Georgetown University, 9 August 2018.

2 Schatz, Sarah, Mad Scientist Conference: Learning in 2050, Georgetown University, 8 August 2018.

3 Davies, Hans, Mad Scientist Conference: Learning in 2050, Georgetown University, 9 August 2018.

4 Garland, Chad, Uncle Sam wants you — to play video games for the US Army, Stars and Stripes, 9 November 2018, https://www.stripes.com/news/uncle-sam-wants-you-to-play-video-games-for-the-us-army-1.555885.

101. TRADOC 2028

[Editor’s Note:  The U.S. Army Training and Doctrine Command (TRADOC) mission is to recruit, train, and educate the Army, driving constant improvement and change to ensure the Total Army can deter, fight, and win on any battlefield now and into the future. Today’s post addresses how TRADOC will need to transform to ensure that it continues to accomplish this mission with the next generation of Soldiers.]

Per The Army Vision:

The Army of 2028 will be ready to deploy, fight, and win decisively against any adversary, anytime and anywhere, in a joint, multi-domain, high-intensity conflict, while simultaneously deterring others and maintaining its ability to conduct irregular warfare. The Army will do this through the employment of modern manned and unmanned ground combat vehicles, aircraft, sustainment systems, and weapons, coupled with robust combined arms formations and tactics based on a modern warfighting doctrine and centered on exceptional Leaders and Soldiers of unmatched lethality.” GEN Mark A. Milley, Chief of Staff of the Army, and Dr. Mark T. Esper, Secretary of the Army, June 7, 2018.

In order to achieve this vision, the Army of 2028 needs a TRADOC 2028 that will recruit, organize, and train future Soldiers and Leaders to deploy, fight, and win decisively on any future battlefield. This TRADOC 2028 must account for: 1) the generational differences in learning styles; 2) emerging learning support technologies; and 3) how the Army will need to train and learn to maintain cognitive overmatch on the future battlefield. The Future Operational Environment, characterized by the speeding up of warfare and learning, will challenge the artificial boundaries between institutional and organizational learning and training (e.g., Brigade mobile training teams [MTTs] as a Standard Operating Procedure [SOP]).

Soldiers will be “New Humans” – beyond digital natives, they will embrace embedded and integrated sensors, Artificial Intelligence (AI), mixed reality, and ubiquitous communications. “Old Humans” adapted their learning style to accommodate new technologies (e.g., Classroom XXI). New Humans’ learning style will be a result of these technologies, as they will have been born into a world where they code, hack, rely on intelligent tutors and expert avatars (think the nextgen of Alexa / Siri), and learn increasingly via immersive Augmented / Virtual Reality (AR/VR), gaming, simulations, and YouTube-like tutorials, rather than the desiccated lectures and interminable PowerPoint presentations of yore. TRADOC must ensure that our cadre of instructors know how to use (and more importantly, embrace and effectively incorporate) these new learning technologies into their programs of instruction, until their ranks are filled with “New Humans.”

Delivering training for new, as of yet undefined MOSs and skillsets. The Army will have to compete with Industry to recruit the requisite talent for Army 2028. These recruits may enter service with fundamental technical skills and knowledges (e.g., drone creator/maintainer, 3-D printing specialist, digital and cyber fortification construction engineer) that may result in a flattening of the initial learning curve and facilitate more time for training “Green” tradecraft. Cyber recruiting will remain critical, as TRADOC will face an increasingly difficult recruiting environment as the Army competes to recruit new skillsets, from training deep learning tools to robotic repair. Initiatives to appeal to gamers (e.g., the Army’s eSports team) will have to be reflected in new approaches to all TRADOC Lines of Effort. AI may assist in identifying potential recruits with the requisite aptitudes.

“TRADOC in your ruck.” Personal AI assistants bring Commanders and their staffs all of the collected expertise of today’s institutional force. Conducting machine speed collection, collation, and analysis of battlefield information will free up warfighters and commanders to do what they do best — fight and make decisions, respectively. AI’s ability to quickly sift through and analyze the plethora of input received from across the battlefield, fused with the lessons learned data from thousands of previous engagements, will lessen the commander’s dependence on having had direct personal combat experience with conditions similar to his current fight when making command decisions.

Learning in the future will be personalized and individualized with targeted learning at the point of need. Training must be customizable, temporally optimized in a style that matches the individual learners, versus a one size fits all approach. These learning environments will need to bring gaming and micro simulations to individual learners for them to experiment. Similar tools could improve tactical war-gaming and support Commander’s decision making.  This will disrupt the traditional career maps that have defined success in the current generation of Army Leaders.  In the future, courses will be much less defined by the rank/grade of the Soldiers attending them.

Geolocation of Training will lose importance. We must stop building and start connecting. Emerging technologies – many accounted for in the Synthetic Training Environment (STE) – will connect experts and Soldiers, creating a seamless training continuum from the training base to home station to the fox hole. Investment should focus on technologies connecting and delivering expertise to the Soldier rather than brick and mortar infrastructure.  This vision of TRADOC 2028 will require “Big Data” to effectively deliver this personalized, immersive training to our Soldiers and Leaders at the point of need, and comes with associated privacy issues that will have to be addressed.

In conclusion, TRADOC 2028 sets the conditions to win warfare at machine speed. This speeding up of warfare and learning will challenge the artificial boundaries between institutional and organizational learning and training.

If you enjoyed this post, please also see:

– Mr. Elliott Masie’s presentation on Dynamic Readiness from the Learning in 2050 Conference, co-hosted with Georgetown University’s Center for Security Studies in Washington, DC, on 8-9 August 2018.

Top Ten” Takeaways from the Learning in 2050 Conference.

82. Bias and Machine Learning

[Editor’s Note:  Today’s post poses four central questions to our Mad Scientist community of action regarding bias in machine learning and the associated ramifications for artificial intelligence, autonomy, lethality, and decision-making on future warfighting.]

We thought that we had the answers, it was the questions we had wrong” – Bono, U2

Source: www.vpnsrus.com via flickr

As machine learning and deep learning algorithms become more commonplace, it is clear that the utopian ideal of a bias-neutral Artificial Intelligence (AI) is exactly just that. These algorithms have underlying biases embedded in their coding, imparted by their human programmers (either consciously or unconsciously). These algorithms can develop further biases during the machine learning and training process.  Dr. Tolga Bolukbasi, Boston University, recently described algorithms as not being capable of distinguishing right from wrong, unlike humans that can judge their actions, even when they act against ethical norms. For algorithms, data is the ultimate determining factor.

Realizing that algorithms supporting future Intelligence, Surveillance, and Reconnaissance (ISR) networks and Commander’s decision support aids will have inherent biases — what is the impact on future warfighting? This question is exceptionally relevant as Soldiers and Leaders consider the influence of biases in man-machine relationships, and their potential ramifications on the battlefield, especially with regard to the rules of engagement (i.e., mission execution and combat efficiency versus the proportional use of force and minimizing civilian casualties and collateral damage).

It is difficult to make predictions, particularly about the future.” This quote has been attributed to anyone ranging from Mark Twain to Niels Bohr to Yogi Berra. Point prediction is a sucker’s bet. However, asking the right questions about biases in AI is incredibly important.

The Mad Scientist Initiative has developed a series of questions to help frame the discussion regarding what biases we are willing to accept and in what cases they will be acceptable. Feel free to share your observations and questions in the comments section of this blog post (below) or email them to us at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil.

1) What types of bias are we willing to accept? Will a so-called cognitive bias that forgoes a logical, deliberative process be allowable? What about a programming bias that is discriminative towards any specific gender(s), ethnicity(ies), race(s), or even age(s)?

2) In what types of systems will we accept biases? Will machine learning applications in supposedly non-lethal warfighting functions like sustainment, protection, and intelligence be given more leeway with regards to bias?

3) Will the biases in machine learning programming and algorithms be more apparent and/or outweigh the inherent biases of humans-in-the-loop? How will perceived biases affect trust and reliance on machine learning applications?

4) At what point will the pace of innovation and introduction of this technology on the battlefield by our adversaries cause us to forego concerns of bias and rapidly field systems to gain a decisive Observe, Orient, Decide, and Act (OODA) loop and combat speed advantage on the Hyperactive Battlefield?

For additional information impacting on this important discussion, please see the following:

An Appropriate Level of Trust… blog post

Ethical Dilemmas of Future Warfare blog post

Ethics and the Future of War panel discussion video

80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

76. “Top Ten” Takeaways from the Learning in 2050 Conference

On 8-9 August 2018, the U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Learning in 2050 Conference with Georgetown University’s Center for Security Studies in Washington, DC.  Leading scientists, innovators, and scholars from academia, industry, and the government gathered to address future learning techniques and technologies that are critical in preparing for Army operations in the mid-21st century against adversaries in rapidly evolving battlespaces.  The new and innovative learning capabilities addressed at this conference will enable our Soldiers and Leaders to act quickly and decisively in a changing Operational Environment (OE) with fleeting windows of opportunity and more advanced and lethal technologies.

We have identified the following “Top 10” takeaways related to Learning in 2050:

1. Many learning technologies built around commercial products are available today (Amazon Alexa, Smart Phones, Immersion tech, Avatar experts) for introduction into our training and educational institutions. Many of these technologies are part of the Army’s concept for a Synthetic Training Environment (STE) and there are nascent manifestations already.  For these technologies to be widely available to the future Army, the Army of today must be prepared to address:

– The collection and exploitation of as much data as possible;

– The policy concerns with security and privacy;

 – The cultural challenges associated with changing the dynamic between learners and instructors, teachers, and coaches; and

– The adequate funding to produce capabilities at scale so that digital tutors or other technologies (Augmented Reality [AR] / Virtual Reality [VR], etc.) and skills required in a dynamic future, like critical thinking/group think mitigation, are widely available or perhaps ubiquitous.

2. Personalization and individualization of learning in the future will be paramount, and some training that today takes place in physical schools will be more the exception, with learning occurring at the point of need. This transformation will not be limited to lesson plans or even just learning styles:

Intelligent tutors, Artificial Intelligence (AI)-driven instruction, and targeted mentoring/tutoring;

– Tailored timing and pacing of learning (when, where, and for what duration best suits the individual learner or group of learners?);

– Collaborative learners will be teams partnering to learn;

Targeted Neuroplasticity Training / Source: DARPA

– Various media and technologies that enable enhanced or accelerated learning (Targeted Neuroplasticity Training (TNT), haptic sensors, AR/VR, lifelong personal digital learning partners, pharmaceuticals, etc.) at scale;

– Project-oriented learning; when today’s high school students are building apps, they are asked “What positive change do you want to have?” One example is an open table for Bully Free Tables. In the future, learners will learn through working on projects;

– Project-oriented learning will lead to a convergence of learning and operations, creating a chicken (learning) or the egg (mission/project) relationship; and

– Learning must be adapted to consciously address the desired, or extant, culture.

Drones Hanger / Source: Oshanin

3. Some jobs and skill sets have not even been articulated yet. Hobbies and recreational activities engaged in by kids and enthusiasts today could become occupations or Military Occupational Specialties (MOS’s) of the future (e.g., drone creator/maintainer, 3-D printing specialist, digital and cyber fortification construction engineer — think Minecraft and Fortnite with real-world physical implications). Some emerging trends in personalized warfare, big data, and virtual nations could bring about the necessity for more specialists that don’t currently exist (e.g., data protection and/or data erasure specialists).

Mechanical Animal / Source: Pinterest

4. The New Human (who will be born in 2032 and is the recruit of 2050) will be fundamentally different from the Old Human. The Chief of Staff of the Army (CSA) in 2050 is currently a young Captain in our Army today. While we are arguably cyborgs today (with integrated electronics in our pockets and on our wrists), the New Humans will likely be cyborgs in the truest sense of the word, with some having embedded sensors. How will those New Humans learn? What will they need to learn? Why would they want to learn something? These are all critical questions the Army will continue to ask over the next several decades.

Source: iLearn

5. Learning is continuous and self-initiated, while education is a point in time and is “done to you” by someone else. Learning may result in a certificate or degree – similar to education – or can lead to the foundations of a skill or a deeper understanding of operations and activity. How will organizations quantify learning in the future? Will degrees or even certifications still be the benchmark for talent and capability?

Source: The Data Feed Toolbox

6. Learning isn’t slowing down, it’s speeding up. More and more things are becoming instantaneous and humans have no concept of extreme speed. Tesla cars have the ability to update software, with owners getting into a veritably different car each day. What happens to our Soldiers when military vehicles change much more iteratively? This may force a paradigm shift wherein learning means tightening local and global connections (tough to do considering government/military network securities, firewalls, vulnerabilities, and constraints); viewing technology as extended brains all networked together (similar to Dr. Alexander Kott’s look at the Internet of Battlefield Things [IoBT]); and leveraging these capabilities to enable Soldier learning at extremely high speeds.

Source: Connecting Universes

7. While there are a number of emerging concepts and technologies to improve and accelerate learning (TNT, extended reality, personalized learning models, and intelligent tutors), the focus, training stimuli, data sets, and desired outcomes all have to be properly tuned and aligned or the Learner could end up losing correct behavior habits (developing maladaptive plasticity), developing incorrect or skewed behaviors (per the desired capability), or assuming inert cognitive biases.

Source: TechCrunch

8. Geolocation may become increasingly less important when it comes to learning in the future. If Apple required users to go to Silicon Valley to get trained on an iPhone, they would be exponentially less successful. But this is how the Army currently trains. The ubiquity of connectivity, the growth of the Internet of Things (and eventually Internet of Everything), the introduction of universal interfaces (think one XBOX controller capable of controlling 10 different types of vehicles), major advances in modeling and simulations, and social media innovation all converge to minimize the importance of teachers, students, mentors, and learners being collocated at the same physical location.

Transdisciplinarity at Work / Source: https://www.cetl.hku.hk

9. Significant questions have to be asked regarding the specificity of training in children at a young age to the point that we may be overemphasizing STEM from an early age and not helping them learn across a wider spectrum. We need Transdisciplinarity in the coming generations.

10. 3-D reconstructions of bases, training areas, cities, and military objectives coupled with mixed reality, haptic sensing, and intuitive controls have the potential to dramatically change how Soldiers train and learn when it comes to not only single performance tasks (e.g., marksmanship, vehicle driving, reconnaissance, etc.) but also in dense urban operations, multi-unit maneuver, and command and control.

Heavy Duty by rOEN911 / Source: DeviantArt

During the next two weeks, we will be posting the videos from each of the Learning in 2050 Conference presentations on the TRADOC G-2 Operational Environment (OE) Enterprise YouTube Channel and the associated slides on our Mad Scientist APAN site — stay connected here at the Mad Scientist Laboratory.

One of the main thrusts in the Mad Scientist lines of effort is harnessing and cultivating the Intellect of the Nation. In this vein, we are asking Learning in 2050 Conference participants (both in person and online) to share their ideas on the presentations and topic. Please consider:

– What topics were most important to you personally and professionally?

– What were your main takeaways from the event?

– What topics did you want the speakers to extrapolate more on?

– What were the implications for your given occupation/career field from the findings of the event?

Your input will be of critical importance to our analysis and products that will have significant impact on the future of the force in design, structuring, planning, and training!  Please submit your input to Mad Scientist at: usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil.

68. Bio Convergence and Soldier 2050 Conference Final Report

[Editor’s Note: The U.S. Army Training and Doctrine Command (TRADOC) co-hosted the Mad Scientist Bio Convergence and Soldier 2050 Conference with SRI International on 8–9 March 2018 at their Menlo Park campus in California. This conference explored bio convergence, what the Army’s Soldier of 2050 will look like, and how they will interact and integrate with their equipment. The following post is an excerpt from this conference’s final report.]

Source: U.S. Army photo by SPC Joshua P. Morris

While the technology and concepts defining warfare have continuously and rapidly transformed, the primary actor in warfare – the human – has remained largely unchanged. Soldiers today may be physically larger, more thoroughly trained, and better equipped than their historical counterparts, but their capability and performance abilities remain very similar.

These limitations in human performance, however, may change over the next 30 years, as advances in biotechnology and human performance likely will expand the boundaries of what is possible for humans to achieve. We may see Soldiers – not just their equipment – with superior vision, enhanced cognitive abilities, disease/virus resistance, and increased strength, speed, agility, and endurance. As a result, these advances could provide the Soldier with an edge to survive and thrive on the hyperactive, constantly changing, and increasingly lethal Multi-Domain Battlespace.

Source: The Guardian and Lynsey Irvine/Getty

In addition to potentially changing the individual physiology and abilities of the future Soldier, there are many technological innovations on the horizon that will impact human performance. The convergence of these technologies – artificial intelligence (AI), robotics, augmented reality, brain-machine interface, nanotechnologies, and biological and medical improvements to the human – is referred to as bio convergence. Soldiers of the future will have enhanced capabilities due to technologies that will be installed, instilled, and augmented. This convergence will also make the Army come to terms on what kinds of bio-converged technologies will be accepted in new recruits.

The conference generated the following key findings:

Source: RodMartin.org

• The broad advancement of biotechnologies will provide wide access to dangerous and powerful bioweapons and human enhancements. The low cost and low expertise entry point into gene editing, human performance enhancement, and bioweapon production has spurred a string of new explorations into this arena by countries with large defense budgets (e.g.,  China), non-state criminal and terrorist organizations (e.g., ISIS), and even super-empowered individuals willing to subject their bodies to experimental and risky treatments.

Source: Shutterstock

• Emerging synthetic biology tools (e.g., CRISPR, Talon, and ZFN) present an opportunity to engineer Soldiers’ DNA and enhance their performance, providing  greater  speed, strength, endurance, and resilience.  These tools, however, will also create new vulnerabilities, such as genomic targeting, that can be exploited by an adversary and/or potentially harm the individual undergoing enhancement.  Bioengineering is becoming easier and cheaper as a bevy of developments are reducing biotechnology transaction costs in gene reading, writing, and editing.  Due to the ever-increasing speed and lethality of the future battlefield, combatants will need cognitive and physical enhancement to survive and thrive.

Source: Getty Images

• Ensuring that our land forces are ready to meet future challenges requires optimizing biotechnology and neuroscience advancements.  Designer viruses and diseases will be highly volatile, mutative, and extremely personalized, potentially challenging an already stressed Army medical response system and its countermeasures.  Synthetic biology provides numerous applications that will bridge capability gaps and enable future forces to fight effectively. Future synthetic biology defense applications are numerous and range from sensing capabilities to rapidly developed vaccines and therapeutics.

Source: Rockwell Collins / Aviation Week

• Private industry and academia have become the driving force behind innovation. While there are some benefits to this – such as shorter development times – there are also risks. For example, investments in industry are mainly driven by market demand which can lead to a lack of investment in areas that are vital to National Defense but have low to no consumer demand. In academia, a majority of graduate students in STEM fields are foreign nationals, comprising over 80% of electrical and petroleum engineering programs. The U.S. will need to find a way to maintain its technological superiority even when most of the expertise eventually leaves the country.

Source: World Health Organization

• The advent of new biotechnologies will give rise to moral, regulatory, and legal challenges for the Army of the Future, its business practices, recruiting requirements, Soldier standards, and structure. The rate of technology development in the synthetic biology field is increasing rapidly. Private individuals or small start-ups with minimal capital can create a new organism for which there is no current countermeasure and the development of one will likely take years. This potentiality leads to the dilemma of swiftly creating effective policy and regulation that addresses these concerns, while not stifling creativity and productivity in the field for those conducting legitimate research. Current regulation may not be sufficient, and bureaucratic inflexibility prevents quick reactive and proactive change. Our adversaries may not move as readily to adopt harsher regulations in the bio-technology arena. Rather than focusing on short-term solutions, it may be beneficial to take a holistic approach centered in a world where bio-technology is interacting with everyday life. The U.S. may have to work from a relative “disadvantage,” using safe and legal methods of enhancement, while our adversaries may choose to operate below our defined legal threshold.

Bio Convergence is incredibly important to the Army of the Future because the future Soldier is the Bio. The Warrior of tomorrow’s Army will be given more responsibility, will be asked to do more, will be required to be more capable, and will face more challenges and complexities than ever before. These Soldiers must be able to quickly adapt, change, connect to and disconnect from a multitude of networks – digital and otherwise – all while carrying out multiple mission-sets in an increasingly disrupted, degraded, and arduous environment marred with distorted reality, information warfare, and attacks of a personalized nature.

For additional information regarding this conference:

• Review the Lessons Learned from the Bio Convergence and Soldier 2050 Conference preliminary assessment.

• Read the entire Mad Scientist Bio Convergence and Soldier 2050 Conference Final Report.

• Watch the conference’s video presentations.

• See the associated presentations’ briefing slides.

• Check out the associated “Call for Ideas” writing contest finalist submissions, hosted by our colleagues at Small Wars Journal.