80. “The Queue”

[Editor’s Note:  Mad Scientist Laboratory is pleased to present our August edition of “The Queue” – a monthly post listing the most compelling articles, books, podcasts, videos, and/or movies that the U.S. Army’s Training and Doctrine Command (TRADOC) Mad Scientist Initiative has come across during the past month. In this anthology, we address how each of these works either informs or challenges our understanding of the Future Operational Environment. We hope that you will add “The Queue” to your essential reading, listening, or watching each month!]

Gartner Hype Cycle / Source:  Nicole Saraco Loddo, Gartner

1.5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies,” by Kasey Panetta, Gartner, 16 August 2018.

Gartner’s annual hype cycle highlights many of the technologies and trends explored by the Mad Scientist program over the last two years. This year’s cycle added 17 new technologies and organized them into five emerging trends: 1) Democratized Artificial Intelligence (AI), 2) Digitalized Eco-Systems, 3) Do-It-Yourself Bio-Hacking, 4) Transparently Immersive Experiences, and 5) Ubiquitous Infrastructure. Of note, many of these technologies have a 5–10 year horizon until the Plateau of Productivity. If this time horizon is accurate, we believe these emerging technologies and five trends will have a significant role in defining the Character of Future War in 2035 and should have modernization implications for the Army of 2028. For additional information on the disruptive technologies identified between now and 2035, see the Era of Accelerated Human Progress portion of our Potential Game Changers broadsheet.

[Gartner disclaimer:  Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.]

Artificial Intelligence by GLAS-8 / Source: Flickr

2.Should Evil AI Research Be Published? Five Experts Weigh In,” by Dan Robitzski, Futurism, 27 August 2018.

The following rhetorical (for now) question was posed to the “AI Race and Societal Impacts” panel during last month’s The Joint Multi-Conference on Human-Level Artificial Intelligence in Prague, The Czech Republic:

“Let’s say you’re an AI scientist, and you’ve found the holy grail of your field — you figured out how to build an artificial general intelligence (AGI). That’s a truly intelligent computer that could pass as human in terms of cognitive ability or emotional intelligence. AGI would be creative and find links between disparate ideas — things no computer can do today.

That’s great, right? Except for one big catch: your AGI system is evil or could only be used for malicious purposes.

So, now a conundrum. Do you publish your white paper and tell the world exactly how to create this unrelenting force of evil? Do you file a patent so that no one else (except for you) could bring such an algorithm into existence? Or do you sit on your research, protecting the world from your creation but also passing up on the astronomical paycheck that would surely arrive in the wake of such a discovery?”

The panel’s responses ranged from controlling — “Don’t publish it!” and treat it like a grenade, “one would not hand it to a small child, but maybe a trained soldier could be trusted with it”; to the altruistic — “publish [it]… immediately” and “there is no evil technology, but there are people who would misuse it. If that AGI algorithm was shared with the world, people might be able to find ways to use it for good”; to the entrepreneurial – “sell the evil AGI to [me]. That way, they wouldn’t have to hold onto the ethical burden of such a powerful and scary AI — instead, you could just pass it to [me and I will] take it from there.

While no consensus of opinion was arrived at, the panel discussion served a useful exercise in illustrating how AI differs from previous eras’ game changing technologies. Unlike Nuclear, Biological, and Chemical weapons, no internationally agreed to and implemented control protocols can be applied to AI, as there are no analogous gas centrifuges, fissile materials, or triggering mechanisms; no restricted access pathogens; no proscribed precursor chemicals to control. Rather, when AGI is ultimately achieved, it is likely to be composed of nothing more than diffuse code; a digital will’o wisp that can permeate across the global net to other nations, non-state actors, and super-empowered individuals, with the potential to facilitate unprecedentedly disruptive Information Operation (IO) campaigns and Virtual Warfare, revolutionizing human affairs. The West would be best served in emulating the PRC with its Military-Civil Fusion Centers and integrate the resources of the State with the innovation of industry to achieve their own AGI solutions soonest. The decisive edge will “accrue to the side with more autonomous decision-action concurrency on the Hyperactive Battlefield” — the best defense against a nefarious AGI is a friendly AGI!

Scales Sword Of Justice / Source: https://www.maxpixel.net/

3.Can Justice be blind when it comes to machine learning? Researchers present findings at ICML 2018,” The Alan Turing Institute, 11 July 2018.

Can justice really be blind? The International Conference on Machine Learning (ICML) was held in Stockholm, Sweden, in July 2018. This conference explored the notion of machine learning fairness and proposed new methods to help regulators provide better oversight and practitioners to develop fair and privacy-preserving data analyses. Like ethical discussions taking place within the DoD, there are rising legal concerns that commercial machine learning systems (e.g., those associated with car insurance pricing) might illegally or unfairly discriminate against certain subgroups of the population. Machine learning will play an important role in assisting battlefield decisions (e.g., the targeting cycle and commander’s decisions) – especially lethal decisions. There is a common misperception that machines will make unbiased and fair decisions, divorced from human bias. Yet the issue of machine learning bias is significant because humans, with their host of cognitive biases, code the very programming that will enable machines to learn and make decisions. Making the best, unbiased decisions will become critical in AI-assisted warfighting. We must ensure that machine-based learning outputs are verified and understood to preclude the inadvertent introduction of human biases.  Read the full report here.

Robot PNG / Source: pngimg.com

4.Uptight robots that suddenly beg to stay alive are less likely to be switched off by humans,” by Katyanna Quach, The Register, 3 August 2018.

In a study published by PLOS ONE, researchers found that a robot’s personality affected a human’s decision-making. In the study, participants were asked to dialogue with a robot that was either sociable (chatty) or functional (focused). At the end of the study, the researchers let the participants know that they could switch the robot off if they wanted to. At that moment, the robot would make an impassioned plea to the participant to resist shutting them down. The participants’ actions were then recorded. Unexpectedly, there were  a large number of participants who resisted shutting down the functional robots after they made their plea, as opposed to the sociable ones. This is significant. It shows, beyond the unexpected result, that decision-making is affected by robotic personality. Humans will form an emotional connection to artificial entities despite knowing they are robotic if they mimic and emulate human behavior. If the Army believes its Soldiers will be accompanied and augmented heavily by robots in the near future, it must also understand that human-robot interaction will not be the same as human-computer interaction. The U.S. Army must explore how attain the appropriate level of trust between Soldiers and their robotic teammates on the future battlefield. Robots must be treated more like partners than tools, with trust, cooperation, and even empathy displayed.

IoT / Source: Pixabay

5.Spending on Internet of Things May More Than Double to Over Half a Trillion Dollars,” by Aaron Pressman, Fortune, 8 August 2018.

While the advent of the Internet brought home computing and communication even deeper into global households, the revolution of smart phones brought about the concept of constant personal interconnectivity. Today and into the future, not only are humans being connected to the global commons via their smart devices, but a multitude of devices, vehicles, and various accessories are being integrated into the Internet of Things (IoT). Previously, the IoT was addressed as a game changing technology. The IoT is composed of trillions of internet-linked items, creating opportunities and vulnerabilities. There has been explosive growth in low Size Weight and Power (SWaP) and connected devices (Internet of Battlefield Things), especially for sensor applications (situational awareness).

Large companies are expected to quickly grow their spending on Internet-connected devices (i.e., appliances, home devices [such as Google Home, Alexa, etc.], various sensors) to approximately $520 billion. This is a massive investment into what will likely become the Internet of Everything (IoE). While growth is focused on known devices, it is likely that it will expand to embedded and wearable sensors – think clothing, accessories, and even sensors and communication devices embedded within the human body. This has two major implications for the Future Operational Environment (FOE):

– The U.S. military is already struggling with the balance between collecting, organizing, and using critical data, allowing service members to use personal devices, and maintaining operations and network security and integrity (see banning of personal fitness trackers recently). A segment of the IoT sensors and devices may be necessary or critical to the function and operation of many U.S. Armed Forces platforms and weapons systems, inciting some critical questions about supply chain security, system vulnerabilities, and reliance on micro sensors and microelectronics

– The U.S. Army of the future will likely have to operate in and around dense urban environments, where IoT devices and sensors will be abundant, degrading blue force’s ability to sense the battlefield and “see” the enemy, thereby creating a veritable needle in a stack of needles.

6.Battlefield Internet: A Plan for Securing Cyberspace,” by Michèle Flournoy and Michael Sulmeyer, Foreign Affairs, September/October 2018. Review submitted by Ms. Marie Murphy.

With the possibility of a “cyber Pearl Harbor” becoming increasingly imminent, intelligence officials warn of the rising danger of cyber attacks. Effects of these attacks have already been felt around the world. They have the power to break the trust people have in institutions, companies, and governments as they act in the undefined gray zone between peace and all-out war. The military implications are quite clear: cyber attacks can cripple the military’s ability to function from a command and control aspect to intelligence communications and materiel and personnel networks. Besides the military and government, private companies’ use of the internet must be accounted for when discussing cyber security. Some companies have felt the effects of cyber attacks, while others are reluctant to invest in cyber protection measures. In this way, civilians become affected by acts of cyber warfare, and attacks on a country may not be directed at the opposing military, but the civilian population of a state, as in the case of power and utility outages seen in eastern Europe. Any actor with access to the internet can inflict damage, and anyone connected to the internet is vulnerable to attack, so public-private cooperation is necessary to most effectively combat cyber threats.

If you read, watch, or listen to something this month that you think has the potential to inform or challenge our understanding of the Future Operational Environment, please forward it (along with a brief description of why its potential ramifications are noteworthy to the greater Mad Scientist Community of Action) to our attention at:  usarmy.jble.tradoc.mbx.army-mad-scientist@mail.mil — we may select it for inclusion in our next edition of “The Queue”!

77. “The Tenth Man” — Russia’s Era Military Innovation Technopark

[Editor’s Note: Mad Scientist Laboratory is pleased to publish the second in our series of “The Tenth Man” posts (read the first one here). This Devil’s Advocate or contrarian approach serves as a form of alternative analysis and is a check against group think and mirror imaging. The Mad Scientist Laboratory offers it as a platform for the contrarians in our network to share their alternative perspectives and analyses regarding the Future Operational Environment.

Today’s post is by guest blogger Mr. Ray Finch addressing Russia’s on-going efforts to develop a military innovation center —  Era Military Innovation Technopark — near the city of Anapa (Krasnodar Region) on the northern coast of the Black Sea.  Per The Operational Environment and the Changing Character of Future Warfare, “Russia can be considered our ‘pacing threat,’ and will be our most capable potential foe for at least the first half of the Era of Accelerated Human Progress [now through 2035]. It will remain a key adversary through the Era of Contested Equality [2035-2050].” So any Russian attempts at innovation to create “A Militarized Silicon Valley in Russia” should be sounding alarms throughout the NATO Alliance, right?  Well, maybe not….]

(Please note that several of Mr. Finch’s embedded links in the post below are best accessed using non-DoD networks.)

Only a Mad Russian Scientist could write the paragraph below:

Russia Resurgent, Source: Bill Butcher, The Economist

If all goes according to plan, in October 2035 the Kremlin will host a gala birthday party to commemorate President Putin’s 83d birthday. Ever since the Russian leader began receiving special biosynthetic plasma developed by military scientists at the country’s premier Era Technopolis Center in Anapa, the president’s health and overall fitness now resembles that of a 45-year old. This development was just one in a series of innovations which have helped to transform – not just the Kremlin leader – but the entire country.  By focusing its best and brightest on new technologies, Russia has become the global leader in information and telecommunication systems, artificial intelligence, robotic complexes, supercomputers, technical vision and pattern recognition, information security, nanotechnology and nanomaterials, energy tech and technology life support cycle, as well as bioengineering, biosynthetic, and biosensor technologies. In many respects, Russia is now the strongest country in the world.

While this certainly echoes the current Kremlin propaganda, a more sober analysis regarding the outcomes of the Era Military Innovation Technopark in Anapa (Krasnodar Region) ought to consider those systemic factors which will likely retard its future development. Below are five reasons why Putin and Russia will likely have less to celebrate in 2035.

President Putin and Defense Minister Shoigu being briefed on Technopark-Era, Kremlin, 23 Feb 2018. Source: http://kremlin.ru/events/president/news/56923, CC BY 4.0.

You can’t have milk without a cow

The primary reason that the Kremlin’s attempt to create breakthrough innovations at the Era Technopark will result in disappointment stems from the lack of a robust social structure to support such innovations. And it’s not simply the absence of good roads or adequate healthcare. As the renowned MIT scientist, Dr. Loren R. Graham recently pointed out, the Kremlin leadership wants to enjoy the “milk” of technology, without worrying about supporting the system needed to support a “cow.” Graham elaborates on his observation by pointing out that even though Russian scientists have often been at the forefront of technological innovations, the country’s poor legal system prevents these discoveries from ever bearing fruit. Stifling bureaucracy and a broken legal system prevent Russian scientists and innovators from profiting from their discoveries. This dilemma leads to the second factor.

Brain drain

Despite all of the Kremlin’s patriotic hype over the past several years, many young and talented Russians are voting with their feet and pursuing careers abroad. As the senior Russian analyst, Dr. Gordon M. Hahn noted, “instead of voting for pro-democratic forces and/or fomenting unrest, Russia’s discontented, highly educated, highly skilled university graduates tend to move abroad to find suitable work.” And even though the US is maligned on a daily basis in the Kremlin-supported Russian media, many of these smart, young Russians are moving to America. Indeed, according to a recent Radio Free Europe/Radio Liberty (RFE/RL) report, “the number of asylum applications by Russian citizens in the United States hit a 24-year high in 2017, jumping nearly 40 percent from the previous year and continuing an upward march that began after Russian President Vladimir Putin returned to the Kremlin in 2012.” These smart, young Russians believe that their country is headed in the wrong direction and are looking for opportunities elsewhere.

Everything turns out to be a Kalashnikov

There’s no doubt that Russian scientists and technicians are capable of creating effective weapon systems. President Putin’s recent display of military muscle-power was not a mere campaign stratagem, but rather a reminder to his Western “partners” that since Russia remains armed to the teeth, his country deserves respect. And there’s little question that the new Era Technopark will help to create advanced weapon systems of “which there is no analogous version in the world.” But that’s just the point. While Russia is famous for its tanks, artillery, and rocket systems, it has struggled to create anything which might be qualified as a technological marvel in the civilian sector. As some Russian observers have put it, “no matter what the state tries to develop, it ends up being a Kalashnikov.”

Soviet AK-47. Type 2 made from 1951 to 1954/55. Source: http://www.dodmedia.osd.mil Public Domain

The Boss knows what’s best

The current Kremlin leadership now parades itself as being at the forefront of a global conservative and traditional movement. In their favorite narrative, the conniving US is forever trying to weaken Russia (and other autocratic countries) by infecting them with a liberal bacillus, often referred to as a “color revolution.” In their rendition, Russia was contaminated by this democratic disease during the 1990s, only to find itself weakened and taken advantage of by America.

Since then, the Kremlin leadership has retained the form of democracy, but has removed its essence. Elections are held, ballots are cast, but the winner is pre-determined from above. So far, the Russian population has played along with this charade, but at some point, perhaps in an economic crisis, the increasingly plugged-in Russian population might demand a more representative form of government. Regardless, while this top-down, conservative model is ideal for maintaining control and staging major events, it lacks the essential freedom inherent within innovation. Moreover, such a quasi-autocratic system tends to promote Russia’s most serious challenge.

The cancer of corruption

Despite the façade of a uniformed, law-governed state, Russia continues to rank near the bottom on the global corruption index. According to a recent Russian report, “90 percent of entrepreneurs have encountered corruption at least once.” Private Russian companies will likely think twice before deciding to invest in the Era Technopark, unless of course, the Kremlin makes them an offer they cannot refuse. Moreover, as suggested earlier, the young Era scientists may not be fully committed, understanding that the “milk” of their technological discoveries will likely by expropriated by their uniformed bosses.

Technopark Era is not scheduled to be fully operational until 2020, and the elevated rhetoric over its innovative mandate will likely prompt concern among some US defense officials. While the center could advance Russian military technology over the next 15-25 years, it is doubtful that Era will usher in a new era for Russia.

If you enjoyed this edition of the “Tenth Man”:

– Learn more about Russia’s Era Military Innovation Technopark in the April 2018 edition of the TRADOC G-2’s Foreign Military Studies Office (FMSO) OE Watch, Volume 8, Issue 4, pages 10-11.

– Read Mad Scientist Sam Bendett‘s guest blog post on Russian Ground Battlefield Robots: A Candid Evaluation and Ways Forward.

Ray Finch works as a Eurasian Analyst at the Foreign Military Studies Office. He’s a former Army officer (Artillery and Russian FAO).

 

43. The Changing Character of Warfare: Takeaways for the Future

The Future Operational Environment (OE), as described in The Operational Environment and the Changing Character of Future Warfare , brings with it an inexorable series of movements which lead us to consider the following critical question:

What do these issues mean for the nature and character of warfare?

The nature of war, which has remained relatively constant from Thucydides, through Clausewitz, through the Cold War, and on into the present, certainly remains constant through the Era of Accelerated Human Progress (i.e., now through 2035). War is still waged because of fear, honor, and interest, and remains an expression of politics by other means. However, as we move into the Era of Contested Equality (i.e., 2035-2050), the character of warfare has changed in several key areas:

The Moral and Cognitive Dimensions are Ascendant.

The proliferation of high technology, coupled with the speed of human interaction and pervasive connectivity, means that no one nation will have an absolute strategic advantage in capabilities. When breakthroughs occur, the advantages they confer will be fleeting, as rivals quickly adapt. Under such conditions, the physical dimension of warfare may become less important than the cognitive and the moral. As a result, there will be less self-imposed restrictions by some powers on the use of military force, and hybrid strategies involving information operations, direct cyber-attacks against individuals and segments of populations, or national infrastructure, terrorism, the use of proxies, and Weapons of Mass Destruction (WMD) will aim to prevail against an enemy’s will.

Integration across Diplomacy, Information, Military, and Economic (DIME).

Clausewitz’s timeless dictum that war is policy by other means takes on a new importance as the distance between war and policy recedes; but also must take into account other elements of national power to form true whole-of-government and, when possible, collective security approaches to national security issues. The interrelationship across the DIME will require a closer integration across all elements of government, and Joint decision-making bodies will need to quickly and effectively deliver DIME effects across the physical, the cognitive, and moral dimensions. Military operations are an essential element of this equation, but may not necessarily be the decisive means of achieving an end state.

Limitations of Military Force.

While mid-Century militaries will have more capability than at any time in history, their ability to wage high-intensity conflict will become more limited. Force-on-force conflict will be so destructive, will be waged at the new speed of human and AI-enhanced interaction, and will occur at such extended long-ranges that exquisitely trained and equipped forces facing a peer or near-peer rival will rapidly suffer significant losses in manpower and equipment that will be difficult to replace. Robotics, unmanned vehicles, and man-machine teaming activities offer partial solutions, but warfare will still revolve around increasingly vulnerable human beings. Military forces will need to consider how advances in AI, bio-engineering, man-machine interface, neuro-implanted knowledge, and other areas of enhanced human performance and learning can quickly help reduce the long lead time in training and developing personnel.

The Primacy of Information.

In the timeless struggle between offense and defense, information will become the most important and most useful tool at all levels of warfare. The ability of an actor to use information to target the enemy’s will, without necessarily having to address its means will increasingly be possible. In the past, nations have tried to target an enemy’s will through kinetic attacks on its means – the enemy military – or through the direct targeting of the will by attacking the national infrastructure or a national populace itself. Sophisticated, nuanced information operations, taking advantage of an ability to directly target an affected audience through cyber operations or other forms of influence operations, and reinforced by a credible capable armed force can bend an adversary’s will before battle is joined.

Expansion of the Battle Area.

Nations, non-state actors, and even individuals will be able to target military forces and civilian infrastructure at increasing – often over intercontinental – ranges using a host of conventional and unconventional means. A force deploying to a combat zone will be vulnerable from the individual soldier’s personal residence, to his or her installation, and during his or her entire deployment. Adversaries also will have the ability to target or hold at risk non-military infrastructure and even populations with increasingly sophisticated, nuanced and destructive capabilities, including WMD, hypersonic conventional weapons, and perhaps most critically, cyber weapons and information warfare. WMD will not be the only threat capable of directly targeting and even destroying a society, as cyber and information can directly target infrastructure, banking, food supplies, power, and general ways of life. Limited wars focusing on a limited area of operations waged between peers or near-peer adversaries will become more dangerous as adversaries will have an unprecedented capability to broaden their attacks to their enemy’s homeland. The U.S. Homeland likely will not avoid the effects of warfare and will be vulnerable in at least eight areas.

Ethics of Warfare Shift.
Traditional norms of warfare, definitions of combatants and non-combatants, and even what constitutes military action or national casus belli will be turned upside down and remain in flux at all levels of warfare.


– Does cyber activity, or information operations aimed at influencing national policy, rise to the level of warfare?

– Is using cyber capabilities to target a national infrastructure legal, if it has broad societal impacts?

– Can one target an electric grid that supports a civilian hospital, but also powers a military base a continent away from the battle zone from which unmanned systems are controlled?

– What is the threshold for WMD use?

– Is the use of autonomous robots against human soldiers legal?

These and other questions will arise, and likely will be answered differently by individual actors.

The changes in the character of war by mid-Century will be pronounced, and are directly related and traceable to our present. The natural progression of the changes in the character of war may be a change in the nature of war, perhaps towards the end of the Era of Contested Equality or in the second half of the Twenty First Century.

For additional information, watch the TRADOC G-2 Operational Environment Enterprise’s The Changing Character of Future Warfare video.