9. Autonomy Threat Trends

The United States’ post-Cold War military superiority is eroding. Having witnessed U.S. military operations for the past three decades, our potential adversaries have realized that our superiority can be mitigated via game-changing advances in a number of technologies. During the Robotics, Artificial Intelligence & Autonomy Conference, at Georgia Tech Research Institute (GTRI), 7-8 March 2017, Mad Scientists addressed autonomy threat trends that will challenge the United States (from both nation states and Violent Non-State Actors [VNSAs]) in the future Operational Environment.

Russia. Russians view robotics, artificial intelligence, and autonomy technologies as key to a next “Kondratiev Sixth Wave.” They have developed and are testing the URAN-9 Unmanned Ground Vehicle (UGV), an unmanned combat vehicle equipped with a 30mm cannon, 7.62mm machine gun, and anti-tank and anti-aircraft missiles. At 9 tons, it is readily air-transportable and demonstrates the advantageous trade space between reduced (or eliminated) human survivability requirements and system size, weight, and power (SWAP) requirements.

China. China is also investing heavily in advanced research, through both overseas purchases as well as impressive domestic research investments, particularly in the field of quantum computing – a potential breakthrough enabler for artificial intelligence. China has fielded armed UAVs and also developed UGVs, such as the Snow Leopard 10, which can detect and detonate bombs. According to the Defense Science Board, “every major manufacturer for the Chinese military has a research center devoted to unmanned systems.”

VNSAs. As these technologies are increasingly embedded into our human infrastructure, a wide range of VNSA and super-empowered individual threats become very feasible. Terrorists are traditionally conservative and imitative with respect to technology but are increasingly looking to robotics, artificial intelligence, and autonomy for multiple reasons. Some groups have an ideological orientation towards technology – either to leverage it or undermine it. Many groups find existing methods insufficient to achieve their aims; these technologies “lengthen the levers of asymmetry.” VNSAs confront a need to circumvent protective measures and make it more difficult to apprehend / kill VNSA operatives. Use of these technologies reinforces the psychological impact of terrorism, and also enhances the competitive status of the employer. As these technologies permeate our infrastructure, a very high level of exploitable resources will be available (e.g., driverless cars), and the costs associated with adopting new technology are often low. The marginal cost to proliferate an AI capability through software replication, for example, is close to zero.

Some VNSAs are taking on complex engineering tasks and will train, hire, or kidnap the human capital they need to staff their R&D entities. The Syrian conflict has emerged as an innovation incubator that features VNSA use of drones, teleoperated rifles, remote gun turrets, and chemical weapons. Unburdened by acquisition bureaucracies, some adversaries are closing the technology gap faster than we are widening it.

Cyber and Dark Networks. Cyber networks are the “Battlespace” of “Cyber Agents” (i.e., the software code that incorporates automation and artificial intelligence to act in the cyber domain). VNSAs frequently leverage Dark Networks, a component of the cyberspace that is integral to the threat development of these technologies. Global Dark Networks are mature, largely self-organizing and include competing supply chains that circumvent regulatory controls.

“Algorithmic Warfare.” Conflict is extending below the platform level, below the platform component level, and even below the electronic chipset level as logic solutions compete “algorithm vs algorithm.” Algorithms are the subtle ‘secret sauce’ that powers these technologies and underscores the need for robust STEM programs to ensure that the appropriate intellectual talent is available to devise the most innovative, effective, and efficient code.

Rogue Technology. The future will include the threat of rogue technology that will be agile, high velocity, complex, networked, and “pop-up” in unexpected, non-linear events. Future adversaries may be both human and AI. Although hotly debated, there are a number of pathways by which fully sentient artificial consciousness (strong AI) could be achieved in the 2030-2050 timeframe. Early detection and control might be the only available avenues to preclude existential rogue AI threats.

Mr. Alvin Wilby, Vice President of Research at Thales, the French defense firm, is reported by BBC News as having stated that the “genie is out of the bottle” with regard to smart technology:

“The technological challenge of scaling it up to swarms and things like that doesn’t need any inventive step…. It’s just a question of time and scale and I think that’s an absolute certainty that we should worry about.”

Questions for consideration:
• During the past 100 years, a number of efforts to contain/restrict arms proliferation have been consigned to the ash heap of History (e.g., the Washington Naval Treaty, the Convention on Certain Conventional Weapons). Are efforts to accomplish the same for autonomous weapons (i.e., last month’s Geneva Conference) similarly doomed?

• Tight controls on fissile materials has limited the proliferation of nuclear weapons. With the proliferation of code and dark networks, do you concur with Mr. Wilby’s statement that the “genie is out of the bottle,” especially with regard to VNSAs?

For more on this autonomy threat trend, see the presentation by Dr. Gary Ackerman, National Consortium for the Study of Terrorism and Responses to Terrorism, University of Maryland, delivered at the Robotics, Artificial Intelligence & Autonomy Conference at GTRI.

Share on Facebook Share on LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *