[Editor’s Note: Mad Scientist Laboratory welcomes guest blogger Dr. Stephen J. Cimbala, Distinguished Professor of Political Science at Penn State Brandywine, with his submission addressing the potentially destabilizing effects of cyber warfare on nuclear crisis management.
In Army of None, proclaimed Mad Scientist Paul Scharre recounts how, on September 26, 1983, Lt Col Stanislav Petrov, a Soviet Air Defense Forces officer serving in a bunker outside Moscow, was alerted to a U.S. missile launch by a recently deployed space-based early warning system. The Soviet watch officer trusted his “gut” – or experientially informed intuition – that this was a false alarm. His gut was right and the world was saved from an inadvertent nuclear exchange because this officer did not overly trust the system.
But what happens if we should lose trust in key elements of the U.S. National Command Authority during a period of heightened tensions? How vulnerable are we, during a crisis, to an adversary hacking into our nuclear command and control elements and introducing mis- or disinformation? With the additional stresses introduced by hypersonic weapons and, ironically, downsized inventories of nuclear weapons, the stakes have never been higher — Read on! (Please review this post via a non-DoD network in order to access all of the embedded links — Thank you!]
Faster and evasive offensive threats; vulnerable nuclear warning and Command, Control, and Communications (C3) systems; and advanced technology for cyberwar will complicate future efforts in nuclear crisis management. New technology for waging conflict in the cyber domain is only part of the problem. The principal danger for nuclear-strategic stability lies in the interactions between instruments for cyberwar and the sinews of nuclear decision making. During the Cold War and the first nuclear age, expectations about crisis management and deterrence stability were based on relatively static models of nuclear exchanges and “black box” assumptions about the decision making processes of states and leaders. In the middle decades of the 21st century, software (including people and organizations) matters as much, or more, than hardware. States’ efforts to approach the brink without crossing the nuclear threshold will depend upon their ability to fulfill the objective requirements for successful crisis management, as discussed herein, despite a new matrix of embedded uncertainties created by the information age.
Crisis management, including nuclear crisis management, is both a competitive and cooperative endeavor between military adversaries. A crisis is, by definition, a time of great tension and uncertainty. Threats are in the air and time pressure on policymakers seems intense. Each side has objectives that it wants to attain and values that it deems important to protect. During a crisis state behaviors are especially interactive and interdependent with those of another state. It would not be too farfetched to refer to this interdependent stream of interstate crisis behaviors as a system, provided the term “system” is not understood as an entity completely separate from the state or individual behaviors that make it up. The system aspect implies reciprocal causation of the crisis behaviors of “A” by “B,” and vice versa.
One aspect of crisis management is the deceptively simple question: what defines a crisis as such? All crises are characterized to some extent by a high degree of threat, short time for decision, and a “fog of crisis” reminiscent of Clausewitz‘s “fog of war” that confuses crisis participants about what is happening. Before the discipline of crisis management was ever invented by modern scholarship, historians had captured the rush-to-judgment character of much crisis decision-making among great powers. The influence of nuclear weapons on crisis decision-making is therefore not easy to measure or document because the avoidance of war can be ascribed to many causes. The presence of nuclear forces obviously influences the degree of destruction that can be done should crisis management fail. Short of that catastrophe, the greater interest of scholars is in how the presence of nuclear weapons might affect the decision-making process itself in a crisis. The problem is conceptually elusive: there are so many potentially important causal factors relevant to a decision with regard to war or peace. History is full of dependent variables in search of competing explanations.
Ironically, the downsizing of US and post-Soviet Russian strategic nuclear arsenals since the end of the Cold War, while a positive development from the perspectives of nuclear arms control and nonproliferation, makes the concurrence of cyber and nuclear attack capabilities more alarming. The supersized deployments of missiles and bombers and expansive numbers of weapons deployed by the Cold War Americans and Soviets had at least one virtue. Those arsenals provided so much redundancy against first strike vulnerability that relatively linear systems for nuclear attack warning, command-control and responsive launch under, or after, attack, sufficed. At the same time, Cold War tools for military cyber mischief were primitive compared to those available now. In addition, countries and their armed forces were less dependent on the fidelity of their information systems for national security. Thus the reduction of US, Russian, and possibly other forces to the size of “minimum deterrents” might compromise nuclear flexibility and resilience in the face of kinetic attacks preceded or accompanied by cyber war. In addition, although the mathematics of minimum deterrence would shrink the size of attackers’ as well as defenders’ arsenals, defenders with smaller size forces might have greater fears of absolute compared to relative losses – and, therefore, be more prone to preemption-dependent strategies than defenders with larger forces.
Offensive and defensive information warfare, as well as other cyber-related activities, are obviously very much on the minds of U.S. military leaders and others in the American and allied national security establishments. On the other hand, arms control for cyber is apt to run into daunting security and technical issues: even assuming a successful navigation of political trust for matters as sensitive as these. Of special significance is whether cyber arms control negotiators can certify that hackers within their own states are sufficiently under control for cyber verification and transparency. Both Russia and China reportedly use ad hoc, unofficial hackers to conduct operations about which governments would prefer to remain officially deniable.
Cyberwar can also destroy or disrupt communication channels necessary for successful crisis management. One way cyberwar can do this is to disrupt communication links between policymakers and military commanders during a period of high threat and severe time pressure. Two kinds of unanticipated problems, from the standpoint of civil-military relations, are possible under these conditions. First, political leaders may have predelegated limited authority for nuclear release or launch under restrictive conditions: only when these few conditions obtain, according to the protocols of predelegation, would military commanders be authorized to employ nuclear weapons distributed within their command. Clogged, destroyed, or disrupted communications could prevent top leaders from knowing that military commanders perceived a situation to be far more desperate, and thus permissive of nuclear initiative, than it really was. For example, during the Cold War, disrupted communications between the U.S. National Command Authority and ballistic missile submarines, once the latter came under attack, could have resulted in a joint decision by submarine officers and crew to launch in the absence of contrary instructions.
Cyber warfare during a crisis will almost certainly increase the time pressure under which political leaders operate. It may do this literally, or it may affect the perceived time-lines within which the policymaking process can make its decisions. U.S., Russian and Chinese interest in hypersonic weapons, including the possible deployment of hypersonic boost-glide vehicles and hypersonic cruise missiles among strategic nuclear forces, has the potential to create additional stresses on already time constrained capabilities for warning, attack assessment, response selection, and transmission of appropriate orders through the chain of command. Once either side sees parts of its command, control, and communications system being subverted by phony information or extraneous cyber-noise, its sense of panic at the possible loss of military options will be enormous. In the case of U.S. Cold War nuclear war plans, for example, disruption of even portions of the strategic command, control, and communications system could have prevented competent execution of parts of the strategic nuclear war plan. Cold War nuclear war plans depended upon finely orchestrated time-on-target estimates and precise damage expectancies against various classes of targets. Partially misinformed or disinformed networks and communications centers would have led to redundant attacks against the same target sets and, quite possibly, unplanned attacks on friendly military or civilian installations. Even in the post-Cold War world of flexible Nuclear Response Plans, the potential slide toward preemption, based on mistaken or exaggerated fears of command-control vulnerability, casts a shadow over deterrence stability.
If you enjoyed this post, review comprehensive source document from which it was excerpted here
… and check out the following related posts:
An Appropriate Level of Trust…
The Future of the Cyber Domain
Warfare in the Parallel Cambrian Age, by Chris O’Connor
Setting the Army for the Future (Part I), by Gary Phillips
Insights from the Mad Scientist Weaponized Information Series of Virtual Events
Sub-threshold Maneuver and the Flanking of U.S. National Security, by Dr. Russell Glenn
The Need for Speed (and Maneuverability), by proclaimed Mad Scientist Seth Gnesin
>>>> REMINDER: Be sure to register for Mad Scientist’s next virtual event — Climate Change – Threats, Resilience, and Adaptation — on Tuesday, 13 April 2021 (1430-1600 EDT). Join our panelists:
-
-
- Dr. Elizabeth Chalecki, Associate Professor of International Relations, University of Nebraska – Omaha, and Wilson Fellow, Woodrow Wilson International Center for Scholars
-
-
-
- Dr. Anne Marie Baylouny, Associate Professor of National Security Affairs, Naval Postgraduate School
-
-
-
- Damarys Acevedo-Mackey, Environmental Engineer, U.S. Army Corps of Engineers
-
-
-
- Devabhaktuni “Sri” Srikrishna, Founder, PatientKnowhow.com
-
- … as they present their unique perspectives regarding Climate Change’s impact on the Operational Environment and the associated implications for the U.S. Army, and then answer questions from registered participants.
Check out this event’s 5W’s here, and then register here [via a non-DoD network] to participate in this informative event!
Stephen J. Cimbala is Distinguished Professor of Political Science at Penn State Brandywine. An award winning Penn State teacher, Dr. Cimbala has authored numerous works in the fields of international security, nuclear arms control and other topics. His publications include The United States, Russia and Nuclear Peace (Palgrave – Macmillan: 2020).