[Editor’s Note: Mad Scientist Laboratory is pleased to publish the first of a two-part series by returning guest blogger, Dr. Nick Marsella, addressing the duty we have to examine our assumptions about emergent warfighting technologies / capabilities and their associated implications to identify potential second / third order and evil effects. This critical, yet too frequently neglected responsibility enables us to identify and mitigate any associated vulnerabilities or undesirable effects, precluding them from being exploited by our competitors and adversaries — Enjoy!]
As the resident red teamer for a large military organization, I have long advocated for military planners and those involved in modernization to examine their assumptions and to identify the potential second/third order and “evil” effects in their plans, programs, and efforts. While the common use of the word “evil” means “profoundly immoral and wicked,” I use the term more broadly. “Evil,” in my usage for this essay, implies the unexpected and profound implications of a policy or adaptation of a technology – often with negative (but not necessarily always immoral) significant consequences.
While many professionals in and out of the military would agree that we should examine our assumptions and potentially harmful implications – specifically in developing or adopting technologies or capabilities – I am frequently disappointed in how often we actually do it and the extent to which we drill down and examine the details associated with technology implications. Like doctors, staffs have a responsibility to do no harm; they should identify assumptions and the 2nd/3rd/evil order effects, inform decision-makers, and incorporate these considerations into their estimates of the costs/benefits and risk estimates.
TAKING RESPONSIBILITY
I am comforted however by the increasing recognition of the importance of challenging our thinking and moving from solely focusing on the “perceived” benefits of a technology to considering the dark or evil potential effects.
In his commencement address to the Class of 2019 at Stanford University last June, Apple’s CEO, Tim Cook, offered the following thoughts:
– “Technology magnifies who we are, the good and the bad.”
– “If you want credit for the good, take responsibility for the bad” – highlighting the fact that Silicon Valley’s revolutionary inventions connecting people around the world have also enabled data breaches, privacy violations, hate speech, and fake news.
– “Too many seem to think that good intentions excuse harmful outcomes.”
Succinctly, Mr. Cook offered, “Taking responsibility means having the courage to think things through.”1 His remarks are worthy of our consideration.
IDENTIFY EFFECTS
While I am sure that Mr. Cook wouldn’t have us throw out our iPhones and Macs, neither am I recommending disregarding the advantages of modernity by returning to manual typewriters in lieu of laptops or returning to less sophisticated medical procedures (e.g., refuting the benefits of applying machine learning to CT scans, X-rays, and other procedures).2
However, as we incorporate automation, machine learning, elements of artificial intelligence, data analytics, the concept of the Internet of Things (IoT), and robotics into society and military operations, we should do it with our eyes wide open and with a sense of humility in our ability to foresee future implications.
Two examples amplify this point.
Many of us remember the first fielding of GPS devices in the 1990s, which enabled leaders to accurately and instantly determine their location. The advantages of this capability are many, but some of the costs included a dependency on technology to navigate from point to point; reduction in Soldiers’ proficiency in map reading; and perhaps the loss of an appreciation and understanding of terrain. Now expand this increased dependency on technology and the network across the Army and Joint force – have we truly fully identified their implications and do we have workarounds in place?3
Even simple uses of technology, such as student computer usage in the college classroom, have implications. Increasingly, faculty banish the use of personal laptops and other electronic devices in their classrooms due to their distractive nature. Rather than listening to a lecture or participating fully in a discussion or workshop, students are distracted by their laptops – connecting to friends via social media or engaging in other on-line activities. Secondly, recent studies would indicate that “pen and paper” notetaking enhances learning.
In one formal study of the use of technology in the classroom and its effects on learning, researchers examined a sophomore introductory economics class at the United States Military Academy. The researchers divided the course sections into three random groups: in some sections, electronics were banned; in others, the use of laptops and other devices were allowed; while the remaining sections were only allowed to use tablets, provided that they were laid flat so professors could observe their use. All sections underwent the same instruction and testing; however, the students in those sections where electronics were allowed scored significantly lower on tests.4 Other studies and commentary backup this study.5
In summary, while we should pursue and field technology that helps us accomplish our mission and improve lives, we must recognize the 2nd/3rd order effects. As I’ve highlighted before in this blog – “every new capability begets a new vulnerability.” As a caveat to this rule and as noted historian Murray Williamson observed, “capabilities create dependencies, and dependencies create vulnerabilities.”6 We need to find and identify these effects and vulnerabilities before others do, while insuring we are keeping an open mind to the potential “evil” effects.
If you enjoyed this post, please see Dr. Marsella’s other posts:
– Some Thoughts on Futures Work for the Military Professional (Parts I and II)
– First Salvo on “Learning in 2050” – Continuity and Change
Dr. Nick Marsella is a retired Army Colonel and is currently a Department of the Army civilian serving as the Devil’s Advocate/Red Team for the U.S. Army’s Training and Doctrine Command.
Disclaimer: The views expressed in this article do not imply endorsement by the U.S. Army Training and Doctrine Command, the U.S. Army, the Department of Defense, or the U.S. Government. This piece is meant to be thought-provoking and does not reflect the current position of the U.S. Army.
1 Cook, Tim. (2019 June 16). 2019 Commencement Address by Apple CEO Tim Cook to Stanford’s 128th Commencement. Retrieved from: https://news.stanford.edu/2019/-6/16/remarks-tim-cook-2019-stanford-commencement.
2 Retrieved from: cs231n.stanford.edu/reports/2017/pdfs/527.pdf
3 TRADOC Pamphlet 525-3-8. U.S. Army Concept: Multi-Domain Combined Arms Operations at Echelons Above Brigade 2025-2045, pgs. 73-74. The Army concept lists four major general risks to the implementation of the 100 page concept in a page and a half – namely: the future Army communications network may not fully support the EABC; overreliance on technological capabilities; semi–fixed formations provide a false illusion of permanency; and imprudent application of the mission command philosophy.
4 Dynarski, Susan M. (2017, August 10). For better learning in college lectures, lay down the laptop and pick up a pen. Brookings Report. Retrieved from: https://www.brookings.edu/research/for-better-learning-in-college-lectures-lay-down-the-laptop-and-pick-up-a-pen/
5 Lombrozo T. (2016, July 11). Is it time to ban computers from classrooms? NPR. Retrieved from: http://www.npr.org/sections/13.7/2016/07/11/485490818/is-it-time-to-ban-computers-from-classrooms. Also see: May, C. (2017, July 11). Students are Better Off without a Laptop in the Classroom. Scientific American. Retrieved from: https://www.scientificamerican.com/article/students-are-better-off-without-a-laptop-in-the-classroom/
6 Williamson, Murray. (2017). America and the Future of War: The Past as Prologue. Stanford, CA: Hoover Institute Press, p. 177.
2nd/3rd order effects are especially critical for bridging strategy with operations and tactics; but unmooring the definition of evil from immorality isn’t helpful. Courting moral neutrality in decision making is a fantasy; every consequence has inherently moral characteristics.
Great article! The two quotes at the end really resonated with me, “Every new capability begets a new vulnerability” , and “Capabilities create dependencies, and dependencies create vulnerabilities.” The first because of how we don’t practice going back to “Manual” very often once we have the new technology, and become very used to using it. The second because once we get, and are used to a new time saving technology, we fill in those extra blocks of time with additional requirements which leave us no time to practice going back to “Manual”.
An interesting article especially given the pending streaming presentation on AI. As a neuropsychologist and man-machine interface scientist I have been fighting the inaccuracy of AI for decades only to run into a brick wall. I preface my approach with the Sun Tzu philosophy of knowing both oneself and one’s enemy as being both the military and cognitive aspects of the dynamics of strategy. Words mean things that bring about expectations within individuals. AI has grown to mean the ability to replace human intelligence with binary/linear machines; which are light-years behind human intellect. This article touches on this aspect of misunderstanding technology and creating gods out of idols. I applaud Dr. Marsella for his insight.