(Editor’s Note: The Mad Scientist Laboratory is pleased to present the following guest blog post by Mr. Nick Marsella. If you are interested in submitting a guest post, please select “Guest Bloggers” from the menu above and review the submission instructions)
For more than a decade, I have been both an observer and participant in various efforts to examine the future [to include Mad Scientist events]. I have often been struck by how different people approach the task; describe their work [using such terms as “predicting, assessing, speculating, or forecasting” – pick your favorite]; and the varying degrees of rigor in their analysis. Secondly, I have been astounded by the volume and amount of research and publications devoted to the future from across the public and private sector – and how often we seem to “rediscover” the same insights (e.g., the world is increasingly becoming urban).1
I suspect most people would agree with the statement that organizations must plan for the “near, mid, and far,” and the further the “far” is in the future increases the degree of uncertainty. As the chart below illustrates, “futures work” informs senior leader’s decisions on the design of strategies, organizations, and in the acquisition of technologies for the near and mid-term. While the “far future” increases uncertainty (and is often speculative), it informs leaders on where to invest resources for research and development of ideas and technologies – which in turn – helps shape the exploitation of future opportunities.
Unfortunately, futures work is not immune to three deadly sins, namely: estimates or forecasts of future developments can fall prey to cognitive bias; our work sometimes lacks clarity or sufficient specificity for the decision maker; and the work is not tied to solving a problem or to help shape the future of an organization.
While there are many cognitive challenges to futures work, a brief survey of the literature would indicate perhaps the most challenging are: hubris, lack of imagination/ paradigm blindness, trends faith, and mirror imaging.2
Hubris or overconfidence often translates into “this is the sole best solution or idea.” Hubris is also evident in overconfidence that the estimate or technology development will occur along a predictable time frame. One only needs to reflect on the cancellation of major programs, such as the Army’s Future Combat Systems (FCS), or to the catastrophic failures of two Space Shuttles, as illustrations that we sometimes place too much confidence in our ability to predict the rate of technology development or resilience of that technology. Among the solutions to hubris are: remain open-minded; maintain a healthy skeptical attitude; consult a friend or a devil’s advocate to help red team the idea; rigorously challenge assumptions; and look for disconfirming information. Keep in mind the old adage – “every new technology begets a new vulnerability.”
Paradigm blindness forces us to accept what we know as the answer at the expense of considering or exploring other options. We must continually re-examine our answers and options, which for some is as often as the historian Barbara Tuchman noted was “as rare as rubies in the backyard.” The oft cited quote from the New York Times editorial of 8 December 1903 illustrates this cognitive error: “A man carrying airplane will eventually be built, but only if mathematicians and engineers work steadily for the next ten million years.” Nine days later the Wright Brothers completed their first flight.
While the study of trends has many useful purposes and is a methodology often used by futurists, trends can be deceiving. As many have noted, the future is unknowable, and history – while valuable – is an imperfect guide.3 For example, it was a trend that the Chicago Cubs would never win the World Series – until they did. Similarly any “technology” is useful and will continue to develop until it is replaced by something better. The doubling of computer power every two years, known as Moore’s Law, is a trend that some think will conclude in the near future.
Mirror imaging occurs when we subscribe our beliefs or ideas to other competitors. A corollary to this mirror imaging idea is the concept of railroading where we assume that other competitors, for example, are developing technology at similar pace and along the same track that we are. Mirror imaging places a premium on the notion that our way is the only way – discounting history and organizational, strategic, geographic and cultural differences – as well as dismissing ideas that others might have.
Thinking about the future is hard work, requiring us to continually examine the rigor associated with these efforts and avoiding the cognitive biases inherent in our future’s work.
We must balance imagination with realism. We must avoid the “sunk cost syndrome” – where we become afraid of killing off less productive research, projects, or investments, resulting in what one author noted, becoming “zombies” – absorbing resources – difficult to kill – taking on a life of their own.
As many senior leaders have noted, our prediction or assessment of the future will never be precise and totally accurate. We can only aspire not to be too wrong. While this is true and given that the future is always uncertain – our processes, mindsets, assumptions and actions should not add to the uncertainty.
Nick Marsella is a retired Army Colonel and is currently a Department of the Army civilian serving as the Devil’s Advocate/Red Team for Training and Doctrine Command.
Part II of this blog post will examine the purposes and why futures work and efforts [like Mad Scientists] are important and help to inform senior leaders about the future and to help drive informed decisions.
1 Duplication is not necessarily bad, since the work may provide insights from different vantage points or perspectives and repetition of findings over time may add creditability to them and their conclusions. Yet, all too often, efforts within an organization continue to rediscover the same insights over multiple years – resulting in continual admiration of a problem. In my view, “rediscovery” often results from a failure from doing a comprehensive literature review, which includes identification of insights or lessons already identified.
2 This is not to dismiss other challenges such as confirmation bias, poor qualitative or quantitative methodologies – among many others – resulting in invalid conclusions. See the National Research Council of the National Academies report, Persistent Forecasting of Disruptive Technologies(2010), or the many Defense Science Board reports on future technology development and red teaming.
3 Grey, C.S. (2015). Executive Summary. Thucydides was Right: Defining the Future Threat. Strategic Studies Institute and U.S. Army War College Press. Summary can be found here.
The views expressed are the author’s and do not reflect the official position of the Department of Defense, Department of the Army, or Training and Doctrine Command.