In April international space agencies participating in the 2019 Planetary Defense Conference, launched an exercise designed to simulate the response to an asteroid strike on Earth. It is a result of increasing efforts over the past two decades to improve humanities chances of surviving the threat of an extinction level event from the impact of a large space object with the planet, similar to that considered to have led to the mass extinction of dinosaurs. Commenting on the purpose of these international agencies Mr. Rüdiger Jehn, Head of Planetary Defense for the European Space Agency (ESA) said, "The first step in protecting our planet is knowing what’s out there. Only then, with enough warning, can we take the steps needed to prevent an asteroid strike altogether, or to minimise the damage it does on the ground.”
Our interest in these events is due to the lessons we believe they offer to organizations striving to improve their own ability to survive existential threats through effective Strategic Risk Governance and Management. As Mr. Jehn observes the first step is to “know what is out there”, in other words to find the “known unknowns” that may represent a threat. This corresponds to the essential process of Anticipation. We urge our clients to search in the “Realm of Ignorance” and “forage for surprise” to actively look for the few key Strategic Risks* that should concern them.
NASA’s Centre for Near Earth Object Studies (CNEOS) was charged by congress in 2005 with finding 90% of objects of a size, distance from earth and trajectory thought likely to represent a potential threat to the planet. Since then CNEOS funding has risen from $4m to nearly $200m p.a. as the number of such objects found has grown. 20,000 have been identified to date and the number is still growing at about 150 per month. Without such active search humankind would still be living in profound ignorance of the potential scale of the threat from these space objects – the same kind of profound ignorance that many boards of directors experience, as they become familiar with imperfect strategic assumptions and information about risk that fosters their Strategic Risk blindness.
Once potential threats have been identified there is still great uncertainty about when or if they will occur. NASA and other agencies seek to identify two parameters – the date of potential impact and the probability of occurrence. At first sight this may seem analogous to the approach taken with the conventional risk register assessments of companies. This is a mistake – when NASA computes a probability of impact they are basing this on the known measurement imperfections of size, position and trajectory. In other words, there is a robust basis for a computed probability. This is not the case for the overwhelming majority of Strategic Risks identified by businesses; these are uncertainties, frequently a manifestation of the complex economic environment in which businesses operate, for which, as John Maynard Keynes observed in 1937 “…there is no scientific basis on which to perform any calculable probability.”
The clue to the relevant Assessment of Strategic Risk is in Mr. Jehn’s comment: “…with enough warning”. What matters for boards is not a subjective and unreliable probability but when the threat might materialise. It might be argued that there is as much difficulty in assessing the time to impact of an asteroid as the probability of impact. This is true – for asteroids. But when businesses focus on the expected time to event threshold for Strategic Risk there is much more data and evidence that can be applied to make a best estimate. Moreover, by applying the lessons learned from the Good Judgement project**, the estimation algorithm we use is amenable to update and adjustment over time. This is just what NASA, ESA and other agencies do to improves their estimates of time to, and probability of, impact. The key metric remains time, since this is what either facilitates or limits any potential response to the threat. Moreover, if the rate of change of the observed time to event accelerates, i.e. starts to become much shorter, this is a vital indicator of the need for action.
The action required for Adaptation or mitigation of the effects of the observed threat is clearly determined by the nature and severity of it. For NASA and ESA such actions are “… the steps needed to prevent an asteroid strike altogether, or to minimise the damage it does on the ground.” For business organizations having strategies that are robust to their assumptions (i.e. they may survive such assumptions being wrong), or having built resilience, for example maintaining key strategic reserves, are amongst the methods that can be employed to adapt to anticipated threat. The lessons from corporate failures nearly always demonstrate that failure to adapt in time to an emerging existential underlies their eventual demise. It is just such a failure that NASA, ESA and others hope to avoid by running their simulation exercises.
When boards and executive teams apply these proven Anticipation, Assessment and Adaptation processes to their own Strategic Risks they may avoid, as NASA and ESA hope to also, an Extinction Level Event!
* We define Strategic Risk as any event which has the potential to cause serious adverse effect on strategic goals up to and including the death of the corporation
** Tom Coyne was a member of the winning team in the Intelligence Advanced Research Projects Activity’s forecasting tournament, as described by Philip Tetlock and Dan Gardner in their book “Superforecasting”
Now in its seventh year, the Protiviti/NC State University survey is particularly interesting because it reports the views of board directors and C-Suite executives from around the world (this year there were 825 respondents).
The survey asks participants to rate (on a ten point scale) the potential impact on their company of 30 pre-selected risk issues over the next year.
Risks are further divided into three categories, including macroeconomic risks to potential growth opportunities, risks to the validity of current corporate strategy for pursuing those opportunities, and operational risks to the implementation of that strategy.
Organizationally, one of this year's key findings was that on average, board members reported higher potential risk impact on their organizations than CEOs, who in turn reported a higher average risk impact than their respective management teams. This isn't surprising, given that the structure of CEO and management team incentives is much more skewed towards achieving success than avoiding failure than board members' incentives. There may also be an experience factor at work, whereby board members with more years of experience (and accumulated scar tissue) are more attended to the dangers the lurk in uncertainty and ignorance than less experienced managers, who are naturally more focused on risks they believe they can control.
Here is the list of the survey's top risks for 2019:
(1) "Existing operations meeting performance expectations, competing against 'born digital' firms."
(2) "Succession challenges and the ability to attract and retain top talent."
(3) "Regulatory changes and regulatory scrutiny."
(4) "Cyber threats."
(5) "Resistance to change operations."
(6) "Rapid speed of disruptive innovation and new technologies."
(7) "Privacy/identify management and information security."
(8) "Inability to utilize analytics and big data."
(9) "Organization culture may not sufficiently encourage timely identification and escalation of risk issues."
(10) "Sustaining customer loyalty and retention."
At Britten Coyne Partners, we use a number of different frameworks and methodologies to help clients better anticipate, more accurately assess, and adapt in time to emerging threats. It is interesting to use them to evaluate the "top risks" identified in this survey.
One of these methods divides threats into four categories, based on the location and likely increasing severity of their impact, because of the relative difficulty in adapting to them. These categories include: (a) the competitiveness of a firm's value proposition; (b) the size of its served and potential market; (c) its business model design and economics; and (d) the social, economic, national security, and political context in which a company exists and competes.
Five of the survey's "top risks" seem to be in the value proposition category:
(1) Sustaining customer loyalty and retention
(2) Existing operations meeting performance expectations, competing against 'born digital' firms.
(3) Inability to use analytics and big data
(4) Privacy/Identity management and information security
(5) Cyber threats
Arguably, the last two might also have a negative impact on the overall size of a potentially served market (e.g., if rising privacy and identify protection concerns caused a whole market to shrink). This might also include "regulatory change and regulatory scrutiny."
Five risks seem to represent threats to business model design and economics:
(6) Regulatory change and regulatory scrutiny.
(7) Rapid speed of disruptive innovations and new technologies.
(8) Resistance to change operations.
(9) Succession challenges and ability to attract and retain top talent.
(10) Organization culture may not sufficiently encourage timely identification and escalation of risk issues.
It is interesting that no social, economic, national security, and political context risks made the top ten, as they are key drivers of many of the risks that did. However, this may well have been due the survey limiting macro risks to those that affect potential growth opportunities.
Another way to categorize these top ten risks is by the nature of the organizational issues that underlie them, including timely anticipation of emerging threats, accurate assessment of the their potential consequences and likely speed of development, and the ability to adapt to them in time.
Fears of inadequate organizational ability to anticipate emerging threats likely underline "Rapid speed of disruptive innovations and new technologies", "cyber threats", "privacy/identify management and information security", "regulatory changes and regulatory scrutiny", and meeting performance expectations when competing against born digital firms."
Anxiety about the accuracy and timeliness of assessments of emerging threats is indicated by "organization's culture may not encourage the timely identification and escalation of risk issues", "inability to use analytics and big data", and also concerns with "sustaining customer loyalty and retention."
Worries about a company's ability to adapt in time to emerging threats are clear in "timely escalation of risk issues", "resistance to change operations", and "succession challenges and ability to attract and retain top talent."
Last but not least, it is critical to note that directors' and executives' top ten risks are not risks at all, in the classical sense of discrete events whose historical frequencies can be observed, future probabilities of occurrence measured, and potential negative consequences priced and transferred to others (e.g., via insurance or financial derivative contracts).
Rather, they reflect a combination of uncertainties (about the nature, likelihood, timing, and impact of potential threats), and/or concerns about the potential extent of one's ignorance (e.g., about future regulatory changes or the speed of disruptive innovations and new technologies).
As always, the Protiviti/NC State survey provides a good overview of what risks most worry directors and management teams, and why they are important. But that is only the starting point.
Just as important, and far more difficult, are the challenges of how to better anticipate and more accurately assess the threats they pose, and then adapt to them in time. The good news for boards and management teams is that for the past seven years, this has been the focus of our work at Britten Coyne Partners.
The root causes of this phenomenon are undoubtedly complex, but no doubt include our exposure to statistics courses and insurance concepts, like definable hazards whose potential negative impact can be mitigated (for a price) by transferring them to others.
Our evolutionary past is also to blame. Long before writing and mathematics appeared, we used stories to explain the past and anticipate the future -- and stories are usually focused on people and events with emotional power that cause us to retain them (and the lessons they contain) in our individual and collective memory.
Yet one of the great lessons of history is that it usually isn't the occurrence of events that sinks companies and countries. To be sure, they are often cited as the proximate cause of failure. But in reality, these events mark the end of a much longer process, involving interacting trends, decisions, and randomness, from which new threats emerge, evolve, and sometimes reach a critical threshold that produce events that cause catastrophic failures.
Put differently, it is the continuous variables in a system that should attract our interest, not just the discrete ones; we should focus on emergence, not just occurrence. In turn, this requires that we adopt new mental models that seek to understand and adapt to uncertainty, not just risk -- that are focused on estimating the remaining time before a critical system threshold is reached, and not just the probability that an event will occur.
In truth, these concepts are actually related, even though we often fail to see them as such. Consider, for example, a simple system model that does not evolve over time, in which there is a 5% probability each year that a given event of magnitude X will occur. What we usually fail to consider is how that probability increases as the time horizon lengthens. Over five years, there is a 23% chance the event will occur; over 10 years, a 40% chance, and over 20, a 64% chance.
In practice, however, the challenge is usually much greater, because the complex socio-technical systems that produce emergent threats and catastrophic events are themselves constantly evolving. Most of the time, we operate in the realm of true uncertainty, not risk, and the mental models we use to make sense of the world too often fail to recognize this critical distinction.
The lesson of this short story is this: If our goal is to avoid catastrophic failures, we must constantly struggle to repress our natural tendency to focus only on the probability of discrete events occurring within the next year, and instead pay much more attention to the continuous interaction of forces within our world that give rise to the emergent threats which pose the greatest danger to the survival and success of both organizations and investment strategies.
After 80 plus years of combined experience, at Britten Coyne Partners we’ve arrived at a definition of strategy that is broad enough to apply to a wide range of public, private, and non-profit sector organizations, from the smallest to the largest scale:
"Strategy is a causal theory that exploits one or more decisive asymmetries to achieve an organization's most important goals with limited resources, in the face of uncertainty, constraints, and opposition."
For the past eight years, our focus has been on the “in the face of uncertainty” part of this definition.
As our world has become more complex and connected, uncertainty has exponentially increased, and thus made strategy vastly more challenging.
For example, processes that have a social aspect tend to produce a power law distribution of outcomes, due to our human tendencies towards imitation and conformity in the face of uncertainty. Because these processes themselves heighten uncertainty, highly connected, complex social systems naturally evolve to a so-called critical state, where small changes can quickly produce very large effects.
Three recommendations are often made to manage this uncertainty.
The first is to ensure that a strategy is robust enough to achieve its goals under a wide range of possible future conditions.
The second is to build sufficient organizational and business model resiliency to ensure survival when robustness fails.
And the third is to develop the ability to effectively adapt and succeeds in a radically changed environment after resiliency has absorbed the initial shock of its arrival.
All these make sense. But all three are also based on a common underlying assumption: that an organization will have sufficient foresight to identify the range of possible futures to which its strategy must be robust; to identify critical areas where limited resources must be deployed to build resiliency, and to invest in options that can speed adaptation.
In short, all three means of managing uncertainty assume an organizational ability to detect emerging threats to a strategy’s success and an organization’s survival.
Unfortunately, both human nature and organizational culture militate against this.
Evolution has biased us towards over-optimism, overconfidence, and a tendency to seek and overweight information that confirms our biased beliefs. And too many organizational cultures and leaders reinforce these tendencies.
As a result, too many strategies and organizations fail because they are either surprised by and/or too slow to react to new threats.
In sum, another important lesson we’ve learned over the years is that achieving success and avoiding failure are not merely two sides of the same coin. They are entirely different phenomena, which in most organizations receive very different levels of attention and investment.
And too many organizations, leaders, and employees end up paying a high price for that.
In “Scenario Planning and Strategy in the Pentagon”, Michael Fitzsimmons pulls no punches, either about the problems encountered since the Pentagon began to formally use scenario planning in 2002, or how the current process can be improved.
He begins with an observation familiar to many, if not all organizations today: “Students and practitioners of national security policy have long understood that uncertainty about the future is a central challenge of strategy.”
He then notes that, “Scenario planning should be one of the Department of Defense’s (DoD) most important tools for developing strategy under uncertainty.” However, many have judged scenario planning to be a big disappointment. Fitzsimmons digs deep to understand the underlying root causes of this outcome.
He examines “six debates that have complicated the execution of scenario planning in the DoD over the years.”
“Likelihood versus plausibility as an appropriate planning factor. How likely does a scenario need to be to compel planning? And how likely is any given scenario in the first place? Despite the use of much scientific-sounding arguments on the subject, and despite superficial deference to the intelligence community as an authority on the subject of likelihood and plausibility, the answers to these questions are entirely subjective and unverifiable. Everyone has an opinion, and few can be disproved. This means that, despite the scenarios’ purpose to serve as test cases rather than predictions, a nearly endless number of uncertainties can be cause for legitimate debate in making scenario assumptions.”
“High-resolution analysis of a small number of cases versus low-resolution analysis of a large number of cases. Should the scenario planning process focus on studying a few scenarios in-depth or many scenarios with less detail?”
“Long, structured timelines for data development and analysis versus the need to be responsive to senior leader guidance. The more complex scenarios and associated data become and the more organizations required to review and approve the content, the longer it takes for the system to produce and approve those products. This is a challenge regardless of which end of the spectrum identified in the previous point the system tends toward (i.e., many simple scenarios or few complex scenarios).”
“Transparent, collaborative process versus innovative exploration of new concepts and capabilities. It is no secret that bureaucratic processes are enemies of innovation…In the case of DoD, the natural dynamics and politics of developing collaborative products across multiple organizations with differing incentives tend to produce compromises that elide difficult strategic choices rather than confront them and suppress experimental ideas rather than nurture them.”
“Appropriateness of operational plans versus scenarios as the basis for strategy development and force planning…Because the operational planning (focused on near-term employment of existing capabilities) and force planning (focused on supporting budgets and programs well into the future) processes are so segregated, the claims of operational plans and future scenarios often end up being more competitive with each other than complementary when it comes to strategic resource allocation.”
“Prerogatives of civilian planning guidance versus military operational art. Finally, the DoD process has experienced a constant struggle, as do many Pentagon processes, in defining a boundary between those prerogatives and judgments for which civilian guidance predominates and those in which military operational expertise predominates [analogous to corporate conflicts between strategy departments and line managers]. Both perspectives are essential to the process, but it is often ambiguous whether and when one’s deference is due to the other.”
On balance, Fitzsimmons concludes that the use of fewer, more detailed scenarios has been more successful in supporting near-term operational planning needs but less successful supporting long-term strategy development.
The author notes that, “Strategy and force structure development [i.e., investment in capabilities] comprise the questions that preoccupy the DoD’s most senior leaders, especially the secretary and the chairman. These questions address the largest elements of force structure, major resource trade-offs, global posture, alliance relationships, rationales for technology investment strategies, and the like. Problems in these areas are extremely complex and unstructured. As a result, decision-making on strategy and force structure tends to follow a highly inductive path.”
“Decision-makers faced with these questions must think very broadly and consider many potential variations in strategic-level assumptions. In part due to these requirements of breadth and variation, the level of analytic detail that is relevant or even digestible on such questions is sharply limited. Decision-makers involved in strategy and force structure development need to be able to think creatively and consider a full range of possible solutions to strategic problems relatively unconstrained by current doctrine, official intelligence estimates, and programs.”
Our key takeaway from this excellent analysis is this: Strategy development has a longer time horizon than operational planning, and is thus a less constrained process that requires a greater range of less detailed scenarios. Trying to use the same scenarios for both operational planning and strategy decisions invites the frustration that has occurred at DoD.