Occurrence versus Emergence

Even after many years of work in the areas of strategy and risk -- from financial services to energy and most recently at Britten Coyne Partners and The Index Investor -- I am still amazed at how most people's notion of "risk" is subconsciously linked to the probability that a discrete event will occur within a defined period of time (usually the next 12 months).

The root causes of this phenomenon are undoubtedly complex, but no doubt include our exposure to statistics courses and insurance concepts, like definable hazards whose potential negative impact can be mitigated (for a price) by transferring them to others.

Our evolutionary past is also to blame. Long before writing and mathematics appeared, we used stories to explain the past and anticipate the future -- and stories are usually focused on people and events with emotional power that cause us to retain them (and the lessons they contain) in our individual and collective memory.

Yet one of the great lessons of history is that it usually isn't the occurrence of events that sinks companies and countries. To be sure, they are often cited as the proximate cause of failure. But in reality, these events mark the end of a much longer process, involving interacting trends, decisions, and randomness, from which new threats emerge, evolve, and sometimes reach a critical threshold that produce events that cause catastrophic failures.

Put differently, it is the continuous variables in a system that should attract our interest, not just the discrete ones; we should focus on emergence, not just occurrence. In turn, this requires that we adopt new mental models that seek to understand and adapt to uncertainty, not just risk -- that are focused on estimating the remaining time before a critical system threshold is reached, and not just the probability that an event will occur.

In truth, these concepts are actually related, even though we often fail to see them as such. Consider, for example, a simple system model that does not evolve over time, in which there is a 5% probability each year that a given event of magnitude X will occur. What we usually fail to consider is how that probability increases as the time horizon lengthens. Over five years, there is a 23% chance the event will occur; over 10 years, a 40% chance, and over 20, a 64% chance.

In practice, however, the challenge is usually much greater, because the complex socio-technical systems that produce emergent threats and catastrophic events are themselves constantly evolving. Most of the time, we operate in the realm of true uncertainty, not risk, and the mental models we use to make sense of the world too often fail to recognize this critical distinction.

The lesson of this short story is this: If our goal is to avoid catastrophic failures, we must constantly struggle to repress our natural tendency to focus only on the probability of discrete events occurring within the next year, and instead pay much more attention to the continuous interaction of forces within our world that give rise to the emergent threats which pose the greatest danger to the survival and success of both organizations and investment strategies.
Comments

Strategy in a World of Radical Uncertainty

In theory, if too seldom in practice, one of the benefits of painful experience is wisdom, which often takes a surprisingly simple form.

After 80 plus years of combined experience, at Britten Coyne Partners we’ve arrived at a definition of strategy that is broad enough to apply to a wide range of public, private, and non-profit sector organizations, from the smallest to the largest scale:

"Strategy is a causal theory that exploits one or more decisive asymmetries to achieve an organization's most important goals with limited resources, in the face of uncertainty, constraints, and opposition."

For the past eight years, our focus has been on the “in the face of uncertainty” part of this definition.

As our world has become more complex and connected, uncertainty has exponentially increased, and thus made strategy vastly more challenging.

For example, processes that have a social aspect tend to produce a power law distribution of outcomes, due to our human tendencies towards imitation and conformity in the face of uncertainty. Because these processes themselves heighten uncertainty, highly connected, complex social systems naturally evolve to a so-called critical state, where small changes can quickly produce very large effects.

Three recommendations are often made to manage this uncertainty.

The first is to ensure that a strategy is robust enough to achieve its goals under a wide range of possible future conditions.

The second is to build sufficient organizational and business model resiliency to ensure survival when robustness fails.

And the third is to develop the ability to effectively adapt and succeeds in a radically changed environment after resiliency has absorbed the initial shock of its arrival.

All these make sense. But all three are also based on a common underlying assumption: that an organization will have sufficient foresight to identify the range of possible futures to which its strategy must be robust; to identify critical areas where limited resources must be deployed to build resiliency, and to invest in options that can speed adaptation.

In short, all three means of managing uncertainty assume an organizational ability to detect emerging threats to a strategy’s success and an organization’s survival.

Unfortunately, both human nature and organizational culture militate against this.

Evolution has biased us towards over-optimism, overconfidence, and a tendency to seek and overweight information that confirms our biased beliefs. And too many organizational cultures and leaders reinforce these tendencies.

As a result, too many strategies and organizations fail because they are either surprised by and/or too slow to react to new threats.

In sum, another important lesson we’ve learned over the years is that achieving success and avoiding failure are not merely two sides of the same coin. They are entirely different phenomena, which in most organizations receive very different levels of attention and investment.

And too many organizations, leaders, and employees end up paying a high price for that.






Comments

Lessons from 16 Years of Scenario Planning at the US Department of Defense

The Strategic Studies Institute at the US Army War College has just published a monograph on issues that will be familiar to many users of scenario planning methodologies.

In
“Scenario Planning and Strategy in the Pentagon”, Michael Fitzsimmons pulls no punches, either about the problems encountered since the Pentagon began to formally use scenario planning in 2002, or how the current process can be improved.

He begins with an observation familiar to many, if not all organizations today: “Students and practitioners of national security policy have long understood that uncertainty about the future is a central challenge of strategy.”

He then notes that, “Scenario planning should be one of the Department of Defense’s (DoD) most important tools for developing strategy under uncertainty.” However, many have judged scenario planning to be a big disappointment. Fitzsimmons digs deep to understand the underlying root causes of this outcome.

He examines “six debates that have complicated the execution of scenario planning in the DoD over the years.”

Likelihood versus plausibility as an appropriate planning factor. How likely does a scenario need to be to compel planning? And how likely is any given scenario in the first place? Despite the use of much scientific-sounding arguments on the subject, and despite superficial deference to the intelligence community as an authority on the subject of likelihood and plausibility, the answers to these questions are entirely subjective and unverifiable. Everyone has an opinion, and few can be disproved. This means that, despite the scenarios’ purpose to serve as test cases rather than predictions, a nearly endless number of uncertainties can be cause for legitimate debate in making scenario assumptions.”

“High-resolution analysis of a small number of cases versus low-resolution analysis of a large number of cases. Should the scenario planning process focus on studying a few scenarios in-depth or many scenarios with less detail?”

“Long, structured timelines for data development and analysis versus the need to be responsive to senior leader guidance. The more complex scenarios and associated data become and the more organizations required to review and approve the content, the longer it takes for the system to produce and approve those products. This is a challenge regardless of which end of the spectrum identified in the previous point the system tends toward (i.e., many simple scenarios or few complex scenarios).”

“Transparent, collaborative process versus innovative exploration of new concepts and capabilities. It is no secret that bureaucratic processes are enemies of innovation…In the case of DoD, the natural dynamics and politics of developing collaborative products across multiple organizations with differing incentives tend to produce compromises that elide difficult strategic choices rather than confront them and suppress experimental ideas rather than nurture them.”

“Appropriateness of operational plans versus scenarios as the basis for strategy development and force planning…Because the operational planning (focused on near-term employment of existing capabilities) and force planning (focused on supporting budgets and programs well into the future) processes are so segregated, the claims of operational plans and future scenarios often end up being more competitive with each other than complementary when it comes to strategic resource allocation.”

“Prerogatives of civilian planning guidance versus military operational art. Finally, the DoD process has experienced a constant struggle, as do many Pentagon processes, in defining a boundary between those prerogatives and judgments for which civilian guidance predominates and those in which military operational expertise predominates [analogous to corporate conflicts between strategy departments and line managers]. Both perspectives are essential to the process, but it is often ambiguous whether and when one’s deference is due to the other.”

On balance, Fitzsimmons concludes that the use of fewer, more detailed scenarios has been more successful in supporting near-term operational planning needs but less successful supporting long-term strategy development.

The author notes that, “Strategy and force structure development [i.e., investment in capabilities] comprise the questions that preoccupy the DoD’s most senior leaders, especially the secretary and the chairman. These questions address the largest elements of force structure, major resource trade-offs, global posture, alliance relationships, rationales for technology investment strategies, and the like. Problems in these areas are extremely complex and unstructured. As a result, decision-making on strategy and force structure tends to follow a highly inductive path.”

“Decision-makers faced with these questions must think very broadly and consider many potential variations in strategic-level assumptions. In part due to these requirements of breadth and variation, the level of analytic detail that is relevant or even digestible on such questions is sharply limited. Decision-makers involved in strategy and force structure development need to be able to think creatively and consider a full range of possible solutions to strategic problems relatively unconstrained by current doctrine, official intelligence estimates, and programs.”

Our key takeaway from this excellent analysis is this: Strategy development has a longer time horizon than operational planning, and is thus a less constrained process that requires a greater range of less detailed scenarios. Trying to use the same scenarios for both operational planning and strategy decisions invites the frustration that has occurred at DoD.


Comments

The UK Has Just Raised the Risk Governance Bar for Company Directors

With the news today that the UK government intends to replace the Financial Reporting Council – a non-regulatory, voluntarily funded body – with a statutory regulator with powers that should be “feared” by the organisations that it regulates (i.e. all publicly quoted companies in the UK), together with moves by the UK Pensions Regulator to impose sanctions for company directors who put their employees’ pensions at risk, the environment for risk governance is becoming significantly harder on company directors.

Thus, it is timely to reflect on the findings of a recent research project by the Universities of Liverpool and Leeds on
“Business Judgement and the Courts”. It is a commonly held perception that business judgement is immune from judicial review. It is true that approaches to judicial accountability of directors in civil (i.e. not criminal cases) varies across the “Anglosphere”, with, for example, greater protection in the US and less in Australia. Yet the question of what exactly the courts consider “business judgement” and whether or not it is or can be subject to judicial oversight was not clearly understood.

The research project examined (mostly) UK legal precedent to try and clarify the reality, through a database of over 100 cases. The results were illuminating and of relevance to all company boards in the UK and possibly elsewhere. Firstly, the widely held perception that “business judgement” was a shield against judicial review was shown to be very wrong. The number of cases taken to law against company directors is increasing and since the UK Companies Act 2006 set out directors’ accountability in statute, it has accelerated further. The new moves mentioned above are likely to increase pressure on this trend. Moreover, when cases are taken to law, the chances of liability being found against directors is over 60%. So much for protection afforded by “business judgement”!

When cases were successfully defended, the research found, it was because directors could provide clear evidence of diligent and comprehensive decision-making
processes. These go well beyond simple documenting of procedures represented by board agenda or minutes. The cases where director liability was found were based on decision-making process failures, including failures to identity relevant factors or issues, failing to take advice or seek relevant information or failing to act due to recklessness or “blind optimism”.

It seems clear that bar is being raised in the governance environment, including and perhaps especially risk governance. Directors and boards who take comfort that they have a “risk management” procedure in place are likely to be practising false optimism on two counts – first that their procedures are a substitute for effective Strategic Risk Governance and second that if called upon to defend themselves in court, reliance on their judgement will be sufficient.

Our work frequently highlights how “Risk Blindness” builds over time through familiarity with (sometimes very) imperfect information. We help boards and directors reduce their Strategic Risk blindness with proven decision-making processes supporting thorough Anticipation and Assessment of and Adaptation to Strategic Risk. It seems that in addition to mitigating the chances of corporate extinction these processes may also improve the chances of directors successfully defending a judicial review of their decisions, if or when things do go badly wrong.

Comments

RAND’s New Analysis of Strategic Warning Challenges Facing the Intelligence Community Applies to the Private Sector Too

As our clients know, Britten Coyne Partners’ methodologies draw on a wide variety of sources, including the military and intelligence communities. In this blog post we will summarize important insights from RAND Corporation’s new analysis, “Perspectives and Opportunities in Intelligence for US Leaders.”

Chapter 2 is titled, “
Reconstituting Strategic Warning for the Digital Age.” RAND begins by noting that, “since 2014, events have raised questions about both the Intelligence Community’s (IC’s) strategic warning effectiveness and the policy community’s understanding of warning and its ability to command action in response.”

They then make a point that we have often made ourselves: “The strategic warning mission is bedeviled by two inherent challenges…In slowly developing situations it is often hard to stimulate action. The second challenge is how to alert policymakers to something that has rarely, if ever, been seen before…The inherent challenge in providing insight to policymakers about future developments is in making sure that warning is heeded, but does not cause undue alarm.”

RAND also cites well-known CIA analyst Jack Davis’ famous quote on another aspect of the warning challenge: “Waiting for evidence the enemy is at the gate usually fails the timeliness test; prediction of potential crises without hard evidence can fail the credibility test.”

In sum, “warning is fundamentally a problem of both sensemaking and effectively communicating insights to busy policymakers – an inherently difficult challenge.”

In today’s world, the warning challenge has become exponentially greater because of the speed with which events often develop our complex and densely interconnected world.

As RAND notes, today the IC is “frequently confronting rapidly evolving situations that have never been seen before.” Under these conditions, effective “warning is dependent on much more than simply having experts who know the issues associated with the topic. Experts in a field tend to see trends based on what they have seen before, and they have difficulty imagining discontinuities that surprise…Experts are naturally wedded to their previous assessments” (a point also made by Philip Tetlock in his excellent book on “
Expert Political Judgment: How Good Is It?”).

Given this, RAND concludes that, “expertise has to be complemented with a diversity of views", from internal and external sources. To which we can only add, “Amen.”

It is also unlikely that artificial intelligence and other technologies will ever be able to replace the human beings when it comes to sensemaking in highly complex situations, particularly when the challenge is to make accurate estimates of the ways they could evolve in the future. At best, such technologies may increasingly be able to augment human cognition.

In summing up the nature of the strategic warning challenge facing the intelligence community, the Director of National Intelligence’s strategic plan states that, “our anticipated strategic environment models closely on chaos theory: initial conditions are key, trends are non-linear, and challenges emerge suddenly due to unpredictable systems behavior…We believe our customers will seek out inputs on what may surprise them, if we are capable of placing such inputs in a larger context and demonstrating rigor in our analytic approach to complexity.”

RAND concludes that “a new warning tradecraft that combines indicators with techniques to test assumptions, identify event drivers, consider plausible alternative explanations and outcomes, and aggregate expert forecasts provides a good foundation, but it must be applied throughout the IC to achieve fundamental change in the current approach to warning.”

We believe that the points raised by RAND are just as applicable to the strategic risk management, governance, and warning challenges faced today by private sector corporations as they are to those facing the Intelligence Community. At Britten Coyne Partners, our mission is to help organizations successfully meet them.
Comments