Black Swan Events

harvard-business-review-riskBlack Swan events – should we even bother? The October issue of the Harvard Business Review had a special spotlight on risk, and featured an article by Nassim M Taleb, Daniel G Goldstein and Mark W Spitznagel on The Six Mistakes Executives Make in Risk Management. The article discusses the so-called Black Swan events or Low Probability High Impact events, and the fact that these events are practically impossible to predict, so instead of spending our efforts on quantifying and estimating them, maybe we should just let them happen and rather focus on reducing our vulnerability to them (if they do happen). A very interesting thought…

Misunderstood Risk Management

It was a post on Michael Herrera’s BCP blog that alerted me to this article in the HBR, and it is a good article indeed. Risk Management, so Taleb & Co. say, should be about lessing the impact of what we don’t understand. It should not be a futile attempt at developing sophisticated techniques, which only succeed in making us believe that we can – when in fact we can not – predict our social, technological, natural and economic environment.

Six Mistakes

The article discusses six mistake that many executives make in risk management:



  • We think we can manage risk by predicting extreme events
  • We are convinced that studying the past will help us manage risk
  • We don’t listen to advice about what we shouldn’t do
  • We assume that risk can be measured by standard deviation
  • We don’t appreciate that what is mathematically equivalent isn’t psychologically so
  • We are thaught that efficiency and maximizing shareholder value don’t tolerate redundancy

Six answers

Instead of calculating the odds of whether things will happen, rather think about what you will do if they do happen. That’s actually very basic in crisis management planning. If you have made planned and exercised for one event, you can rest assured that most other events are covered by this plan. People who can handle one type of crisis, always have the potential to handle many types of crises.

Today’s world, let alone the future world, will never be like the past. There will always be something “unprecedented”. Hindsight can never be converted into foresight. Lessons should be learned from the past, yes, but the past should not be used to predict the future.

A dollar not lost is a dollar earned, and preventing losses is perhaps a better strategy than gaining profits, and risk management activities are in fact profit-generating activities. There is no separation.

Statistics is a complicated science and the often cited standard deviation is not average variation, but the square root of average squared variations. Although in a perfect world of randomness, most events fall within the -1 and +1 standard deviation, in real life (“Black Swan”), movements can easily exceed 10, 20 or 30 standard deviations.

In absolute quantified numbers, two events may be equal, but subjectively, in a group of people, the same event may have very different value to each of them and may be understood entirely differently. After all, number are just numbers, feelings, values, and risk acceptance are not so uniform.

Optimization of processes makes them vulnerable to changes. Robust processes can survive change, but resilience comes at a price. On the other hand, if you are too highly leveraged and specialized or optimized, you may have no other option if something fails, a recipe for disaster.

Achilles’ heel

The biggest risk, so the author conclude, lies within us:

We overestimate our abilities
and underestimate what can go wrong.

The message is, I guess, expect the unexpected, and always prepare for the worst. And most of all: Recognize your limits. And don’t be like Henry Kissinger, who said There can’t be a crisis today.  My schedule is already full.

Disasters – prepare or react?

This article carries and interesting notion, not so unlike the recent BBC World Debate on Disasters. Should we actually bother to spend time and money on disaster mitigation, or should we rather focus on preparing for disaster recovery?  Is re-active better than pro-active?

Black Swan Events – the book

On a side note, Nassim Taleb is also the author of the book The Black Swan: The Impact of the Highly Improbable

Paul Kleindorfer

Low Probability High Impact events are a specialty of Paul Kleindorfer, who has  been  reviewed on this blog previously.

Reference

Taleb, N M, Goldstein D G & Spitznagel, M W (2009) The Six Mistakes Executives Make in Risk Management. Harvard Business Review 87(10), 78-81

Link

Author links

Related


Posted in ARTICLES and PAPERS
Tags: , , , , ,

ARTICLES and PAPERS
Managing risk together
Purchasing theory... I have to admit it's not one of my particular strongholds, but several of my re[...]
Broader research = better research?
I have always seen myself as a cross-disciplinary thinker, and I guess that is why I am so often sid[...]
BOOKS and BOOK CHAPTERS
Book Review: Global Supply Chain Management
The Handbook of Global Supply Chain Management is an excellent book. My interest in it stems from th[...]
Book Review: Creative Destruction
Like with so many of my other recent book reviews I came across Nolan and Croson's book, Creative De[...]
REPORTS and WHITEPAPERS
How New Zealand develops resilient organisations
Is New Zealand better prepared for a disaster than other countries? As our infrastructure and organi[...]
Supply Chain and Transport Risk
In our interconnected world, safety, reliability and efficiency can only be secured through collabor[...]