Hey, we’re all professionals, right? We are the experts on Risk, right? We study, we read books… we even read blogs and fly to conferences, right? But! We oftentimes get it wrong – we miss risks and threats that we should have seen coming, thereby losing our stakeholders’ confidence; or, we focus our time, our resources, our “political capital” in our organizations on risks and threats that didn’t deserve our attention and our stakeholders’ faith in us. Why?
Of course, that’s in part the nature of our business. Risks and threats are not guaranteed. Probability will always play a part. Otherwise we’d all be World Poker champions, Casino-beaters and Wall Street millionaires. However, we as humans come psychologically hard-wired (each to different degrees) with certain cognitive biases that influence our understanding of Probability and Risk. Recognizing these biases and building frameworks and controls into our programs to correct for them can make our Risk Management and Business Continuity programs more accurate, more targeted and better able to meet our organizations’ needs.
Here are several cognitive biases to recognize and resist:
The Gambler’s Fallacy
This one’s easy to understand so we’ll lead with it. This is the misperception that past events influence the probability of future events. Given an honest coin toss (with a 50% probability of Heads or Tails), if you flip a coin nine times and it comes up Heads every time, what is the probability that the next toss will also be Heads? Many people’s heads will say 50% and they’d be right, but many people’s hearts will say less – the probability of ten Heads in a row is 0.5 to the 10th power, or less than 0.001%, so they figure Tails is “due”. The math is correct, but what’s happened? The first nine are already taken care of, so the math is only for Flip Ten. This is why casinos get rich on people (including Risk Managers).
The Anchoring Fallacy
This is the tendency to stay focused, or ‘anchored’, on a previously-understood risk to the exclusion of its current state or evolving threats. Current airport security illustrates this misperception. Because of 9/11, box cutters and other small sharp instruments will be forever banned for all passengers. Theoretically, compensating controls such as hardened cockpit doors, armed pilots, flight marshals, do-not-open and do-not-negotiate protocols (plus an aware passenger population that will fight back) have driven the success probability of a small-knife terrorist attack to zero. However, security is still ‘anchored’ on pocketknives and box-cutters. Why is this relevant to Business Resiliency? Many companies base their program on “Remember that incident in 19XX? That’s why we invest in Business Resiliency!” Good to have management that sees the need, but it’s up to us as risk professionals to keep the program current by ensuring that we contemplate and factor in emerging threats and global trends. The only guarantee in our business (besides death and taxes) is that the next event will be different from the last event.
Normalcy Bias
This is the tendency to discount the probability of events where the judgment user has no experience or familiarity with the type of event – it’s not ‘normal’ to them. I had a discussion with an executive of a Seattle-based company who said, “Why should we contemplate earthquakes in our Risk Management? It’s not like we’re based in California!” Seismically speaking, the Pacific Northwest is one of the highest seismic threat regions in North America if not the world (the Pacific Ring of Fire and all that). However, quakes were not in their experience so they were not ‘normal’ to them. Ask a native of Florida and a new arrival about the threat of hurricanes and you will get different answers.
Zero-Risk Bias
This is the tendency to prefer total elimination of a risk to a higher overall reduction across a threat surface. Given choices for actions or controls to reduce risk, people will more likely select an action or control that results in elimination of a five-point risk over one that results in a one-point reduction across six risks.
Availability Bias
This one’s almost the converse of Normalcy Bias: the tendency to overestimate risks of vivid or publicly-prominent occurrences or occurrences that are psychologically ‘available’. People are acutely aware of high-profile tragedies such as child abductions or airline crashes, and therefore are wary of them. In reality, there are (thankfully) rare and lower-risk than perceived. The odds of a flight ending in a crash are about 10,000,000:1, and 95% of people involved have survived – if one boards a plane, one will most likely walk away from the flight. Yet, how many companies still have policies prohibiting multiple executives from flying together? How many companies have no policy around executives sharing a car, where the probability of accident or casualty is far greater?
The Texas Sharpshooter Fallacy
This one’s one of my favorites, and very relevant to Business Resiliency. A Texan shoots at the side of a barn, runs up and paints a bull’s-eye around it and exclaims, “Yee-Ha, I’m a Sharpshooter!” This is the fallacy of starting with a conclusion and working backwards to a pre-conceived hypothesis. Why is this relevant to how we manage risk? Many executives point to an event’s outcome and use it to justify whether their investment (or non-investment) in Business Resiliency is warranted. I had a discussion with an executive whose European headquarters experienced a two-alarm fire (smoke condition). The building had to be evacuated for about an hour. Thankfully, they were able to return. His assessment was, “Yes, we evacuated to our meeting point and waited for an hour, but we were able to return so it was not really a Business Resiliency event.” Yee-Ha! My assessment back to him was, “OK, so this company was forced out of its European Headquarters, lost all vital records, lost all technology assets, work came to a halt and hundreds of people’s productivity was reduced to zero, and at 45 minutes into the event you did not know when you were returning, you did not know if you were returning, and had no contingency for non-return?” His rejoinder? “Well, bloody hell, we could see the building from the meeting point, it was still there!” Business Resiliency is not about the last non-material risk, it’s about the next, perhaps-material risk.
There are many cognitive biases, such as Zero-Choice Bias, Base-Rate Bias, Confirmation Bias, Choice-Supportive Bias and others. Some are highly relevant to Risk Management and Business Continuity. All are proof that there are factors impeding our understanding of Risk and Probability and therefore Resiliency. The takeaway is that prudent leading-edge Business Resiliency understands the existence of these biases and takes actionable steps to minimize their effect and work around them.
How can we make our programs immune to these factors? ARSC is happy to have this conversation with you!
(This article also appears on the World Conference of Disaster Management blog, where Howard Mannella is presenting Black Swans, Grey Ash and Turkeys: How to Plan for Un-Plannable Events in Toronto June 17, 2014)
Of course, that’s in part the nature of our business. Risks and threats are not guaranteed. Probability will always play a part. Otherwise we’d all be World Poker champions, Casino-beaters and Wall Street millionaires. However, we as humans come psychologically hard-wired (each to different degrees) with certain cognitive biases that influence our understanding of Probability and Risk. Recognizing these biases and building frameworks and controls into our programs to correct for them can make our Risk Management and Business Continuity programs more accurate, more targeted and better able to meet our organizations’ needs.
Here are several cognitive biases to recognize and resist:
The Gambler’s Fallacy
This one’s easy to understand so we’ll lead with it. This is the misperception that past events influence the probability of future events. Given an honest coin toss (with a 50% probability of Heads or Tails), if you flip a coin nine times and it comes up Heads every time, what is the probability that the next toss will also be Heads? Many people’s heads will say 50% and they’d be right, but many people’s hearts will say less – the probability of ten Heads in a row is 0.5 to the 10th power, or less than 0.001%, so they figure Tails is “due”. The math is correct, but what’s happened? The first nine are already taken care of, so the math is only for Flip Ten. This is why casinos get rich on people (including Risk Managers).
The Anchoring Fallacy
This is the tendency to stay focused, or ‘anchored’, on a previously-understood risk to the exclusion of its current state or evolving threats. Current airport security illustrates this misperception. Because of 9/11, box cutters and other small sharp instruments will be forever banned for all passengers. Theoretically, compensating controls such as hardened cockpit doors, armed pilots, flight marshals, do-not-open and do-not-negotiate protocols (plus an aware passenger population that will fight back) have driven the success probability of a small-knife terrorist attack to zero. However, security is still ‘anchored’ on pocketknives and box-cutters. Why is this relevant to Business Resiliency? Many companies base their program on “Remember that incident in 19XX? That’s why we invest in Business Resiliency!” Good to have management that sees the need, but it’s up to us as risk professionals to keep the program current by ensuring that we contemplate and factor in emerging threats and global trends. The only guarantee in our business (besides death and taxes) is that the next event will be different from the last event.
Normalcy Bias
This is the tendency to discount the probability of events where the judgment user has no experience or familiarity with the type of event – it’s not ‘normal’ to them. I had a discussion with an executive of a Seattle-based company who said, “Why should we contemplate earthquakes in our Risk Management? It’s not like we’re based in California!” Seismically speaking, the Pacific Northwest is one of the highest seismic threat regions in North America if not the world (the Pacific Ring of Fire and all that). However, quakes were not in their experience so they were not ‘normal’ to them. Ask a native of Florida and a new arrival about the threat of hurricanes and you will get different answers.
Zero-Risk Bias
This is the tendency to prefer total elimination of a risk to a higher overall reduction across a threat surface. Given choices for actions or controls to reduce risk, people will more likely select an action or control that results in elimination of a five-point risk over one that results in a one-point reduction across six risks.
Availability Bias
This one’s almost the converse of Normalcy Bias: the tendency to overestimate risks of vivid or publicly-prominent occurrences or occurrences that are psychologically ‘available’. People are acutely aware of high-profile tragedies such as child abductions or airline crashes, and therefore are wary of them. In reality, there are (thankfully) rare and lower-risk than perceived. The odds of a flight ending in a crash are about 10,000,000:1, and 95% of people involved have survived – if one boards a plane, one will most likely walk away from the flight. Yet, how many companies still have policies prohibiting multiple executives from flying together? How many companies have no policy around executives sharing a car, where the probability of accident or casualty is far greater?
The Texas Sharpshooter Fallacy
This one’s one of my favorites, and very relevant to Business Resiliency. A Texan shoots at the side of a barn, runs up and paints a bull’s-eye around it and exclaims, “Yee-Ha, I’m a Sharpshooter!” This is the fallacy of starting with a conclusion and working backwards to a pre-conceived hypothesis. Why is this relevant to how we manage risk? Many executives point to an event’s outcome and use it to justify whether their investment (or non-investment) in Business Resiliency is warranted. I had a discussion with an executive whose European headquarters experienced a two-alarm fire (smoke condition). The building had to be evacuated for about an hour. Thankfully, they were able to return. His assessment was, “Yes, we evacuated to our meeting point and waited for an hour, but we were able to return so it was not really a Business Resiliency event.” Yee-Ha! My assessment back to him was, “OK, so this company was forced out of its European Headquarters, lost all vital records, lost all technology assets, work came to a halt and hundreds of people’s productivity was reduced to zero, and at 45 minutes into the event you did not know when you were returning, you did not know if you were returning, and had no contingency for non-return?” His rejoinder? “Well, bloody hell, we could see the building from the meeting point, it was still there!” Business Resiliency is not about the last non-material risk, it’s about the next, perhaps-material risk.
There are many cognitive biases, such as Zero-Choice Bias, Base-Rate Bias, Confirmation Bias, Choice-Supportive Bias and others. Some are highly relevant to Risk Management and Business Continuity. All are proof that there are factors impeding our understanding of Risk and Probability and therefore Resiliency. The takeaway is that prudent leading-edge Business Resiliency understands the existence of these biases and takes actionable steps to minimize their effect and work around them.
How can we make our programs immune to these factors? ARSC is happy to have this conversation with you!
(This article also appears on the World Conference of Disaster Management blog, where Howard Mannella is presenting Black Swans, Grey Ash and Turkeys: How to Plan for Un-Plannable Events in Toronto June 17, 2014)