Too Safe For Our Own Good

If cars were more dangerous, we would drive more safely. At least that is what Gordon Tullock postulated in his famous thought experiment which became known as “Tullock’s Spike.” Tullock suggested that if we were serious about reducing car related casualties, we should install a spike in the center of the steering wheel of all cars. Risk compensation would then lead to safer driving and therefore actually reduce fatalities.

Risk compensation is the tendency we have to adjust our behavior based on our perception of risk. The riskier we think a situation is, the more careful we will be. Likewise, if we perceive we are relatively safe or protected, we will be less careful. In the case of driving, as vehicles have gotten safer, people’s behavior has become riskier thereby mitigating a portion of the harm reduction we should have experienced.

Part of the mechanism of risk compensation is optimism bias. This bias causes us to overestimate the chance that good things will happen to us and underestimate the likelihood of bad things happening. This serves to increase our perception that we are safe, and perception is what matters, not reality. If we believe we are safe from harm, we will do risky things.

As leaders, we cannot afford optimism bias.

Leadership requires realism. Realism says that things exist regardless of our perception of them. In other words, there is a reality that we can see if we set aside our biases. Actually, that’s not right. There is a reality we can partially see if we can get past our preset perceptions. We will never see everything.

This is where humility and community become the best tools a leader can have. Humility allows me to learn from other people. Community is where I find those other people. For this to work, though, the community needs to be shedding its optimism bias (and a bunch of other biases, too) and seeing as much of this reality as they each can. Getting a more complete picture from the partial views each person has is called the wisdom of the crowd.

So, how do we create a culture where people are less likely to succumb to their biases and more likely to “see” things as they really are? The optimism bias helps us feel like we are in control, improves our outlook and chances of succeeding, and improves our health, so it’s not all bad. However, when it interferes with our ability to accurately evaluate the circumstances surrounding critical decisions, it becomes a liability.

One way to establish personal and corporate habits that help reduce optimism bias is using the “outside view.” We practice picturing ourselves outside of the scenario and look for “base rates.” If you’ve never walked 10 miles before, knowing it takes the average person roughly between 2 hours 30 minutes and 3 hours 20 minutes to walk 10 miles could keep you from optimistically assuming a shorter time. This works for projects, timelines, predicting success rates, and much more.

Another way to combat optimism bias is the “postmortem” approach. (Only we are going to do the postmortem before the patient dies, so it’s really more of a pre-mortem.) The individual, or the team, pictures themself after the project fails or the decision turns out to be wrong and then tries to define all the reasons it went south. This helps us acknowledge the risks we already knew were there but were ignoring in favor of an unrealistic optimism.

It is good to be optimistic about the diligence, creativity, and tenacity of your team. It is also good to be realistic about the actual risks and potential failure points so you can plan for and mitigate them. Humble leaders can get a clearer picture from their team if they trust them enough not to make the world seem too safe for their own good. Healthy teams (and healthy leaders) have a good balance of realism and optimism because it works, and it is The Bison Way.