Safety Leadership

Safety Leadership: I can’t be wrong – can I?

Reprints
safety-leadership-slider.jpg

EDITOR’S NOTE: Achieving and sustaining an injury-free workplace demands strong leadership. In this monthly column, experts from global consulting firm DEKRA Organizational Safety and Reliability share their point of view on what leaders need to know to guide their organizations to safety excellence.

Our brains are designed to be efficient, so they look for shortcuts. This drive for efficiency affects our thinking and we sometimes miss things in plain sight, leaving us susceptible to shortcuts called “cognitive biases.”

We’ve all seen the injuries – even fatalities – that have resulted from taking procedural shortcuts. Cognitive biases can be even more dangerous because the shortcut is occurring internally in how we think and the decisions we consider. A quick Google search shows hundreds of biases, but let's focus on the category I call the “I can’t be wrong” biases.

The human brain experiences being right as pleasure and being wrong as pain and loss. Even if there are no consequences to being right or wrong, the brain doesn’t know the difference and encodes the experience as causing pleasure or pain. As such, “We are motivated to believe we are right regardless of reality. … It requires effort and thinking to weigh arguments logically.” (Lieberman, Rock, Halverson and Cox, 2015).

One of the most common “I can’t be wrong” errors is confirmation bias, in which we acknowledge only information that confirms our underlying beliefs and ignore evidence that conflicts with them. Incident investigations, sadly, are filled with examples in which confirmation bias was a contributing cause.

Take, for example, a worker trying to release material caught in a conveyor belt. He believed that the belt was de-energized but didn’t seek to validate this belief. Instead, he sought data confirming his belief, leading to a mistaken assumption and an injury. What if he had been “successful,” meaning he cleared the jam and restarted the line without incident? His “successful” outcome would confirm that he was right all along, possibly affecting future decisions in similar situations.

In addition to confirmation bias, we tend to be overconfident in our abilities, which is why a survey of self-rated driving ability found that 93 percent of participants rated themselves “above average” – a statistical impossibility.1 We also tend to think that bad things will not happen to us. As a result, we go into jobs, tasks and decision points with an underlying belief that the risk of an injury or incident is remote.

Similarly, we hate losses of any kind. Think of the times when you drive the riskiest. In most of those situations, you probably were running late. We often view being late as a loss and seek to avoid it, taking risks to catch up. The brain’s reward system primes us to look for potential losses, such as having to redo steps or wasting time. For example, imagine you spent 90 minutes preparing for a hydro-blasting job. You did all the paperwork, got all the sign-offs, put on your personal protective equipment and then spent the past four hours doing the work. You take a break, but when you return, you notice a small section was missed. The procedure requires you to redo permitting, but you know you can just give it a few blasts and be done. What do you do? This example of a sunk cost bias appears when we already have invested time and energy into a course of action and are reluctant to backtrack.

Confirmation, overconfidence, sunk cost and optimism biases all are the results of the pleasure we feel when we’re right and the pain we feel when we’re wrong. What can we do to keep ourselves from falling prey to them? We can educate ourselves about how the brain functions and our tendency to fall into these traps. Leaders can make workplaces safer by allowing workers to pause the job and seek a second opinion. Lastly, we can conduct more pre-mortems to anticipate incidents and take countermeasures before they occur.

A pre-mortem (Klein, 2011) is a technique in which a worker imagines the most likely types of injuries or incidents that can occur and their possible causes. This allows him or her to view the job or decision differently. The pre-mortem technique helps to avoid confirmation, optimism and sunk cost biases.

Cognitive biases are shortcuts the brain makes to improve efficiency. Although these shortcuts generally are helpful, they could cause us to make mistakes. The “I can’t be wrong” biases are all too common in workplaces where selective criteria are used to determine problem resolution. By understanding these biases and employing countermeasures such as pre-mortems, we minimize the chance of succumbing to these traps.

Reference
1. Svenson, Ola (February 1981). “Are We All Less Risky and More Skillful Than Our Fellow Drivers?” Acta Psychologica. 47 (2): 143–148.

This article represents the views of the author and should not be construed as a National Safety Council endorsement.

Michael Mangan, Ph.D., is vice president of research and development for DEKRA Organizational Safety and Reliability (dekra-insight.com). He is responsible for facilitating global thought leadership and generating new innovations across the organization.

 

 

 

Post a comment to this article

Safety+Health welcomes comments that promote respectful dialogue. Please stay on topic. Comments that contain personal attacks, profanity or abusive language – or those aggressively promoting products or services – will be removed. We reserve the right to determine which comments violate our comment policy. (Anonymous comments are welcome; merely skip the “name” field in the comment box. An email address is required but will not be included with your comment.)