Monthly Archives: March 2019

Are Cognitive Biases Impacting Workplace Safety?

Posted on

If the world were full of rational people who make rational decisions based on complete and accurate information, would the likelihood and severity of workplace incidents be increased, or would it decrease? In the real world, we know that each person has a different set of intellectual gifts and tendencies, not all people make rational choices, and we often have to make decisions based only on partial information. Here in the real world, we often find ourselves asking why a person would take the course of action they did. If they only had more information or were more aware of the hazard, they might not have made the choices they made that resulted in a tragedy. Congratulations, you have just discovered the 20/20 Hindsight Bias!

nicholson_hindsight

– Nicholson of “The Australian” newspaper: www.nicholsoncartoons.com.au

Consider the following problem: you observe that your company is not in compliance with “Regulation X”, the minimum standard set out by law in your jurisdiction. In order for the situation to be addressed properly, Management will have to allocate resources to equipment design, training, PPE, or administration. Depending on the management in your organization, the person pulling the purse strings may prefer to ignore the problem. Usually you will encounter one of the Top 3 Excuses for not working safely: Time, Lazy, Cost. Implementation always involves a resource allocation of time, effort and budget. If people can find a reason not to believe it, then there is no effort required to address the issue. Here are some examples of limiting beliefs HSE professionals run into out in the field:

1. The Gambler’s Fallacy – An industry Veteran will try to put your idea of regulatory compliance into his own 37 years of experience by stating “I’ve been doing it this way for 37 years and haven’t gotten hurt”. This is a glitch in human thinking related to how we weight events in the past and come to erroneous conclusions about the future. It is akin to believing that a VLT machine is “due” for a payout because you’ve already sunk $1500 into it. The perfect example is a coin toss. If we flip a coin 5 times and get heads on every flip, it would be easy to think that the 6th toss will be Tails. In reality though, it’s still 50/50.

2. Reduction to Absurdity – While this is more of a rhetorical device than it is a type of bias, it is still rooted in bias. Opponents will take your idea for complying with Regulation X beyond its most logical extreme and then turn it into a Straw Man that is easily demolished. “If we put a guard on that machine, we’ll have to put a guard on this machine. Maybe we should all wear faceshields, and while we’re at it, cover ourselves in bubble wrap, too. You see how stupid this is, see what you’re saying here?

3. Projection Bias – A worker sustains an injury in the workplace, and people are more than willing to point out “He chose to do A, when he should have done B”. When you hear people using the word “should” with respect to a workplace incident, they are imparting a value judgement based on their own set of experiences. The Projection Bias is the false assumption that other people think just like us, and that we, in the same set of circumstances, would have responded differently. Most often, the Projection Bias show up when individuals make an appeal to “Common Sense”.

4. Observation Selection Bias – The Observation Selection Bias occurs when an event has an effect on your perceptual filter that causes you to notice something more than you did previously. For example a pregnant woman will notice more pregnant women around her. Or the guy who bought a lime green Jeep suddenly notices more lime green Jeeps on the highway. Most people don’t notice this as a bias. Nothing has changed. This bias isn’t always a bad thing. Just be aware that when you talk about a topic in the work place (ladder use), individuals will start reporting hazards related to ladder use.

5. 20/20 Hindsight Bias – After an incident occurs, it’s usually very obvious what a person could have done differently to prevent the incident from occurring. But just because the course of action is obvious after the fact, it does not follow that it was obvious at the time for the person who made that decision. It’s the tendency to see events that have already occurred as being more predictable than before they took place. You’ll see the 20/20 Hindsight bias when people say “I told that guy not to do that, I knew it all along, it was inevitable”.

6. Conservatism (Bayesian) – Thomas Bayes is one of those historical figures who deserves more credit that he has received. His simple concept states that when we receive new information, the rational person should always be updating their prior assumptions and beliefs in light of the new information. So let’s say an industry workplace fatality occurs, and a safety alert is issued, and the root cause was related to the worker wearing loose clothing while using a grinder. If we fail to examine grinder use in our own workplace and take measures to prevent a similar incident, then we have not updated our prior assumptions about the probability of it occurring at our own company. The consequence of a conservatism bias may show up within the law courts as “negligence”.

7. The Curse of Knowledge – The Curse of Knowledge will cause experienced workers to falsely conclude that the next generation of workers just doesn’t get it and any training efforts are not worth their investment. The result is that the workplace loses out on the transmission of knowledge from experienced workers to the younger generation. In every work place, more experienced workers will lament the younger generation’s lack of skills or ability to learn on the job. These experienced workers may have simply forgotten that one day, they were a neophyte tradesman, too. Instead of going back in time and remembering their first day on the job, the more experienced workers will blame some outside force such as video games, divorce, the school system. It’s easier to do that than it is to empathize with the new worker and remember the days when you didn’t know the difference between a crescent wrench and a pipe wrench.

8. The Money Illusion – This is the tendency of people to think in terms of the nominal value, rather than the real purchasing power of money. You’ll see this at work when old timers tell stories like “I remember back in 1982, I was earning 8 bucks an hour as a Roughneck. I worked 8 weeks straight in the bush, went to the truck dealership in Wetaskiwin and bought myself a new Chevy, fully loaded. Paid cash.”. However, when you look at the facts, $8 an hour in 1982 is only $18.87 in 2014 dollars. See it for yourself at the Bank of Canada. In reality, Roughnecks on the Drilling Rigs are earning $30.70 as a *minimum* wage, according to the CAODC. Furthermore, new vehicles today have more bells and whistles, they get better mileage, and have more safety features. Then there’s televisions. The 80 inch plasma television of today simply didn’t exist in 1982, along with a whole host of technology that just gets cheaper by the day. The reality is that a roughneck working today has a far higher standard of living than the roughneck of 1982. When these stories are told in the workplace about how good it was in the old days, it leads the younger workforce to conclude that it’s time to find another industry.

9. Pro-Innovation Bias – Be careful of this one the next time you go see the vendors at the safety conference. You go to a safety conference and find the “Hot-Damn Widget”, it’s called that because it works like a hot-damn. It’s the silver bullet to a significant workplace hazard. They even have Vince from the Slap-Chop commercials, so you know it’s going to be good. Imbued with this new sense of enthusiasm, you buy a dozen Hot-Damn Widgets. You deploy them in the workplace and the result is a total failure. The pro-innovation bias can lead us to conclude that the newer product is indeed better than the old one. The best way to mitigate this bias is through research, case studies and customer references (not testimonials).

10. Status Quo Bias – We’ve all heard the cliche’s: Change is inevitable, The only constant is change, Change for the sake of Change. This is the flip side to the pro-innovation bias: If it ain’t broke, don’t fix it. Let’s say you’re on a drilling rig and you have some ideas around how to mitigate hand injuries. Management is opposed to the idea based on the limiting belief that when you combine heavy iron and people, someone will always get hurt. However, if we simply chose to believe otherwise and looked at the drilling rig in terms of what we would have to do to limit or eliminate worker exposure to hand injuries, we would have to think about it, make modifications to equipment, buy new equipment, change job procedures. All that takes up resources (again, Time, Lazy Cost). Instead, we tell workers to “be more careful” or “don’t put your hands where you wouldn’t put your johnson” in the hopes that this will drive down incident rates, result in lower WCB claims costs and somehow impact our TRIF.