Experience can be a powerful teacher. How we choose to learn from past experiences will influence the decisions we make in the future.
However, the lessons we learn aren’t necessarily always positive. Past experiences can embolden the types of risky behaviors that make us less safe.
One research study has shown that pilots can be prone to judging the success of a decision based on its outcome rather than the quality of the decision made . Allowing the end to justify the means is more formally known as outcome bias. In other words, committing an unsafe act and experiencing no adverse outcomes can lead people to believe that, in hindsight, the act itself was justified.
Outcome bias can be costly to people and organizations, especially in terms of safety. Valuable opportunities to learn lessons from “close calls” are lost, and risky behavior could become normalized.
Outcome bias can also create a barrier to safety reporting when hazardous situations are present . When people are emboldened by their past experiences, they may underestimate the severity of the hazards around them. For this reason, there is little motivation for people to report them. Not knowing what hazards exist can be costly to flight departments that strive to be proactive in addressing safety deficiencies.
Even in business settings, an overemphasis on "outcomes" can create an outcome-centric culture, perpetuating workarounds, routine violations, and/or unsafe practices within an organization.
Fortunately, there are some simple tips for recognizing and mitigating outcome bias, which we will discuss in this post. But first, let’s take a look at an example of outcome bias in action.
It’s a frosty winter morning and a crew is getting ready to depart on a flight. A thin layer of frost covers a few portions of the wings and fuselage. The crew recall from training that there are specific locations where frost is, and isn’t, allowed to be on the aircraft. However, they decide to skip de-icing altogether in a rush to make their estimated departure clearance time (EDCT). “It’s only a thin layer of frost, after all,” they reason.
What happens next is a toss-up.
In one scenario, the crew take off and complete the flight uneventfully. But in another scenario, the crew takeoff and are met with the stall warning.
Although the outcomes differ, both crews made the same mistake. So, would the crew that completed their flight without issue be considered more “competent” or “safer?”
The answer is no.
The decision to forego deicing in both scenarios was equally risky; it was just the outcomes that were different. The crew that were lucky enough to depart uneventfully could now be under the mistaken impression that it's acceptable to take off with portions of contamination on the wings.
The next time they forego deicing, they might not be as lucky.
A study carried out in New Zealand showed that outcome bias influences pilot perceptions on unsafe events, particularly when making weather decisions .
Researchers presented a group of 142 pilots with a hypothetical scenario: a non-instrument rated private pilot departed in marginal Visual Flight Rules (VFR), with a forecast for temporary Instrument Meteorological Conditions (IMC). The pilots were then split into groups and given different outcomes of the flight:
Pilots rated the decision-making ability as “better” in the close call and positive outcome scenarios than in the negative or no outcome scenarios. Even though each scenario involved the same decision to take off into deteriorating weather, the study highlights how much influence a successful outcome has on our perception of safety.
According to the researchers, “What was of particular interest was that pilots interpreted events that led to a close call very similar to those that had positive outcomes, which may reinforce risky behavior.”
Consider the statement above in the context of a Safety Management System (SMS).
If people perceive a problem only when there is a negative outcome, what does this mean for safety reporting? If close calls are not being observed or reported, how does the flight department know what hazards threaten the operation?
If close calls are not being observed or reported, how does the flight department know what hazards threaten the operation?
When flight departments know the hazards and close calls plaguing their frontline employees, safety managers can take a proactive stance in mitigating safety concerns before they escalate into damage, injury, or other serious consequences.
However, one research study in the medical field revealed that doctors and nurses were more likely to report unsafe acts or mishaps only if they led to a bad outcome. In contrast, the same unsafe actions that resulted in innocuous outcomes were less likely to be reported .
These findings might mean that safety issues are slipping through the cracks, only to be brought to the organization’s attention when - eventually - damage, injury, or other serious consequences occur.
Safety managers have a part to play in helping personnel look beyond the outcome and think about what could have happened or what almost happened. Promoting the reporting of hazards and “close calls,'' in addition to creating a culture of trust, are significant first steps.
You can read more about creating a strong safety culture in one of our previous blogs here!
The problem with cognitive biases is that they can grow and develop without us being conscious of them. Slipping into outcome bias can be all too easy and tempting. “Well, it worked out last time,” you might think before you depart without de-icing.
Stopping to consider your thought process and paying attention to your inner monologue can help key you into identifying outcome bias. Outcome bias may present itself in thoughts such as:
THOUGHT FROM SCHEDULER - “We had a crew complete a 16-hour day last week; why can’t this crew do it today?”
REMEDY - “I am going to see what is different about this crew today. Have they had the same amount of rest?”
THOUGHT FROM PILOT - “I skipped the walk around last week because the client showed up early. This plane is in good shape. I can skip it again.”
REMEDY - “I don’t know what condition the plane is in unless I inspect it.”
THOUGHT FROM MECHANIC - “I don’t need a safety harness; I saw the supervisor working on the engine last time without one.”
REMEDY - “It is a long drop, and if I fall, there is a high probability that I could seriously hurt myself.”
Fortunately, there are a few things we can do to guard against biases in ourselves and others.
The following excerpt from the Boeing Maintenance Error Decision Aid (MEDA) for investigation underscores the importance of reporting:
Data from the U.S. Navy shows that the contributing factors to low-cost/no-injury events were the same contributing factors that caused high-cost/personal-injury events. Therefore, addressing the contributing factors to lower-level events can prevent higher-level events. 
In the safety-critical world of aviation, flight crews and frontline workers make dozens of important decisions every day. The reality is that many of our decisions can be time-sensitive or made while under a high workload. So naturally, we rely on our previous experiences or “rules of thumb” to guide the decision-making process.
This is when we must be mindful of outcome bias, and realize that just because something has “worked out before” doesn’t necessarily mean it was safe or will work out again. Said another way, repeating the same task doesn’t always yield the same results.
If we have the opportunity to make a prudent decision on the ground before we even take off, it might spare us from facing some much more challenging decisions once airborne.
If you would like help or information on any of the topics mentioned in this article or assistance writing bulletins for your flight department, we’re always happy to help. Contact us today.
Don't waste valuable time!get help now