Lessons Unlearned: How Past Experiences Could Make Us Less Safe

Lessons Unlearned: How Past Experiences Could Make Us Less Safe
Claire Ealding
Feb 19, 2021

Experience can be a powerful teacher. How we choose to learn from past experiences will influence the decisions we make in the future.  


However, the lessons we learn aren’t necessarily always positive. Past experiences can embolden the types of risky behaviors that make us less safe.


One research study has shown that pilots can be prone to judging the success of a decision based on its outcome rather than the quality of the decision made [1]. Allowing the end to justify the means is more formally known as outcome bias. In other words, committing an unsafe act and experiencing no adverse outcomes can lead people to believe that, in hindsight, the act itself was justified.


Outcome bias can be costly to people and organizations, especially in terms of safety. Valuable opportunities to learn lessons from “close calls” are lost, and risky behavior could become normalized.


Outcome bias can also create a barrier to safety reporting when hazardous situations are present [2]. When people are emboldened by their past experiences, they may underestimate the severity of the hazards around them. For this reason, there is little motivation for people to report them. Not knowing what hazards exist can be costly to flight departments that strive to be proactive in addressing safety deficiencies.


Even in business settings, an overemphasis on "outcomes" can create an outcome-centric culture, perpetuating workarounds, routine violations, and/or unsafe practices within an organization.  


Fortunately, there are some simple tips for recognizing and mitigating outcome bias, which we will discuss in this post. But first, let’s take a look at an example of outcome bias in action.



Same Decision, Different Outcomes



It’s a frosty winter morning and a crew is getting ready to depart on a flight. A thin layer of frost covers a few portions of the wings and fuselage. The crew recall from training that there are specific locations where frost is, and isn’t, allowed to be on the aircraft. However, they decide to skip de-icing altogether in a rush to make their estimated departure clearance time (EDCT). “It’s only a thin layer of frost, after all,” they reason.


What happens next is a toss-up.


In one scenario, the crew take off and complete the flight uneventfully. But in another scenario, the crew takeoff and are met with the stall warning.


Although the outcomes differ, both crews made the same mistake. So, would the crew that completed their flight without issue be considered more “competent” or “safer?”


The answer is no.


The decision to forego deicing in both scenarios was equally risky; it was just the outcomes that were different. The crew that were lucky enough to depart uneventfully could now be under the mistaken impression that it's acceptable to take off with portions of contamination on the wings.


The next time they forego deicing, they might not be as lucky.


The Science Behind the Bias

A study carried out in New Zealand showed that outcome bias influences pilot perceptions on unsafe events, particularly when making weather decisions [1].


Researchers presented a group of 142 pilots with a hypothetical scenario: a non-instrument rated private pilot departed in marginal Visual Flight Rules (VFR), with a forecast for temporary Instrument Meteorological Conditions (IMC). The pilots were then split into groups and given different outcomes of the flight:

  • Group 1: Positive outcome - the pilot made it safely to their destination (probably by “scud running”).
  • Group 2: Close call outcome - the pilot inadvertently entered IMC, turned around, and returned to the departure airfield.
  • Group 3: Negative outcome - the pilot inadvertently entered IMC, became disoriented, and crashed.
  • Group 4: No outcome given - judge quality of decision without knowing the outcome.


Pilots rated the decision-making ability as “better” in the close call and positive outcome scenarios than in the negative or no outcome scenarios. Even though each scenario involved the same decision to take off into deteriorating weather, the study highlights how much influence a successful outcome has on our perception of safety.

Table from: Understanding the Past: Investigating the Role of Availability, Outcome and Hindsight Bias in Visual Pilots’ Weather-Related Decision Making


According to the researchers, “What was of particular interest was that pilots interpreted events that led to a close call very similar to those that had positive outcomes, which may reinforce risky behavior.


Consider the statement above in the context of a Safety Management System (SMS).


If people perceive a problem only when there is a negative outcome, what does this mean for safety reporting? If close calls are not being observed or reported, how does the flight department know what hazards threaten the operation?



Blissfully Unaware

Blissfully ignorant safety manager



If close calls are not being observed or reported, how does the flight department know what hazards threaten the operation?

When flight departments know the hazards and close calls plaguing their frontline employees, safety managers can take a proactive stance in mitigating safety concerns before they escalate into damage, injury, or other serious consequences.


However, one research study in the medical field revealed that doctors and nurses were more likely to report unsafe acts or mishaps only if they led to a bad outcome. In contrast, the same unsafe actions that resulted in innocuous outcomes were less likely to be reported [2].


These findings might mean that safety issues are slipping through the cracks, only to be brought to the organization’s attention when - eventually - damage, injury, or other serious consequences occur.


Safety managers have a part to play in helping personnel look beyond the outcome and think about what could have happened or what almost happened. Promoting the reporting of hazards and “close calls,'' in addition to creating a culture of trust, are significant first steps.


You can read more about creating a strong safety culture in one of our previous blogs here!



The Devil on your Shoulder


The problem with cognitive biases is that they can grow and develop without us being conscious of them. Slipping into outcome bias can be all too easy and tempting. “Well, it worked out last time,” you might think before you depart without de-icing.


Stopping to consider your thought process and paying attention to your inner monologue can help key you into identifying outcome bias. Outcome bias may present itself in thoughts such as:

THOUGHT FROM SCHEDULER - “We had a crew complete a 16-hour day last week; why can’t this crew do it today?”

REMEDY - “I am going to see what is different about this crew today. Have they had the same amount of rest?”


THOUGHT FROM PILOT - “I skipped the walk around last week because the client showed up early. This plane is in good shape. I can skip it again.”

REMEDY - “I don’t know what condition the plane is in unless I inspect it.”

THOUGHT FROM MECHANIC - “I don’t need a safety harness; I saw the supervisor working on the engine last time without one.”

REMEDY - “It is a long drop, and if I fall, there is a high probability that I could seriously hurt myself.”


Fortunately, there are a few things we can do to guard against biases in ourselves and others.



Four tips for preventing outcome bias



  1. Share “never again” stories. We all have our version of an “I will never do that again” story. If you feel like people could learn a lesson from your experiences, share your story! Escalating these valuable lessons learned from informal “hangar talk” to a safety report can also make tremendous improvements to safety.


  1. Debrief. Debriefing after a flight is a great way to reinforce learning. In your debrief, identify any close calls, discuss things that could have gone better, and think about what you can do differently next time. Don’t forget to discuss what went well too! Making a habit of self-reflecting and opening a dialogue with your crew allows you to assess your decision-making and performance continually. And don’t forget to report any identified mistakes in your SMS!

  2. Eliminate the outcome during self-reflection. It can be challenging to be objective when self-critiquing our performance. When reflecting on whether a decision was good or not, try to eliminate the outcome from the equation. Instead, focus on the quality of the decision when it was made. In other words, focus less on the results and more on how you got there.


  1. Be liberal with safety reporting. The overall goal of a SMS is to identify hazards before they manifest into a poor outcome. So, safety reports are not just for “when something bad happened.” They are even more critical when “something almost happened.” Not only will your organization benefit from the safety data from close-calls, but you could be sharing vital information that helps prevent costly mistakes.


The following excerpt from the Boeing Maintenance Error Decision Aid (MEDA) for investigation underscores the importance of reporting:


Data from the U.S. Navy shows that the contributing factors to low-cost/no-injury events were the same contributing factors that caused high-cost/personal-injury events. Therefore, addressing the contributing factors to lower-level events can prevent higher-level events. [3]



Conclusion


In the safety-critical world of aviation, flight crews and frontline workers make dozens of important decisions every day. The reality is that many of our decisions can be time-sensitive or made while under a high workload. So naturally, we rely on our previous experiences or “rules of thumb” to guide the decision-making process.


This is when we must be mindful of outcome bias, and realize that just because something has “worked out before” doesn’t necessarily mean it was safe or will work out again. Said another way, repeating the same task doesn’t always yield the same results.


If we have the opportunity to make a prudent decision on the ground before we even take off, it might spare us from facing some much more challenging decisions once airborne.



If you would like help or information on any of the topics mentioned in this article or assistance writing bulletins for your flight department, we’re always happy to help. Contact us today.  


References

  1. Walmsley, S. and Gilbey, A. (2019) Understanding the past: Investigating the role of availability, outcome, and hindsight bias and close calls in visual pilots' weather‐related decision making. Applied Cognitive Psychology. 33(6): 1124-1136.
  2. Lawton, R. and Parker, D. (2002) Barriers to incident reporting in a healthcare system. Quality and Safety in Healthcare. 11(1): 15-18.
  3. Rankin, W. (2007) MEDA Investigation Process. Boeing Aero Issue 26, Q02.

Your Personal
Safety Department

Don't waste valuable time!

get help now