Mistakes Are an Opportunity
In any industry, there are bound to be mistakes. In aviation, mistakes can lead to devastating accidents, an outcome that we hope to prevent by implementing Safety Management Systems (SMS) in our organizations.
While SMS encourages us to move toward predictive analysis to prevent damage and injury, it’s impossible for risks to be entirely mitigated. Even the best systems will sometimes experience a mistake or an accident.
When mistakes do occur, it can be unfortunate or even tragic, but there are always lessons to take away. For example, consider the following story.
A Story behind Every Control
I noticed the door had two separate locks, and my flight instructor was religious in making sure they were both locked. She would make me close the door, then lock it once. After we both heard the first click, she would tell me to engage the second lock.
After we both heard the second click, she would look behind my back at the door to make sure it was properly sealed. Then she would repeat the same procedure with her own door.
To me, it seemed like overkill. I couldn’t fathom ever doing such a ritual on my own because I really didn’t see the purpose. When I asked my flight instructor about why she insisted on closing the doors like this, she told me.
Someone had gone up flying in the 162 and hadn’t locked the doors properly. On a 152, this wouldn’t have been a major issue, because the wind would have pushed the door up against the plane. On the 162, however, the doors open up. This meant that the wind caught the door and ripped it up and off.
I was so surprised that something so major had happened at the club I flew at, with the plane I was flying. It was sobering. I thought my flight instructor was done there, but she wasn’t. She told me about how later, someone was flying alone and had forgotten to latch the passenger door properly.
The exact same thing happened. I remember my eyes got wide when my instructor told me this. It wasn’t a one time fluke or some defect.
I remember looking at the doors of that plane and thinking about what would happen if they fell off, and how easy it would be for them to fall off.
- Story contributed by Alex Nevin, 2016
Mistakes Provide Reference Points
Accidents are bound to happen from time to time, no matter how much effort we put into our SMS. A more realistic goal would be to prevent the mistake from happening again.
Sometimes mistakes will have one clear cause, but more often its a series of events and circumstances that lead up to an accident. In SMS, we call those events and circumstances contributing factors.
Contributing factors are clearly identifiable elements of a SMS program that:
- Directly cause a mistake or
- Create an environment where mistakes are inevitable
In the first instance of Alex’s story, perhaps a safety manager blamed a single latch and added the second as a redundancy. Perhaps in the second, the pilot was distracted and the double-check protocol Alex described was implemented.
These are simple speculations, and don’t take into account any data beyond Alex’s story, but even so, we can draw a lesson from them.
The single latch and distraction would both be contributing factors to the incidents that Alex’s instructor talked about. For another example, look at this InFO from the FAA about the contributing factors in wrong runway takeoffs.
Human factors like distraction are one of the most commonly cited contributing factors. If you’d like a quick and easy resource to share your employees or colleagues on this important topic, check out this ready to distribute newsletter. It’s free!
While an accident in of itself may but unfortunate or even tragic, safety managers and teams can analyze the contributing factors to implement controls that mitigate the risk in the future. These controls should be documented and monitored, which provides safety data for our SMS.
How Useful Is Accident Data?
The data collected in the wake of a mistake can be used in many ways. In the old model of reactive safety, blame might be assigned and then the data forgotten. With a functioning SMS, we’ll put that data to work to implement proactive and predictive safety.
In my thought experiment earlier, I proposed that after the second incident, the club instituted a protocol for making sure both latches were secured. Simply blaming the distracted pilot would have been a reactive answer. Adding a protocol is a proactive measure.
Beyond using the data for proactive measures like a safety protocol, the data from the accident can also be put to use with predictive methods. By adding the data about this accident to an SMS database, it adds to the accuracy of predictive models that can further improve safety.
That’s a lot of value from the data by itself, but it can still be put to use in another area of our SMS. By building a lessons learned library, accident data can also be used as a training resource.
Build a Lessons Learned Library
The lessons learned library is different from the database we use for predictive analysis. Instead, it focuses on the takeaways of safety managers and teams. Think of it like a collection of after action reports.
By reporting the lessons they learned in investigating an accident and then implementing controls, the safety team adds value to their safety community. These lessons can then be passed on to new safety leaders as well as other employees.
Building a lessons learned library provides:
- A demonstrated commitment to safety
- Documentation of SMS for auditors
- Transparency that fosters safety culture
- Opportunities for training and self-improvement
By building a lessons learned library, you stand to gain great benefits for your SMS with information you’re already gathering.
Better yet, make your lessons learned library available to the community at large, so that safety professionals everywhere can learn from your experience. Not only will you be helping your colleagues, but the next generation of safety managers as well.
What Lessons Have You Learned?
I’m sure my readers have learned many lessons over the years and we’d love to hear about them. If you’ve ever learned a valuable lesson from a mistake or accident, please share it in the comments below.
Editor’s Note: This article originally appeared as “Lessons Learned Databank: Essential for Aviation Safety Managers” by Tyler Britton in February of 2016. I’ve expanded on his original piece and added Alex’s story, which originally appeared the same month, titled “Aviation Safety Programs Save Lives - Learn From Others Events,” to add context and depth to Tyler’s peice. I’d like to thank both of them for their contributions to this article.
Do you have an Emergency Response Plan in place?
Use this checklist to create or review your ERP!