In any industry, there are bound to be mistakes. Mistakes can be either willful or unintentional, but are equally dangerous to aviation safety.
In aviation, unintentional mistakes can lead to devastating accidents, an outcome that we hope to prevent by implementing Safety Management Systems (SMS) in our organizations.
Aviation SMS implementations are expected to reduce both direct and indirect costs associated with accidents and minor incidents. The finance folks prefer to look at hard numbers associated with accidents, such as:
Finance folks and upper management are more receptive to SMS implementation costs when you have historical cost data to make the business case. How does one place a value on a "preventable accident" or "costs saved because we didn't have an accident?" How does a safety team directly attribute the SMS implementation to the avoidance of one or two accidents per year?
This has become a very difficult argument for safety managers to win. Simply because you have implemented an SMS, does that mean you won't experience adverse effects from risk? We must be realistic.
While SMS encourages us to move toward predictive analysis to prevent damage and injury, risks cannot be entirely mitigated. Even the best systems will sometimes experience a mistake or an accident. Aviation SMS risk management processes focus on:
Mistakes help drive the SMS' risk management processes. However, mistakes must be identified and reported in order to fix the "system design" and reinforce risk controls to prevent a recurrence.
When mistakes do occur, they can be unfortunate or even tragic, but there are always lessons to take away. For example, consider the following story.
I noticed the door had two separate locks, and my flight instructor was religious in making sure they were both locked. She would make me close the door, then lock it once. After we both heard the first click, she would tell me to engage the second lock.
After we both heard the second click, she would look behind my back at the door to make sure it was properly sealed. Then she would repeat the same procedure with her own door.
To me, it seemed like overkill. I couldn’t fathom ever doing such a ritual on my own because I really didn’t see the purpose. When I asked my flight instructor why she insisted on closing the doors like this, she told me.
Someone had gone up flying in the 162 and hadn’t locked the doors properly. On a 152, this wouldn’t have been a major issue, because the wind would have pushed the door up against the plane. On the 162, however, the doors open up. This meant that the wind caught the door and ripped it up and off.
I was so surprised that something so major had happened at the club I flew at, with the plane I was flying. It was sobering. I thought my flight instructor was done there, but she wasn’t. She told me about how later, someone was flying alone and had forgotten to latch the passenger door properly.
The exact same thing happened. I remember my eyes got wide when my instructor told me this. It wasn’t a one-time fluke or some defect.
I remember looking at the doors of that plane and thinking about what would happen if they fell off, and how easy it would be for them to fall off.
- Story contributed by Alex Nevin, 2016
Accidents are bound to happen from time to time, no matter how much effort we put into our SMS. A more realistic goal would be to prevent the same mistake from happening again.
Sometimes mistakes will have one clear cause, but more often it's a series of events and circumstances that lead up to an accident. In the SMS, we call those events and circumstances contributing factors.
Contributing factors are clearly identifiable elements of the conditions leading up to an event that:
In the first instance of Alex’s story, perhaps a safety manager blamed a single latch and added the second as a redundancy. Perhaps in the second, the pilot was distracted and the double-check protocol Alex described was implemented.
These are simple speculations and don’t take into account any data beyond Alex’s story, but even so, we can draw a lesson from them.
The single latch and distraction would both be contributing factors to the incidents that Alex’s instructor talked about. For another example, look at this Info from the FAA about the contributing factors in wrong runway takeoffs.
Human factors like distraction are one of the most commonly cited contributing factors. If you’d like a quick and easy resource to share with your employees or colleagues on this important topic, check out this ready-to-distribute newsletter. It’s free!
While an accident in and of itself may be unfortunate or even tragic, safety managers and teams can analyze the contributing factors to implement risk controls that mitigate the risk in the future. These risk controls should be documented and monitored, which provides safety data for our SMS.
The data collected in the wake of a mistake can be used in many ways. In the old model of reactive safety, the blame might be assigned and then the data have been forgotten. With a functioning SMS, we’ll put that data to work to drive proactive and predictive risk management processes.
In my thought experiment earlier, I proposed that after the second incident, the club instituted a protocol for making sure both latches were secured. Simply blaming the distracted pilot would have been a reactive answer. Adding a protocol is a proactive measure.
Beyond using the data for proactive measures like a safety protocol, the data from the accident can also be put to use with predictive methods. Adding the data about this accident to an SMS database, it adds to the accuracy of predictive models that can further improve safety by analyzing historical records.
That’s a lot of value from the incident data by itself, but the collected data can still be put to use in another area of the SMS. By building a Lessons Learned library, accident data can also be used as a training resource.
Safety promotion is one of the most under-utilized tools by MOST safety managers. By most, I'm not speaking about 45 or 50 percent of the safety managers, but more like 80 to 90 percent of companies do not executive safety promotion campaigns to come even close to realizing their full potential. The Lessons Learned Library is simply one safety promotion tool that can be exploited to improve aviation safety.
The lessons learned library is different from the data analytics we use for predictive risk analysis. Instead, the Lessons Learned Library focuses on the takeaways of safety managers and investigation teams. Think of it like a collection of after-action reports.
By reporting the lessons they learned in investigating an accident and then implementing relevant risk controls, the safety team adds value to their safety community. These lessons can then be passed on to new safety leaders as well as other employees either by email or by providing a searchable database of "Lessons Learned" in the organization's SMS database.
Providing AND managing lessons learned library provides:
By building a Lessons Learned library, you stand to gain great benefits for your SMS with the information you’re already gathering.
Better yet, make your lessons learned library available to the community at large so that safety professionals everywhere can learn from your experience. Not only will you be helping your colleagues, but the next generation of safety managers as well.
Formal investigations from reported safety issues offer great amounts of useful data to share with employees in the form of lessons learned. Yet many aviation SMS neglect lessons learned. Why?
Based on ten years of empirical evidence from hundreds of small to medium-sized aviation service providers, only 15-18% of aviation SMS routinely create a lesson learned and publish it to their lesson learned library. Another 30% of operators have published more than two lessons learned for their employees. That leaves 50 percent who completely neglect to create and publish a lesson learned for their employees.
Why do operators not make better use of the Lessons Learned Library in aviation SMS? Here are my thoughts-they are only thoughts as I have not conducted any surveys to verify my suspicions.
Lack of time to prepare lesson learned
In many cases, a safety manager is a part-time safety manager and doesn't have time to prepare a lesson learned after conducting the investigation. The safety manager is putting out SMS fires or contributing to production activities to generate revenue.
Lack of available resources to prepare lesson learned
This may be related to the part-time safety manager. Based on an analysis of ten years worth of safety investigations, lessons learned are more often created by full-time safety managers or teams that have full-time safety managers on staff.
Lack of policy guidelines or ambiguous risk management procedures
SMS risk management procedures do not require a lesson learned to be created in any instance. When it is not required, safety managers are less inclined to perform a "best practice" when management does not explicitly endorse spending additional company resources on drafting lessons learned, even though the company possesses a "Lessons Learned Library" to store and retrieve lessons learned.
Lack of safety team acknowledgment (no incentive to create lessons learned)
Safety teams spend considerable amounts of time preparing safety meetings, documenting risk management activities for reported safety issues and audits, providing SMS training, and safety promotion activities to enhance safety culture. I find it highly probable that since most operators have never had a Lessons Learned library, they don't know how to use one, nor do they expect employees to find their way to the library to conduct research. When safety teams are not rewarded for "extra-curricular," not-visible, or not-monitored safety activities, they are less inclined to participate in these behaviors.
Lack of top management support
Upper management either does not know about the importance of "Lesson Learned Libraries" to safety culture, or they are not providing the required oversight to ensure lessons are regularly added to the lesson learned library.
Most safety professionals and managers agree that knowledge can be gained from reviewing past accidents and minor incidents. To take a case in point, let's consider the ubiquitous "case study" that we see in so many college textbooks. Case studies have proven that students learn by reading, yes, "case studies."
We have the technology to present case studies to employees using SMS databases. An SMS database is also useful for conducting accident investigations, risk assessments, and everything else related to an organization's risk management efforts. Furthermore, an SMS database may also have a tool that helps safety teams easily create and disseminate "Lessons Learned."
How do I know this? SMS Pro has had a Lessons Learned Library in its database for the past ten years. This is how I know how often lessons learned are created among hundreds of aviation service providers.
An integrated SMS database is an excellent way to manage lessons learned. Note that I said "Integrated" SMS database. This implies that all risk management data is easily within reach of the safety team performing the investigation and drafting the resulting "lesson learned."
A good lesson learned will allow users to search for lessons learned by:
Another useful feature of the lessons learned library is a full-text search. When an employee wishes to search for a particular topic, having the ability to query the SMS database for related accidents or incidents is very useful. Lessons Learned library is an exceptional safety promotion tool that helps build a proactive safety culture.
I’m sure my readers have learned many lessons over the years and we’d love to hear about them. If you’ve ever learned a valuable lesson from a mistake or accident, please share it in the comments below.
Do you have an Emergency Response Plan in place? Use this checklist to create or review your ERP!
Nichole Kruger Editor’s Note: This article originally appeared as “Lessons Learned Databank: Essential for Aviation Safety Managers” by Tyler Britton in February of 2016. I’ve expanded on his original piece and added Alex’s story, which originally appeared the same month, titled “Aviation Safety Programs Save Lives - Learn from Others Events,” to add context and depth to Tyler’s piece. I’d like to thank both of them for their contributions to this article.
Last updated August 2024.