I was recently asked to review a draft of a paper discussing safety culture within a major Air Navigation Service Provider/Regulator and how they measure safety with related topics involving compliance vs. safety, balancing effectiveness and safety, quality assurance vs. quality control, human performance and managing risk. The draft had many references to “before and after” specific dates. (I’m not identifying the organization since most any name could be inserted without significantly altering the content.)
The paper made me realize we have been addressing the same issues for at least 40 years. So I thought I would share my perspective on why we’re still looking for answers.
From the U.S. alone I can recall reports including, “before and after” August 3, 1981, the date of the US air traffic controllers strike; the 1997 report by the National Civil Aviation Review Commission (NCARC); and countless reports following September 11, 2001. I recall numerous internal FAA reviews and joint government and industry efforts, with countless recommendations over the years. Yet we still search for answers.
After 48 years in the industry working in North America and Europe, including a 30-year career with FAA as an air traffic controller at several of the busiest facilities in the U.S., an FAA accident investigator, and Airline Transport Pilot/Flight Instructor, I’ve come to the opinion that there is a common characteristic affecting us all. The irony is it’s the same characteristic that has made aviation as successful as it is today with the milestones and achievements we have seen over the past 40+ years. That characteristic is our inability to recognize our resistance to change and a single-minded focus on “us”, whoever “us” may be, instead of the overall system.
The Systems Foundation
Aviation is not simple. It’s a complicated matrix of multiple people, organizations, systems, and cultures that come together to make the worldwide aviation system a marvel of human achievement. It requires an individual and organizational “can do” attitude that has created cultures that would never be described as indecisive. Ironically, while we can adapt quickly to solve an immediate problem, we are not receptive to change.
The old expression was “stove pipe” and that was an accurate analogy. Thirty-six years ago, in a 1984 interview, FAA Administrator Don Engen, describing the FAA Air Traffic culture said, “We are too autocratic. ‘Turn right now to 360, climb to 15,000 feet’ – you get used to barking out those orders and you ascend to become a supervisor and you’re autocratic. And the supervisors become managers.”
Similar patterns could be seen in flight training, airline operations, and maintenance. Success was based on performance within a very rigid and controlled environment of specialization. That rigid single-minded focus that allowed us to build a complex system requiring the integration of multiple people and organizations working as a team – even though most didn’t realize it – created the foundation to make it almost impossible to say we made a mistake or, even worse, that someone may have a better idea.
The 600-pound Gorillas In Our Room
There are a number of classic aviation topics we normally resist discussing. Consider if you will…
Aviation always emphasizes safety, and generally states, “Safety is our number one goal.” I submit that we have been misusing and oversimplifying this term for years and, in doing so, creating an expectation no system can ever achieve. We’re not being honest.
One definition of “safe” is, “freedom from the occurrence or risk of injury, danger, or loss.” In other words, the expectation nobody will get hurt. While a noble goal, it ignores the reality that there is no such thing as being “perfectly safe” and aviation, like all modes of transportation, has risk involved and you have to work at staying safe.
But if you are indoctrinated to believe everything is already safe, you will be less inclined to see risk and even less inclined to report it. The use of the term “risk” within Safety Management Systems – introduced over 25 years ago – provided aviation with one of the greatest leaps in safety. While nobody is willing to say, “This operation is unsafe,” we are all willing to say, “I think we need to talk about how risky this could be…” and then mitigate and control those risks. But we still resist looking at something we’ve been doing “forever” and acknowledging our risk analysis may be biased by history. Again, we’re not being honest with ourselves.
Professional safety personnel understand this, but they are not the target audience for this example. The intended audience are the people who do the same job day after day who have a different view of what’s normal and expected.
Regarding the reporting of safety events, we need to accept the fact that humans make mistakes. Unfortunately, lacking the understanding of human factors we have today, early investigations focused on “what happened” and who made the mistake that caused the event.
Predictably this led to the focus on an individual failure being blamed for the event, with associated actions against the individual. But doing so is the fastest way to turn off our best source of dynamic safety data where multiple hazards and threats must be managed as a matrix.
The Regulator – A Hidden Risk?
Regulations, procedures, practices, etc., are an important part of the “checks and balance” of safety. But regulations are generally drafted after a significant event where organizational-political pressure demanded corrective action. The regulatory process is rarely proactive and normally influenced by input from various organizations that involves a series of compromises.
To be candid, people making regulations are just people and they too can make mistakes and new regulations can create unintended risks to the system that were not foreseen by the regulatory process. Unfortunately, when the regulator is questioned, it’s the rare organization that will say, “We made a mistake.”
The point is that operational organizations should not simply focus on compliance, believing in compliance means they are safe. No regulator understands the overall system risks as well as the operators who are ultimately responsible for accurate risk assessments
Safety & Efficiency – The Classic Debate
Just as there is a relationship between compliance and safety, there is a historical relationship between system capacity and safety. There has always been tension, or worse, conflict, between system efficiency and system safety. Operational people would look at safety people as obstructionists to efficiency, while safety people may view some operational actions as unwise and question why safety was not consulted. Instead of opposing perspectives, the combination of experience can become a force-multiplier for overall efficiency executed safely.
There is also a subliminal risk associated with the opposing debate of safety vs efficiency. That risk is found in the confusion people experience when they receive mixed messages – where emphasis is placed on “safety first” but subsequent actions demonstrate a focus on economics. The truth is both safety and economics are linked together, and most people understand this. The mitigation to this risk is clear policy on priorities that are executed consistently.
Not So Fast
Finally, we should examine what might be the most difficult challenge to managing safety – the natural desire to take action and fix something immediately.
Safety takes patience. Understanding all the factors involved in a complex system requires the investment of time and expertise from multiple sources – one person doesn’t know everything anymore (if they ever did). Patience conflicts with society’s need to immediately know why something happened, whose fault it was, and how to make sure it never happens again. It’s a naïve and myopic goal that negates taking the proper action to maintain safety.
No Time Left For Subtlety
It seems some outside experts are becoming frustrated by the slow pace of change in the system. In what struck me as some of the boldest (bluntest) language I can recall, a recent (2020) MITRE report discussing FAA culture states: “Safety culture has returned to an event-driven foundation, despite the desire to move away from that being part of the impetus for the changes implemented in 2012. This thirst, or perceived thirst, for event information by each level of leadership continues to drive event-driven behaviors which negatively (and profoundly) affect safety efforts at every step along the QA/QC process.” The section goes on to say, “This event-driven culture needs to stop, and the facilities must be allowed to recover from an event.”
As I indicated, I’ve formed these opinions over years based on my experience with many “600-pound Gorillas.” Our global system is improving, which is a tribute to the dedication of all the people who contribute to the success of aviation, but I think we could do more and faster.
I also suspect aviation is not the only industry with these issues. Rail, maritime, medicine probably have similar experiences. If that’s the case, perhaps we can learn from each other?