Lessons from an eavesdropper

On my way home from the airport, returning from the Telluride East conference outside of Washington, DC, I overheard a woman on the phone talking about a harrowing experience she had just had on an airplane. The conversation went something like this:

We were so lucky to have had a great pilot. We were on the runway and cleared for takeoff when another jet appeared out of nowhere in front of us! If our pilot hadn’t slammed on the brakes, we would have crashed into the other plane. I’m so glad that the pilot was paying attention! He really saved us all.

After a long discussion at Telluride East about high-reliability organizations, I found this fragment of conversation extremely interesting. The airline industry is often seen as a model for safety, as a great example of a high-reliability organization. And yet here was a woman speaking of a narrowly averted disaster on the runway!

To me, this episode reinforced that, even in high-reliability organizations, humans make mistakes. Eliminating human error is impossible. Preventing harm by recognizing this fact and designing a system to accommodate human error, however, is not.

On another level, the woman’s attitude toward the error was very telling. All credit for saving the plane went to the pilot. He was smart and aware; he had saved the day all by himself! Some bad pilot had erred, but the good pilot had prevented disaster. Never mind the fact that the pilot had no doubt had extensive training. Never mind the fact that the pilot was not the only one in the cockpit. What was the co-pilot’s role? What were the air traffic controllers doing during this time? Perhaps the situation was exactly as the woman imagined it – someone made a mistake, and the pilot, through sheer force of will, saved both planes from disaster. But I have a suspicion that the situation was far more complicated than that and that there was not simply one “bad” actor and one “good” actor.

I also suspect that, had the unthinkable occurred and the planes had collided, the pilots (whether or not they survived) would have borne the brunt of the blame in the public’s eye. And that would be a real shame, because that would present a barrier to growth and change within the organization. I hope and believe that, in the spirit of a true high-reliability organization, the crew debriefed the event and systemic changes were made to avoid such a near miss in the future. I also hope and believe that, as long as the errors made were reasonable human errors and did not result from flagrant protocol violations, the pilots were neither “blamed” nor “shamed” for the near-miss event.

Clearly, health care has a lot to learn from high-reliability organizations such as the aviation industry. No force on earth can prevent humans from making mistakes. Only by recognizing human fallibility can organizations put systemic measures in place to prevent harm.