It always saddens me when a medical mistake is reported in the media, and the immediate reaction of so many people is: “Someone that careless and incompetent should be fired. There is absolutely no excuse for hurting a patient by omitting such a basic step as calculating the right dose / making sure you have the right patient / checking for allergies / etc… ”
My immediate response to this is always: “Have you ever locked yourself out of your house?” Human beings make mistakes. Human beings sometimes omit basic, crucial steps from tasks they supposedly have mastered. We make small, stupid, high-consequence mistakes so predictably, so inevitably that an entire industry (locksmithing) is dedicated to undoing the small, stupid, high-consequence mistake of not ensuring we have our keys when and where we need them.
The keys analogy is actually a great way to think about both how errors happen and how we can prevent them. Remembering your house keys before you leave the house is a basic, crucial component of the process known as Adulthood. We expert Adults do this step every day, usually without error or even thought. We don’t think about remembering our keys even though the potential consequences of forgetting are quite serious. Without keys, we can’t perform Adult activities such as showering and sleeping in our own beds. Competent Adults should always make sure they have their keys before they leave the house.
And yet… everyone locks themselves out at some point. No matter how long we have owned keys, no matter how many reminder signs we put by the door, no matter how many seminars about memory tips and tricks we attend: sometimes, we just forget our keys.
But there is hope! If, as we have discussed at length here in Telluride, we not only plan for but expect mistakes, we can design systems to minimize harm from these inevitable mistakes. When do patient-harming errors in medicine occur? Quite often, under the same circumstances that errors occur in ordinary life. We forget our keys whenever we are rushed, whenever we are tired, whenever we are multitasking, or whenever our basic routine has been altered for some reason.
And how do we create safe, reliable systems? (Hint: not by hoping to never again rush, be tired, multitask, or change up our routine.) We build into the system methods for preventing mistakes. Each time I leave my apartment, I pause and ask myself “Wallet? Phone? Keys?” Unconsciously, I have created my very own pre-procedure checklist.
What about when, despite my excellent time-out system, I still make a mistake? Fortunately, my friend Maggie down the street has a set of spares, so the outcome of entering my apartment does not depend on my infallibility. Yes, I’ve had several near misses, but so far the checklist + Maggie backup system has prevented an actual adverse lockout event.
However, the Maggie system is frankly less reliable on nights and weekends, so my system still has some obvious gaps. I therefore plan to build in the additional safeguard of hiding a set of keys in the yard so that I am protected from harm even if simultaneously I forget my keys, I run through the checklist improperly, and Maggie stays at the library until 2am. Doesn’t having this safeguard (which doesn’t depend on human infallibility) make so much more sense than simply hoping that these three failures (all of which will likely happen at some point) don’t all happen at once?
Human beings make mistakes. We can protect patients by acknowledging this truth, by fighting against the outdated culture in healthcare which states that providers are infallible. Designing systems based on how we would like the world to be (a world where medical professionals never make mistakes) hurts patients. Designing systems based on how the world actually is keeps our patients safe. I would go so far as to call it a professional and moral mandate to create such safe systems.
I am so happy to be here in Telluride, with words such as “human factors” and “causal chains” being tossed around with nerdy abandon. I hope that as this week continues, we will learn even more ways to design systems that prevent human error, and more importantly catch and mitigate inevitable human errors before they result in patient harm. I hope that every single one of us here never forgets the main reason we care about creating safe systems: not just to follow rules and regulations, not just to reach “100 days without a serious safety event”, not just because patient safety discussions are nerdtastically awesome. We are here at the Telluride Patient Safety Roundtable because we are members of a healing profession, a healing vocation. We want our patients to walk away healthier and more cared for than when they first entrusted themselves to our care. To err is human, but learning and improving are also parts of being human. We can learn from our mistakes of the past to anticipate our mistakes in the future and design systems which still keep our patients safe.