Saturday, January 26, 2013
We Must Learn From Others
For many years, while in the role of instructor/evaluator, I have used one principle facilitation technique to measure the progress of those I was working with. If successful, they had to articulate how they created that successful outcome. It is a version of the Socratic method. Just pointing out errors, although common and easy to do, does not prepare anyone to be successful when they leave the training environment. Nor does it facilitate effective self evaluation when students are on their own. Knowing what works and how to apply that information is the key.
I try to avoid comparisons between the medical profession and air transportation. Managing human health and providing safe transportation are very dissimilar pursuits. However, there are some narrow areas that overlap. One of these is error recognition and mitigation. Air transportation, as I pointed out in a previous post, has been a leader in this area. The medical profession, particularly in the hospital environment, has not done well with either threat management or error recognition and mitigation.
National Transportation Safety Board Chairwoman Deborah Hersman says back-to-back battery incidents aboard Boeing 787 Dreamliners in the United States and Japan are "an unprecedented event" and are a "very serious safety concern." I believe the same focus that has been placed on the 787 safety issues should be directed at the medical community. Possibly Dr. Regina Benjamin, the Surgeon General, could use a similar approach to addressing preventable deaths from medical errors and infections as her colleague Ms. Hersman. This is not an attempt to point out areas of ineffective management, but rather an invitation to learn from others successes.
The watershed accident for airborne electrical fires was Swiss Air 111 on September 2, 1998. 215 passengers and 14 crewmembers died when an electrical fault caused an uncontained fire on board the DC-10 aircraft and resulted in the unsurvivable crash off the coast of Halifax, Nova Scotia. Everyone agreed that scenario was unacceptable. The accident initiated profound changes in entertainment systems aboard airliners, as well as the associated training, reporting and oversight. The system asked, “What went wrong and how do we fix it so it doesn’t happen again?” The aggressive approach to the elusive 787 battery problems is a direct result of that accident and those deaths.
In contrast, hospital acquired infections (HAI), are estimated by the CDC to cause about 100,000 deaths per year. This figure, unfortunately, has been relatively constant for the last decade. There is a very insensitive saying that speaks to this point. “If you’re not sick when you go in the hospital, you probably will be before you get out.” Many of these HAI deaths could be avoided by simple, very low cost changes in hygiene practices.
In addition, there are other preventable medical errors, treating the wrong patient, treating the wrong body part, administering the wrong medication, surgical supplies left inside the body, etc. that are responsible for a significant number of fatalities. It has been stated that if medical errors were classified as a disease, they would be the sixth leading cause of death in the U. S. This scenario, I believe, is also unacceptable.
The resolution strategy has been known for some time, but remains elusive because it takes courage to “do it differently”. The first step, a new mindset, is always the most difficult. A paradigm shift in the culture of medicine is needed. This is not new territory outside of medicine. The aviation community went through a similar process before it could begin to effectively manage its own errors. It is still difficult however. Some airlines are remain hesitant to objectively look at their weaknesses as well as their strengths.
The Dean of the Harvard Medical School, Dr. Atul Gawande, spoke to error recognition and mitigation in his 2011 commencement address. He pointed out three skills the medical profession could focus on to reduce unnecessary patient deaths. These are the same skills that have been adopted by effective airlines to address human error.
“For one, you must acquire an ability to recognize when you’ve succeeded and when you’ve failed…...”
“Second, you must grow an ability to devise solutions for the system problems that data and experience uncover.”
The third skill set Dr. Gawande described were the elements of humility, standardization and teamwork.
“They include humility, an understanding that no matter who you are, how experienced or smart, you will fail. They include discipline, the belief that standardization, doing certain things the same way every time, can reduce your failures. And they include teamwork, the recognition that others can save you from failure, no matter who they are in the hierarchy.”
Dr. Gawande didn’t reference him specifically, but these are the skills and values pioneered by Dr. Robert Helmreich, Founder and Director of The University of Texas HumanFactors Research Project. The aviation community was an early adopter of that project. Some in the medical community are starting to follow his recommendations. His team’s work was the genesis of Threat and Error Management principles, the Aviation Safety Action Program (ASAP) and the Line Operation Safety Audit (LOSA) program. These methodologies and programs have been very successful in reducing the negative consequences of human error in air transportation. They have made a dramatic difference in the culture of the aviation community. These results are a concrete validation of the message in Dr. Gawande’s speech.