Peter Pronovost, MD, PhD, is an anesthesiologist at my hospital. He is also a specialist on patient safety, quality of medical care, and reducing medical errors. It was while he was a college student and his father’s cancer was misdiagnosed that he became interested in this area of safety and quality. By the time a second opinion provided the correct diagnosis, it was too late to give the appropriate therapy, and his father died “a horrible death, writhing in pain.” In 2008, Dr. Pronovost was awarded a MacArthur Fellowship—the so-called “genius grant.”
Even physicians are humans, and therefore prone to imperfection and the occasional mistake. But Dr. Pronovost believes it is possible to put in place protocols and strategies to reduce (or, in some cases, eliminate) mistakes and harm to patients. He has proven that a number of these approaches are successful.
A matter of teamwork
About a year ago, we were discussing the causes of serious problems (like death) that occur in the operating room (OR). Dr. Pronovost told me that having a disaster in the OR is analogous to having a problem in a cockpit leading to a plane crash. When you read in the newspaper that the crash was the result of “pilot error,” almost always the subsequent investigation shows the same thing—the people flying the plane did not know each other and had never flown together before. Each one had different presumptions about who knew what and who was doing what. Because they did not know each other, communication was poor. Almost always, prompt communication among the people in the cockpit could have prevented the crash.
Because the data were so compelling with regard to plane crashes, Dr. Pronovost thought it made no sense at all for surgeons, anesthesiologists, and nurses who did not know each other to be involved in complex surgeries. If everyone knew and was comfortable communicating with each other, then at the first hint of trouble the issue could be addressed promptly and a delay in management avoided. If people don’t communicate (say the nurse is intimidated by a surgeon he or she has not met before, or the surgeon incorrectly presumes the anesthesiologist is doing something that the usual anesthesiologist does) then problems may not be articulated or fixed quickly.
That discussion came to mind when I learned about Asiana Flight 214 crashing during landing at San Francisco airport. Sadly, three deaths occurred and many other passengers were seriously injured. A television news report described how the plane struck short of the runway, its tail section was knocked off, and a fire promptly threatened the passengers. According to the report, the plane was flying too low and at too slow a speed.
“I bet the people in the cockpit of that plane had never flown together before,” I said to the person next to me, as we listened to the newscast. Before long, the reporter shared the information that, in fact, there had been three pilots in the cockpit on that transoceanic flight and none of the three had ever flown together.
“That’s amazing that you knew that,” said my companion.
“Yes, it is,” I replied.
The lesson of flight 214 for us in medicine is that we should be creating teams of people (physicians, nurses, anesthesiologists, technicians, etc.) who know each other, can anticipate what is coming next, and communicate well. We should not put people together in ORs or intensive-care units who are strangers or not “team players.” High-volume ophthalmic surgeons already know this, as they invariably have “favorite” scrub nurses and anesthesiologists. To do otherwise is to put patients at unnecessary risk.
• Patient safety expert Peter Pronovost earns dual national honors
• Details emerge about Flight 214’s cockpit chaos. http://www.timesheraldonline.com/news/ci_23631983/details-emerge-about-flight-214s-cockpit-chaos
Subscribe to Ophthalmology Times to receive the latest clinical news and updates for ophthalmologists.