Skip To Navigation Skip To Content Skip To Footer

    The MGMA membership renewal portal is experiencing intermittent issues. We are working on a fix. If you're unable to renew, please call 877.275.6462 ext. 1888 or email service@mgma.com to renew.

    Rater8 - You make patients happy. We make sure everyone knows about it. Try it for free.
    Insight Article
    Home > Articles > Article
    Generic profile image
    Barbara Davis, MA

    In honor of the recent Patient Safety Awareness Week, I wanted to share my own patient experience as an example of how we as patient safety champions can improve the safety of patient care at a systems level. While there was no harm in my case, I experienced a number of “near misses” that signified opportunities for improvement.
     
    In “near misses,” an error was committed, but the patient does not experience clinical harm due to early detection or sheer luck. How can healthcare organizations learn from “near misses” so they are prevented and no harm to patients occur?
     
    This story is about my outpatient surgery and the cascade of “near misses” that followed, with my analysis of the errors through the lens of safety literature and a framework that categorizes system-related errors.
     
    I have had a number of surgeries over the last two years, including one outpatient surgery involving the removal of hardware following a tibial plateau fracture on the snowy slopes of Colorado. After my injury in December 2016, I was treated in a local ER and saw my surgeon about a week later, during which I underwent a surgical procedure requiring internal fixation with a plate and seven screws. Even after two months of prescribed no weight bearing to allow for healing, I was in tremendous pain and my leg was bowed during the following months. In August 2017, the surgeon and I discussed the next course of treatment: removal of the hardware followed by a knee replacement in November. After researching the incidence of hardware removal, I found that this is a very common surgery (with about 14% of patients having their hardware removed).
     
    The plan was the surgeon would use the original incision site to remove the plate and screws. That part went well. All the preoperative and intraoperative processes were flawless, as far as I could tell.
     
    When I got to the PACU, my heart rate was low and I needed to be monitored longer. I could tell they were worried and so was I. After some amount of time, I was cleared by cardiology and was discharged from the PACU. When I wanted to put on my clothes, I couldn’t find them. The normal preoperative process was to put my clothes in a hospital belongings bag, label it with a pre-printed label, and put it on a wire shelf outside of the OR/PACU so that when I was ready, the transport nurse could easily pick up the bag and place it on the stretcher.
     
    A simple misplacement led to tremendous distraction
    Since my bag of clothes went missing, a cascade of events occurred that could have caused me harm and I would now label as a “near miss.” The nurses were so focused on one thing that they became distracted and forgot the others — perhaps assuming that someone else had already done them.
     
    When my clothes were not found, two nurses went upstairs to the hospital rooms of patients who had recently been discharged from the PACU. After checking a number of rooms, the team finally found the bag in a woman’s room whose husband said, “I wondered whose they were and why they were in my wife’s room.” Although he could have said something sooner, it certainly was not his responsibility. My guess is that he too was worried that if his provider couldn’t keep track of someone’s clothes, how could they keep track of his wife?
     
    When the two nurses returned, they gave me my clothes and sent me on my way, forgetting about discharge instructions and wound care. As a patient, I was eager to get home. Even as a strong advocate for my own care, I did not realize what had occurred until after I left the hospital.
     
    The burden of communication should not lie with the patient
    When I got home and looked at my leg, I found the dressing was very peculiar — the surgical wrap covered my entire leg from my thigh to my ankle and it was longer and different than the dressing used in my original internal fixation surgery. I reviewed my packet of information for wound care instructions and didn’t find any. I am embarrassed to say that it took me a week before I called my doctor’s office. I knew this was not right, and in hindsight, I would have greatly benefited from some form of communication from the hospital or my doctor.
     
    The nurse at my doctor’s office was alarmed when he learned of my situation and calmly walked me through the wrap bandage removal. After I sent the nurse a picture of the wound, he told me to go to the drug store and buy sterile pads and tape. About $75 later, I came home and under the nurse’s direction, I changed my dressing.
     
    When I reported this to the hospital’s patient representative, I received a call back from the nurse manager, who said, “I reviewed your medical record, and you were right. You didn’t receive any wound care instructions.” To me, it seemed like she doubted my word. Had I not taken control over my care, it is easy to imagine how an infection or other adverse event may have occurred.
     
    What went wrong? When I look back on this now 18 months later in the context of patient safety, I see there are several safety issues that had the potential to cause harm, but didn’t. Maybe I was lucky. Here is my dissection of the anatomy of safety errors:
     
    Framework of system safety
    James Reason, PhD, the grandfather of safety literature and error categorization, has developed a body of research around the two categories of human error: person and system. He acknowledges that human errors are inevitable and occur in large part due to poorly designed systems. Errors are the consequence, not the cause, of designs upstream from the actual error itself. He recommends that we have “countermeasures” that help mitigate errors and change the systems in which humans work; these countermeasures don’t change the human condition, but they change the conditions in which we work.
     
    In healthcare, we tend to focus on a person approach and the unsafe acts — errors and procedural violations — of people on the front line. We tend to attribute unsafe acts to mental processes, such as forgetfulness, poor motivation, carelessness, and perhaps negligence. In response, we write a new procedure, post a new poster or name/blame and shame an employee or staff member. In retrospect, I have been in plenty of meetings as a hospital quality leader where we faulted an individual, instead of examining the contributions of the system.
     
    There is also the system approach that accepts that errors will occur and puts safeguards in place to prevent them. One of Dr. Reason’s 12 principles of effective error management is aiming for continuous reform, not local fixes. This continuous striving for system reform is the product of safety work: it is the organizational culture and underlying values.
     
     As Dr. Reason states, error management has three components:

    1. Reduction: Identify potential errors and put in countermeasures to prevent them
    2. Containment: If an error does occur, manage it and then look for other ways, locations, conditions in which the same or similar error could occur.
    3. Management for effectiveness: Once the single error has been resolved, continue to seek ways in which the system can be error-proof.

     
    My patient experience through a safety lens
    At the sharp end of healthcare, my surgeon closed the wound and left the OR, leaving the wound care to the OR team. In my own mind, I now ask questions such as: Was it a new team member who placed the gauze, cut a piece of what looked like gaffer’s tape, and wrapped my leg from my thigh to my ankle? Had he or she ever been instructed on proper dressing protocol for this type of surgery? Had he or she been supervised and given feedback? Where was the supervisor who could do a visual check of the dressing before I left the OR?
     
    My bag of clothes, although labeled, did not appear to be placed in alphabetical order. They appeared to be randomly put on the shelves, but I could be wrong. With the 20/20 vision afforded by reflection, I can see that the nurses were distracted by my missing clothes and I wonder if this led them to abandon the usual steps in the discharge process, which included wound care teaching.
     
    Even though the hospital typically has automated postoperative discharge calls go out to patients in the 24-48 hours after surgery, there was a two-week pause in the program. When the calls go out, they serve as a safety net for patients during care transitions. The calls ask simple questions about access to medications, understanding discharge instructions, and scheduling a follow-up appointment.

    For an unrelated reason, the decision to pause the calls was made and manual calls did not take place to supplement the program. By chance, I was discharged in this two-week window and would have greatly benefited from the follow up.
     
    In hindsight, I am curious if there were other patients who missed out on the follow-up calls and had issues during this period. I reflect back on the safety culture and realize that when decisions like this are made, it can have real consequences to real patients. Ultimately, the calls were paused for two weeks in August, so I didn’t receive the safety net call. Were there other patients?
     
    My analysis
    The lack of organization with my belongings could be described as a “latent error,” defined by Dr. Reason as “accidents waiting to happen — failures of organization or design that allow the inevitable active errors to cause harm.” In healthcare, there are thousands of latent errors just waiting to happen due to the failure of effective design.
     
    The omission of my wound care instructions can be characterized as a “slip,” or the type of error that results from a lapse of concentration. Dr. Reason describes this as, “ ..the many activities we perform reflexively, or as if acting on autopilot. In this construct, slips represent failures of schematic behaviors, or lapses in concentration, and occur in the face of competing sensory or emotional distractions, fatigue, or stress.”
     
    The decision to pause the post-outreach calls was made upstream from the people doing the work and did not factor in a manual call as a counter safety measure. I have found that administrators can make decisions using a single filter and not look at the whole picture of the impact of the decision. Manual calls for this two-week period would have been a more costly and less efficient approach to post discharge follow-up, but it would have at least provided the same concept of a safety net to patients in need.
     
    Lastly, as an informed consumer of healthcare, I failed to advocate for myself and call my doctor when I first realized something was wrong. When I finally did, I sent him the picture of the dressing and he provided feedback/teaching to the OR team to complete the wound care in the OR. I wonder about all the patients who simply don’t feel comfortable advocating for themselves in similar situations.
     
    In what Dr. Reason calls the Swiss cheese model, I fell right through the holes in the system. My clothes fell victim to a latent error in design, I didn’t receive discharge instructions due to distraction, and I missed a postoperative outreach call because of an administrative decision.
     
    I am sharing my own experiences to highlight the gap that still exists in patient safety, even when care teams have the best of intentions and when the patient is a knowledgeable healthcare consumer. I am grateful for the clinical teams who cared for me when I couldn’t advocate for myself in the OR and PACU. After 30 years of experience working with physicians, nurses, and care teams, I didn’t call when I suspected that something wasn’t right — I didn’t advocate for myself, which is my own responsibility. I accept responsibility for my own lack of action and can’t help but think about others: What about patients who haven’t worked in healthcare? Would they have fallen through the “cheese” and experienced system errors that resulted in harm? 
     
    When we look at patient safety in this way, it is easy to see the domino effect of what seems like a simple decision. Reflecting on Patient Safety Awareness Week, I encourage you to shed light on these issues and identify ways of mitigating risk and ensuring “near misses” are fewer and further between.

     
    To learn more about the Science of Safety go to https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1070929/
    http://aerossurance.com/helicopters/james-reasons-12-principles-error-management/

    Generic profile image

    Written By

    Barbara Davis, MA

    Barbara Davis, MA has been involved in quality in healthcare for over 30 years, most recently with SCL Health System in Denver, CO. She has worked in complex health care environments, including a university health system, an HMO health plan and multi-hospital system. She joined CipherHealth in January 2016 as Vice-President of Accounts and is now Vice President, Clinical Services.   Her areas of expertise include quality improvement and Lean, service excellence focusing on organizational culture and the patient experience, patient safety and reliability, regulatory issues and organizational strategic goal setting. She is passionate about improving the patient and family’s experiences in healthcare and believes that safety is the cornerstone of a good patient experience.   She was a Malcolm Baldrige National Quality Award examiner for nine years and was the founder of the regional quality award, Rocky Mountain Performance Excellence. She has been certified in Lean through the Society of Manufacturing Engineers.   She taught at Regis University for 9 years developing the Quality and Performance Improvement in Healthcare class and the Advanced Concepts class which focuses on improvement methods associated with Lean and Six Sigma, and the Certificate in Quality and Patient Safety (a four course certificate).   Currently she co-chairs the Patient and Family Advisory Council at Saint Joseph Hospital in Denver.   She received her undergraduate degree from the University of Iowa and her graduate degree in Health Care Administration from George Washington University.


    Explore Related Content

    More Insight Articles

    Ask MGMA
    An error has occurred. The page may no longer respond until reloaded. Reload 🗙