Vital Signs: A Podcast for Sentara Providers
This podcast series is designed to meet the healthcare professional where they are and deliver didactic lectures and engaging interviews to improve their knowledge and skills in a wide variety of topics for the healthcare needs of their patients. Sentara's Vital Signs – A Podcast for Sentara Providers will provide the opportunity to listen to a range of subjects to include: Diagnostic and treatment methods, the evolving challenges in the provision of healthcare, and expert opinions on ways to improve their own wellness and mindfulness as they care for their patients. *Please note only Sentara employed physicians and allied health care professionals are eligible to claim credit.
Vital Signs: A Podcast for Sentara Providers
Patient Safety Series - Episode 1 Joel Bundy, MD
Sentara is accredited by the Southern States CME Collaborative to provide continuing medical education for physicians.
Sentara designates this enduring material for a maximum of .25 AMA PRA Category 1 Credit(s)™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.
Sentara Continuing Medical Education adheres to ACCME Standards for Integrity and Independence in Accredited Continuing Education. Any individuals in a position to control the content of an accredited activity, including faculty, planners, reviewers or others are required to disclose all relevant financial relationships with ineligible entities (commercial interests). All relevant conflicts of interest have been mitigated prior to the commencement of the activity.
Topics for the Vital Signs podcast will include Sentara safety and quality initiatives that focus on improving patient care at Sentara facilities. Our faculty will be discussing the most recent developments in health care, and discussing best practices across the system.
To Claim Credit for listening to this episode:
1. Click here and enter 13720 as the Activity ID (number).
2. Then go to the MY CME tab and complete the evaluation.
3. Credit hours will be reflected on your transcript, or you may download your certificate.
For more information, click here.
Music Phoenix Rising:
Music: https://www.purple-planet.com
Reach out to Continuing Medical Education at CME@sentara.com for more information.
You are listening to vital signs, a podcast for Sentara providers. Welcome to episode one of the patient safety series in today's episode. We are joined by Dr. Joel Bundy, chief quality and safety officer. Before we turn things over to Dr. Bundy, let's go over some important CME announcements. This episode has been accredited for AMA PRA category one credits for detailed accreditation and designation information. Along with disclosure information, please visit the show notes. This information can also be found on our website, www.sentara.com/physicianeducation as well as always reaching us by email at physicianeducationatsentara.com. Now here's Dr. Bundy
Speaker 2:Many reference the United States Institute of medicine. Similar work to err, is human as the beginning of the modern patient safety movement and that 1999 publication, it was estimated that between 44,000- 98,000 people died each year from medical errors equal to the numbers expected if a Boeing seven 47 crashed every single day, it has been estimated that 3.7% of hospitalized patients experienced adverse events with negligence causing 27.6% of those events in the same study, Dr. Lucian Leape of Harvard medical school found that 58% of adverse errors were attributed to errors in management over the subsequent three decades, there continued to be reminders that patients were still not as safe as many of those earlier visionaries had hoped for. And some have described the patient safety movement has having only meager progress. If wouldn't consider as a broader definition of preventable errors, it is estimated that the incidence rate for patient deaths from preventable harm to be between 210,000 and 400,000 per year. This estimate came from a literature review of preventable adverse events that included errors of commission, omission, communication and diagnostic errors equally concerning serious harm was estimated to be 10 to 20 times more likely than preventable deaths in a 2016 Johns Hopkins study that calculated the mean rate of death from medical error of 251,454 a year using the studies reported since that 1999 Institute of medicine report, and then extrapolating to the total number of us hospitals and the admissions in 2013. What makes this report noteworthy was the declaration. That medical error was the third leading cause of death in the United States. This resonated with the public immediate similar to the Boeing seven 47 analogy of estimated deaths from the IOM report. With this limited progress, there have been a search for new and different ways to improve patient safety. Many thought to look outside of healthcare factoring in mechanisms of cognitive errors solutions from the field of psychology and human factors research, it was further recommended that medicine specifically study lessons from highly reliable organizations. The concept of high reliability has been present for many years and high reliability organizations or HROs were noted to have effective management of innately risky technologies through organizational control, HRO, such as nuclear power plants, flat operations aboard us Navy aircraft carriers and air traffic control have been extensively studied in HROs are defined as those where accidents are expected to occur normally due to operational risk and task complexity, but do not occur because of the organizations' devotion to zero rates of error. What we would say in healthcare zero harm. Do you understand how HRO has prevented errors? Researchers proposed several models to understand better harm events and how they occurred. One sets model is the cause and effect model where human error directly leads to injury. This linear model explains some workforce injuries like slips and falls, but yeah, typically doesn't explain many of the other recognizable harm events. Dr. James reason proposed a Swiss cheese model of defenses by adding complexity to that linear harm event model. In this model, there are active failures by people at the sharp end at the bedside, coupled with system problems, reason labeled these system problems, latent conditions, examples of latent conditions could be poor design or shortfalls in training. Another error model was proposed where errors were actually caused by the system. This has been described as the sharp end model and this model system errors operating in a nonlinear complex system then triggers the Swiss cheese effect. Within this framework of errors is imperative that there is an understanding of human performance. The department of energy human performance handbook lays out five principles, which reinforce the importance of the system over that of the individual. Number one, people are fallible and even the best people make mistakes. Number two error likely situations are predictable, manageable, and preventable. Number three, the individual behavior is influenced by organizational processes and values. Number four, people achieve high levels of performance because of the encouragement and reinforcement received from leaders, their peers and subordinates, and number five events can be avoided through an understanding of the reasons mistakes occur and application of those lessons learned from past events and the operative word there is learning this handbook also States and I quote error is universal. No one is immune, regardless of age experience or educational level, they're saying to err is human is indeed a truism. It is a human nature to be imprecise or to err. Consequently error will happen. No amount of counseling training or motivation can alter a person's fallibility end of quote, since errors are inevitable, it requires building capacity and resiliency into the system with an underpinning of what just culture is. We're dealing with individuals, mistakes and harm thinking needs to shift from who failed to what failed in the pursuit of system solutions. Within the safety scientist, researchers were sort of studying those HROs that had fewer accidents than it would be expected. Given the complexity of the work key characteristics from these industries in areas of both prevention and organizational resilience were defined, encompassing, both the anticipation of errors that should not happen and containment when errors do occur, prevention of errors requires a focus on removal of unwanted variation in practice. Knowing that humans will make mistakes building capacity and resilience into the system will mitigate against injury or harm. And that is paramount in reliability systems. It requires a system to bounce back and to learn from these events and importantly, to spread that learning throughout the rest of the system, as reliability as a dynamic non-event that is very, very hard to measure. It requires an organizational culture that fosters respectful interactions between various members of the team. This is called mindful organizing where the team works in an infrastructure built on trust. There is a shared understanding around mission and around operations while effectively improving the collective action of the organization. Even with team turnover, as seen all the United States Navy aircraft carriers, sustainability occurs with the constant operational training and retraining to improve skills and as a means for socialization across a continuously learning organization, according to some researchers, HROs exhibit five principles that make up mindful organizing. And you've heard these before. Number one, preoccupation with failure, number two, reluctance to simplify number three, having sensitivity to operations, number four, a commitment to resilience and number five deference to expertise. The first three of these principles focus on prevention of errors and the last two on organizational resilience, additional characteristics from research in the safety culture of aviation and healthcare revolve around system barriers that could achieve six Sigma levels of safety. That is three errors per 1 million opportunities. This includes the building of guard rails, effective teamwork, evidence-based practices, optimizing safety strategies throughout the organization, and simplifying burdensome and overly complex rules to improve safety and system performance. Healthcare organizations have begun to incorporate such high reliability organizing into their safety programs through change initiatives, by building specific structures of a safety management system within a learning organization. These HRO principles can be in co-located in this strategy and daily operations. They mirror known foundational requirements to create systems safety, such as leadership engagement and learning systems. HROs have a safety management structure with multiple informational input, such as from safety culture surveys, incident reporting systems, such as our stars program leadership walk rounds, physician and nursing peer review our serious safety events with root cause and apparent cause analysis. More recently, we've been looking at our grievance and complaint reviews, of course, outcome measures with external benchmarks against other systems or the Medicare population as a whole execution of these principles can be done through bundled skills for both leaders and for staff. These leadership skills are a practical means to accomplish HRO principles. Leadership walk rounds, for example, allows the leader to be aligned with operations at the bedside, and to learn from those with expertise who are doing the work. In addition to these leadership bundles, sharpen staff, or those at the bedside are taught universal skills for error prevention. These universal skills include things like attention on task, communicating clearly thinking critically adhering to protocols. And more importantly than many of these speaking up and escalating the specific tools are taught for each of the safety habits. There are a growing number of healthcare organizations that have demonstrated improved safety performance with implementation of high reliability, organizing, concentrating on the HR attributes of respectful interactions and heat interrelating. One hospital in Brooklyn, New York created the code of mutual respect for their medical staff and employees effectively speaking up and handling disrespectful behavior improved. Post-implementation the children's hospital solution for patient safety leveraged HRO principles, and have been able to say 15,589 children from serious harm with 300 fewer central line associated bloodstream infections and the estimated cost savings of$20 million across 110 hospitals in 2019. And additional example comes from the Connecticut hospital association were implementation of high reliability principles led to a 50% reduction in serious safety events. According to the author is elimination of preventable harm through a high reliability framework requires system readiness to classify and capture safety events and to use champions to change the safety culture across the organization. High reliability is being implemented within healthcare today and with beneficial results for patients and organizations alike by applying these principles throughout an organization, high reliability organizing can be the chassis, the leads to enterprise transformation. Sentara healthcare began our HRO journey in 2002 towards improved patient safety across all divisions in the organization through implementation of these high reliability principles, serious safety events decreased by 80% over the next seven years. In 2009, there was organizational growth through mergers and the original seven hospitals grew to 12 with 2,641 acute care beds across Virginia and North Carolina, cultural and organizational barriers, executive succession staff turnover all resulted in a lack of sustainment evidenced by increases in wrong side events, serious safety events, central line associated bloodstream infections and falls with injuries. Variability we've seen in patient satisfaction, employee engagement and application of just culture, both with employees and the medical staff, where we using relevant and practical high reliability tools across all clinical and nonclinical staff and the physicians and providers in acute care divisions, the medical group and the health plan were these concepts less well understood within areas such as information, technology, finance, and facilities. It became evident to Sentara that we needed to refocus on doing what we had done well back in 2002, we had to address the drift. So now let's talk about safety habits and tools, leadership skills, and some areas. We still need to focus on it, the dig deeper and to keep our patients in each of us safe. We'll do that next time. The HRO journey never ends.
Speaker 1:Thank you for joining us and be on the lookout for episode two of the patient safety series with Dr. Joel Bundy, you've been listening to Sentara healthcare is vital signs, a podcast for Sentara providers, as a reminder, re today's show notes for information about claiming your continuing education credits. Well, that's it for now, but we'll be back soon with another episode of vital signs, a podcast for Sentara providers, the podcast that provides evidence-based education programs for physicians and healthcare providers on the go.