CIHQ-ARS Article
H.0.P.E Consulting Partners with University of North Texas Health Science Center – SafeCare Podcast

CIHQ’ May Partner article was provided by Rey Gonzalez of H.O.P.E Consulting, the company of choice for those in continuous pursuit of sustained, reliable operations. Rey collaborated with the University of North Texas Health Science Center in Fort Worth, TX to record podcasts about human performance and how it relates to patient safety. HOPE Consulting provides effective solutions that enhance performance through assessment, training, observation and coaching by reducing costly errors. Rey’s background is in plant operations, quality assurance, and high-risk industry work. In these podcasts, Rey discusses the importance of personal, personnel, and environmental safety necessary to provide quality care. Error precursor and subsequent traps discussion begins at 7:50 (first link). By minute 14, Rey provides a technique, ‘Stop the Line’ as part of a two pronged approach in minimizing errors.
Want to listen to the podcast here is the link: Why "Stopping the Line" May Not Work with Rey Gonzalez
UNTHSC – SafeCare Podcast – Why “Stopping the Line” May Not Work
Five Podcast Discussion Points:
- Before we get started, can you please explain a few potential harms one might encounter at a nuclear power plant?
- Background includes, Aux. operator, control room operator, work management, quality assurance, operations training and human performance improvement as a coordinator or manager of the program.
- Commercial nuclear power simply provides an abundance of electricity using nuclear fuel as the heat source. Water is boiled using this heat source to create steam which is used to turn a turbine and an electrical generator, to produce electricity.
- Because nuclear fuel is used, radiological exposure can be a hazard like overexposing a patient when doing x-rays or any other use of nuclear medicine. Nuclear fuel, used at a power plant, is of a low-grade using 2-3% concentration of uranium versus weapons grades whose concentration is over 90%, so the danger is not a nuclear weapon-type of hazard. Hazards can be radiological over-exposure or the spread of radiological material in unwanted areas is called contamination.
- Still because nuclear fuel is used, every nuclear power plant operator (companies and personnel) has the protection of the public, personnel, and environment at the forefront. Using nuclear fuel is a privilege and a huge responsibility that is taken very seriously. Public safety is a core value.
- Hazards are more in line with any other industrial setting, such as high heat sources from steam, electrical shock, and all the potential hazards that come into play in terms of personal safety working in an industrial setting. So, working, outside of office settings, would require hardhats, safety glasses, safety shoes, hearing protection and gloves for certain activities. These are occupational hazards.
- Describe “error pre-cursors” and provide examples.
- After listening to your Jan. 12th podcast with Jessica, and the story shared about an error that resulted due to personnel sharing a password. [RECAP?] I thought it would be great to piggy-back on it and talk about “error precursors” or what is sometimes called “error traps” or “local factors”. These are conditions or situations in the work environment that can increase error-likeliness. Some examples of many include, time pressure, high workload, change in plan, simultaneous or multiple tasks, etc. Do these exist in healthcare? Of course.
- In the case of sharing passwords, what local conditions or situation could have existed at that time? At a minimum, there could have been, time pressure & high workload. Time pressure has the same effect whether it is perceived or real. We probably know that we are fallible humans, in fact, studies show that we make 3 – 7 errors an hour. If this number seems high, it’s because most are inconsequential and therefore may go unnoticed or recognized. By inconsequential I mean that it’s easily recoverable and had a low amount of risk or hazard. Such as forgetting your lunch or identification badge before leaving the house. You notice it when you’re a block away from home, so you turn around and retrieve it. No big deal.
- Two things that must be understood:
- Error precursors/traps can increase error rates 2 – 50x (and we are already error-prone)
- The good news is that all error precursors are manageable. Let’s talk about how time pressure is managed. Whose responsibility is it to manage error precursors? Whose responsibility is it to minimize errors and how is that accomplished?
- In nuclear power, we were taught various human performance tools and techniques like Time-outs huddles, verification techniques (like 2nd checks) and many more (they just had different names). Under a strong safety culture, it’s incumbent on the individual performer to learn all they can on the proper application of these tools and techniques especially at critical junctions in their daily work. We call these “critical steps”. A critical step is defined as a step that can result in immediate, intolerable, and irreversible harm if it or its proceeding actions are performed incorrectly.
- This is a good time to share that there’s an on-going conference that was birthed in the Nuclear Industry, however, today is attended by many high-risk industries such as airlines, other transportation, manufacturing, oil & gas and yes, the medical industry. The name of the organizers is, “The Community of Human and Organizational Learning”. This year it will be in Colorado Springs, June 14 – 16 and information can be found at: https://www.cholearning.org/ I would encourage your listeners to check them out. The authors of a new book titled, “Critical Steps” and including myself will be there providing several learning presentations on the subject.
- Patient Safety Officers advise to “stop the line” when unsafe conditions arise that may cause harm. Explain why this known strategy to prevent harm may not work.
- It boils down to our human nature.
- As humans we tend to proceed in the face of uncertainty for many reasons.
- First, we humans come to work to do a go job. (Can we agree to this?)
- We want to serve others with quality care, we also want to get the “job done”. Because of this, it’s not inherent of our human nature to STOP. We take pride in our work which leads us into other conditions that drive us into proceeding in the face of uncertainty.
- When it comes to uncertainty, it originates from various sources:
- The first is a HUGE question, “How do I know to STOP when I’m uncertain if I don’t know I’m uncertain?” This can be learned if you are trained to identify visible cues that exist prior to an undesirable event. What we call Trigger Training, sensitizes us to these visible cues and that would be a discussion for a later time.
- Something else that promotes uncertainty is a lack of confidence which can stem from a lack of experience (being new in your profession), a lack of training (within your area of expertise) or merely the fact that you have a more introverted personality. Nothing wrong with that and many introverts won’t hesitate from speaking up when they see something wrong or risky, but some will not.
- Another reason that drives proceeding in the face of uncertainty is people don’t want to stand-out as that guy or gal that is slowing things down. Or possibly seen as “bucking the system”, or fearful that they will be labeled as a “trouble-maker”.
- There are several other influences drive our behaviors into proceeding in the face of uncertainty, I’ll list them and then we can discuss just a few. These can be, but not limited to (which can drive “organizational silence”):
- he HALO effect,
- Peer Pressure
- Schedule or Time Pressure
- Sign/Label Blindness
- Habituation
- Pollyanna Principle
- Rationalization
- Risk Perception
All of these can drive “organizational silence”. This is defined as, “the natural reluctance people have to bring unwelcome news to, or to contradict, their actual or perceived superiors.”
Can these drivers that we discussed result in organizational silence?
“Organizational Deafness” – a failure to listen. 2004 elevator hydraulic fluid story
- It boils down to our human nature.
- When an error occurs in nuclear power, how is it managed?
- It’s all about learning and sharing. The nuclear industry is committed to learning all they can from undesirable events.
- It begins with reporting, which is critical, because you don’t know what you don’t know and if errors and near miss events are not reported, then we cannot learn. There’s a tremendous amount of psychological safety that must be fostered in any high reliability organization. This moves HROs into developing a “Just Culture” into their management style so that the industry can learn.
- It took years for the commercial nuclear industry to learn this, and I have a quick story about it: while working as a Human Performance coordinator, I was pleased to assist management with a new direction they wanted to head into… “reporting all errors”. This initiative was touted as “no blame”, “tells us about your error and we will support you and fix any resulting issues due to the error”. During a heavy period of physical work activities, all power plants go through, an outage where the power plant is shutdown to conduct needed maintenance and testing so that it can return to service for another year (or more) of operation.
- One day, during the outage carpenters (vendors or subcontractors) were building a wooden bridge over some large pipes that encircled a large water tank. In doing so they were cutting boards to size using a circular power saw. However, they were not using a sawhorse to place their board on to cut them. Instead, they used the stainless-steel pipes they were building over. At some point, they nicked the pipe which was noticeable when they saw some sparks. Well, they did the right thing and stopped, notified their foreman who notified the company’s management. The carpenters had heard about the “no blame” policy, so they didn’t hesitate to come forward. However, the result was the same typical “knee-jerk” reaction as was in the past and they were fired. And as a result, where do you think future reporting went?
- What was needed was a “Just Culture” approach and a commitment to learning versus blame and punishment. I subsequently used this situation in management’s first Human Performance training class for Leaders as I was describing what is called the “Blame Cycle”. After I shared that story to the class, the Plant Manager stood up and I thought my career was over. He said, “What Rey just shared was “me”, I was the one who made the decision to fire those guys”. He then said, I didn’t realize the impact of that decision and thanks to Rey, I will never make that mistake again. Whew!
- What healthcare factors prevent adopting nuclear power plants’ safety standards?
- There shouldn’t be any reasons for not adopting similar safety standards, however, they must be:
- Customized to fit the highly complex work in healthcare. This is where a collaboration of social scientists and subject matter experts in healthcare can produce positive results.
- They must be applied non-superficially, meaning that past efforts to apply error-minimization techniques from the airlines industry, such as a checklist, have been adapted and has led to some improvement, yet the focus remains on individual performer behaviors versus a comprehensive evaluation of the systems (protocols and procedures), culture and social-dynamics and are influences to how we conduct work.
Some current thinking in medical industry believes that the medical industry is too unique and specialized to use any processes or protocols or philosophies that come from other industries. Yet, all high-risk organizations are run and operated by humans, and because we are all fallible others have learned through “safety and social science” that it can be managed to minimize fatalities (or in medical terms, sentinel events) and other undesirable events. As an example, the nuclear industry does not hesitate to reach out for assistance (from other industries) because of the realization that you simply can’t know everything, and a tremendous amount of responsibility to public safety.
- Just read a fantastic book called, “Still Not Safe” that describes the answers to this question very well. I’ll summarize some of theirs and my thoughts in this way:
- The medical industry is still steeped in old ways of thinking that come from traditionalists who were taught in manners that no longer apply to the new technologies, procedures and evolving diseases, viruses or other afflictions that exist today. This lack of adaptation will hopefully be overcome by the newer generations coming into the industry; however, their training needs to be updated.
First, do no harm ” (i.e., Public Safety)- The nuclear industry takes a two-pronged approach that has its roots in what Erik Hollnagal describes as “Safety I and Safety II”. When the patient safety movement was birthed, it was easiest to focus on individual behaviors (and in many cases blame) versus a more holistic view of organizational drivers that can drive behaviors (such as how the systems and processes are setup) … a systems view or Systems Thinking. A shift to systems thinking in the medical industry is lacking. It’s a failure to take a holistic approach to the complexities of managing and working in healthcare.
- There shouldn’t be any reasons for not adopting similar safety standards, however, they must be: