Group Management For Just Culture Assignment Paper
Company culture is a key component to employee happiness and engagement, and it is the one driven by top company leaders impacting at the organizational level”, says Brandon Smith, the author of The Happiness at Work Formula.
Company leaders boost company culture and decrease employee burnout through modeling and promoting the value of work-life balance.
People make errors. Errors can cause accidents. In healthcare, errors and accidents result in morbidity and adverse outcomes and sometimes in mortality.
But what if you saw one of your coworkers make a mistake? No matter what occurred, you might not have told your supervisor because you were afraid of getting your co-worker in trouble or even getting reprimanded yourself. Cumbersome paperwork might have been required. Or you might have felt it would be a waste of time to say anything because you didn’t think anything would change to correct the problem.
ORDER PLAGIARISM -FREE PAPER HERE
Fast forward to modern day health care. While patient safety is always the top priority, some of the errors that do occur may go unreported — often for many of the same reasons, experts say.
But as concepts in patient safety have evolved in the last decade, health care is moving toward modern safety models that focus on learning from errors to improve safety.Group Management For Just Culture Assignment Paper
Acknowledging that even experienced professionals make mistakes can lead to an open and safe reporting system where everyone can speak up without fear of reprisal. This can lead to shared learning from errors and an eventual culture shift that prevents errors from occurring again. This is “just culture.”
A term coined by safety experts, just culture is a philosophy and concept that is increasingly being adopted by health care institutions, including radiology departments.
“A fair and just culture improves patient safety by empowering employees to proactively monitor the workplace and participate in safety efforts in the work environment,” said David B. Larson, MD, MBA, associate professor of pediatric radiology and vice chair of education and clinical operations, Stanford University School of Medicine, Stanford, CA, who has authored numerous studies on patient safety in radiology (see Web Extras).
Anyone can make a mistake, but in industries such as aviation and health care, one small mistake can lead to dire consequences and loss of life. In the past, a person who made a mistake or reported an error was punished. That is now changing, thanks to the “Just Culture” concept being pioneered by David Marx.
“It’s like that whack-a-mole game—individuals are punished for errors,” said Marx, CEO of Outcome Engenuity, a company that teaches industries how to investigate events and produce better outcomes in a nonpunitive manner. “We need a better way to deal with human fallibility rather than whacking people into submission.”
Under the Just Culture concept, safety is promoted by facilitating open communication within an organization in addition to a system of accountability for safe behavioral choices among staff. Just Culture is about holding human beings accountable, but it is also about digging deeper and looking at the system to determine why a particular error occurred.
“The core message for pharmacists is that you are a fallible human being and you’re going to make mistakes—it’s inescapable,” Marx said in an interview with Pharmacy Today. “We can influence the system design to change the rate of errors, but perfection just isn’t in the cards.”
Marie Link, PharmD, System Medication Safety Officer for University Hospitals in Cleveland, evaluates medication use across all clinical disciplines to improve standardization, safety, and the quality of patient care. Just Culture plays an integral role in the success of these objectives.
“The Just Culture concepts have changed our environment. We have evolved to a mindset where the error is treated as a system error rather than an error caused by an individual,” Link explained. “Just Culture allows staff, from the bottom to the top, to communicate and discuss sensitive issues more freely; this is [critical] to improving patient care.”
Just Culture at University Hospitals health system was rolled out in 2009 to support efforts to improve transparency and increase medication event reporting by staff. “It is well established that voluntary reporting only captures the tip of the iceberg,” said Link. “How can an organization uncover and learn from the additional information that lies below the surface?”
Safety Culture in the Pharmacy
I have never met a pharmacist who intentionally set out to make an error. Most pharmacists are detail-oriented individuals who take their roles seriously.
After all, pharmacists are the umpires of the health care game. They enter, verify, and triple check prescriptions, orders, and final products until they are satisfied.
Pharmacists make sure that the correct medication is going to the correct patient. I signed up for this when I applied to pharmacy school in 1993.
At the time, I didn’t know what I was signing up for except a nice salary. I had no idea about the culture of safety in many medical jobs, or that a career in pharmacy required perfectionism.
Fate would have it that I married a man in safety, as well. While he reduces on-the-job accidents along with the Occupational Safety and Health Administration (OSHA) and other safety organizations, I work in a hospital where helping patients become well is the goal.
Nevertheless, the Journal of Patient Safety estimates that more than 400,000 people die each year due to harm in the hospital, making it the fourth leading cause of death in the United States. If this were any other industry, the organization would be shut down until the cause of harm was fixed, but hospitals simultaneously save lives, and so they stay open.Group Management For Just Culture Assignment Paper
Hospitals have cultures that blame people rather than processes. Blaming people reduces error reporting, which shuts down improvement in processes.
Health care needs to view all errors as opportunities to improve systems and processes to catch mistakes caused by human error. Keep in mind that humans build processes, as well.
But will blaming people instead of processes ever change?
I asked a pharmacist once why he didn’t report errors. He told me that he only reports the errors that matter.
Don’t they all matter, though? Choosing and picking which error to report is looking through a punitive lens rather than a process lens.
I try to make it my practice to report all errors, even my own, because it is the only way to shed light on things that need to be adjusted in the system. If there are duplications missed regularly and a trend develops, the system analysts can figure out how to adjust the alerts to be better.
Changing how pharmacists check for errors could help, but if we don’t report, then they don’t know. In the meantime, we shouldn’t pick and choose what we report.
In the automotive industry, safety falls under human resources. Many times, an employee safety group is developed to look at the issues affecting the company.
Hospitals should employ the same type of safety group that not only encompasses risk management, information technology, and nursing, but also includes actual clinicians who work with the systems and interact with patients and their orders.
There should be multiple pathways provided for employees to bring suggestions and concerns to the group to look at the system and make it better, rather than just reporting errors with no follow-up and breaking down the processes that lead to a particular mishap.
We have processes and rules in place to make hospitals safer, but the culture can be so tainted that no one follows the protocols that are in place. It is true that when you start looking at safety through the lens of culture, you see how challenging it is to change.
Safety culture starts at the highest level of an organization and trickles down. If management does not have safety as a priority, then I guarantee you that no one else will.
One of the most damaging messages a pharmacist can receive is leadership mishandling a medical error. If our leaders do not take the time to investigate the systems involved with the error and how the error happened, and instead rush to punitive action toward the clinician, then staff members will become more jaded and less involved.
Medical errors are almost always the result of systematic flaws, rather than a person’s incompetence. Rushing to judgment rarely improves safety culture in a hospital and turns clinicians into something worse.
Here’s what a safety culture in the pharmacy would look like:
Order entry and/or verification would not be in an area where distractions are abundant. There would be a telephone, but mainly for outbound calls. Order entry/verification would be in a quieter environment separate from where phones are ringing. Why host tasks that require perfection in an area that isn’t conducive to patient safety? If the room isn’t separate, then there will be constant interruptions. Every interruption while pharmacists are in the middle of doing their job is a recipe for disaster, just as it is for a nurse on the floor.
There would be continuity of care with work assignments. If pharmacists or nurses are changing hospitals every day, then they never really learn their patients. Processes could also vary from one hospital to another, which can lead to confusion for the clinician. If a pharmacist regularly works in the same environment, then he or she is able to see what processes need to change to ensure patient safety. Relationships between nursing and physicians would improve due to continuity of care.
Nurses and pharmacists would report every single error, no matter how small. Only situations where there was blatant disregard of policy or unsafe acts would be punitive. If there is a near miss, then praise where the error was discovered prior to the patient receiving the wrong care would be given. The system should be designed to catch errors at different levels, rather than relying on one step of the process.
A safety focus group would be set up where issues and processes are analyzed on a routine basis and changes are evaluated based on these analyses. This focus group in the pharmacy could report to a larger group in the hospital with each department represented if a particular issue affects other departments.Group Management For Just Culture Assignment Paper
If a pharmacist sees a particular error trend with a medication or identifies areas where patients could be at risk, “Just Culture promotes a positive environment to encourage the pharmacist to raise up their concern,” said Link.
With Just Culture, University Hospitals has seen an increase in the number of medication events reported and has enhanced their root cause analysis (RCA) process. “Part of the success of gathering information is in creating ways for all staff to be involved in improving patient care,” added Link.
When a medication error or safety event occurs, “it is important to discuss it ,” Link told Today. “An RCA is scheduled with a multidisciplinary group to walk through the event, determine how it happened, and identify the changes needed to prevent it from happening again.”
The primary ethical obligation of a pharmacist is to avoid harm by filling each prescription correctly. For this reason, pharmacies, pharmacy organizations, and boards of pharmacy have adopted and espoused the principles of continuous quality improvement.
Multiple barriers to developing ‘just culture’ in the NHS, say pharmacy organisations
What are the organization’s primary and secondary values? An organization operating within a Just Culture has defined its primary (high) and secondary values to ensure that workers know how to prioritize their work. Safety should always be a primary value. Values such as efficiency and productivity should be considered secondary values. Overzealous commitment to these and other secondary values can threaten safety and confuse workers, particularly if they are not provided with direction regarding which value takes precedence. In general, it should be clear to workers that safety should not be sacrificed to achieve secondary goals such as productivity. Yet, 26% of respondents from hospitals that participated in the AHRQ culture survey said that, whenever pressure builds up, managers want staff to work faster, even if it means taking shortcuts. Fifty percent said they work in crisis mode, trying to do too much too quickly, and 36% reported that safety is sacrificed to get more work done.
Do managers’ behaviors demonstrate that safety is a primary (high) value? The best way to influence the day-to-day decisions that staff make—which, in turn, affects patient safety—is through employee observations of leaders’ and managers’ practices and behaviors. Open discussion of safety as a high value, and seeing leaders and man-agers behave in a manner that demonstrates that safety comes first, encourages and supports staff decisions to do the same. But, about a quarter of respondents to the AHRQ survey reported that managers overlook repetitive safety problems and do not act in a way that demonstrates to staff that safety is a top priority. Behaviors that send mixed messages (e.g., safety vs. productivity) create confusion and promote unsafe behavioral choices.
Is safety a value or a priority? Many healthcare organizations have made patient safety a priority that deserves their utmost attention right now. But priorities can easily shift, and once again, patient safety could take a back seat to other dimensions of quality, leaving tragic patient injuries in its wake. Patient safety should be a sustained primary value associated with every healthcare priority, not a priority that can be reordered based on competing demands.
Justice and Safety
How does the organization respond to human error, at-risk behavior, and reckless behavior? Three types of behavior should be anticipated in an organization: human error, at-risk behavior, and reckless behavior. Each type of behavior has a different cause, so a different response is required.
Human error involves unintentional and unpredictable behavior that causes or could cause an undesirable outcome; it is not a behavioral choice—we don’t choose to make errors. Since most human errors arise from weaknesses in the system, they are managed within a Just Culture through system redesigns that reduce the risk of errors. Discipline is not warranted or productive because the worker did not intend the action or any undesirable outcome that resulted. In a Just Culture, the only just option is to console the worker who made the error and to redesign systems to prevent further errors. Unfortunately, the AHRQ survey results uncover a different reality in many hospitals. Half of respondents feel like mistakes are held against them; 65% worry that their mistakes are kept in their personnel file; and 54% feel like the person is being written up, not the problem, when events are reported.Group Management For Just Culture Assignment Paper
At-risk behaviors are different than human errors. Behavioral research shows that we are programmed to drift into unsafe habits, to lose perception of the risk attached to everyday behaviors, or mistakenly believe the risk to be justified. Our decisions about what is important are typically based on the immediate desired outcomes, not delayed and uncertain consequences. Over time, as perceptions of risk fade away and workers try to do more with less, they take shortcuts, violate policies, and drift away from behaviors they once knew were safer. These at-risk behaviors, often the norm among groups, are considered to be “the way we do things around here.” In a Just Culture, the solution is not to punish those who engage in at-risk behaviors, but to uncover and remedy the system-based reasons for their behavior and decrease staff tolerance for taking these risks through coaching.
In comparison to at-risk behaviors, workers who behave recklessly always perceive the risk they are taking and understand that it is substantial. They behave intentionally and are unable to justify the behavior (i.e., do not mistakenly believe the risk is justified). They know others are not engaging in the behavior (i.e., it is not the norm). The behavior represents a conscious choice to disregard what they know to be a substantial and unjustifiable risk. In a Just Culture, reckless behavior is blameworthy behavior. As such, it should be managed through remedial or disciplinary actions according to the organization’s human resources policies.
Are individual accountabilities documented in job descriptions, performance evaluations, and/or policies, and communicated to staff? Organizations that operate within a Just Culture have defined and communicated individual accountabilities so all staff understand what is expected of them. In a Just Culture, staff at all levels are held accountable to perform at the highest level of personal reliability while conscious of human limitations. They are accountable for making safe behavioral choices and decisions that promote safety. They are responsible for identifying patient safety and other organizational risks, including system vulnerabilities, human errors, at-risk behaviors, and reckless behaviors. They must work with others to identify and manage everyday risks and coach individuals who are engaging in at-risk behaviors.
Managers and administrators have additional responsibilities to continually assess the behavioral choices of staff, monitor systems and processes, design and redesign systems to improve safety, investigate the causes of risk and errors, and to respond fairly and consistently to staff who make human errors or engage in at-risk or reckless behavior. In a Just Culture, all workers know that safety is a primary value in the organization, and they continually look for risks that pose a threat. They are thoughtful about their behavioral choices and always thinking about the most reliable ways to get the job done right.
Does the potential or actual severity of an outcome play a role in how staff are treated when evaluating risk and errors? An organization operating within a Just Culture does not employ an outcome-based model of accountability, meaning there is no severity bias—the potential or actual severity of the outcome plays no role in determining how staff are treated. Instead, staff are judged on the quality of their behavioral choices, not the outcome or potential outcome of a hazard or mishap. When patients are harmed, this is a difficult but worthwhile stance, as an outcome-based accountability model often results in a “no harm, no foul” approach to staff, with missed opportunities to console employees for human error, coach individuals regarding at-risk behaviors, or to redesign systems to prevent human errors from reaching patients. If an error happens, employees should know that they will be treated fairly when they report their mistakes, and that they will be accountable for the quality of their choices, and not simply the outcome.Group Management For Just Culture Assignment Paper
Management of At-Risk Behaviors
Is the culture tolerant of at-risk behaviors? Human behavior runs counter to safety because the rewards for risk taking are often immediate and positive (e.g., saved time), while the punishment (e.g., patient harm) is often delayed and remote. As a result, even the most educated and careful healthcare professional will learn to master dangerous shortcuts, particularly when faced with an unanticipated system problem (e.g., technology glitches, time urgency). Staff will drift from safe and controlled processes, as first learned, to unsafe and automatic processes. Over time, the risk associated with these processes fades and the entire culture becomes tolerant to these risks.
For example, if you’re an experienced pharmacist, you might not think twice about answering the phone and managing special requests at the pharmacy window while entering complex medication orders. You may no longer check the patient’s full drug profile, allergies, and weight before entering medication orders. You may now rush past drug interaction messages with barely a notice, and fill medication orders using the label, not the order. If you’re an experienced nurse, you may believe it’s acceptable to maintain unauthorized stashes of medications on your unit, prepare IV admixtures instead of waiting for pharmacy to dispense them, and administer medications to patients before pharmacy has reviewed the order. You may borrow another patient’s medications for quick administration to your patient and leave medications at the bedside. You may no longer bring the patient’s medication administration record to the bedside if you are just administering a prn medication. Successful outcomes foster continuance and tolerance to the risks, particularly when others ‘look the other way’ or begin imitating the at-risk behavior.
Does the organization tend to punish safe behavior and/or reward at-risk behavior? When organizational tolerance to risk is high, safe behavioral choices may actually invoke criticism, and at-risk behaviors may invoke rewards. For example, a nurse who takes longer to administer medications may be criticized, even if the additional time is attributed to safe practice habits and patient education. But a nurse who is able to handle four new admissions in the course of a shift may be admired, and others may follow her example, even if dangerous shortcuts must have been taken to accomplish the work. A pharmacist who dispenses a “missing” medication quickly is more likely to receive positive reinforcement from the awaiting nurse than a pharmacist who fully investigates the reason for the request, thus delaying receipt of the missing medication. The pharmacist who typically ignores those “nuisance” alerts and is able to enter a large volume of orders without a backlog may receive a better performance evaluation than a pharmacist who takes longer because he evaluates the significance of all alerts. In fact, shortcuts like these and many others could even be labeled as efficient behavior.
One key to the successful implementation of safety regulation is to attain a “just culture” reporting environment within aviation organisations, regulators and investigation authorities. This effective reporting culture depends on how those organisations handle blame and punishment.
What is at stake
Just culturedoes not mean complete protection of front-line operatorsin the event of aviation incidents and accidents. Nobody can be above the law, and interpreting acceptable or unacceptable behaviour or actions remains the responsibility of the national judiciary.
Indeed, the administration of justice, in particular in the domain of criminal law, constitutes one of the pillars of a State’s sovereign functions. Both at International Civil Aviation Organization (ICAO) level (see
ICAO Annex 13, Aircraft Accident and Incident Investigation) and in Europe, effective and focused rules and regulations on the protection of safety data already exist or are under discussion, but there are limits to what can be addressed by safety legislation.Group Management For Just Culture Assignment Paper
In this context, a just culture signifies the growing recognition of the need to establish communication and training initiatives and advance arrangements amongst those in the aviation safety sector, regulators, law enforcement and the judiciary in order to avoid unnecessary interference and to engender mutual trust in and an understanding of the relevance of their respective activities and responsibilities.
A just culture requires an understanding and appreciation of the various processes and commitments by both safety specialists and the judiciary. A just culture is based on the assumption that controllers and pilots can make mistakes and that only a member of the judiciary can decide what is an “honest mistake” and what constitutes intentional or reckless behaviour.
ORDER HERE NOW
The concept of a just culture represents the fundamental recognition that both aviation safety and the administration of justice would benefit from a carefully established equilibrium, moving away from fears of criminalisation. That is easier said than done, of course, but the time has come to seriously question the added value of endless and generally unsuccessful efforts at international level to “protect” controllers and pilots against legal action by creating standards, regulations and laws which are supposed to shield them from intervention by the justice system.
Only a very small proportion of human actions that are unsafe are deliberate (e.g. criminal activity, substance abuse, use of controlled substances, reckless noncompliance, sabotage, etc.) and as such deserve sanctions of appropriate severity. A blanket amnesty on all unsafe acts would lack credibility in the eyes of employees and could be seen to oppose natural justice. A “no-blame” culture per se is therefore neither feasible nor desirable.
What is needed is a “just culture”, an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information – but in which they are also clear about where the line must be drawn between acceptable and unacceptable behaviour.
There is a need to learn from accidents and incidents through safety investigation so as to take appropriate action to prevent the repetition of such events. In addition, it is important that even apparently minor occurrences are investigated, in order to prevent catalysts for major accidents. Safety analysis and investigation is a necessary and effective means of improving safety, by learning the appropriate lessons from safety occurrences and adopting preventative actions. It is therefore important that an environment exists where occurrences are reported, the necessary processes are in place for investigation and for the development of necessary preventative actions such as re-training, improved supervision etc.
How do healthcare providers establish a culture that encourages open reporting of adverse events and risky situations, yet hold people and organizations accountable in a just manner?
Just Culture reshapes our understanding of accountability, the role of the system, and the role of human behavior. It provides a framework to support consistent management of operational systems and behaviors.Group Management For Just Culture Assignment Paper
CPS Supports Culture Improvement
The Center for Patient Safety (CPS) recognizes a “just” culture is an important and necessary program in every healthcare organization. An accountable culture supports open communication of errors in a non-punitive environment (a “just” culture) and leads to greater improvement in patient safety.
CPS supports a model of Just/Accountable Culture. We provide education and coaching for its implementation throughout the continuum of care.
Complete the form below to find out more about our Just/Accountable Culture education, implementation support and training or get a cost estimate for your organization.
CPS Assists with Implementation and Sustainability
Many organizations are eager to adopt a just culture program, but then feel overwhelmed. CPS can guide successful and efficient implementation that is structured to work in your unique environment.
Our implementation and sustainability plan is designed to fit any size organization, from a small EMS agency to a large health system.
provide/analyze patient safety culture survey assessments to determine strengths and opportunities; measure improvement over time.
provide onsite identification and consultation support to determine your unique needs.
help to develop a realistic timeline for deployment of Just/Accountable Culture at your organization.
provide ongoing virtual and onsite implementation and sustainability support for 6 months, one year, or more.
Every organization is different, but consultation and training costs are affordable and designed to fit your needs. We want you to be successful today, tomorrow, and in the future!
Complete the form at the bottom of the page for more information about our implementation and sustainability program or contact Kathy Wire at firstname.lastname@example.org.
For an overview of the Just Culture development process, download our summary here.
CPS Administers Just/Accountable Culture Training
Are you interested in training only, or perhaps a refresher course? We can design training to support your internal efforts.
Introduction to Just/Accountable Culture: Would you and/or your staff like an overview of Just Culture? This is delivered in a a concise and informative webinar format.
Just/Accountable Culture Manager Training: This day-long class provides the hands-on training, tools, and resources necessary to implement Just Culture within an organization. One or more individuals from an organization can become the Champions for the program. Minimum class size is 20 people and can be taught to an entire organization or offered to several individuals from different organizations within a similar geographic location. This class is scheduled based on demand.
Just/Accountable Culture ReEducate: Has a previous implementation plateaued or do you need support invigorating a previous implementation? We can develop a customized approach based on your needs. This starts with a phone consultation and the development of a plan specific to your organization.
At-risk behaviors represent the greatest risk to patients given that reckless behaviors are rare and human errors usually present as single isolated failures. The faded perception of risk, the habitual nature of the behaviors, and upside-down rewards that discourage safe behaviors and encourage unsafe at-risk behaviors make it difficult to change the behaviors. In a Just Culture, the solution is not to punish those who engage in at-risk behaviors, but to identify and report these behaviors, determine the scope of the behavior, uncover and remedy any upside-down rewards and the system-based causes for the behaviors, and decrease staff tolerance to risk-taking.
A just culture recognizes that individual practitioners should not be held accountable for system failings over which they have no control. A just culture also recognizes that many individual or active errors represent predictable interactions between human operators and the system in which they work. However, in contrast to a culture that touts no blame as its governing principle, a just culture does not tolerate conscious disregard of clear risks to patients or gross misconduct, such as falsifying a record, performing professional duties while intoxicated, etc.Group Management For Just Culture Assignment Paper
Dr. Lucian Leape, a member of the Quality of Health Care in America Committee at the Institute of Medicine and adjunct professor of the Harvard School of Public Health, stated that the single greatest impediment to error prevention in the medical industry is “that we punish people for making mistakes.” Leape (2009) indicated that in the healthcare organizational environment in most hospitals, at least six major changes are required to begin the journey to a culture of safety:
We need to move from looking at errors as individual failures to realizing they are caused by system failures
We must move from a punitive environment to a just culture
We must move from secrecy to transparency
Care must change from being provider-centered (doctor-centered) to being patient-centered
We must move our models of care from reliance on independent, individual performance excellence to interdependent, collaborative, interprofessional teamwork
Accountability must be universal and reciprocal, not top-down
People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. But if we find out who made the errors and punish them, are we solving the problems? No. The problem is seldom the fault of an individual; it is the fault of the system. Changing the people without changing the system will perpetuate the problems.
How can we change systems to encourage individuals to report errors and learn from their mistakes? A just culture seeks to create an environment that encourages individuals to report mistakes so that the precursors to error can be better understood in order to fix the system issues. Individual practitioners should not be held accountable for system failings over which they have no control. In a just culture, individuals are continually learning, designing safe systems, and managing behavioral choices. Events are not things to be fixed, but opportunities to improve understanding of the system.
How do you get started with a just culture initiative and ensure that all staff members feel free to report errors? There needs to be an administration that supports the concepts of a just culture and encourages staff to report errors. Highly reliable industries foster mindfulness in their workers. Mindfulness is defined by Weick and Sutcliffe (2001) as being composed of five components:
A constant concern about the possibility of failure even in the most successful endeavors
Deference to expertise regardless of rank or status
An ability to adapt when the unexpected occurs (commitment to resilience)
An ability to concentrate on a specific task while having a sense of the bigger picture (sensitivity to operations)
An ability to alter and flatten hierarchy as best fits the situation
Health organizations are now writing and promoting just culture policies and documents. The Joint Commission leadership standards (Schyve, 2009) address leadership and safety specifically relating to the organization’s governing body (the CEO and senior management and medical and clinical staff leaders). The Joint Commission (formerly JCAHO) suggests instituting an organizationwide policy of transparency that sheds light on all adverse events and patient safety issues within the organization, thereby creating an environment where it is safe for everyone to talk about real and potential organizational vulnerabilities and to support each other in an effort to report vulnerabilities and failures without fear of reprisal.
Is there visible evidence of coaching around at-risk behaviors? Within a Just Culture, at-risk behaviors are reduced by removing the barriers to safe behavioral choices, removing the rewards for at-risk behaviors, and coaching staff to reduce their tolerance to risk and encourage a decision-making process that results in the desired safe behavioral choices. Coaching involves helping another see risk that was not seen or misread as being insignificant or justifiable. It entails a productive discussion between individuals about the risks vs. rewards of certain behaviors and the decision-making process for behaviors under the control of the worker. Unlike “counseling,” which is typically a boss-to-employee discussion that entails putting the employee on notice regarding potential disciplinary action, coaching involves manager-to-staff, peer-to-peer, and staff-to-manager coaching. Staff willingness to coach peers and managers and to be coached by others can be a strong indicator of a Just Culture. Yet, the AHRQ culture survey results suggest that only about half of respondents feel free to question the decisions or actions of those with more authority, and 37% reported that they are afraid to speak up when something doesn’t look right.
Bodies representing the pharmacy profession have made several recommendations in their responses to NHS Improvement’s consultation on patient safety.
The Royal Pharmaceutical Society noted several barriers to achieving a “just culture” in the NHS, including policies that punish human error and a fear of authority
Pharmacy organisations have warned of barriers to developing a “just culture” within the NHS in their responses to NHS Improvement’s proposals for a new patient safety strategy.
The Community Pharmacy Patient Safety Group (CPPSG) and the Pharmaceutical Services Negotiating Committee (PSNC) also expressed concerns that NHS Improvement’s proposals were focused too much on secondary care and should also consider other healthcare settings, including community pharmacy.
In response to an NHS Improvement consultation, which closed on 15 February 2019, the Royal Pharmaceutical Society (RPS) said the development of a “just culture” in the NHS was inhibited by several factors, including a lack of awareness of the need to distinguish human error from at-risk or reckless behaviour; historical and culture barriers; policies based on punishment of human error; and fear of authority such as the regulator or employer.
The CPPSG and PSNC also said that an understanding of what a “just culture” meant in practice was not clear to all healthcare professionals, and that previous experiences of unjust treatment could affect an individual’s approach to open reporting and sharing behaviour in future.
The CPPSG said it was vital that all undergraduate programmes for healthcare professionals embed patient safety throughout learning, rather than teaching it as a standalone or bolt-on module. The PSNC recommended that organisations ensure that learning is built into the training of all relevant staff and that the public be provided with information to drive better understanding of the actions that are taken when incidents occur.
It also said that feedback following the reporting of incidents was essential to drive improvement and motivate staff to keep up reporting activity, and that financial incentives delivered through NHS contracts could support ongoing engagement and continuous improvement in practice.
The RPS said that success measures and progress indicators for safety would be valuable so that teams and organisations could monitor whether they are on the right track.
The RPS, the PSNC and the CPPSG welcomed NHS Improvement proposals to develop a Patient Safety Incident Management System to build on the National Reporting and Learning System and the Strategic Executive Information System. However, the RPS said that reports from the professional regulators should be included as “there is learning to be had from these as well”.
The PSNC also suggested the introduction of a safety alert system, similar to that in the recently announced changes to the GP contract, where alerts are distributed directly to community pharmacies so that they are always received in a timely manner.
The RPS said that it “strongly supported” proposals to develop a network of senior patient safety specialists, which could include pharmacists, but said that the specialists should be a minimum band 8a or have the equivalent skills and knowledge of patient safety culture.Group Management For Just Culture Assignment Paper
Regardless of the effort and time a pharmacy puts into developing and implementing its continuous quality improvement (CQI) program, one truth remains – there will still be mistakes and medication errors. No system will eliminate all errors. Anytime a medication error reaches a patient, there is a chance a patient will be injured by the mistake. Risk is the nature of the profession of pharmacy. It is an imperfect science and an awesome responsibility.
When a mistake is made, the first reaction of those in authority is to blame someone. There must be someone to punish. We look for the last person who worked on the prescription, and we heap shame on that person. It is easy to lapse into the 17th century mentality of “burn the witch.” Boards of pharmacy often find themselves placed in this position – they must find and punish the culprit.
CQI, however, teaches that the way to reduce medication errors is to improve the process and the workflow. The CQI theory is to look for the root cause of the error and change the system to eliminate that cause. If all we do is “burn the witch” and fire the pharmacist, then the next time the same sequences of events line up, we must find a new witch. Eventually there is no one left to fire, no one reports mistakes, and there is no improvement.
Most mistakes in the pharmacy are the result of simple human errors, which any pharmacist and technician can make. As long as human beings play any part in the practice of pharmacy, there will be human errors. We could no more eliminate all mistakes than we could stop being human. CQI systems are necessary because pharmacists and technicians are human beings. The root cause is that thing which failed to prevent our act of human frailty.
For perhaps 90% of all the medication errors that pharmacists and pharmacy technicians make, the CQI theory of eliminating the “blame and shame” of being human works. We eliminate fear of reporting and with each reported error there is a search for the root cause and the system is improved. The risk of the next error is reduced.
There are a few medication errors, however, for which there needs to be blame assessed and for which punishment is appropriate. Very occasionally the root cause is not a process flaw or a workflow difficulty. Sometimes the root cause is the pharmacist or pharmacy technician’s at-risk behavior. At-risk behavior may be multi-tasking or trying to fill too many prescriptions at one time.
Still less common, there are times when the person’s behavior is not just at-risk, but reckless. The individual has shown a reckless disregard for the safety of his or her patients. Among examples of recklessness is the person who arrives at work drunk or high. Sometimes we say this person has demonstrated that he or she just doesn’t care.
If ethics includes justice, then it is incumbent upon managers, supervisors, and boards of pharmacy members to understand the differences in each of these types of action. When a medication error is the result of simple human error, then the system needs improvement. When the pharmacist or pharmacy technician exhibits at-risk behavior, then education is appropriate. Actual discipline, however, should be reserved for those persons who exhibit a reckless disregard for the safety of patients and the system. Punishment cannot be meted out according to the harm that results, but by the actions that caused it. It is easy to punish – it is hard to determine which person to punish and why.
Pharmacists are key players in an RCA because “they understand potential for harm related to the use of medications and likelihood of potential errors,” said Link. “Pharmacists can provide valuable recommendations for resolution and prevention.”
In healthcare and many other industries, we jump to assign blame to an individual when a mistake is made. Then, we turn to technology or education as a solution, possibly firing the mistake-maker as well. This may feel like the right thing to do, but it rarely works.
The “blame game” does not reduce error. In fact, it increases your chance for error! What works in a healthcare setting is a “Just Culture” – an environment where workers trust each other, are rewarded for providing safety information and are clear about their responsibility to make safe behavioral choices. In a nutshell, healthcare professionals must feel safe to report errors, including their own, and administrators must be driven to learn what causes mistakes and how to reduce their number and severity.
A “Just Culture” approach doesn’t mean no one ever is punished or even fired. This isn’t a “no responsibility” idea. To the contrary, we focus intently on types of error, so that we can correctly understand causes and what actions may be taken to reduce errors.
There are three basic types of errors – and it’s a mistake to lump them into one basket:
This is a lapse, a pure mistake, an inadvertent action. For example, reading a prescription wrong or not catching a prescription that was written wrong.
Humans tend to drift, to underestimate risk, to ease up, especially when other pressures bear down and shortcuts have been taken in the past without harm. It could be failing to check a prescription after it’s been filled by a reliable pharmacy technician or working without a lunch break so your attention is compromised.
This is a choice – like driving drunk. A medication error that results from a conscious disregard for a substantial, unjustified risk must be addressed appropriately.
To reduce errors, you have to understand why they were made. A pharmacist who makes a human error needs to be consoled – not punished. Everyone makes mistakes. In a “Just Culture,” we accept this reality and encourage openness and responsibility. When an honest mistake occurs, ask what processes or procedures need to be changed, not what penalty can be applied a particular person.
In contrast, reckless behavior must be punished even if harm doesn’t result.In other words, the drunk driver should face consequences, even if no one is killed. Otherwise, the wrong incentives are in place and errors won’t be reduced.
Let me tell you a story that illustrates why a “Just Culture” is so essential to reducing errors. I once worked with a hospital system that had issues with nurses not scanning wristbands for positive patient identification when providing medications. The nurses developed many clever workarounds to ensure they still achieved (on paper) the required 98% scanning compliance rate. Nurses were often reprimanded for not scanning wristbands, but no one ever asked why they weren’t scanning as required. Well, my team finally did. We discovered the wristband system was nearly impossible to adhere to in the real world. Wristbands wouldn’t scan if they got wet, ripped or bent. Nurses weren’t scanning because the wristbands wouldn’t scan.
Finally, a seasoned nurse risked her job and spoke up. She told us: I’ll tell you why I’ve found multiple workarounds. I want to keep my job and I know my scanning rate has to be close to 98%, so I scan stickers on charts, carts, bedside tables – anything to make it look like I actually scanned the patient. A few months ago I tried scanning my patient and after several failed attempts, I looked over at my patient and saw that she was silently crying. Tears were rolling down her cheeks. When I asked her why she was crying, she told me she was afraid because I obviously had the wrong medicines for her. After that day, I decided I would never trust this system!Group Management For Just Culture Assignment Paper
If managers had taken the time to ask the nurses why they didn’t scan the patients’ wristbands, they could have worked to create a better system. Sometimes — and this can be hard to accept — our policies and proceduresdon’t work. There is always a cause behind both human and at-risk behavior. A leader must not only investigate the system but also determine the causes behind the behavior.
In 2011, the Agency for Healthcare Research and Quality found that only 44% of hospital employees felt comfortable reporting errors. That illustrates how far we have to go in creating a “Just Culture” and, therefore, a safer system for patients. Fear of reporting allows errors to be repeated, sometimes with devastating results. The good news is that treating people based on behavioral choices – rather than outcomes – will provide an eye-opening and liberating way to improve the safety of our health care system.
“As humans, we are raised to understand that if you make a mistake, you are going to be punished, whether you meant to make the mistake or not,” said Natasha Nicol, PharmD, FASHP, Director of Medication Safety at Cardinal Health in Dublin, OH. “Just Culture goes against that and says no, we need to understand and know and accept and learn that human beings make mistakes. We need to plan for that in order to understand the causes behind the mistakes and build risk reduction strategies to stop errors from happening.”
Nicol studied under Marx, is a certified Just Culture trainer, and teaches hospitals across the nation how to implement and sustain Just Culture practices.
An important element of Just Culture in health care is to make sure that hospital staff has the safest possible systems, procedures, and policies in place. At the same time, health care personnel should understand that they need to make safe behavioral choices.
“This is where people drift. They say OK, I know I’m supposed to do it this way, but I do it a different way because it’s faster,” Nicol explained.
An organization needs to look at the entire system and determine if a policy or procedure makes sense. “They need to understand why people make the behavioral choices they do,” said Nicol.
For example, Nicol is working with a hospital system in Alabama to implement Just Culture and improve medication processes, from the time a physician thinks about writing an order until the medication is being monitored in the patient’s body. Nichol reviewed the hospital’s scanning rates for bedside bar coding. Although the hospital’s leadership thought nurses were doing a great job of scanning, “it turns out that the nurses hardly ever truly scanned the arm band on a patient,” she said. “We had to look deeper to understand the reasons behind this behavioral choice.”
It turns out that the nurses weren’t scanning the bar codes on the patients’ armbands because often the bar codes don’t work. Sometimes this is a result of a printer error or the band getting wet or torn.
“The nurses said, ‘I know I need to confirm and scan the bar codes, but the system doesn’t work for me,'” said Nicol. Once she identified the barriers to complying with the bar code scanning process, the hospital began working on fixing the underlying problems so the process will work for the nurses.
Nicol believes that a Just Culture is necessary to effect positive change. “Without Just Culture, hospitals will not be able to make medication use safer, and without Just Culture, people don’t feel comfortable talking, explaining, and sharing ideas about errors,” she said. For pharmacists, the key is to “move toward more standardization of procedures to help reduce errors.”
Get over the notion that you can be perfect every day. “Don’t fret about making a mistake. If you want to worry, worry about whether you are making the right choices when you step into the pharmacy,” said David Marx, the father of Just Culture and a founder of the Just Culture Community.
According to Marx, human fallibility can be organized into three categories:
These three types of fallibility are each managed in a specific way. Human error, such as a slip, lapse, or mistake, is often a product of an organization’s current system design and is managed by consoling the individual with support and compassion, followed by an evaluation of the system for possible improvements that can be made to the processes, procedures, training, or environment.
By far the most prevalent behavior of the three, at-risk behavior—where an individual believes the risk is insignificant or justified—is managed by coaching. Removing incentives for at-risk behavior, creating incentives for safer choices, and realigning the individual’s perception of the risks they are taking can improve the individual’s behavioral choices.
Reckless behavior involves an individual consciously disregarding a substantial and unjustifiable risk. Reckless behavior is rare and is effectively managed through disciplinary measures and punitive actions.
“Although a person may be sorry that a mistake occurred, we need to figure out why that mistake occurred,” said Marx. “We need to look at the error and [understand] that it is really an outcome. The outcome is a product of the system and the performance-shaping factors within that system.”Group Management For Just Culture Assignment Paper
Medication errors continue to be a concern of health care providers and the public, in particular how to prevent harm from medication mistakes. Many health care workers are afraid to report errors for fear of retribution including the loss of professional licensure and even imprisonment. Most health care workers are silent, instead of admitting their mistake and discussing it openly with peers. This can result in further patient harm if the system causing the mistake is not identified and fixed; thus self-denial may have a negative impact on patient care outcomes. As a result, pharmacy leaders, in collaboration with others, must put systems in place that serve to prevent medication errors while promoting a “Just Culture” way of managing performance and outcomes. This culture must exist across disciplines and departments. Pharmacy leaders need to understand how to classify behaviors associated with errors, set realistic expectations, instill values for staff, and promote accountability within the workplace. This article reviews the concept of Just Culture and provides ways that pharmacy directors can use this concept to manage the degree of error in patient-centered pharmacy services.
A substantial body of evidence from international literature points to the potential risks to patient safety posed by medication errors and the resulting adverse drug events. In the United States, medication errors are estimated to harm at least 1.5 million patients per year, with about 400,000 preventable adverse events.1 In Australian hospitals, about 1% of all patients suffer an adverse event as a result of a medication error.2 Of 1,000 consecutive claims reported to the Medical Protection Society in the UK from July 1, 1996, 193 were associated with prescribing medications.3 Medication errors are also costly to health care systems, patients and their families, and clinicians.4,5Preventing medication errors has therefore become a high priority worldwide.
A fundamental problem inhibiting the reporting of errors is the variation in how errors are defined, what information is reported, and who is required to report. In the 1970s, a physician’s prescription for a dose of a medication not appropriate for a patient’s renal function was not considered a medication error. Only after the safety movements of the 1990s were these prescribing errors, attributed mostly to physicians, recognized as medication errors. Near misses are often not reported. Health care workers who believe that an error or near miss is unimportant or causes no harm might decide not to report it. Medication error reports are often difficult to complete and take around 15 to 20 minutes. A busy clinician may not take the time to fill in error details, some of which may not be readily retrievable. In addition, the lack of standardization in the information reported makes it difficult to identify trends in the data.
Many errors go unreported by health care workers; reporting of medication errors in large academic medical centers averages 100 per month (Joseph Melucci, personal communication, February 22, 2017). Given the numbers of doses dispensed by most hospital pharmacies, this reporting percentage is quite low. Only serious or harmful medication errors are reported; errors that do not cause harm but necessitate a systems fix to prevent them in the future are not reported. The major reason errors are not reported is that self-reporting will result in repercussions.6 Health care workers may suffer worry, guilt, anxiety, self-doubt, blame, and depression following serious errors, both for themselves (for disciplinary actions) and for the patient who has been harmed. Support for health care workers in these situations often rests with family members, while some hospitals have programs for “second victims” of medication errors.7 Most health care workers hide the pain of their mistake with silence, instead of admitting their mistake and discussing it openly with peers. Hiding errors can result in further patient harm if the mistake is not identified and fixed; thus self-denial may have a negative impact on patient care outcomes.
The ideal safety system has a robust, easy reporting process for errors and a culture that does not assign blame for errors and reporting; transparent discussion is rewarded. This system increases error reports of all types and establishes a continuous cycle of problem identification and process improvement. Communication and collaboration with risk managers, safety officers, and pharmacy leaders are necessary to provide quality care and encourage a culture of safety. In a culture of safety, open communication facilitates reporting and disclosure among stakeholders and is considered the norm. Even some of the most advanced organizations in terms of safety culture continue to struggle with the balance between personal accountability and a no-blame approach to medication errors.
The shift to a blame-free culture occurred in the mid-1990s with the acknowledgement of human fallibility and the idea that no practice is without error. During this time, the focus moved from the individual to the system processes that allow errors to occur. A series of key papers by physician leaders acknowledged that even the most experienced and knowledgeable employee had the capacity to make an error that would result in harm to a patient.8,9 Although a culture that does not place blame was a step in the right direction, it was not without its faults. This model failed to confront individuals who willfully and repeatedly made unsafe behavioral clinical practice choices. Disciplining health care workers for honest mistakes is counterproductive, but the failure to discipline workers who are involved in repetitive errors poses a danger to patients. A blame-free culture holds no one accountable and any conduct can be reported without any consequences. Finding a balance between punishment and blamelessness is the basis for developing a Just Culture.
State Boards of Pharmacy were slow to change their perspective. The following case in Ohio clearly demonstrated a blaming approach to medication errors. A pharmacist was jailed after he was accused of negligence in failing to detect a pharmacy technician’s chemotherapy mixing error that resulted in the death of 2 year-old Emily Jerry.10There was outcry and concern voiced on both sides of this case, and it caused anxiety and fear in pharmacists that limited reporting errors.Group Management For Just Culture Assignment Paper
With all processes, human factors are often the cause of mistakes. In the Emily Jerry case, human error resulted in serious consequences for all parties involved. Medication errors create other consequences including lost income and wages, loss of trust in the health care system, decrease in morale, and physical and psychological pain. The majority of medication errors do not result from the reckless behavior but from faulty systems and processes. Pharmacy leaders should strive to put systems in place that serve to minimize human error and implement a Just Culture way of thinking both inside and outside of the pharmacy department. To do this, pharmacy leaders must understand how to classify behaviors associated with errors, set realistic expectations, instill values for staff, and promote accountability within the workplace.
This article reviews the concept of a Just Culture of safety and its implications for pharmacy leaders involved in balancing accountability and system failures resulting from medication errors. The specific aims of this paper are to (a) review the various behaviors involved in any error, (b) describe the fundamental leadership approaches in establishing a Just Culture of safety, and (c) describe how the Just Culture algorithm is applied to a medication error. While operational and clinical effectiveness is important in developing patient-centered pharmacy services, promoting and establishing a Just Culture of safety provides a framework for open dialogue and continuous process improvement to ultimately prevent serious and harmful medication errors.
Adopting one model of shared accountability
Learning from mistakes vs. blaming individuals
Managing behavioral choices (human error, at-risk behavior, reckless behavior)
Designing safety into all clinical systems and processes
Commitment of organization/leadership to shared goals
Just culture originated in the 1980s in the aviation industry, where safety errors can have catastrophic results.
To help reduce aircraft accidents, there was systematic review of the technology, the training and the culture in aviation. It was recognized that the conditions for accidents were often known by people in the workplace who were afraid to speak up for fear of being reprimanded or humiliated.
“Improving the safety of an organization depends on learning the details from near-miss events, including specifics often known only by the people most closely involved with the incident,” Dr. Larson said. “If staff members do not feel safe to speak freely without adverse consequence, they tend to hide, cover up or simply not report those details.
“Learning only happens when errors and problems are recognized, disclosed, analyzed and openly discussed, which requires that managers move beyond the ‘blame and shame’ approach to dealing with errors,” he explained.
For example, in health care, a just culture can be especially beneficial when shortcuts are taken that can adversely affect the level of care provided, according to Nadja Kadom, MD, associate professor and director of pediatric neuroradiology and pediatric radiology quality, Emory University School of Medicine and Children’s Healthcare of Atlanta.
“When staff members take shortcuts that put patients at risk, writing them up is not the solution. Taking shortcuts — an “at-risk” behavior — requires coaching the individual on proper procedure. In addition, many improvement opportunities may be uncovered when we try to identify why this person feels the need to take shortcuts in the first place,” Dr. Kadom said.
She added that shifting from blaming individuals to looking at the system as a whole in a just culture ultimately leads to a more productive organization and a more enjoyable work environment.
Employees want to do the right thing, particularly when that means voicing concerns about safety. A just culture fosters an environment in which individuals are not afraid to disclose and discuss a mistake.
“By encouraging open communication of error in a non-punitive environment, just culture holds institutions and providers accountable for actions and establishes a uniform and fair approach to improving patient care,” said Priscilla J. Slanetz, MD, MPH, associate professor of radiology, Harvard Medical School, Boston.
Just culture is not a tool or a method limited to certain aspects of the work, according to Dr. Larson. “A just culture must permeate the organization,” he said. “It is a shared mindset that asserts that everyone will focus much more on learning from errors and adverse events than on assigning blame.”
In a just culture, there is a shared recognition that adverse events may be multifactorial and systems and processes play an important role in increasing the likelihood of individual human errors. A just culture stresses the importance of creating systems and processes that decrease that likelihood.
Making the shift from a culture of blame to one of safety requires strong, compassionate and caring leadership.
“Leaders must embrace staff that speak up and identify issues in their departments,” Dr. Slanetz said. “Staff who raise concerns must be viewed as providing opportunities for departments and organizations to grow. Only then can health care organizations add value, and most importantly, be able to provide the highest quality of care.”
Leaders who want to move to a just culture must realize that when it comes to managing safety, they often cannot trust their instincts, according to Dr. Larson. Reading about just culture and learning from other organizations that have adopted the approach is probably the best way to start aligning policies and practices to support a just culture.Group Management For Just Culture Assignment Paper
“Leaders often have to unlearn their whole approach to safety, because the instinctive reaction to an adverse event is to look for a single cause and blame the person closest to the event with the severity of the punishment tied to the result of the error,” Dr. Larson said.
Dr. Kadom, who recently developed several workshops on just culture at Children’s Healthcare of Atlanta, would agree.
“Starting the discussion is important and leaders can do that by setting an example,” Dr. Kadom said. “Sharing stories of their own mistakes can go a long way in creating an open, non-punitive environment. It will remind staff that managers are humans too.”
Just Culture in Radiology
Since so many areas of radiology depend on human performance, the specialty stands to benefit greatly from just culture, according to Dr. Larson. But it can’t be turned on and off only when errors occur.
“Just culture requires an organizational commitment to embrace and maintain it,” Dr. Larson said. “The culture must be carefully protected, since it is based on trust, which takes time to develop and is easily lost.”
How far along is radiology in embracing just culture? It’s not quite a tsunami of change yet, according to Dr. Slanetz.
“Nearly every institution has created codes of conduct and committees focused on quality and safety,” Dr. Slanetz said. “However, at the individual level, the ‘blame game’ is not gone and creating a culture where everyone feels free to speak up is still lacking at some facilities. The tools might be in place, but we still need to figure out how to use those tools effectively to shift the culture successfully.”
While there are often up-front costs associated with adopting such a widespread culture change, such as culture assessment, staff education, etc., the bottom line costs can be outweighed by the benefits to safety and quality of care.
“Just culture adds substantial value and saves money as fewer medical errors have innumerable benefits, beyond just better patient outcomes,” Dr. Slanetz said. “It helps health care providers continually improve and create institutional systems that are more streamlined and effective.”
Dr. Larson suggests that just culture also has the potential to affect the psychological cost of medical errors. Patient harm events are intrinsically accompanied by feelings of guilt, he explains, which can be emotionally distressing to the individual involved. Adding a reprimand can greatly exacerbate the impact.
“The combination can be psychologically devastating to individuals involved in an event,” Dr. Larson said. “Therefore, a great deal of fear often surrounds patient safety programs. Just culture is about driving out that fear so that learning can occur.”
Focusing on the human interaction with systems can help organizations begin to adopt just culture, according to Dr. Slanetz. “By creating strong, more integrated systems for accountability, organizations can create environments focused on learning and openness — all in a culture of accountability,” Dr. Slanetz said.
Ultimately, just culture leads to both a more productive organization and a more enjoyable work environment, so it is in the best interest of radiology to adopt it, Dr. Larson said.
One organizational approach has been to seek out errors and identify the responsible individual. Individual punishment follows. This punitive approach does not solve the problem. People function within systems designed by an organization. An individual may be at fault, but frequently the system is also at fault. Punishing people without changing the system only perpetuates the problem rather than solving it.
Implementing a just culture
First and foremost, it goes without saying that we all want to do the right thing, at the right time, for the right patient the first time, and every time thereafter, without making a mistake or causing harm, ever. And in a perfect world, that’s what would always happen-we would take good care of our patients, who in turn would always have great outcomes.
Unfortunately, it’s not a perfect world and every healthcare provider will make at least one error during the course of his or her career. Whether it will actually reach a patient is irrelevant. The fact is, an error will still be made, which results in at least one of the following outcomes:
Actual event, with patient harm
Actual event, with no patient harm
Near miss, with no patient harm, but that has the potential to cause harm
We also know now that the majority of errors occur from systems failures or process problems, and we know that if we focus on and fix the process, we will have more success in achieving safer patient care conditions than if we target people problems or punish providers for making those errors. It is already well known that how an organization manages its event reporting system tells a lot about the organization’s culture. It only follows that an organization with a robust event reporting database in all likelihood has a positive culture of patient safety in which staff members are very comfortable reporting, knowing that there won’t be punative action taken against them. However, we do not discount accountability when an error is made, particularly in light of our understanding of the set of alogorithms that help us determine whether the behavior was human error only, at-risk behavior, or reckless behavior. We also know that those who have instituted a culture of patient safety seem to have followed several strategies to ensure successful implementation.Group Management For Just Culture Assignment Paper
A healthcare organization’s staff members, from leadership to frontline staff, have to understand the concept of a just culture, agree with the principles, and practice it every day. However, to ensure a hospital is operated 100% in a just culture manner, there are steps to take at the beginning of your journey. A healthcare organization would also be wise to review these steps occasionally and ensure everything is being done to support a just culture.
Step one: Leadership buy-in
If the culture of the organization is to truly be patient safety-focused, then it is absolutely critical that leadership set the example by communicating to all employees, including managers, medical staff, and board members, that patient safety is a priority organizational strategic goal. There are many steps leadership can take to ensure that they mean business when they say patient safety is an important component of success for the organization. The following is a checklist for leadership to ensure they are communicating this priority effectively:
Make sure patient safety issues are discussed at senior leadership and board meetings
Do walk-arounds and talk to staff about what they perceive as serious patient safety issues
Ensure senior leadership representatives are involved in performance improvement, patient safety, and risk management committees; their attendance and participation should be required
Ensure patient safety education is presented at new staff and medical staff orientation programs, as well as at regularly scheduled educational sessions
Provide a “Lesson Learned” forum where staff can present actual adverse events to their colleagues and ask them to provide to the group their plans for how these issues will be prevented from happening again
If a plan was successful, ask staff to share how this was accomplished with the rest of the organization and promote these successes throughout the organization
Make patient safety a part of competencies and performance evaluations for every hospital employee, the medical stall, and governing board
Step two: Formation of a patient safety culture committee
Form a patient safety committee, task force, or initiative composed of representatives from nursing, ancillary clinical and nonclinical departments, middle and senior leadership, and the medical staff. This team of committed individuals will eventually serve as just culture champions for the entire organization, as well as a resource for assistance and guidance to support the staff’s efforts. Members of this group will also be crucial in helping to build awareness of what the staff, as well as leadership, perceives the current culture to be and will serve as cheerleaders during the patient safety culture assessment survey process, encouraging survey completion housewide. The results are usually an initial surprise to leadership, because their perception of the current culture will most likely differ from that of the frontline staff.
If it hasn’t been done by now, it is time to assess the organization’s current culture of patient safety. The best practice is to resurvey every six months, focusing on those areas most needing immediate improvement, until measurement determines that the organization has at least met its minimum target for improvement. Once improvements are identified as being more stable with respect to the rate of progress, then the survey should be conducted every year.
The next step is to educate senior leadership, members of the board, and other key operating managers about the just culture workshop or other full-day session. It is at this point that an outside expert on the subject would be the most credible instructor, who can truly do justice to the topic and who can motivate attendees into action. Inviting someone outside of the hospital to speak also validates the message that the hospital’s patient safety staff has been trying to communicate, usually for a long time prior to bringing an expert in to conduct the session.
Next, develop an orientation training program on patient safety and just culture for all new directors, managers, and supervisors coming on board, as well as for those who have been with the organization for some time. Include the topic of accountability as part of the program and encourage participants to work with human resources prior to utilizing a disciplinary processor before taking any action. Develop a similar program to be presented annually as a reminder of the organization’s commitment to fairness and accountability.
In most cases, once senior leadership and managers have been brought on board, formal education sessions for the staff may not be necessary, depending on how comfortable mid-level managers feel about communicating to their staff what they have learned and ways in which patient safety can be improved. In fact, in many instances, educating the rest of the staff can really be incorporated into the routine and practical operations of the institution.
However, if the organization feels that formal staff training is necessary, there is no need to start from scratch. There are several programs already out there online that can be adapted to meet the organization’s needs. For more information, www.justculture.org is a great place to start looking into how education can be improved.
Revise policies, procedures, and protocols (particularly those policies relating to expectations for behavior) and continue efforts through routine orientation, during routinely scheduled competency training, and at unit-specific education programs whenever possible. Any policies that do not promote a just culture should be eliminated, specifically those relating to punishment for errors. Policies that will need to be revised include your incident reporting policy, sentinel event policy, disclosure policy, patient complaint/grievance process, job description, codes of conduct, medical staff bylaws, rules and regulations, and the like.
Any document that addresses the consequences for behavior and the management of adverse events will need to be revised to reconcile professional accountability and the need to create a safe environment to report medical errors. In other words, the staff need to know that if an event occurred because of a system failure or flaw, then the organization accepts responsibility and accountability, and the individual will not be punished for something that was out of his or her control.
Leadership will need to understand that the reasons for clinical outcomes and events should not be the focus, nor should those involved be prejudged. Any rush to blame individuals is to be avoided. Rather, there should be an attempt to understand at the time the event occurred the circumstances and context for the actions and decision-making. The main focus of this analysis is on system failures-with any and all subsequent analyses and proceedings conducted with fairness, within the legislative and legal frameworks, and in accordance with established hospital policy and/or bylaws. The rights of all individuals are protected, for both employees and patients, and policies and procedures should reflect language that addresses:Group Management For Just Culture Assignment Paper
Leadership’s commitment to and support of the purpose of quality improvement
Leadership appropriately protecting any and all quality improvement information from legal, regulatory, or other proceedings
The organization’s intolerance of intentionally unsafe actions, reckless actions, disregard for the welfare of patients or staff, or other willful misconduct and misbehavior
In health care we excel in defining projects and tackling them with zeal, yet the end result, particularly in the safety-based ones, is that most do not achieve the desired outcomes. Instead, projects suffer from inadequate design, and we harvest, at best, modest results. Five years after the IOM report “To Err Is Human” there is general consensus that we have not accomplished our goal to appreciably decrease harm, and have little solid evidence that the delivery of health care is safer and more reliable (Kohn, Corrigan, and Donaldson 2000;Leape and Berwick 2005). Other industries, those labeled “highly reliable,” have a more systematic approach to achieve greater success.
Highly reliable industries foster “mindfulness” in their workers. Mindfulness is defined by Roberts, Weick, and Sutcliffe as being comprised of five components: A constant concern about the possibility of failure even in the most successful endeavors, deference to expertise regardless of rank or status, an ability to adapt when the unexpected occurs (commitment to resilience), an ability to both concentrate on a specific task while having a sense of the bigger picture (sensitivity to operations), and an ability to alter and flatten hierarchy as best fits the situation (Weick and Sutcliffe 2001). These common characteristics together appear to generate reliably dependable processes with minimal and manageable errors. Health care aspires to high reliability but has not, to date, clearly framed the steps necessary to achieve such. Our historical approach mimics early steps in other industries as evidenced by a preoccupation with fancy technology and outcome-based initiatives, but without the systematic effort to build the mindfulness necessary to make all other initiatives successful. As the science of patient safety deepens, health care’s path to mindfulness and high reliability is becoming clearer. This article’s goal is to fully relate three initiatives that are underway in many hospitals and health care systems, and to argue that the three together comprise a cornerstone necessary for any comprehensive patient safety plan. These three initiatives are critical and must be pursued with and integrated into all other operations. They are (1) the development of a Fair and Just Culture (Marx 2001), (2) leadership intelligently engaged in WalkRounds safety by using frontline provider insights to directly influence operational decisions (Frankel et al. 2003), and (3) systematic and reinforced training in teamwork and effective communication (Helmreich, Musson 2000; Gaba 2001; Cooper and Gaba 2002; Leonard and Graham, and Bonacum 2004; Baker et al. 2005). The success of these pursuits is interdependent, and hospitals interested in transforming care must spend equal effort on them. That effort must be substantial and equal to what is currently spent on information technology and outcome-based initiatives (see Figure 1), such as IHI’s 100,000 lives campaign (Davis 2005), NQF’s Patient Safety Practices (Kizer 2001), and the Leapfrog initiatives (Milstein 2002). If pursued in this manner, the likelihood is that outcome-based initiatives will reach their goals more frequently and faster, failure to do so is likely to ensure that safe and effective care remains an elusive goal. The tools work synergistically, are reasonably simple in concept but less easily implemented, and are difficult to measure. Ultimately they are essential for all other efforts. This article relates the components of Just Culture, Engaged Leadership, and Teamwork and Communication and suggests a framework for action in each, including specific tools.
FAIR AND JUST CULTURE: APPROPRIATE ACCOUNTABILITY
Define Fair and Just Culture
A Fair and Just Culture is one that learns and improves by openly identifying and examining its own weaknesses. Organizations with a Just Culture are as willing to expose areas of weakness as they are to display areas of excellence. Of critical importance is that caregivers feel that they are supported and safe when voicing concerns (Marx 2001). Individuals know, and are able to articulate, that they may speak safely on issues regarding their own actions or those in the environment around them. They feel safe and emotionally comfortable while busily occupied in a work environment, able and expected to perform at peak capacity, but able at any moment to admit weakness, concern, or inability, and able to seek assistance when concerned that the quality and safety of the care being delivered is threatened. These workers are comfortable monitoring others working with them, detecting excessive workload and redistributing the work when appropriate to maintain safety and reliability.
Each individual feels as accountable for maintaining this environment as they do for delivering outstanding care. They know that they are accountable for their actions, but will not be blamed for system faults in their work environment beyond their control. They are accountable for developing and maintaining an environment that feels psychologically safe. They will not be penalized for underreporting when it feels unsafe to voice concerns.
This is not utopian; it boils down to the comment, “I feel respected by everyone in each work interaction I have.” This state is achievable when outstanding leadership ensures that every employee clearly understands his own accountability and models such.
Accountability—being held to account—is based on a relationship between two or more parties in which the product of one party—individual or group—is evaluated by another party. This process can be contractually formalized or molded over time by social pressures and historical norms.
The components of accountability include the individual’s understanding that they are to perform an action, a clear expectation what that action is, and the means by which they will be evaluated. Consider a surgeon performing an operation. She is accountable to other members of the “team,” to the hospital as a whole, to state licensing and accrediting bodies, to the patient. She may have to account for the number of surgeries performed, or perhaps only account for those surgeries that are problematic, or only those that go awry so badly that a patient is hurt. What becomes immediately apparent in this simple description of an operation is that accountability in health care encompasses multiple expectations about actions and the reporting of them; each group’s expectations differ based on social mores, regulation, law, and historical precedent. The tenets of a Fair and Just Culture should help organizations develop a framework for consistent accountability, and begin to repair the current environment, where accountability is poorly defined and individuals are unclear what the rules are or whether the rules are constantly changing.
Today, adding up the surgeon’s various accountabilities, she is accountable for increased risk, regardless whether knowingly or not; for not following rules, regardless whether to increase or decrease risk; and for outcomes based on the outcome severity, not the causative activity. In a Fair and Just Culture, the surgeon will be held accountable for knowingly unnecessarily increasing risk. The severity of the outcome and the breaking of rules will be subject to that principle. To be absolutely clear, health care organizations, and occasionally individual providers, are ethically responsible, through insurance mechanisms and otherwise, for aiding and possibly compensating a harmed patient. However, from the perspective of systems improvement, learning and positive change are more likely to occur when compensation is uncoupled from the evaluation of an adverse event. A Fair and Just Culture can be cultivated in health care organizations regardless whether this aspect of adverse events is fully reconciled; in fact a Fair and Just environment is likely a viable mechanism for diminishing the sting of the current malpractice tort process. Open discussion and transparency are characteristics that lead to mediation and resolution, not litigation.Group Management For Just Culture Assignment Paper
Industries Outside of Health Care
The environment described, while rare in health care, is embedded and evident in other industries we perceive as reliable and safe. In aviation, for example, insights about human behavior 45 years ago led to the science of human factors, which helped shape the industry through the adoption of standardization and simplification rules to produce greater reliability and safety. The importance of acknowledging employee concerns and hazards is evident. For over a quarter century an error reporting system paid for by the federal government through the Federal Aviation Administration and managed by NASA has been extensively used (McGreevy and Ames Research Center 2001). It has evolved to open reporting systems administered within specific airlines. Pilots have been trained for the past 30 years to understand and admit their fallibility, and the industry they work in promotes a discussion, on a regular basis, of individual failing. Pilots are regularly evaluated for both their technical skill and their ability to promote effective teamwork. The application of human factors is uniformly manifest (GABA 2001). The result is an extraordinary safety record.
Relationship to Teamwork and Leadership Involvement
In contrast, as surgeons and anesthesiologists walk into hospital operating theatres, they do so with the underlying expectation, based on training and habit, that everyone in the room is “expertly” trained and will manage their specific job without error. No real briefing of the team consistently occurs before each procedure between surgeon, anesthesiologist, nurse, and technician (albeit per JCAHO requirements they may now stop to insure the correct side of the procedure—an act that is a fraction of the full briefing that should occur). The operating room team’s optimal functionality depends on the open discussion of teamwork and team expectation, and that is greatly dependent on how the hospital culture promotes such discussions. It is quite possible to envision strategically, and then produce structurally, an environment where each individual’s personal concerns can be voiced about that particular surgical case, and to voice concerns when they arise, in real time, to the best advantage of the patient. How our hospitals strategically approach accountability, followed by the structures put into place to make the strategy manifest, will greatly affect whether the care providers will speak up in that operating room. This will in great part determine the speed and efficacy in surfacing a problem, which affects the reliability of operating room care. The opportunities for improved care are endless, through improved communication and other systematic improvements directed by the knowledge gained from voiced concerns. What would this look like in real life? A perinatal unit provides a good example.
Clinical Example: Brigham and Women’s Hospital (BWH) Perinatal Unit
BWH in Boston delivers about 8,600 babies each year, and a significant percentage of those patients are delivered by private practice obstetricians, individuals with excellent reputations. A pregnant woman chooses an obstetrician to care for her (presuming she has the insurance to do so), and over the course of the pregnancy develops a bond with that physician. The obstetrician is duty bound—and accountable—to deliver the best care possible to the couple, and shepherds the pregnant woman over 9 months with the one goal of a healthy child and mother. The obstetrician may be part of a group, but if the patient is asked, she is likely to identify whom she thinks of as “her” obstetrician.
When the expectant mother enters the hospital, she expects expert decisions to be made about her labor by her skilled obstetrician, and because many of the obstetricians at the BWH deliver hundreds of babies each year in an environment where excellence is the norm, she is quite likely to achieve her desired outcome. But obstetricians are human and fallible. What happens when obstetricians mis-step, when they become fixated on a particular diagnosis they have made and/or ignore new information that is clinically relevant? When they become fatigued, preoccupied, or are slightly less than expert in a given situation? The unique bond between physician and patient actually undermines the ability of other physicians or providers to even know that a poor decision has been made and to intervene. In the current environment on most obstetrical units today, only some percentage of the nurses would feel comfortable speaking up with their concerns if they perceived a problem with the patient’s care.Group Management For Just Culture Assignment Paper
The BWH has instituted twice daily “board” rounds where each patient is discussed jointly with the group of physicians and nurses covering the obstetric service at that point in time. There are always a fair number of providers present, with physicians representing both the teaching service and private staff. Through the board rounds, these clinicians have an opportunity to hear from their equals about the care being delivered—in real time. While it is quite likely the majority of their thinking will be precisely on target, there is now an opportunity for input and reconsideration of the care plan from additional experts. This added perspective is perceived as valuable, not meddling, and is now accepted as the norm. Teamwork, team coordination, and collaboration have been artfully developed by Dr. David Acker, BWH’s Chief of Obstetrics and Margaret Hickey, R.N., Nurse Manager for Labor and Delivery, through these twice daily board rounds. Nurses can speak their minds without fear of repercussions and actively advocate for the patients. So can residents-in-training and the more experienced senior staff. The rounds are not just an opportunity for teaching; they are, following the example of their two designers, manifest teamwork in action, based on the concepts of transparency engendered by a Fair and Just Culture; secondarily, and of equal import, they promote cross-professional and cross specialty teaching.
A patient care system is obligated to collect productive investigative data that can be analyzed and acted upon to improve patient safety. This process is not possible unless members of the organization remain vigilant and mindful and maintain continuous surveillance. Similarly, people within the organization must believe that they are obligated to report errors. However, medical institutions cannot afford a blame-free culture: Some errors do warrant disciplinary action. Finding a balance between the extremes of punishment and blamelessness is the goal of developing a just culture
JUST CULTURE: CONCEPT AND PHILOSOPHY
A just culture balances the need for an open and honest reporting environment with the end of a quality learning environment and culture. While the organization has a duty and responsibility to employees (and ultimately to patients), all employees are held responsible for the quality of their choices. Just culture requires a change in focus from errors and outcomes to system design and management of the behavioral choices of all employees.2
Consider the following situations:
• Two nurses select the (same) wrong vial of intravenous medication from the dispensing system. One nurse administers the drug, causing cardiac arrest. The other nurse realizes the switch when drawing the solution from the vial into the syringe at the bedside. How do we approach the nurses and investigate the situation?
• The attending physician tells a resident physician to obtain a specific blood test. The resident forgets. Fearing the wrath of the supervising physician, the resident reports that the result is normal. How do we deal with this breach?
• A surgical team does not perform a surgical time out on the grounds that no adverse events have occurred in the past. How do we handle this violation?
• The night nurse supervisor reports to a medical director that the lead respiratory therapist was in the hospital at 4:00 am with alcohol on his breath. At a later date, the physician confronts the employee who vehemently denies alcohol abuse. Should the matter be dropped?
In only one of these scenarios does an adverse event occur, yet a just culture, with its insistence on a value-based culture and shared accountability, demands that all of these situations be addressed. However, individual practitioners should not be held accountable for mistakes made in a system they cannot control.3
In the first example, further investigation showed that the 2 vials of entirely different medications looked alike in shape, size, color, and print. This accident waiting to happen did happen to the first nurse and her patient. Human error was involved, but this nurse should be consoled and supported rather than punished.
The resident physician falsified patient data, which cannot be condoned and must be addressed. Honest disclosure without fear of retribution is an important characteristic of a just culture.
The surgical team cannot function outside of the value-based principles designed by the organization. Although this surgical team has never been involved in an adverse event, one may occur in the future.
As for the respiratory therapist, in a just culture we are concerned for the safety of our patients and we are concerned for and care about each other. Further nonpunitive investigation is necessary.
These examples address an aspect of just culture that goes beyond ensuring that employees feel free to report errors. Highly reliable organizations and industries foster mindfulness in their workers. Weick and Sutcliffe4 describe mindfulness in terms of 5 components:Group Management For Just Culture Assignment Paper
A constant concern about the possibility of failure
Deference to expertise regardless of rank or status
Ability to adapt when the unexpected occurs
Ability to concentrate on a task while having a sense of the big picture
Ability to alter and flatten the hierarchy to fit a specific situation
Mindfulness throughout an organization considers, but moves beyond, events and occurrences. Everyone in the organization is continually learning, adjusting, and redesigning systems for safety and managing behavioral choices.
INDUSTRIES OUTSIDE OF HEALTHCARE
While the concepts of developing a just culture and supporting team function may be new to healthcare—spurred by publication of Errors in Medicine in 2000—the just culture environment has been imbedded in other industries for many years.5 The industries of aviation, train transportation, and nuclear power have been accepted as highly reliable and safe.6 For aviation, frequently compared to healthcare, these principles and their foundation span 45 years.7 Nonetheless, within these just culture industries are examples of errors, failures, and accidents that are insightful and address human behavior in complex systems.8
The Chatsworth train collision occurred on the afternoon of Friday, September 12, 2008 at the beginning of the evening commute in a high-density travel corridor.9 This mass casualty accident brought a massive emergency response by the city and county of Los Angeles, taxing resources to the breaking point. Twenty-five people died, and many survivors were hospitalized for an extended period. The Metrolink train company was exposed to more than $200 million in liability judgments.
Investigations revealed that the train engineer did not obey a signal to not enter a single-track segment (designated as such because of oncoming train traffic). The Metrolink passenger train and a freight train were headed toward one other, both moving at a speed of 40 miles per hour. The engineer of the freight train engaged his air brake 2 seconds before impact, while the engineer of the passenger train did not engage his brake. Further investigation revealed that the engineer in the commuter train had a habit of text messaging while operating the train and had been warned about this policy violation. Nevertheless, his cell phone history (delivered under subpoena) showed 2 text messages sent shortly before impact. A spokeswoman for the train company admitted the strong likelihood of operator error and was chastised by the firm as a result. She resigned.
U.S. Airways Flight 1549, 2009
Flight 1549, an Airbus A-320, departed LaGuardia Airport in New York City bound for Seattle-Tacoma International Airport with a stopover at the Charlotte-Douglas International Airport. The aircraft carried 150 passengers and a flight crew of 5. Three minutes into the flight, while still on the initial climb out of New York, the airplane encountered a flock of Canadian geese. A multiple bird strike with a total loss of power in both engines is a highly improbable occurrence, but this event is exactly what occurred.10 Within seconds, the crew determined that the plane would be unable to reach an airfield; the captain turned south along the Hudson River in a glide mode and ditched the aircraft in the river near the U.S.S. Intrepid Museum within 3 minutes of losing power. All 155 persons aboard the aircraft were rescued while the aircraft was still sinking, and only minor injuries resulted. The accident investigation revealed crew management at its finest, with the pilot communicating and the copilot audibly reading aloud procedures to ditch the aircraft. Captain Sullenberger and his crew were widely acclaimed for their performance under pressure. A former Air Force pilot with a graduate degree from Purdue University in human factors, the captain asserted that his highest duty and obligation have been to safety. A pilot colleague commented “If an unlikely tragedy like this had to occur, I can think of no other pilot I would pick to handle it other than Sully!”Group Management For Just Culture Assignment Paper
The Chernobyl accident in the Ukraine is the only accident in the history of nuclear power that resulted in fatalities from radiation.11 An explosion and fires released the contents of the reactor core into the atmosphere, and radioactive material moved downwind. The 2 plant operators died immediately; another 28 people died of acute radiation poisoning. The disaster provides insight into the serious consequences of ignoring safety issues. Investigation after the event revealed a flawed reactor design compounded by inadequately trained personnel operating the facility. Twenty years later, the only resulting public health issue is an increased incidence of thyroid cancer.
Applications in Healthcare
These anecdotes and exhaustive investigational material from other industries indicate several points that apply to the healthcare environment.
As in aviation, aspects of medical care (anesthesiology, surgery, emergency medicine, intensive care medicine) are event driven and dynamic, complex and tightly coupled, and uncertain and risky.8
Written checklists help prevent crises.
Established written procedures are vital in crises.
Training in decisionmaking and crew resource management are valuable.
Systematic drills and practice using simulation technology address the ability to handle crisis situations.
In a landmark publication, Reason presented a detailed analysis of human error.12Reason introduced his text by referring to the 1928 studies of Spearman but asserted that the decade prior to the publication of his book was characterized by public concern about the terrible cost of human error. He lists the Tenerife runway collision in 1977, the Three Mile Island crisis in 1979, the Bhopal methyl isocyanate disaster in 1984, the Challenger and Chernobyl explosions of 1986, the capsize of the Herald of Free Enterprise, the King’s Cross tube station fire in 1987, and the Piper Alpha oil platform explosion in 1988 as causes for a collective impetus to address error. Furthermore, this decade made clear that the nature and scale of such tragedies impacted wide geographic areas and generations of humans.
We’ve all been there…something goes wrong, a patient is harmed, and we, as medical directors, managers, and administrators, are forced to judge the behavioral choices of another human being. Most of the time, we conduct this complex leadership function guided by little more than vague policies, personal beliefs, and intuition. Frequently, we are frustrated by the fact that many other providers have made the same mistake or behavioral choice, with no adverse outcome to the patient, and the behavior was overlooked. Quite understandably, the staff is frustrated by what appears to be inconsistent, irrational decision-making by leadership. The “just culture” concept teaches us to shift our attention from retrospective judgment of others, focused on the severity of the outcome, to real-time evaluation of behavioral choices in a rational and organized manner.Group Management For Just Culture Assignment Paper
At Fairview Health Services, a large integrated delivery system in Minnesota, we identified addressing our culture as the primary opportunity to improve patient safety in 2001. We focused on two key areas of cultural concern: the leadership culture that sets the tone and judges the behavior of others, and the culture at the point of care, or team culture. In 2003, we worked with the Minnesota Alliance for Patient Safety (MAPS), a multi-stakeholder group founded by the Minnesota Hospital Association, the Minnesota Department of Health, and the Minnesota Medical Association, to establish a state-wide initiative to create a culture of justice and accountability. This effort includes hospitals, the professional boards, and the department of health.
Establishing a just culture within an organization requires action on three fronts: building awareness, implementing policies that support just culture, and building just culture principles into the practices and processes of daily work. Based on our experience over the past 6 years, let me give you examples of how you might do this.
Building awareness is the first step in any movement. To raise awareness we did two things.
First, with the assistance of David Marx, JD, president of Outcome Engineering, we conducted a survey of staff, medical leaders, managers, and administrators asking them various questions about how they thought the organization would respond to a given behavior by a clinician (e.g., bringing unauthorized equipment into the operating room [OR] for use in a surgery) if that behavior resulted in harm. We then asked the same question, except this time the behavior resulted in no harm. The survey results were clear. Members of the organization had no clear sense of how people would be judged, or how they should be judged when their behavioral choice was the wrong choice. And respondents consistently judged people more harshly if the behavior resulted in harm (Figure). The survey results were a wake-up call for the organization’s leaders.
Our second step to raise awareness was education. First, a small group of 10 key clinical and operational leaders attended a day-long session with David Marx to evaluate the just culture concepts and learn how we should proceed inside our organization and as a state. Following this, 60 Minnesota health care leaders attended a 2-day summit sponsored by MAPS, which included the professional boards and the department of health, to deepen understanding of just culture and to better understand the perspective of the professional boards and public agencies. The leaders who attended enthusiastically embraced the just culture concept, finding that it provides practical and useful principles and tools anyone can use.Group Management For Just Culture Assignment Paper
We then conducted a “big bang” educational session for all operational and clinical leaders across the system. Our message: “anyone who finds himself/herself in the position of judging the behavioral choices of other human beings” should attend the session. Three hundred and fifty people were educated in an 8-hour training session with David Marx. The education included an overview of the concepts, education on the use of a set of algorithms that guide people through the process of classifying behavioral choices as “error,” “at-risk behavior,” or “reckless behavior.” Participants also practiced applying the algorithms to real-life scenarios. In hindsight, conducting this mass education was very effective. It caused the organizational perspective on justice and accountability to shift almost overnight. We did not conduct education sessions for front-line staff on just culture, but instead we have woven the expectations for staff behavior, along with the concepts of error, at-risk behavior, and reckless behavior, into orientation and unit education sessions.
The behaviors we can expect:
Human error—inadvertent action; inadvertently doing other than what should have been done; slip, lapse, mistake.
At-risk behavior—behavior that increases risk where risk is not recognized, or is mistakenly believed to be justified.
Reckless behavior—behavioral choice to consciously disregard a substantial and unjustifiable risk.
Implementing Policies that Support Just Culture
This might better be termed, “eliminate the policies that don’t allow you to incorporate just culture.” Policies that require punishment for errors, for example, won’t work. Sentinel event investigation policies that say, “We will only look at systems and not human behavior” won’t work. Ideally, the organizational policies related to employee behavior expectations, consequences for behavior, and event investigation would incorporate the language of just culture. Job descriptions, medical staff bylaws, and codes of conduct should incorporate the principles. This will take time, so start by removing the policies that are barriers to just culture and work incrementally to build the philosophy in as you go. Our organization is still in the process of incorporating just culture principles into policies, but we have eliminated the policy barriers to using the principles. For example, if you have policies that authorize punishment (e.g., written reprimand or dismissal) after a certain number of errors, or that predicate punishment on the severity of the outcome, get rid of them.
Building Just Culture into Organizational Practices and Processes
Once the leadership group of the organization has grasped the concept and leaders buy in to the philosophy, you can begin to incorporate it into the work you do every day. I recommend not introducing just culture as a new initiative or it could become the “flavor of the month.” Instead, leaders should look at the challenges they face and ask, “How would I apply just culture principles to this situation?”
If your organization’s priority is reducing harm related to misidentification of patients, for example, how would you work with the staff to understand and categorize behavioral choices as “error,” “at-risk,” or “reckless”? How would you clarify what the organizational response will be to each type of behavior? If a person makes an error, he/she knew the right thing to do, intended to do the right thing, and followed the right process, but made a mistake (e.g., misreads a label); he/she should be consoled and we should figure out a system that will prevent future errors. If a person engages in at-risk behavior, he/she knows the right thing to do, but does otherwise because he/she does not see the risk or feels that the benefit of the chosen behavior outweighs the risk (e.g., does not wake a patient to check a name band), management must understand why people are engaging in this risky behavior. Leaders must ask hard questions like, “How prevalent is this behavior? Why are people doing this? How can we put systems in place that will encourage or force the correct behavior? How can we help people perceive the risk that exists so they will make the right behavioral choice?” Lastly, the organization and clinical leadership should identify which behaviors will be considered reckless and are, therefore, punishable. Reckless behavior is punishable regardless of the outcome of the behavior. Leaders must establish processes to know when someone is engaging in reckless behavior and be willing to punish those who engage in it. A given behavior may be considered “at risk” in one situation or organization and be considered “reckless” in another.Group Management For Just Culture Assignment Paper
Consider this scenario. In hospital “A,” a nurse, not wanting to disturb a sleeping patient, does not check a patient’s name band and administers an IV antibiotic to the wrong patient, who was allergic to that drug. The patient has an anaphylactic reaction and ends up in the ICU on a respirator. How do we judge this nurse’s behavioral choice not to check the name band before administering the medication? Do we punish her? Some organizations would punish the nurse (i.e., retrain, reprimand, or dismiss) because she violated the patient identification policy. A just culture would want to know:
Was the nurse aware of the policy to check name bands?
Was it possible to check the name band?
Do all the nurses on the unit check name bands prior to administering medications?
Why didn’t the nurse check the name band? Did she mistakenly believe it was better not to? Why?
The error in this scenario is administering the medication to the wrong patient. We determined the nurse’s behavior to be “at-risk” (and not “reckless”) because the nurse violated the policy for what she believed to be a good reason—allowing the patient to sleep. It turns out that customer satisfaction scores had recently been reviewed at a staff meeting, and sleep interruption was identified as the number one concern of patients. In addition, the other nurses on the unit agreed that they have not awakened patients to check name bands many times.
Now consider another scenario. In hospital “B,” a patient checks in. A name band is applied, and the patient is told that all staff will be asking patients to spell their names and give birth dates before providing care or treatment. The patient notes that all care providers and transport personnel follow the procedure. Now, let’s say a nurse does exactly the same thing as the nurse in the first scenario. She enters the room, observes the patient sleeping, and decides not to wake the patient to check the name band. A just culture would classify the nurse’s behavior as “reckless.” The policy was known, the policy was doable, and others were following the policy.
Within Fairview, we have incorporated just culture into our performance improvement initiatives, such as hand washing and patient identification. We identify what types of errors are made, what types of at-risk behaviors we see, and whether or not anyone is engaging in reckless behavior. As we make improvements in the process, we make sure we design it to prevent error, make risk apparent, and discourage at-risk behavior. We also clarify what behavior will be considered reckless. Currently, we are incorporating just culture principles into team training.
Just culture principles will help you change your organizational culture. In 2001, an accident occurred in our interventional MRI room when a piece of equipment flew across the room and attached to the outside of the MRI while a patient was in the tunnel. The event investigation that followed focused on system solutions and staff behavior. The department established safe processes and expectations for staff training and behavior. All staff are screened for MRI safety themselves, participate in MRI safety training, follow check-in procedures, and wear pocketless scrubs to minimize the opportunity to forget something in a pocket. Six years later, in 2007, a physician entered the room wearing scrubs with pockets, disregarding the prompt from colleagues to stop. Administration was notified. The conversation that ensued among operational and medical leaders focused on categorizing the behavior as error, at-risk, or reckless and, from that, determining whether the physician should be consoled, coached, or punished. Since Fairview has implemented clear policies and behavior expectations, and others are able to follow the policies, the behavior was found to be reckless. The physician apologized for her behavior and was warned that future behavior of this type would impact her clinical privileges. Just culture principles and tools provide a useful and necessary construct to aid organizations in dealing with difficult cultural issues, particularly to determine when the generally appropriate focus on systems needs to give way to a focus on individual accountability.
One of the simplest methods for achieving this is to take time off. Two easy and effective ways you can promote this, by removing the stigma often associated with taking time off, are awarding for employee recognition with paid vacation and/or by simply implementing incentivized vacation use.
These actions result in reduced employee burnout and higher levels of employee engagement.
What is Culture, and How do we Change it?
Managing a company culture is complicated, and the larger and more complex the company, the harder it becomes to manage. I have had several business owners ask me how, as an HR professional, I would go about changing their culture. As much as we would all love a culture change checklist, there are many interdependent factors to consider when trying to improve a company’s culture. As Forbespoints out, a company’s culture is comprised of “an interlocking set of goals, roles, processes, values, communications practices, attitudes, and assumptions.” The Harvard Business Reviewdescribes culture as living in “the collective hearts and habits of people and their shared perception of ‘how things are done around here’.” While true, these definitions are too abstract for practical use. So, companies focus on the related areas that they can measurably change. The more tangible areas of ergonomics, compensation philosophy, diversity and inclusion, training and development, health and wellness and internal processes all affect an organization’s culture. And they are the typical go-to programs for HR Departments looking to change the company for the better. These areas; however, should not be their main focus. In this article, I explore and invite you to discuss the real drivers of culture change.
Traditional Culture Change Tactics Address the Symptoms, Not the Disease
Traditionally, HR Departments have focused on a variety of initiatives ranging in size and scope from team building activities to full-blown health and wellness programs. These initiatives, if managed correctly, usually increase employee engagement for at least the short term. They do not, however, have a meaningful, lasting impact on a company’s culture. Even when these initiatives are grassroots and have employee buy-in, they do not address the underlying problems with a company’s culture. Think of culture change as a triage situation. You have to stop the bleeding before you can move on to physical therapy. Employee burnout, job insecurity, poor company communication and unethical or ineffective leadership should all be addressed before focusing on other initiatives. According to Arianna Huffington, founder of the Huffington Post, companies must end employee burnout, and stop looking at burnout as part of the normal cost of doing business. She also points out that a company’s culture functions as its immune system, and, when culture is compromised, the business is susceptible to a negative public image.
Reflect on your company’s past attempts at culture change and ask yourself:
What programs have you implemented?
Where any of those programs designed purposefully to address the real, underlying issues that hold company cultures back?
Was the participation and impact lower than you expected?
Culture Change Starts at the Top and Must be Continuously Managed
Leadership sets the tone for culture through storytelling, conversations and role modeling. Role modeling in particular is critical to establishing and maintaining a positive culture. The values that support a company’s culture must be modeled daily by leadership at the highest levels. Leaders must also share their visions and be able to effectively persuade people to embrace change. A Harvard Business Review case study on culture change reveled that once the vision and values are clear, managers must enforce and reinforce the changes through their decisions, strategies, systems and operating procedures. Management must further support culture change through learning and development and through hiring the right people who exemplify the kind of company the organization wants to become. This is a never-ending effort to achieve and maintain the desired outcomes. Are you setting the right tone for your team?
Are your line managers aligning their management approach with company values?
Is your leadership team leading by example?
Do you consider burnout to be a cost of doing business, or are you ready to combat it?
What is burnout and what causes it?
Much scholarly research in the medical community has been conducted on employee burnout and its impact on factors that affect company culture. Social Psychologist Christina Maslach defined burnout as a psychological syndrome involving emotional exhaustion, depersonalization (cynicism) and a diminished sense of personal accomplishment. Employees may feel as though there is a discrepancybetween their level of effort and the rewards they receive for their efforts. This can happen when employees do not receive the level of recognition or gratitude they feel they deserve based on their accomplishments. Professor Barry A. Farber of Columbia university identifies that employees’ burnout falls into three main categories: worn-out, under challenged or frenetic (frantic).
How can Companies Prevent or Reduce Employee Burnout?
Mina Westman and Dov Eden of Tel Aviv University found through their study that vacation (time away from work) alleviates the symptoms of employee burnout by temporarily removing the employees from the causes of burnout. According to the Harvard Business Review, employees returning from work also generally have a more positive outlook and feel more energized upon returning to work from a vacation. Taking vacations also has been shown to significantly increase an employee’s likelihood of getting a raise. By providing employees with ample vacation time, employers may reduce or eliminate both the causes, and ultimately the symptoms, of employee burnout. Group Management For Just Culture Assignment Paper