Three fourths of the mistakes a man makes is because he does not really know the things he thinks he knows. James Bryce
Traditionally, an unexpected adverse event was equated with an error. In turn, an error was
equated with incompetence or even negligence. Consequently, punishing the guilty was
considered to be the only method to improve safety of patients. We felt that we could solve
human error problems by telling people to be more careful, by reprimanding the miscreants,
or by issuing a new rule or procedure.
This is ‘The Bad Apple Theory’, where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere. In fact, this “name, blame, and shame” approach has a toxic effect. Not only does it not improve safety, it also continues to push the issue of medical errors into secrecy.
The new perspectiveThe new discipline of Patient Safety acknowledges that risk is inherent in medicine and error is inherent in the human condition. As Dr Lewis Thomas said eloquently, “We are built to make mistakes, coded for error.” We now understand that a human error problem is actually an organizational problem. Finding a ‘human error’ by any other name, or by any other human, is only the beginning of your journey, not a convenient conclusion.
The new view recognizes that systems are inherent trade-offs between safety and other pressures , such as time. For example, in an understaffed hospital, a rushed doctor may be forced to take shortcuts which jeopardize the patient’s safety, not because he is careless, but is forced to do so because he has lots of other patients to see. The major contribution of the patient safety movement has been to propagate the insight that medical error is the result of “bad systems,” not “bad apples”.
Errors can be reduced by redesigning systems and improving processes so that caregivers can produce better results. While the discipline of Patient Safety can learn a lot from other high hazard industries, such as aviation and nuclear power, the uniqueness of health care must not be lost. Health care is more unpredictable, complex, and nonlinear than the most complex nuclear power plants.
Machines respond in a predictable way to a set of commands and processes; patients don’t—their response to medications and clinical interventions is far more variable and unpredictable. Machines don’t have families, feelings, language barriers, or psychosocial issues; patients do. While it is vitally important for us to learn techniques and lessons from other industries, the health care industry must produce leaders and champions from within the clinical community to face up to this challenge and devise solutions unique to the clinical environment.
Humans as heroesWhile humans can cause problems, they are the solution as well. After all, humans are the only ones that are going to be able to recognize the errors and prevent and correct them. We need to be able to balance both these views of human ability and experience; one that uses technology, design, standardization and simplicity to reduce human fallibility, while the other stresses human adaptability, foresight and resilience as a shield against errors. Systems and processes are important, but in the end people make the difference. We need to think not in terms of humans as hazards, but rather in terms of humans as heroes.
In reality, what’s amazing is that inspite of the chaos, constraints, and limitations under which hospitals in India function, doctors and nurses are able to deliver safe care to their patients the vast majority of times. This is on account of the hard work, individual vigilance, resourcefulness and problem solving ability which the medical staff brings to work every single day.
Sadly, the cardinal virtues and abilities of clinical staff are being squandered on fixing administrative and organizational inefficiencies, rather than being put at the service of patients. Clinical staff maintains safety by adapting and working around these inefficiencies. If we truly want safer healthcare, front line staff may have to complain more and demand action on these inefficiencies, on behalf of themselves and their patients.
Hospitals are complex adaptive systems which means they do not respond in predictable ways to rules and policies. It also means that efforts to improve safety must combine rules and standards with messier activities that respect the importance of culture, innovation, and iterative learning in the clinical setting. A variety of strategies have been employed to create safer systems, including:
* Building in redundancies
* Using checklists
* Improving teamwork
* Learning from past mistakes
As Dr Abha Agarwal describes in her text book, Patient Safety: A Case-Based Comprehensive Guide (2014), the following are the three basic propositions we need to keep in mind when trying to design systems to keep patients safe:
#1. “The soil, not the seed”We have learned that the formula for errors is not bad people, but good people plus bad systems. Even apparently egregious errors such as wrong-site and wrong-patient surgeries happen not because of incompetent surgeons, but because of unreliable processes of patient identification and surgical site marking.
Medication errors happen not because of inattentive nurses but because of a needlessly complicated multistep system of medication management, from prescribing to dispensing to administration. The Patient Safety discipline proposes that the fertile ground for medical errors is the “soil” of the healthcare delivery system; and not the clinician, who is only the “seed”.
#2. From “I” to “we”The second underpinning of the Patient Safety discipline is that safer care is a function of good teams, not good individuals acting alone. This is because the technological sophistication of the last century has introduced unprecedented complexity and fragmentation in health care; today, there is usually a small army of professionals who need to work together to deliver care to the patient, especially in a hospital setting. However, hospitals can be chaotic, because of frequent interruptions, emergencies, shift changes, and this leads to discontinuity in care. Traditionally, physicians have been taught to take ownership of the patient’s medical care, and they still think of themselves as being in charge- of being the “captain of the ship.” In contrast, the discipline of Patient Safety rejects the notion of “I” in favor of “we.” It proposes that the only possible way to deliver safe and efficient care in such a complex, fragmented system is for various professionals to work together as a coordinated team.
#3. “Just culture”The Just Culture model proposes that since the human condition cannot be changed, the only hope for safer care lies in a relentless focus on improving systems of care. While it’s important not to blame needlessly, this must be balanced with the need for accountability, because no hospital can offer a “blame-free” system if it tolerates acts which show a reckless disregard for patient safety.
A just culture provides a framework of shared accountability: healthcare institutions are responsible for providing systems and environment that are optimally designed for safe care, while the staff is responsible for their personal choices of behavior and for reporting system vulnerabilities. The just culture model distinguishes between three types of errors:
* The first type is the “human error,” inadvertently doing other than what should have been done and includes errors such as a slip or a lapse. This is now considered to be an inevitable part of human fallibility and should be managed through designing systems that are more error-proof and error-tolerant.
* The second is “at-risk behavior” where staff uses workarounds to bypass established safety processes in order to get their work done efficiently. Such behavior should lead to coaching of the staff, so they are aware of how their actions endanger safety, in addition to improving the system to fix the broken process.
* The final is “reckless behavior,” which an individual indulges in by consciously disregarding a substantial risk. For example, a surgeon refusing to sign the operative site or to participate in the time-out process will be considered to be reckless, and should be penalized.
We need to have zero tolerance for reckless behavior, and fortunately, such instances are rare. Most errors fall into the category of human error or at-risk behavior.
Some insightsDr Berwick, a pioneer in the patient safety research movement, had these pearls of insights to share after spending a lifetime in studying patient safety: * The problem is not errors, but harm. If we believe our battle is against errors, we will lose. Errors are inevitable; they will always be there. The focus should be on ‘How can we keep patients from being hurt?’ and less on ‘How can we keep errors from happening?’
* We used to believe that rules create safety. The truth is more nuanced, and involves both making the rules and knowing how and when to break them in order to create safety. Patient safety is a continually emerging property of a complex adaptive system which means that the rules should be more like instructions for driving a car, allowing the driver to adapt to current circumstances, rather than a point-by-point recipe for baking a cake.
* Reporting is valuable to track problems and progress, but to learn from errors, we need to use stories. Reporting only for measurement causes us to lose the necessary context to make sense of the numbers. The question ‘How many?’ isn’t powerful enough. The question should be ‘What happened?’ It is impossible to have safety where transparency is not assured.
* Every technology (even that which is used for improving safety) has hazards. Building technology for safety is crucial, but it must be supported by conversation – a human mechanism for getting control back. Technology without collective mindfulness will make things worse, not better.
* Healthcare differs a lot from other high-hazard industries. While there is much to learn from other industries, there are crucial differences between healthcare and other fields. The simpleminded adoption of safety practices from other industries will not work – we need to adapt these solutions to the unique problems healthcare throws up.
* The focus has been on preventing harm, which means that what’s important is what happens before the injury. The reality is that what happens after the injury is equally important. Part of our safety culture has got to focus on the healing side and we have to heal both the injured person and the person who caused the injury. Patient safety research is catalyzing a more holistic view of patient safety, recognizing that the implementation of “safer systems” by itself will not create safe patient care if the medical staff is overextended, poorly trained, or under-supervised. People are part of the reason why errors occur, but they hold the key to the solution as well, because all medical care is provided by people!