Health Care Errors and Patient Safety
eBook - ePub

Health Care Errors and Patient Safety

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Health Care Errors and Patient Safety

Book details
Book preview
Table of contents
Citations

About This Book

The detection, reporting, measurement, and minimization of medical errors and harms is now a core requirement in clinical organizations throughout developed societies. This book focuses on this major new area in health care. It explores the nature of medical error, its incidence in different health care settings, and strategies for minimizing errors and their harmful consequences to patients. Written by leading authorities, it discusses the practical issues involved in reducing errors in health care - for the clinician, the health policy adviser, and ethical and legal health professionals.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Health Care Errors and Patient Safety by Brian Hurwitz, Aziz Sheikh, Brian Hurwitz, Aziz Sheikh in PDF and/or ePUB format, as well as other popular books in Medicina & Política de salud. We have over one million books available in our catalogue for you to explore.

Information

Publisher
BMJ Books
Year
2011
ISBN
9781444360318
Edition
1
CHAPTER 1
Health care mistakes, violations and patient safety
Brian Hurwitz, Aziz Sheikh
Error is not something that is fallen into momentarily; it is omnipresent.
David Bates, 1996 [1]
Safety of health care today is at the nexus of empirical, ethical, legal and policy considerations worldwide. Concerns about safety originate in growing realisations which this volume charts, that health care provision is an industry that frequently, and, avoidably, harms vulnerable people. Health Care Errors and Patient Safety focuses on medical mistakes and violations, and what can be learnt from their intensive scrutiny in order to enhance patient safety [2,3].
Over the past two decades, aberrantly provided health care has become a major area of scientific investigation, public discussion and health policy formation. Enquiries into medical mistakes – their contexts, causes, consequences and costs – have widened in scope and deepened in conceptual grasp of error and violation [4–8]. Agencies and reporting mechanisms have been established to collect data on medical mishaps and safety incidents, to extract and promulgate lessons that can be learnt from them [9–11]. As a consequence, many more health care events and processes than in the past are today classified as mistakes. But as this book makes clear, many of these data suffer from biased numerators (from under-reporting of errors overall and a tendency to report the most dangerous or injurious incidents) and absent denominators (from lack of reliable information on how frequently relevant health care procedures are undertaken), which limits knowledge of error rates and curtails development of policies to improve health care safety [12,13].
Box 1.1: The underside of progress
To invent the sailing ship or steamer is to invent the shipwreck. To invent the train is to invent the rail accident of derailment. To invent the family automobile is to produce the pile-up on the highway.
Paul Virilio, 2007 [100]
Box 1.2: The relevance of Murphy’s law according to Reason
‘Murphy’s law says that if it’s possible to do something wrong, people will.’ That’s why at least 50 patients worldwide have died such a horrible death from intrathecal vincristine. These deaths were certainly preventable, and design safeguards such as the new spinal-only connector will help. But safeguards do have a way of biting back, partly because new equipment tends to add to the complexity, opacity, and unfamiliarity of a situation. [101]
The volume addresses the sparsely charted field of medical fallibility. In their planning and execution all human activities can inadvertently be misconstrued and mal-performed. The possibility of technological accidents, unintentional and unforeseeable occurrences that often (but not always) involve undesirable consequences, is woven into the design, operation and social relations of technology (Box 1.1). This represents the underside of progress: in parallel with exercising skill and know-how the possibility of error and violation is always present and nowhere more so than in provision of health care [14] (Box 1.2). In this chapter we sketch the origins and development of interest in medical error, violation and patient safety, and trace growing multidisciplinary recognition of fallibility in health care services.
Errors and mistakes
There is something bittersweet about human errors. Inadvertent and harmful they may be, but once recognised and understood utility can be rescued from them [15,16]. Much human learning is still undertaken by ‘trial and error’, through seeing and reviewing – kinaesthetically sensing – mismatches between intended and accomplished actions, processes captured in part by the saying practice makes perfect. But learning from health care errors is a far less individually centred activity than such an adage might imply.* In health care, learning from mistakes involves sharing and discussing them with patients and colleagues; it requires their accurate reconstruction and description, a taxonomy that can usefully categorise errors, a disciplined vocabulary and analytical framework that helps health care errors to be understood and communicated, and a medical culture capable of facing its own fallibility (see Chapter 14). Subjecting erroneous thought processes, procedures and techniques to review of this sort takes mental effort and moral bravery; it also demands health services structures that foster and support these processes [12] (see also Chapters 6 and 7).
At a time when the possibility of error pervades the health care enterprise, it has become apparent that the vast majority of errors, whether errors of health care planning or of execution [17], do not cause any harm directly. What is the significance of ‘silent’ errors, or ‘near misses’ as they are more commonly referred to, which in the past went frequently undocumented if not entirely unnoticed? Since errors appear to be a universal feature of health care processes, should they be viewed as (undesirable) norms? Should culpability be attached to harmless medical mistakes and, where errors are not injurious, can liability still be imposed [18,19]? If errors are forms of unintentional deviation from collectively agreed standards, should health care violations – intentional divergences from agreed procedures – fall within investigative frameworks designed to understand health care errors? These are just some of the questions which are explored in later chapters.
Violations
Deliberate deviations from rules or codes are known as violations and their variety and causes are considered from different perspectives in Chapters 2, 3 and 7. These health care actions are not described in manuals or rules of best practice but generally represent attempts to compensate for overcomplex, undependable systems – ‘workarounds’ that aim to achieve improvements [20]. Like errors, violations do not necessarily portend harm or a disregard of safety. The psychologist and student of error, James Reason, defines violations as ‘deliberate – but not necessarily reprehensible – deviations from practices deemed necessary (by designers, managers and regulatory agencies) to maintain the safe operation of a potentially hazardous system’ [21] – ‘deviations from safe operating practices, procedures, standards, or rules’ [17]. (Key terms in the study of health care safety are defined in Appendix 1.1, p.17.)
Violations are engendered in specific circumstances, for example, as a result of erroneous regulations or needlessly difficult operating procedures that lead operatives to take short cuts, ignore or deliberately bypass explicit rules and procedures, perhaps because of tiredness, which often lurks behind short cuts, perhaps because of time pressures, or as a result of rules and procedures that (appear to) lack rationale [22]. Violations differ from errors in stemming from deliberated choices, conscious decisions that seem to offer the transgressor some sort of benefit. However, the choices involved in health violations are not usually made entirely freely; they are often engendered by significant operating constraints and care system faults. For example, a doctor may decide to allow a relative to translate and interpret for an adult patient, a common violation of guidelines for employing interpreters [23], but because, at the time of a proposed consultation, no other alternative is really available, this should be attributed, at least in part, to an organisational failing or system error – the lack of accessible interpreters when needed in the health care system. In relation to the clinician, the decision can be seen as a violation of good practice guidelines, exercised with the aim of enabling the patient to consult there and then. But, in fact, such a decision may directly jeopardise clinical outcome and predispose to subsequent error because the patient may fail to disclose important information precisely because they are embarrassed to discuss health matters in front of a relative. This in turn may lead to delayed or faulty diagnosis; and the fact that the doctor has sanctioned a relative to interpret may mean the health service authorities to underestimate the need for more professional interpreters to be recruited by the service, thereby perpetuating the very system fault that engendered the violation in the first place.
As this example indicates, violations usually feature a rationale, the belief – sometimes mistaken – that transgression of a rule or regulation offers economy of effort without significantly threatening worse health outcomes. Because they involve conscious deliberated decisions, violations are generally believed to be avoidable by acts of will. However, we have seen that some violations may arise as compromises between best practice and what seems practicable in the circumstances. Custom and culturally reinforced mindsets influence decisions about whether or not to breach a regulation (‘this is how I was taught to undertake this task, which was not according to a new rule set’). On the other hand, violations can stem from an attitude of recklessness on the part of a health carer who understands but chooses to ignore substantial risks that may be involved in transgressing a regulation.
Reason notes that industrial violations are more likely to be made by men than by women, and that their frequency declines with age [21]. Based on the type of transgression involved, violations are classified as routine, exceptional or criminal acts. Routine violations involve everyday breaches of rules, which typically consist of cutting corners, such as not always washing hands between examining patients. The consequences of this vary depending on the health care setting – in intensive care or infectious diseases units it may threaten life, whereas in general practice it may be relatively less dangerous. As Merry and McCall Smith point out [22, pp. 108–9], routine health care violations are legion, both in number and variety and include, for example:
providing patients with less information than certain regulatory bodies have prescribed for the purpose of obtaining informed consent; failing to check the results of clinical investigations (such as blood tests) in a timely manner; taking medical histories from patients in open ward situations which fail to provide adequate privacy; filling in labels incompletely; completing case notes inadequately …
Although violations are generally held to be avoidable, if they become a matter of habit or ‘second nature’ on the part of a health care grouping, such transgressions can become embedded in stereotypic actions undertaken without much conscious thought which then require special effort and training to prevent.
Exceptional violations occur in exceptional or extreme situations; for example, a sudden, unheralded clinical emergency that appears to necessitate breach of usual procedures. However, ‘benevolent transgressions’ are often subject to post hoc review and require explicit justification.
Criminal violations involve transgressions undertaken for deliberately harmful purposes, such as to defraud a patient or the health care system or, more rarely and bizarrely, to sabotage care and to injure patients [22] (see Chapter 3).
Others classifications of health care violations are possible. Reason, for example, divides them into ‘routine’, ‘optimising’ (those undertaken to further personal rather than task-related goals) and ‘necessary or situational’ violations (that appear to offer the only course available by which to get a job done in the circumstances, see also Chapter 2) [17].
Learning from errors and violations
Errors understood as unintentional divergences from desirable goals or standards have long been viewed as sentinel phenomena. ‘Errors show us the way to truth’ wrote the 16th century German astronomer, Johannes Kepler, when discussing observational errors and defects in instrumentation [24]. ‘By far the most instructive part of a [military] campaign is to know why we fail’ wrote George Scovell, a 19th century code-breaker in the Duke of Wellington’s army during the Peninsular War [25, p. 47]. Mistakes, when recognised, require not only to be corrected but corrected for. Errors in thought or investigational procedure may lead to the construction of erroneous mental maps or models that embody faulty conclusions [26] that once identified, estimated, measured and taken into account, may lead to improved understandings [27]. Although generally identified retrospectively, the investigation of health care errors and violations can bring to light important misunderstandings about a situation and shortcomings of procedure which in turn, and when adjusted for, may lead to enhancements in patient safety [28].
Erring and moral judgment
In most walks of life error remains bound up with errancy, diverging from prescribed or recognised pathways – wandering fallibly off track. Where health care errors are in the frame, moral judgments keep close company [29] (Box 1.3). Those who err are generally characterised negatively, whether in psychological, attitudinal, character, knowledge-based or skills terms, because in this thought schema it is believed they should (and could) have done otherwise. Within such a schema, negative human traits operate not only to diminish the moral worthiness of the erring person, but also to help explain, at least in part, how a mistake may have come about: for example, by flawed reasoning, inattentiveness, absent mindedness, poor planning, poor memory, ignorance, arrogance, lack of insight, impatience, overambitiousness, hurriedness, lack of perspective, overconfidence, inability to listen, tiredness, laziness or clumsiness.
Box 1.3: The Mistake by James Fenton
With the mistake your life goes in reverse.
Now you can see exactly what you did
Wrong yesterday and wrong the day before
And each mistake leads back to something worse
And every nuance of your hypocrisy
Towards yourself, and every excuse
Stands solidly on the perspective lines
And there is perfect visibility.
What an enlightenment. The colonnade
Rolls past on either side. You needn’t move.
The statues of your errors brush your sleeve.
You watch the tale turn back – and you’re dismayed.
And this dismay at this, this big mistake
Is made worse by the sight of all those who
Knew all along where these mistakes would lead –
Those frozen friends who watched the crisis break.
Why didn’t they say? Oh, but they did indeed –
Said with a murmur when the time was wrong
Or by a mild refusal to assent
Or told you plainly but you would not heed.
Yes, you can hear them now. It hurts. It’s worse
Than any sneer from any enemy.
Take this dismay. Lay claim to this mistake.
Look straight along the lines of this reverse.
The Mistake by James Fenton from Out of Danger (©James Fenton 1993) is reproduced by permission of PFD (www.pfd.co.uk) on behalf of James Fenton.
However, the modern perspective on errors conceptualises them as essentially, not merely definitionally, unintentional and therefore unavoidable by an act of will (and by use of foresight) on the part of the person who errs (see Chapter 6). On this account, judgmentalism towards someone who errs is an inappropriate and primitive attitude and response to a genuine error. Yet there remains a tension, as Judith André has noted, between lack of intention and avoidability: ‘Mistakes are inevitable. On the other hand they are to be avoided; nothing counts as a mistake unless in some sense we could have done otherwise’ [30]. It is ‘avoidability in some sense’ that grounds the moral disapprobation which modern students of error believe most usefully directs attention away from those who err towards identifying and improving poor design and ‘latent errors’ in health care systems and operations.
Box 1.4: James Reason’s parapraxia
One day in the late 1970s, James Reason was making tea, and the cat was clamouring to be fed. He efficiently opened the tin of cat food – and put it in his teapot. The two components got mixed up. Both the teapot and the cat’s feeding dish afforded the same opportunity – putting stuff in. As a cognitive psychologist, Reason suddenly realised a new research topic was literally under his nose. In tracing the causes of absent-minded incidents, Reason began an exploration of human error. Three decades later, he has become a leading expert on error and one of the recognised architects of the tools used to improve patient safety.
R. Lertzman [103]
It is over a century since investigators began formally enquiring into human error non-moralistically. Sigmund Freud’s Psychopathology of Everyday Life discussed everyday slips of the tongue and pen, misreadings, aberrant actions, forgetting names and muddling up of memories, which together he called ‘parapraxes’. Freud believed their cognitive basis lay in intra-psychic conflicts with the unconscious; once repression fails, otherwise secret desires, ambitions, fantasies and fears erupt into waking life as perturbations of thought and action – slips, transpositions, substitutions and muddlements. This work was published in 1901 in the journal, ...

Table of contents

  1. Cover
  2. Dedication
  3. Epigraph
  4. Title
  5. Copyright
  6. List of contributors
  7. Foreword
  8. CHAPTER 1: Health care mistakes, violations and patient safety
  9. PART 1: Understanding patient safety
  10. PART 2: Threats to patient safety
  11. PART 3: Responses to health care errors and violations