How to Implement Evidence-Based Healthcare
eBook - ePub

How to Implement Evidence-Based Healthcare

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

How to Implement Evidence-Based Healthcare

Book details
Book preview
Table of contents
Citations

About This Book

British Medical Association Book Award Winner - President's Award of the Year 2018 From the author of the bestselling introduction to evidence-based medicine, this brand new title makes sense of the complex and confusing landscape of implementation science, the role of research impact, and how to avoid research waste. How to Implement Evidence-Based Healthcare clearly and succinctly demystifies the implementation process, and explains how to successfully apply evidence-based healthcare to practice in order to ensure safe and effective practice. Written in an engaging and practical style, it includes frameworks, tools and techniques for successful implementation and behavioural change, as well as in-depth coverage and analysis of key themes and topics with a focus on:

  • Groups and teams
  • Organisations
  • Patients
  • Technology
  • Policy
  • Networks and systems


How to Implement Evidence-Based Healthcare is essential reading for students, clinicians and researchers focused on evidence-based medicine and healthcare, implementation science, applied healthcare research, and those working in public health, public policy, and management.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access How to Implement Evidence-Based Healthcare by Trisha Greenhalgh in PDF and/or ePUB format, as well as other popular books in Medicine & Medical Theory, Practice & Reference. We have over one million books available in our catalogue for you to explore.

Information

Year
2017
ISBN
9781119238539

Chapter 1
Introduction

1.1 The story of this book

Let me start with a warning: this book is not going to give you a cookbook answer to the question of how to implement evidence‐based healthcare (EBHC). My (more modest) aim is threefold:
  1. To introduce you to different ways of thinking about the evidence, people, organisations, technologies and so on (read the chapter headings) that are relevant to the challenge of implementing EBHC.
  2. To persuade you that implementing EBHC is not an exact science and can never be undertaken in a formulaic, algorithmic way. Rather – and notwithstanding all the things that are known to help or hinder the process – it will always require contextual judgement, rules of thumb, instinct and perhaps a lucky alignment of circumstances.
  3. To promote interest in the social sciences (e.g. sociology, social psychology, anthropology) and humanities (e.g. philosophy, literature/storytelling, design) as the intellectual basis for many of the approaches described in this book.
This book was a long time in gestation. The idea first came to Anna Donald and me in the late 1990s. At the time, we were both working in roles that involved helping people and organisations implement evidence – and it was proving a lot harder than the textbooks of the time implied. That was the decade in which evidence‐based medicine (EBM), which later expanded beyond the exclusive realm of doctors to EBHC (to include the activities of other health professionals, managers and lay people), was depicted as a straightforward sequence of asking a clinical question, searching the literature for relevant research articles, critically appraising those articles and implementing the findings. The last task in the sequence was depicted as something that could be ticked off from a checklist.
Anna and I penned an outline for the book (it looked very different then – because most of the research into knowledge translation and implementation cited here had not yet been done). But, tragically, Anna became ill before we got much further and died a few years later, with our magnum opus barely started. Whilst the detail of what is described here is my own work, there is still a sense in which it is Anna’s work too. Even in those early days, before terms like ‘implementation science’, ‘research utilisation’, ‘knowledge translation’ and ‘evidence‐into‐action’ became part of our vocabulary, Anna recognised that we would never be able to produce a set of evidence implementation checklists in the same way as she and I once drew up a set of critical appraisal checklists for our students.
It has taken me nearly 20 years to produce this book, partly because when Anna died, I lost a dear friend as well as a formidable intellectual sparring partner – but also because the question ‘How do you implement EBHC?’ is a good deal too broad for a single book. And yet, one book to scope the field and run a narrative through its many dimensions was exactly what was needed. I have long been convinced that whilst there are definite advantages to asking dozens of different authors, each with different views on the subject, to cover different aspects of this complex and contested field (Sharon Straus and her team did just that, and the book they edited is worth reading [1]), the EBHC community (nay, network of communities) also needs a single‐author textbook whose goal is to achieve some degree of coherence across the disparate topics.
EBM and EBHC have come a long way since the 1990s. The ‘campaign for real EBM’, which I helped establish in 2014, has called for a broadening of EBM’s parameters to include the use of social science methodologies to study the nuances of clinical practice, policymaking and the patient experience – as well as considering the political dimension of conflicts of interest in research funding and industry sponsorship of trials [2]. It is, perhaps, a reflection of the broadening of the EBM/EBHC agenda that implementation science has been established as a separate interdisciplinary field of inquiry (with much internal contestation), with its own suite of journals, research funding panels and conference circuit [3].
One important development in EBHC in recent years is the growing emphasis on value for money in the research process and an emerging evidence base on how little impact research so often has on practice and policy. This overlaps with the expectation on universities (in the United Kingdom at least, via the Research Excellence Framework) to demonstrate that the research they undertake has impact beyond publishing papers in journals read only by other academics. I have reviewed the literature on research impact elsewhere [4].
In 2014, Sir Iain Chalmers led a series in the Lancet that highlighted different aspects of research waste, including waste in the allocation of research funds (too often, we study questions people don’t want answered and fail to study the ones they do) [5]; waste in the conduct of research (studies are underpowered, use the wrong primary endpoints and/or the wrong measurements and so on) [6]; and waste when the findings of research prove ‘unusable’ in practice (because the findings are not presented in ways that could be applied by practitioners or policymakers) [7]. Most recently, John Ioannidis has written a masterly review on ‘Why Most Clinical Research Is Not Useful’ [8]. I look at this last paper in detail in Section 9.1. The bottom line is clear: there is a huge gap between evidence and its implementation – and it’s not easily explained.
The final impetus for me finishing this book was taking up a new job at the University of Oxford in 2015. My new job description included leading (along with Kamal Mahtani) the module ‘Knowledge Into Action’. This was part of the popular and well‐regarded MSc in Evidence‐Based Health Care run by Carl Heneghan and his team from the Centre for Evidence‐Based Medicine. The students on the Knowledge Into Action course were asking for a textbook. Some (the less experienced ones) were looking for checklists and formulae – but many who had worked at the interface between evidence and practice for years knew that the field was not predictable enough to be solved by such things. These more enlightened students wanted a way to get their heads round why implementing EBHC is not an exact science.
In sum, this book looks two ways. Looking retrospectively, it is dedicated to the memory of Anna Donald, who helped inspire it. And looking prospectively, it is dedicated to those who study the implementation of EBHC with a view to improving outcomes for patients. It also seeks to make a contribution to increasing value and reducing waste in research by increasing the proportion of good research that has a worthwhile impact on patients (the sick) and on citizens (including those of us who pay taxes and who may become sick).

1.2 There is no tooth fairy …

This section started life as a blog on the website of the Centre for Evidence Based Health Care at the University of Oxford. I wrote it to set the scene for the Knowledge Into Action MSc module that Kamal Mahtani and I were running in 2016. Our group of students had already completed modules on critical appraisal, randomised controlled trials and other highly rigorous methodological approaches. They perhaps anticipated that ‘rigorous methodology’ would get them through the implementation stage too. To get my excuses in before the course began, I penned this blog entry:
  • Tools and resources for critical appraisal of research evidence are widely available and extremely useful. Whatever the topic and whatever the study design used to research it, there is probably a checklist to guide you step by step through assessing its validity and relevance.
  • The implementation challenge is different. Let me break this news to you gently: there is no tooth fairy. Nor is there any formal framework or model or checklist of things to do (or questions to ask) that will take you systematically through everything you need to do to ‘implement’ a particular piece of evidence in a particular setting.
  • There are certainly tools available [see Appendices], and you should try to become familiar with them. They will prompt you to adapt your evidence to suit a local context, identify local ‘barriers’ and ‘facilitators’ to knowledge use, select and tailor your interventions, and monitor and evaluate your progress. All these aspects of implementation are indeed important.
  • But here’s the rub: despite their value, knowledge‐to‐action tools cannot be applied mechanistically in the same way as the CONSORT checklist [2] can be applied to a paper describing a randomised controlled trial. This is not because the tools are in some way flawed (in which case, the solution would be to refine the tools, just as people have refined the CONSORT checklist over the years). It is because implementation is infinitely more complex (and hence unpredictable) than a research study in which confounding variables have been (or should have been) controlled or corrected for.
  • Implementing research evidence is not just a matter of following procedural steps. You will probably relate to that statement if you’ve ever tried it, just as you may know as a parent that raising a child is not just a matter of reading and applying the child‐rearing manual, or as a tennis player that winning a match cannot be achieved merely by knowing the rules of tennis and studying detailed statistics on your opponent’s performance in previous games. All these are examples of complex practices that require skill and situational judgement (which comes from experience) as well as evidence on ‘what works’.
  • So‐called ‘implementation science’ is, in reality, not a science at all – nor is it an art. It is a science‐informed practice. And just as with child‐rearing and tennis‐playing, you get better at it by doing two things in addition to learning about ‘what works’: doing it, and sharing stories about doing it with others who are also doing it. By reflecting carefully on your own practice and by discussing real case examples shared by others, you will acquire not just the abstract knowledge about ‘what works’ but also the practical wisdom that will help you make contextual judgements about what is likely to work (or at least, what might be tried out to see if it works) in this situation for these people in this organisation with these constraints.
  • There is a philosophical point here. Much healthcare research is oriented to producing statistical generalisations based on one population sample to predict what will happen in a comparable sample. In such cases, there is usually a single correct interpretation of the findings. In contrast, implementation science is at least partly about using unique case examples as a window to wider truths through the enrichment of understanding (what philosophers of science call ‘naturalistic generalisation’). In such cases, multiple interpretations of a case are possible and there may be no such thing as the ‘correct’ answer (recall the example of raising a child above).
  • In the Knowledge Into Action modul...

Table of contents

  1. Cover
  2. Title Page
  3. Table of Contents
  4. Foreword
  5. Acknowledgements
  6. Chapter 1: Introduction
  7. Chapter 2: Evidence
  8. Chapter 3: People
  9. Chapter 4: Groups and teams
  10. Chapter 5: Organisations
  11. Chapter 6: Citizens
  12. Chapter 7: Patients
  13. Chapter 8: Technology
  14. Chapter 9: Policy
  15. Chapter 10: Networks
  16. Chapter 11: Systems
  17. Appendix A: Frameworks, tools and techniques
  18. Appendix B: Psychological domains and constructs relevant to the implementation of EBHC
  19. Index
  20. End User License Agreement