Achieving Evidence-Informed Policy and Practice in Education
eBook - ePub

Achieving Evidence-Informed Policy and Practice in Education

  1. 192 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Achieving Evidence-Informed Policy and Practice in Education

Book details
Book preview
Table of contents
Citations

About This Book

The potential for research evidence to improve educational policy and practice is immense. Yet internationally, research used by teachers and governments is currently sporadic rather than systematic. In response, this book brings together seven chapters that encompass a range of research projects and ideas in relation to evidence-informed policy and practice (EIPP) in education. These projects and ideas all share a single overarching purpose: providing insight into how EIPP in education can be achieved.
Underpinning each chapter is the notion that the world is complex. If we are to introduce change in any meaningful way into it, we therefore have to understand and respond to this complexity. This means then that we cannot simply assume that, because it seems rational or common sense for teachers and policy-makers to use research to help improve their decision making or acts of praxis, that they will do so.
Correspondingly, the book represents a holistic journey of discovery and experimentation: of an engagement with the work of thinkers and authors from Eco to Flyvbjerg, via Habermas, Foucault and Aristotle; of ideas ranging from phronesis to trust and social relations; and with diverse research methodologies, including social network analysis and decision tree predictive modelling.
The result is both descriptive and prescriptive: as well as outlining the research and its findings, practical suggestions and strategies for achieving evidence use both in educational policy and practice are provided throughout.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Achieving Evidence-Informed Policy and Practice in Education by Chris Brown in PDF and/or ePUB format, as well as other popular books in Bildung & Bildungspolitik. We have over one million books available in our catalogue for you to explore.

Information

Year
2017
ISBN
9781787436732

Chapter 1

Introduction

This book brings together seven articles that encompass a range of research projects and ideas in relation to evidence-informed policy and practice (EIPP) in education. These projects and ideas all share a single overarching purpose: providing insight into how EIPP in education can be achieved. The reason for my focus on EIPP is clear cut: although it is true that the notion of using evidence to aid policy or practice is not without controversy or debate (e.g. see Hammersley, 2013), I firmly believe that educationalists engaging with evidence is socially beneficial. This is because policy decisions or teaching practice grounded in an understanding of what is or could be effective, or even simply an understanding of what is currently known, are likely to be more successful than those based on experience or intuition alone. The book itself is premised on a number of fundamental ideas with regards to what EIPP is and how it can be established. Because these ideas are not always explicitly stated within its chapters, I explore them here in the introduction in order to provide context for what is to come.
The first set of ideas underpinning my work concerns the nature of EIPP. For the purposes of this book I define the policy and practice elements of EIPP separately. My definition of evidence-informed practice (EIPr) is adapted from England’s Department for Education (2014), which suggests that EIPr represents: ‘a combination of practitioner expertise and knowledge of the best external research [and/or] evaluation-based evidence’. More specifically in relation to this definition, I consider the notion of ‘external research’ as high quality qualitative or quantitative research that has been peer reviewed and published by academic researchers. I have altered the DfE’s definition to include the phrase ‘[and or]’ because in some areas we are relatively light in research findings, also because some research simply provides new ways to understand the world rather than any concrete calls to action. Nonetheless as I detail below, in both cases such research can and should be used to improve decision making. Other times evidence can provide more concrete suggestions for how to improve teaching and learning and here the phrase ‘evaluation-based evidence’ is considered to comprise meta-analyses or syntheses such as those produced by Hattie (2011) or the Sutton Trust-EEF’s Teaching and Learning toolkit (Sutton Trust-EEF, 2013) which indicate the effectiveness of types of intervention (such as homework or feedback). Evaluation-based evidence also comprises the evaluation of specific named interventions (such as ‘Philosophy for Children’) often through use of randomized control trials (e.g. see Slavin, 2008 or the Education Endowment Foundation). This research can be usefully employed both by itself and in conjunction with other forms of research. The use of the term ‘combination’ within the DfE’s definition, meanwhile, also highlights an evolution in thinking about research informed teaching practice; representing a move from the idea that teaching can be based on research evidence (e.g. see Biesta, 2007; Saunders, 2015), to the realization that it is more realistic, relevant and effective to consider a situation where teaching practice is informed by research evidence. In other words, the coining of the phrase evidence-informed practice represents a change of emphasis, to consider how teachers can employ research alongside other forms of evidence such as their tacit expertise, in order to make effective pedagogic decisions in specific situations.
It is also worth highlighting here the substantiated benefits associated with EIPr, which includes correlational evidence that where research and evidence are used effectively as part of high quality initial teacher education and continuing professional development, with a focus on addressing improvement priorities, it makes a positive difference in terms of teacher, school and system performance (Mincu, 2014; Cordingley, 2013; Godfrey; 2014, 2016). CUREE (2010), meanwhile, lists a range of positive teacher outcomes that emerge from EIPr including both improvements in pedagogic knowledge and skills, and greater teacher confidence. Furthermore, the experience of ‘research-engaged’ schools that take a strategic and concerted approach in this area appear to be positive, with studies suggesting that research engagement can shift school behaviours from a superficial ‘hints and tips’ model of improvement to a learning culture in which staff work together to understand what appears to work, when and why (Godfrey, 2016; Greany, 2015; Handscomb & MacBeath, 2003). In addition, it is also noted by Godfrey (2016) that schools that have made a commitment to practitioner research report increased numbers of application for teaching posts, high teacher work satisfaction and increased staff retention.

Evidence-Informed Policy

When considering evidence-informed policy-making (EIPo) I draw on the definition of Davies (2004, p. 5), who defines it as:
An approach that helps people make well informed decisions about policies, programmes and projects by putting the best available evidence from research at the heart of policy development and implementation.
With the notion of ‘best available’ evidence regarded as synonymous with the notions of external research and ‘evaluation-based evidence’ detailed in the definition of EIPr above. The pursuit of evidence-informed policy is based on the premise that policy outcomes will be improved if decision making is aided by knowledge that is both of quality and pertinent to the issue in hand. This premise is explicated through the work of advocates such as Oakley, who argues that evidence-informed approaches ensure that ‘those who intervene in other people’s lives do so with the utmost benefit and least harm’ (2000, p. 3); also described by Alton-Lee (2012) as the ‘first do no harm’ principle. Failing to employ available evidence can also lead to situations where public money is wasted and members of society not offered treatments or interventions at points in their lives where doing so might provide most benefit (e.g. Scott, Knapp, Henderson, & Maughan, 2001 and Lee et al.’s 2012 analysis for the Washington State Institute for Public Policy).1 Oxman, Lavis, Lewin, and Fretheim (2009) summarize the benefits of being evidence-informed by suggesting that the evidence use increases the probability of policy being more effective, equitable and value for money.

How Should Policy-Makers and Teachers Engage with Research?

How EIPP materializes will be a function of how teachers and policy-makers are expected to act following any engagement with research (Dimmock, 2016; See, Gorard, & Siddiqui, 2016). In my experience the goals of teachers in using research are typically one of the following: (1) to aid the design of new bespoke strategies for teaching and learning (or indeed approaches to school management) that are to be employed as part of their and/or their school’s teaching and learning (or management) activity in order tackle specific identified problems. As Coldwell et al. (2017, p. viii) note ‘for teachers, evidence-informed teaching usually meant drawing on research evidence to integrate and trial in their own practice’. One example is a school I worked with in Chapter 5 who used research to design a ‘mistake typology’: informed by Dweck’s (2006) work on growth mindsets, this typology was designed to help teachers and pupils recognize various types of mistakes and how different mistakes could be used as the basis to improve how pupils learn and approach their work; (2) a second goal is that teachers use research to provide ideas for how to improve aspects of their day-to-day practice by drawing on approaches that research has shown appear to be effective. For instance research can provide clues for how to respond to pupils during lessons in order to maintain their resilience or grit (Duckworth, 2016); (3) teachers can also seek to use research to expand, clarify and deepen concepts, including the concepts they use to understand students, curriculum and pedagogical practice (Cain, 2015, for instance provides a case of teachers examining the notion of ‘gifted and talented’ pupils and the way in which such pupils might be identified and the nature of a suitable curriculum for such a group). While this third goal does happen, it is less common: Coldwell et al. (2017) for example suggesting that in their study of schools teachers’ use of research evidence was prompted by a need to solve a practical problem; finally (4) teachers and schools may also seek out specific programmes or guidelines, shown by research to be effective, which set out how to engage in various aspects of teaching or specific approaches to improve learning (again typically to tackle identified problems). For example, programmes which suggest how to begin each lesson in order to minimize disruption or poor behaviour, or specific schemas for providing feedback. The goals of policy-makers may be considered similar although often their intention is to develop the directives or guidelines that will be used by teachers or affect the governance or operation of schools. Drawing on Stokes’ (1997) research typology, this implies therefore that the research teachers and policy-makers value most will have elements of practical application. Although these goals seem relatively clear cut, we still need to consider how research is actually employed within policy and practice.
There are numerous studies and commentaries that have examined the ways in which research evidence can affect policy and practice (e.g. Biesta, 2007; Cain, 2015; Cooper & Levin, 2010; Edwards, Sebba, & Rickinson, 2007; Hammersley, 1997; Nutley, Walter, & Davies, 2007), including the seminal work of the late Carol Weiss (e.g. 1979, 1980, 1982). Here however I illustrate the key issues involved by engaging with recent work undertaken by Penuel and colleagues (2017), which broadly encapsulates the core issues involved. The particular study undertaken by Penuel et al. (2017) involves the development of a survey to capture a broad range of potential uses of research evidence in order to gain baseline assessment of school leaders’ use of research. Adopting categories first identified by Weiss and Bucuvalas (1980), Penuel et al. (2017) use their survey to examine instrumental, conceptual and symbolic uses of educational research by school and school system leaders. They explain the first of these use-types — instrumental use — in the following way: ‘when policy-makers encourage education leaders to use research to inform their decision making, they implicitly invoke a theory of action in which evidence from research findings directly shape decisions related to policy or practice’ (Penuel et al., 2017, p. 2). In other words instrumental use is the use of research ‘in the service of a particular decision’ (Ibid.). Penuel et al. then define conceptual use, as occurring ‘when research changes the way that a person views a problem or the possible solution spaces for a problem’. Symbolic use, meanwhile, occurs when research evidence is used to validate a preference for a particular decision or to justify a decision already made (Ibid.).
What is clear in examining these definitions is that the difference between instrumental and conceptual use is premised on how educators use research to make decisions and so take action as a result. Specifically, instrumental use is thought to involve a direct translation from research to practice: that is with instrumental use research evidence is seen as pointing towards a solution in relation to a problem of practice, with this solution or strategy subsequently being accepted and/or implemented. Typically, this type of use is thought to go hand in glove with notions of ‘evaluation-based evidence’ since proponents of instrumental use typically believe that through the use of randomized control trials or systematic reviews, evaluative research can provide concrete calls to action through the provision of research informed guidelines or interventions that can be implemented with fidelity (Fixsen, 2017). In other words an instrumental decision is one of ‘this is what we will do and how’. Conceptual use, meanwhile, is regarded as more indirect in that it points to situations in which research evidence guides or informs thinking in relation to a given problem/solution to that problem. With conceptual use, therefore, research evidence is not regarded as the sole source of information upon which educators base their decisions (the decision made thus being ‘these are the kinds of things we will do’). Returning to the definition of instrumental research use, its definition, albeit implicitly, appears to imply action in strict adherence with what the research says should be done, thus ruling out of any other forms of knowledge coming into play (since this would result in the action following use of the research being customized rather than teachers acting with fidelity in relation to the research). What’s more, in theory at least, the more concrete direction research can provide, the more instrumental its use can be. A key question therefore must be how realistic the scenario represented by the instrumental use of research is?2
Even if we just consider the more instrumental goals teachers may have for using research (e.g. goals one and four of those listed above), a variety of sources would seem to imply that the answer is ‘not very’. Notwithstanding the fact that often a given evidence base is not concrete enough to provide a definitive course of action in relation to a problem of practice (not every intervention has been evaluated and not all meta-analyses go into depth about how the intervention in question operates: research on how to encourage relationship building amongst children with autism being one example of the former, research on homework being an example of the latter) teachers simply do not seem to employ research in this way. For instance, Coldwell et al. (2017, p. ix) suggest that there is ‘limited evidence from [their] study of teachers directly importing research findings to change their practice. Rather, research more typically informed their thinking and led — at least in th...

Table of contents

  1. Cover
  2. Title Page
  3. Chapter 1 Introduction
  4. Part I: Evidence-Informed Practice
  5. Part II: Evidence-Informed Policy
  6. Index