Chapter 1
Introduction
This book brings together seven articles that encompass a range of research projects and ideas in relation to evidence-informed policy and practice (EIPP) in education. These projects and ideas all share a single overarching purpose: providing insight into how EIPP in education can be achieved. The reason for my focus on EIPP is clear cut: although it is true that the notion of using evidence to aid policy or practice is not without controversy or debate (e.g. see Hammersley, 2013), I firmly believe that educationalists engaging with evidence is socially beneficial. This is because policy decisions or teaching practice grounded in an understanding of what is or could be effective, or even simply an understanding of what is currently known, are likely to be more successful than those based on experience or intuition alone. The book itself is premised on a number of fundamental ideas with regards to what EIPP is and how it can be established. Because these ideas are not always explicitly stated within its chapters, I explore them here in the introduction in order to provide context for what is to come.
The first set of ideas underpinning my work concerns the nature of EIPP. For the purposes of this book I define the policy and practice elements of EIPP separately. My definition of evidence-informed practice (EIPr) is adapted from Englandâs Department for Education (2014), which suggests that EIPr represents: âa combination of practitioner expertise and knowledge of the best external research [and/or] evaluation-based evidenceâ. More specifically in relation to this definition, I consider the notion of âexternal researchâ as high quality qualitative or quantitative research that has been peer reviewed and published by academic researchers. I have altered the DfEâs definition to include the phrase â[and or]â because in some areas we are relatively light in research findings, also because some research simply provides new ways to understand the world rather than any concrete calls to action. Nonetheless as I detail below, in both cases such research can and should be used to improve decision making. Other times evidence can provide more concrete suggestions for how to improve teaching and learning and here the phrase âevaluation-based evidenceâ is considered to comprise meta-analyses or syntheses such as those produced by Hattie (2011) or the Sutton Trust-EEFâs Teaching and Learning toolkit (Sutton Trust-EEF, 2013) which indicate the effectiveness of types of intervention (such as homework or feedback). Evaluation-based evidence also comprises the evaluation of specific named interventions (such as âPhilosophy for Childrenâ) often through use of randomized control trials (e.g. see Slavin, 2008 or the Education Endowment Foundation). This research can be usefully employed both by itself and in conjunction with other forms of research. The use of the term âcombinationâ within the DfEâs definition, meanwhile, also highlights an evolution in thinking about research informed teaching practice; representing a move from the idea that teaching can be based on research evidence (e.g. see Biesta, 2007; Saunders, 2015), to the realization that it is more realistic, relevant and effective to consider a situation where teaching practice is informed by research evidence. In other words, the coining of the phrase evidence-informed practice represents a change of emphasis, to consider how teachers can employ research alongside other forms of evidence such as their tacit expertise, in order to make effective pedagogic decisions in specific situations.
It is also worth highlighting here the substantiated benefits associated with EIPr, which includes correlational evidence that where research and evidence are used effectively as part of high quality initial teacher education and continuing professional development, with a focus on addressing improvement priorities, it makes a positive difference in terms of teacher, school and system performance (Mincu, 2014; Cordingley, 2013; Godfrey; 2014, 2016). CUREE (2010), meanwhile, lists a range of positive teacher outcomes that emerge from EIPr including both improvements in pedagogic knowledge and skills, and greater teacher confidence. Furthermore, the experience of âresearch-engagedâ schools that take a strategic and concerted approach in this area appear to be positive, with studies suggesting that research engagement can shift school behaviours from a superficial âhints and tipsâ model of improvement to a learning culture in which staff work together to understand what appears to work, when and why (Godfrey, 2016; Greany, 2015; Handscomb & MacBeath, 2003). In addition, it is also noted by Godfrey (2016) that schools that have made a commitment to practitioner research report increased numbers of application for teaching posts, high teacher work satisfaction and increased staff retention.
Evidence-Informed Policy
When considering evidence-informed policy-making (EIPo) I draw on the definition of Davies (2004, p. 5), who defines it as:
An approach that helps people make well informed decisions about policies, programmes and projects by putting the best available evidence from research at the heart of policy development and implementation.
With the notion of âbest availableâ evidence regarded as synonymous with the notions of external research and âevaluation-based evidenceâ detailed in the definition of EIPr above. The pursuit of evidence-informed policy is based on the premise that policy outcomes will be improved if decision making is aided by knowledge that is both of quality and pertinent to the issue in hand. This premise is explicated through the work of advocates such as Oakley, who argues that evidence-informed approaches ensure that âthose who intervene in other peopleâs lives do so with the utmost benefit and least harmâ (2000, p. 3); also described by Alton-Lee (2012) as the âfirst do no harmâ principle. Failing to employ available evidence can also lead to situations where public money is wasted and members of society not offered treatments or interventions at points in their lives where doing so might provide most benefit (e.g. Scott, Knapp, Henderson, & Maughan, 2001 and Lee et al.âs 2012 analysis for the Washington State Institute for Public Policy).1 Oxman, Lavis, Lewin, and Fretheim (2009) summarize the benefits of being evidence-informed by suggesting that the evidence use increases the probability of policy being more effective, equitable and value for money.
How Should Policy-Makers and Teachers Engage with Research?
How EIPP materializes will be a function of how teachers and policy-makers are expected to act following any engagement with research (Dimmock, 2016; See, Gorard, & Siddiqui, 2016). In my experience the goals of teachers in using research are typically one of the following: (1) to aid the design of new bespoke strategies for teaching and learning (or indeed approaches to school management) that are to be employed as part of their and/or their schoolâs teaching and learning (or management) activity in order tackle specific identified problems. As Coldwell et al. (2017, p. viii) note âfor teachers, evidence-informed teaching usually meant drawing on research evidence to integrate and trial in their own practiceâ. One example is a school I worked with in Chapter 5 who used research to design a âmistake typologyâ: informed by Dweckâs (2006) work on growth mindsets, this typology was designed to help teachers and pupils recognize various types of mistakes and how different mistakes could be used as the basis to improve how pupils learn and approach their work; (2) a second goal is that teachers use research to provide ideas for how to improve aspects of their day-to-day practice by drawing on approaches that research has shown appear to be effective. For instance research can provide clues for how to respond to pupils during lessons in order to maintain their resilience or grit (Duckworth, 2016); (3) teachers can also seek to use research to expand, clarify and deepen concepts, including the concepts they use to understand students, curriculum and pedagogical practice (Cain, 2015, for instance provides a case of teachers examining the notion of âgifted and talentedâ pupils and the way in which such pupils might be identified and the nature of a suitable curriculum for such a group). While this third goal does happen, it is less common: Coldwell et al. (2017) for example suggesting that in their study of schools teachersâ use of research evidence was prompted by a need to solve a practical problem; finally (4) teachers and schools may also seek out specific programmes or guidelines, shown by research to be effective, which set out how to engage in various aspects of teaching or specific approaches to improve learning (again typically to tackle identified problems). For example, programmes which suggest how to begin each lesson in order to minimize disruption or poor behaviour, or specific schemas for providing feedback. The goals of policy-makers may be considered similar although often their intention is to develop the directives or guidelines that will be used by teachers or affect the governance or operation of schools. Drawing on Stokesâ (1997) research typology, this implies therefore that the research teachers and policy-makers value most will have elements of practical application. Although these goals seem relatively clear cut, we still need to consider how research is actually employed within policy and practice.
There are numerous studies and commentaries that have examined the ways in which research evidence can affect policy and practice (e.g. Biesta, 2007; Cain, 2015; Cooper & Levin, 2010; Edwards, Sebba, & Rickinson, 2007; Hammersley, 1997; Nutley, Walter, & Davies, 2007), including the seminal work of the late Carol Weiss (e.g. 1979, 1980, 1982). Here however I illustrate the key issues involved by engaging with recent work undertaken by Penuel and colleagues (2017), which broadly encapsulates the core issues involved. The particular study undertaken by Penuel et al. (2017) involves the development of a survey to capture a broad range of potential uses of research evidence in order to gain baseline assessment of school leadersâ use of research. Adopting categories first identified by Weiss and Bucuvalas (1980), Penuel et al. (2017) use their survey to examine instrumental, conceptual and symbolic uses of educational research by school and school system leaders. They explain the first of these use-types â instrumental use â in the following way: âwhen policy-makers encourage education leaders to use research to inform their decision making, they implicitly invoke a theory of action in which evidence from research findings directly shape decisions related to policy or practiceâ (Penuel et al., 2017, p. 2). In other words instrumental use is the use of research âin the service of a particular decisionâ (Ibid.). Penuel et al. then define conceptual use, as occurring âwhen research changes the way that a person views a problem or the possible solution spaces for a problemâ. Symbolic use, meanwhile, occurs when research evidence is used to validate a preference for a particular decision or to justify a decision already made (Ibid.).
What is clear in examining these definitions is that the difference between instrumental and conceptual use is premised on how educators use research to make decisions and so take action as a result. Specifically, instrumental use is thought to involve a direct translation from research to practice: that is with instrumental use research evidence is seen as pointing towards a solution in relation to a problem of practice, with this solution or strategy subsequently being accepted and/or implemented. Typically, this type of use is thought to go hand in glove with notions of âevaluation-based evidenceâ since proponents of instrumental use typically believe that through the use of randomized control trials or systematic reviews, evaluative research can provide concrete calls to action through the provision of research informed guidelines or interventions that can be implemented with fidelity (Fixsen, 2017). In other words an instrumental decision is one of âthis is what we will do and howâ. Conceptual use, meanwhile, is regarded as more indirect in that it points to situations in which research evidence guides or informs thinking in relation to a given problem/solution to that problem. With conceptual use, therefore, research evidence is not regarded as the sole source of information upon which educators base their decisions (the decision made thus being âthese are the kinds of things we will doâ). Returning to the definition of instrumental research use, its definition, albeit implicitly, appears to imply action in strict adherence with what the research says should be done, thus ruling out of any other forms of knowledge coming into play (since this would result in the action following use of the research being customized rather than teachers acting with fidelity in relation to the research). Whatâs more, in theory at least, the more concrete direction research can provide, the more instrumental its use can be. A key question therefore must be how realistic the scenario represented by the instrumental use of research is?2
Even if we just consider the more instrumental goals teachers may have for using research (e.g. goals one and four of those listed above), a variety of sources would seem to imply that the answer is ânot veryâ. Notwithstanding the fact that often a given evidence base is not concrete enough to provide a definitive course of action in relation to a problem of practice (not every intervention has been evaluated and not all meta-analyses go into depth about how the intervention in question operates: research on how to encourage relationship building amongst children with autism being one example of the former, research on homework being an example of the latter) teachers simply do not seem to employ research in this way. For instance, Coldwell et al. (2017, p. ix) suggest that there is âlimited evidence from [their] study of teachers directly importing research findings to change their practice. Rather, research more typically informed their thinking and led â at least in th...