Next Article in Journal
Equity, Justice, and Quality during the COVID-19 Pandemic Period: Considerations on Learning and Scholarly Performance in Brazilian Schools
Next Article in Special Issue
How and under Which Conditions Can We Best Combine Research on School Effectiveness with Research on School Improvement? Establishing Connections Using the Dynamic Approach to School Improvement
Previous Article in Journal
Play, Problem-Solving, STEM Conceptions, and Efficacy in STEM: An Introduction to the STEM in Early Childhood Education Special Issue
Previous Article in Special Issue
Quality Teaching: Finding the Factors That Foster Student Performance in Junior High School Classrooms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Theory-Led Evaluation of a Scalable Intervention to Promote Evidence-Based, Research-Informed Practice in Schools to Address Attainment Gaps

Faculty of Education, University of Cambridge, Cambridge CB2 1TN, UK
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(5), 353; https://doi.org/10.3390/educsci12050353
Submission received: 2 April 2022 / Revised: 25 April 2022 / Accepted: 3 May 2022 / Published: 18 May 2022

Abstract

:
Evidence-based practice is a salient solution that has been presented to address the persistent educational attainment gap linked to economic disadvantage. However, most schools do not engage with research, and we know little about facilitating school-led research use at scale. Linking different approaches to studying educational effectiveness, equity and change, and drawing on cultural-historical activity theory, this study develops intermediate theory about the mechanisms influencing institutions’ success in using research. In the context of the Opportunity Area Programme, supporting place-based interventions in the UK’s most socio-economically disadvantaged regions, we conducted a theory-led evaluation of the Evidence-Based Practice Fund (EBPF), aimed at supporting school-led research engagement to improve learning outcomes. We analysed the documentation of 83 EBPF projects, 8 focus groups, and a school survey. We demonstrate that enabling schools to address locally identified needs motivates research engagement but formulating these as stimulus for change requires scaffolding. Schools were keen but needed re-translation to use research to address those challenges. Low-key school-to-school support was found adequate. Leadership support and collaboration were significant but require relational expertise and professional learning to be effective. This study demonstrates that the use of research by schools at scale is possible and can transform a school’s agency in developing their own practice to improve equity.

1. Introduction: Using Effectiveness Research to Close the Attainment Gap

A key education challenge in many countries, particularly pertinent in England, is a consistent and persistent gap in children and young people’s educational attainment, as linked to economic disadvantage [1,2,3]. This manifests later in life through reduced opportunities, including progression to higher education [4], and worse labour market outcomes [5] for those from the most deprived backgrounds. Pupils with special educational needs are another particularly educationally disadvantaged group and these two factors are often interlinked [6]. Educational inequality is also associated with low social mobility; thus, reducing educational inequality is widely seen as a key policy lever for improving social mobility [7]. In England, while improvements have been made, over recent years a significant slowing down in the closure of the disadvantage gap has been observed [8].
One of the key policy solutions in the UK, and many other countries, for addressing the educational gaps identified above has emphasised utilising educational research findings to develop new effective practices in schools [9,10,11]. This reflects growth in ‘evidence-based practice’ across the educational sector in the UK, particularly in England. This growth has emerged despite concerns about the positioning of ‘for-policy’ educational research [12] as overly pragmatic and focused too extensively on answering only policy-relevant questions. While recognizing the limits of claims made through and by the evidence derived through educational research [13], evidence-practice has become de rigueur in the educational sector. Work by bodies such as the Education Endowment Foundation (EEF) as the designated What Works Centre for education in England has championed both the development of robust evidence around educational practice, mostly around the use of randomised control trials in education [14], and the use of that evidence by practitioners in their own respective educational contexts for the purposes of improving their practice, improving student outcomes, and ultimately narrowing educational inequality.
One of the larger government-funded initiatives to address the attainment and social mobility gaps identified above is the UK Opportunity Area programme (OA). Opportunity Areas are parts of the country with the lowest levels of social mobility, where a range of predominantly education-focused initiatives have been put in place since 2017 to improve social mobility. Overlaying these developments in these areas, and more widely, is the COVID-19 pandemic, which has significantly negatively contributed to exacerbating educational inequalities [15,16,17].
The use of evidence by practitioners is one element of the approach championed by some of the Opportunity Areas in their programming around social mobility. Amongst other approaches, this has taken the form of an Evidence-Based Practice Fund, which provides the specific context for this study and whose description is outlined below.
The idea of evidence-informed teaching is connected to the idea of ‘research-engaged schools’, whereby schools can be considered “as more or less ‘research-engaged’ depending on the extent to which they support and undertake evidence-informed practice, specifically teaching” [18]. Research points to consistent features of schools characterised by highly research-informed practice [18,19]. However, it also shows that most schools and teachers do not fall into this category [18,20]. How schools can move along this continuum is less clear. Research-engaged schools are found to be directly supported by academic researchers [20], further highlighting the challenge of supporting research use in schools at scale. Getting research findings into practice remains challenging and we know little about how practitioners can use research in their practice [21,22].

Paper Aims

Within this context, the aim of this paper is to engage with the substantive issues around evidence-based practice and research use in educational settings and illustrate this with the example of the Evidence-Based Practice Fund. In doing so, specifically, we ask, how can evidence from research on effective pedagogic approaches contribute to teaching practice at scale to improve educational equity in a way that acknowledges schools’ and teachers’ place-based educational goals and contextual factors and enables teachers’ professional agency to lead the improvement of their practice?
In what follows, we offer a review of the relevant literature. We first discuss the wider context of evidence-based and research-informed practice and its implications for the work of schools in the English context, especially as it relates to government policy. We then focus on the existing literature that has engaged with the way schools understand and use evidence-based and research-informed practice, looking at both theoretical and empirical work. We build our argument that this is a contested space, where theory plays an important role, but needs to be understood within the context of the ‘end-users’ of the evidence or research, that is, the schools and practitioners themselves. We find that the literature often provides examples of so-called best practice, with a research gap around the use of evidence and research by schools in normal circumstances alongside schools facing substantial disadvantages. Given this gap, we argue there is a need for a focus on typical teachers and schools, particularly those located in high deprivation locations, and on their own practice development through emerging engagement with such research. This is precisely what our research reported in this paper looks to offer.

2. Situating the Paper Aims in the Evidence and Research Use Context

How (evidence from) research can contribute to the development and warranting of good practice in schools [23], has been much discussed among educational researchers for decades in the UK [10,24,25,26,27]; internationally [9,11,28,29]. Nevertheless, reviews of research on school effectiveness have identified a persistent challenge in the limited application of this research in educational practice and, in particular, support for disadvantaged learners [3,10]. Our position is that this represents an important critique of the field, which informs our interpretation of the understandings and uses of effectiveness (and other types) research and evidence in schools and which also guides our paper’s research design, focusing it on the ways in which practitioners apply evidence to particularly support disadvantaged learners.

The ‘What Works’ Approach to Evidence Building and Use

The development of research on ‘what works’ and evidence-based education has specifically focused on research translation to overcome this challenge, with only limited insight [30] into the school conditions, which would facilitate a meaningful engagement with that evidence. This is despite efforts by the What Works movement, chiefly represented in England by the EEF, to build tools and instruments for schools [31] and their leaders to use in the implementation of evidence [32] into their own practice.
However, Farley-Ripple and colleagues [28] argue in their conceptual piece that these efforts have been based on a simplistic assumption of how research impacts on practice to improve equity: research identifies effective practices which are simply implemented by practitioners, leading to improved outcomes. This assumption is visible in earlier UK policy documents [33]. We suggest that a nuancing of these assumptions has more recently taken place in the UK, highlighting the need for considering teachers’ perspectives and agency [34], and a range of research approaches to produce practice-relevant knowledge [35]. However, a significant knowledge gap remains in relation to understanding and conceptualising the use of effectiveness research in schools [22,36,37] (cf. Burn et al. 2021; Hofmann & Ilie, 2021. This is further highlighted by a recent extensive study [38] looking specifically at the Opportunity Area programme generally, and a specific set of schools who are seen as lead promoters of research and evidence use (so-called Research Schools), which finds a range of structural and organisational barriers to the work of these particular schools yet looks less so at how ‘regular’ schools are able to engage with their work.
This nuancing notwithstanding, there have been significant critical academic debates about the ‘what works’ approach and its assumptions about research, and the research-practice relationship. Key criticisms have expressed concern about the epistemological limitations of RCTs, the narrowing of the curriculum to easily measurable outcomes, and the restriction of research approaches to explore only educational processes and outcomes immediately relevant to policy [39,40]. These approaches have been critical of research on educational effectiveness and the ‘what works’ approach, as well as of the notion of educational improvement as implementation of externally defined interventions, highlighting instead the agency of teachers in designing and effecting change.
A full engagement with these methodological debates is beyond the scope of this paper. Instead, we seek to build bridges across the methodological debates in our own work which ranges from longitudinal research in educational equity analysing large-scale education datasets [41] to randomised field trials [42] to theoretically oriented ethnographic and discourse analytic research of educational interventions [43]. We consider effectiveness research and RCTs as one of several relevant sources of evidence in addressing equity gaps in education. Despite significant efforts and investment, the use of such research by schools at scale to improve equitable outcomes, however, remains unclear. In the remainder of this section, and to further outline our perspective in this paper, we will highlight three essential points emerging from the literature around the use of effectiveness research and evidence in and by schools.

3. Framing the Use of Effectiveness Research in and by Schools

3.1. All Types of Research Can Be Used in Ways That Do, or Do Not, Support Teacher Learning and Agentic Action for Change: Understanding How Practice Changes Is Key

Despite their intention to build robust evidence, reliance on ‘what works’ approaches in relation to the implementation of this evidence may overlook teachers’ professional learning and agency in effecting change and may insufficiently acknowledge the power of contextual factors. Ruthven [23] and Sammons [44] analysed the 1990s UK government’s use of Educational Effectiveness Research to improve students’ learning outcomes and reduce the attainment gap in schools. They both show how effectiveness research can be implemented in schools in a mechanistic way, which may inhibit professional learning and distort pedagogic change. Similarly, discussing the more recent policy efforts to improve equity through research on effectiveness through the ‘what works’ approach, Burnett and Coldwell [40] caution that RCTs, focused on tightly structured interventions, as a key form of research for educational improvement, can narrow the nature of educational development and innovation. Like ourselves, none of these authors argue from a position of opposition to trials in educational research, but they all call for theorising the role for professional agency and learning in schools’ research use.
At the same time, a reliance on professional and institutional agency to design and effect change in schools, as often advocated in opposition to effectiveness research [39], may not always fully acknowledge how much institutions rely on stability [45,46]. Research suggests that expert teachers (including those in research-engaged schools) continually assess, and learn about, their students’ (mis)conceptions, identify suitable data/interventions/innovations and then change their practice accordingly [19,47]. However, an increasing body of evidence on the implementation of research-informed pedagogic interventions, and teacher professional learning and agency, suggests that this is not true for all teachers [18,20,42,48,49,50]. This evidence takes in a variety of contexts, both very similar to the focus of this paper, as well as further afield [49,50], focusing, and rightly so, on the perspectives of the practitioners working within educational settings, who are sometimes seen to find the implementation of evidence-based and research-informed practice challenging.
We do not see this at all as a failure of individual teachers to enact new practices to support all learners. Nor is it simply a case of practical hinderances (such as time, workload, or assessment pressures), though these are certainly a factor [20]). Instead, we argue that a range of sociocultural and institutional norms drawing on much wider social, cultural, and historical patterns shape classroom practice, as well as the underlying assumptions about teaching and learning and what teachers—and students—perceive as possible [30,43,51]. These norms work to sustain existing forms of practice and provide continuity and stability. Such stabilisation helps teachers cope with complex realities in challenging institutional circumstances. However, it also constrains teachers’ agency through limiting what is seen as possible [41,45,52]. We argue, therefore, that more research needs to address the mechanisms of change in professional practice.

3.2. What Does Successful ‘Agentic and Purposeful’ Research Use by Schools and Teachers Look Like: Need for Diversity of ‘Outcomes’, at Different Levels

Both research with research-engaged schools and school effectiveness and improvement highlight that research use by schools and teachers should support local goals [22,53]. However, while local problems of practice and development needs can be highly visible to practitioners, they can also be latent and difficult for ‘insiders’ to identify, articulate, and examine [54]. This makes identifying research to address those local challenges a difficult task. Moreover, an emphasis on highly controlled research methods may sometimes obscure the fact that taking research into practice is not a one-off event which either works or does not work. Instead, it is a process, where the mechanisms of change are as important as the change potentially effected locally.
To understand if professional change efforts are worthwhile, practitioners need to identify what that new practice is trying to accomplish [55,56]. Any attempt to change local practice based on engagement with research (findings) necessitates an identification and articulation of desired outcomes for that practice. Far from such outcomes being a singular well-bounded quantity which can be defined and captured at one time point at the end of an educational ‘intervention’, efforts to change educational practice have many different kinds of ‘outcomes’, whose understanding requires different kinds of methods. In other words, schools and practitioners need to identify the learner outcomes which they seek to achieve, the practice change outcomes through which the learning outcomes may be achieved, and the school-level outcomes (e.g., around culture change, for instance) which may come about by virtue of the engagement with the research and evidence base. To us, this highlights parallels to the multi-level nature of educational effectiveness research (both conceptually and methodologically) and raises the point that practitioners’ understandings of how these learning outcomes may come about (at whichever level) is important—whether that is about how the learner outcomes develop (and the learning theories that underpin that [57], how teachers learn within these contexts [58] or how schools make sense and support evidence-based and research-informed practice [38]).
We further need to consider the different temporal and institutional distances between interventions and their (various) outcomes. Well-defined and relatively short-term outcomes may lend themselves better to more specific research-informed interventions and their local evaluation through measurement. More temporally and/or institutionally distant outcomes, such as change in institutional culture around equity, are less well suited for experimental evidence and evaluation and require expansion of the methodological repertoire. Open-ended change efforts may aim to enable teachers to plan for and enact change in their professional practice over an extended duration in ways whose ‘outcomes’ cannot be pre-defined [59].
While such open-ended change efforts are important, Edwards [45] points out that schools are often not in the position to fully transform their practice: school social practices are difficult to change due to high stakes accountability systems and challenges with social order in many vulnerable schools. These make stability a priority for both students and teachers. Studies demonstrating significant changes in school practice and outcomes have typically involved a significant external research support [3,18]. Our interest lies in developing an understanding of how schools working in challenging locations, without such intensive support, may be able to use evidence to learn to engage in alternative pedagogic practices to improve equity in ways that matter to them in their context. In this paper, we draw on our learnings from an evaluation of a place-based policy initiative in England which offered schools in areas of high socioeconomic deprivation the opportunity to apply for funding to develop and self-evaluate new evidence-based pedagogic practices. The programme aimed to improve equity in relation to an aspect of learning/development each school itself deemed as a key motive for their local setting, as part of a broader social mobility-raising programme implemented in areas with the lowest levels of social mobility in England.

3.3. Need for Theory

If we want research to be useful to practice, we need theory that can explain how change happens [37]. This involves a theory of change for how a specific research-informed, evidence-based intervention is expected to bring about change. We use ‘research-informed’ and ‘evidence-based’ interchangeably here, despite the fact that the former generally focuses on the application of strictly-defined knowledge (through the process of research) to a broad setting, while the latter generally outlines the application of a given knowledge base (‘the evidence) to a specific context [60,61].
Our focus is on an intermediate theory of the mechanisms and contextual factors that influence an institution’s success in using research for their purposes [29,53]. To be applicable across settings, such intermediate theory should be informed by fundamental social theory about how institutional practice changes. In this paper, therefore, we offer a conceptual framework intended to help us think about how teachers in schools serving socioeconomically disadvantaged communities can use evidence and research findings to work towards closing the attainment gap/educational equity. As such, our work has significant conceptual and educational alignment with the recent work of Burn and colleagues [22] who studied the role of school-based research champions in negotiating research into practice in nine schools that were part of the Oxford Education Deanery partnership, and the highly comprehensive work of Gu and colleagues [38]. The novelty of our study is its focus on a large number of schools working independently of us, researchers, all located in areas of high socio-economic deprivation and low social mobility, yet who were given the professional agency to identify local goals of consequence and select the evidence and research that might help them address those.

3.4. Concluding Ideas

Three key features arising from across bodies of research on research-engaged schools and educational effectiveness highlight the importance of senior leadership in supporting and facilitating teacher learning and research use, a context of within and cross-school collaboration and local self-evaluation of practice and improvement efforts to establish if what worked elsewhere for certain goals also works ‘here’ [10,18,20,22,62]. Research also points to the importance of context for facilitating or impeding improvement, particularly around school-level factors [3,47,53]. We aim to re-locate these findings within a theoretical framework which can inform an understanding of how institutional practice changes; and inform an investigation of the factors supporting or hindering schools’ own emergent research and evidence use to address equity across settings.
Beyond the theoretical focus, our research looks to provide actionable insights into how schools make use of evidence, the mechanisms by which they choose the learners, practices, or outcomes of interest, and the barriers and enablers that, respectively, hinder or support the implementation of research and evidence into practice. We argue that these are important insights, especially as they are derived from work with schools working in relatively socially disadvantaged areas, particularly as current public health circumstances and their economic impact will have long-standing effects on the socio-economic contexts of learners and schools. Understanding how schools already facing these hurdles put research and evidence to use for the purpose of improving learner outcomes is, therefore, of critical practical significance.

4. Theoretical Framework for Understanding Change

Given we have identified a need for theory in understanding the implementation of research and evidence-based practice in schools, we now focus on the theoretical perspective that underpins our own research in this paper. Specifically, we draw on cultural-historical activity theory, otherwise known as CHAT [45,59]. Three key features of CHAT’s understanding of change are central to our analysis here: the role of local practitioners’ problems of practice as motives for change; relational expertise; the tool-mediated nature of change.

4.1. Local Practice and Goals

CHAT considers the historically accumulated local practice in a particular professional and institutional setting as integral to participants’ actions [45]. As well as the wider sociocultural norms about classroom teaching and learning shaping perceptions of possible action, CHAT attends to practitioners’ collective and individual local goals as key motives for action [22]. It is not simply about whether an evidence-based intervention could work in a specific setting beyond the original study but if, how, why, and for what purposes it could be made to work in a new setting by the practitioners and participants of that setting.

4.2. Collaboration and Relational Agency

Research on teacher professional learning consistently highlights the importance of collaboration [29]. CHAT argues that while ‘collaboration’ is typically treated as something inherently positive and inevitably useful in professional learning and school improvement, it is also a vague concept that does not alone capture what might be useful about it [55,63]. CHAT highlights “the collaborative construction of relationships and professional norms that can improve teaching and learning” [62]. Edwards [64] refers to this as ‘relational expertise’: confident engagement with the knowledge that underpins one’s own practice and engagement with others’ expertise. Relational expertise enables practitioners’ relational agency: the ability to work productively with others to expand understandings of and solutions to a problem of practice.

4.3. Double Stimulation

As well as practitioners’ motives and capacity to work together, Vygotsky’s notion of double stimulation as a conceptual mechanism of transformative agency is central to CHAT [65]. The first stimulus involves educators identifying and articulating their local problems and new possibilities. A second stimulus refers to the mediating tools (e.g., evidence) which actors can develop or adapt to analyse and work on their as-identified problems [59]. In CHAT research, a double stimulation is commonly evoked via intensive, longitudinal researcher-supported developmental interventions [54]. We seek to identify whether these mechanisms might also help identify ways of facilitating such transformative agency at scale.
In what follows, we first offer a more detailed description of the context of our study, specifically around the Opportunity Area programme, and then describe our research design and underlying data collection and data analysis methods. We then present our results, concluding our paper with a discussion of our key findings in relation to the current theoretical an empirical literature in the field.

5. Context

The OA Programme and the Evidence-Based Practice Fund

This study took place in the wider context of the Opportunity Area programme, a UK Government programme launched in 2017 to tackle entrenched regional disadvantage and low social mobility among young people through place-based educational interventions. The 12 OAs across the country represent the Local Authority Districts with the lowest levels of social mobility: three OAs are found in the East of England, representing some of the most socioeconomically deprived areas nationally [66] and represent the focus of this study, Ipswich, Norwich, Fenland, and East Cambridgeshire.
The OA programme’s aim is to recognise the unique issues the different areas within the UK face and the need for local involvement in decision-making about solutions. A key part of the programme’s strategy is to improve education by addressing deep-rooted issues that affect young people’s outcomes. The programme focuses on pupils who are most likely to encounter barriers to social mobility, including those facing socio-economic deprivation and children with additional educational needs, and promotes a place-based, tailored approach to the activity which may result in increased social mobility over time [67].
Within the OA programme, we studied the Evidence-Based Practice Fund (EBPF) whose specific aim was to encourage schools in high deprivation areas to engage with current research in order to improve outcomes for young people, in particular, disadvantaged pupils, by embedding research-informed practice into schools.
The EBPF operated across all the three OAs in the East of England region, providing grants primarily to schools: schools which wished to implement new research-informed interventions could submit applications stating an identified improvement need, the approach they would take to address that need, and how they would evaluate that approach’s impact. Four EEF Research Schools—otherwise normal schools tasked with supporting the use of evidence to improve teaching practice through working with schools in their area [68]—were involved in supporting the fund. Across the three OA regions in which it operated, EBPF engaged just over 100 schools, though not all were successful in undertaking a project under the EBPF remit.

6. Research Design

6.1. Theory-Based Evaluation Design

The present study draws on a theory-based evaluation approach (TBE) [69]. TBEs are designed to answer why and how programmes work by examining processes and factors involved in changes brought about by development programmes. TBEs are not intended to quantify impact but aim at understanding the transformational relations between programmes and their outcomes and the contextual factors that influence those to define the theory of change that underlines a programme. To do so, we draw on explicit theory that explains change, to “unpick contextual factors that might have causal potency, and identify other things going on that could also influence outcomes and impacts” [69] (p. 21). TBEs are based on two distinct parts, a conceptual part outlining/developing the theory of change guiding the evaluation (which we have outlined in the section above around the theoretical framework supporting this work) and data collection to examine the programme’s workings empirically: how it produced (or failed to produce) the desired changes, which are linked in the evaluation. As such, our evaluation compares what happens, as identified by the empirical study, with pre-existing theory of how change happens to generate new conceptual insights.

6.2. Data Collection and Analysis Methods

The study draws on several data sources and associated data collection and analysis methods. We outline each of these in turn, describing our approach and discussing the strength and limitations of each in turn.

6.2.1. Document Analysis

First, we conducted a document analysis of school generated EBPF documents. The aim of the document analysis was to understand the range of projects supported by EBPF, the outcomes schools and practitioners had chosen, their rationale for the deployment of evidence in supporting achieving these outcomes, and the evaluation approaches they proposed (or succeeded) to implement. These documents included applications, interim reports, and where available, final reports by participating schools. Documentation for a total of 83 EBPF funded projects was available for analysis, across all three OAs supporting EBPF.
Using projects as the unit of analysis, a coding framework was constructed and applied systematically across all available documentation and in relation to each respective project, with a small number of schools running more than project. The coding framework consisted of a set of fixed categories to facilitate the systematic exploration of all relevant documents in relation to the implementation of EBPF-supported project, specifically, the outcomes chosen, the interventions implemented, and the evaluation designs planned (and implemented). Some categories reflected practical considerations around the projects’ implementation and respective evaluation. Other categories in the coding framework focused on the use of evidence by schools, as they had identified in their initial EBPF application documents and other subsequent reports. Further categories covered broader aspects of implications and conclusions about the overall experience of each project from participating schools and staff. Each project was coded across all these categories. The resulting data was then synthesized by aggregating across codes, so that a comprehensive picture of all projects could be derived. Taking this approach ensured that all EBPF-supported projects, regardless of the amount of documentation they each held, were included in our analysis. We found the coding framework to be flexible enough to capture very different projects, and given that the framework was derived based on the original aims of the EBPF project, there was high internal consistency and validity in relation to the project aims and intentions.

6.2.2. A Repeated Interview Study

Secondly, we ran a repeated interview study, exploring the perspectives of the Research School staff engaged in supporting the EBPF and participating schools’ EBPF projects in individual interviews, capturing teachers’ perspectives, exploring their engagement with research and EBPF projects. This repeated interview design consisted of two rounds of interviews conducted with the three Research Schools that were supporting EBPF across the three OA areas of focus and two focus groups with teachers from participating schools. A total of 7 individuals across the three Research Schools took part in the repeated interviews, that is, each person was interviewed twice. Some of these interviews took place as a group (per Research School) and some were individual in nature. A total of 21 EBPF participants took part in the repeated focus group, but due to scheduling difficulties it was not possible to have all EBPF participants take part in both focus groups.
A discussion protocol was derived for each of the interviews and focus groups. These focused on professional learning, teacher engagement, use of evidence, evaluation methodology, and school leadership (with the second round focused on identifying changes), all in relation to the implementation of projects via EBPF.
The interviews were transcribed and coded for key themes, using simple thematic analysis, allowing both for themes immediately related to the topics covered by the interview protocols and new themes, newly outlined by participants to emerge.
While this approach ensured that participants had the freedom to offer new perspectives—which was essential to our design—and to reflect on how their experiences had changed by offering answers at two points in time during the EBPF timeline, the self-selected nature of participant recruitment meant that we have likely heard from participants who were more closely involved and had the requisite resources (including time) to participate in the focus groups. Combined, however, with responses from all three Research Schools involved, and the document analysis above, we argue that we are still in a position to derive accurate insights about EBPF across all three OA areas.

6.2.3. A Participant Survey

Thirdly, and to complement the interview study and document analysis outlined above, we carried out a brief survey. The aim of the survey was to elicit EBPF participants’ perspectives on their respective EBPF projects, and to understand the relationship between these perspectives and broader approaches to professional learning in an evidence-building context. The content of the survey was predominantly factual in nature. Questions first asked respondents to outline how they had engaged in their school’s EBPF project (whether as a lead or participant, independently or with colleagues) and whether participants had made use of the existing training offer around EBPF and their perceptions of this if they had attended it. Further questions asked about the level of support received during their EBPF participation, from their own school and the Research Schools engaged in EBPF, and asked participants to offer their assessment of the value of EBPF participation. A final section of the survey asked about evidence use in general, including familiarity with the EEF toolkit, and how this featured in their day-to-day work. Because of the nature of the survey, and the requirement to limit its duration for practical purposes, the only validated survey scale used was a teacher learning scale, first derived by Vermunt et al. (2019) and used with permission. Despite the scale being valid, the low response rate (as outlined in what follows) meant that the standard computed scale characteristic (Cronbach’s alpha = 0.53) is very low; therefore, the scale has not been used in the Results section to avoid the reporting of invalid and internally inconsistent results.
Participating schools in two of the three OAs responded to the survey, with one OA area unable to distribute the survey to the schools given concerns about burdening teachers. In the two areas that responded, the response rate was low, at 25% and 30% of the participating schools, respectively. This amounted to a very small absolute number of responses, 11 and 7, respectively.
One of the reasons for this low response rate is the COVID-19 pandemic interrupting the survey. We deemed it would have been unethical to issue survey participation reminders given the difficult circumstances around schools moving to online provision in England at that time. As a result, we do not attribute a high degree of confidence to findings that emerged from the survey and we have exercised care in interpreting its results, corroborating findings with other evidence from the two above methods of data collection and analysis.
Following on from the sample size limitations, we provide descriptive analyses of the participants’ answers where relevant. We do not undertake any inferential analysis and are at all times mindful that the survey findings do not represent the perspectives of all EBPF participants.

6.3. Strengths and Limitations of the Research Design

Despite the rich set of data above, we have not observed teachers’ practice nor have we ourselves measured students’ learning outcomes in the participating schools. We rely on the schools’ own reports, which are of variable levels of breadth and depth. This could be considered a limitation of our study. Instead, we argue for a different perspective. Participating school reports provide evidence of the schools’ own capabilities in articulating their goals, processes, and outcomes of attempts to draw on evidence. They are a key piece of evidence of the schools’ own understandings of what and how they have been able to achieve through drawing on research. Therefore, they are a key aspect of the phenomenon itself in which we are interested.
Further, we have outlined how the sample size for the survey was compromised by factors outside of our control (the pandemic) and as a result the particularly pertinent teacher learning scale was not included in the analysis. This is of course regrettable, but on balance, we prioritized the ethical aspect of not burdening potential respondents at a time of substantial professional disruption over the sample size, deeming the full coverage of the document analysis across all EBPF projects a sufficient counterbalance.
Even further, coupling this approach with the focus on all schools participating in EBPF by virtue of the document analysis, we posit that our sample of schools and, therefore, our findings below, are more scalable than they would be from otherwise more strictly researcher-led research. There remains, of course, an important element of self-selection into EBPF in the first place, but given its cross-regional implementation, the schools participating here are diverse and include those working in deprived circumstances who normally cannot engage. All research navigates a trade-off between internal and external validity. Our current study prioritises external validity and the understandings of what can be achieved by schools through engagement with research and evidence in in areas of substantial disadvantage.

7. Findings

Our study does not seek to quantify the impact of the EBPF on students’ learning outcomes. Nonetheless, two-thirds (63%) of respondents to our small survey reported that the implementation of an EBPF-supported project had a positive impact for their pupils, against a backdrop of high school participation in the programme. We first consider the evidence of how schools chose topics and designed their EPBF-supported projects and their evaluation, before attending to the change mechanisms that facilitated or hindered these processes.

7.1. Developing and Evaluating Research-Informed Interventions

7.1.1. Articulating Desired Outcomes

Schools explicitly drew on their locally identified needs to select the groups of students to work with and the topics they wanted to work on via EBPF. Most projects targeted disadvantaged students, with different sub-groups identified by different schools (64% of schools used a targeted approach, of which 36% targeted pupils eligible for Pupil Premium, 32% those with low prior attainment, 8% those with behavioural/educational/emotional needs, 24% a combination). The projects also targeted a range of topics: nearly half of the projects reviewed focused on English/Literacy (45%), with the remaining programmes split between student behaviour, social and emotional needs and mental health issues (11%), mathematics/numeracy (6%), and teacher professional development (13%). In the focus group, teachers articulated a range of diverse reasons for their choice of project topics, but all reasons occurred against a backdrop of each school’s own circumstances. This was mirrored in the document analysis: the majority of projects explicitly or implicitly engaged with a distinct learning need in the school.
“What has really interested me is the wide spread of projects, some have been really niche and really actually tapped into something that has clearly been an issue within the school they’ve struggled to address, and the EBPF has given them the opportunity to say, ‘Actually, let’s give that a try then, because we haven’t got the funding to do that, normally, but this gives us that funding,’ and the projects have kind of gone from things like that to ones that do seem to be a little bit more, We really want to do this in our school’.”.
(Region1_ResearchSchool)
This may have been a result of the structure of the application form itself, which invited the identification of a problem, but the Research Schools supporting the programme noted on several occasions a deeper level of engagement with contextually-specific issues.
“[Schools] been able to apply that evidence to their context. So, rather than picking up a generic resource and just chucking it at the problem and hoping it will put out the fire, it’s allowed them to kind of say, ‘Okay, but what really is at the root of it for us?’”.
(Region1_ResearchSchool)
Our analysis revealed that schools had based their proposed projects upon careful analysis of existing school data, and on reflection on their schools’ and pupils’ learning needs. The second set of Research School interviews suggested that this was a growing trend, with an increasing number of schools moving away from quick fix solutions towards more rigorous development and evaluation of evidence-based approaches. Nonetheless, the findings suggest that schools found articulating (and finding ways of evaluating) desired outcomes difficult. Across all the three OAs, the reviewed project applications showed that identified needs and desired outcomes were broad and (often overly) ambitious, despite the fact that in the interviews several schools recognized that, in principle, that “…there must be a clear and specific focus and a highly refined research question” (Region2_ResearchSchool) to provide better evidence of improvement.
Schools particularly struggled with defining what improvement might mean in areas where no standardised measures of learning/progress existed. Where such measures did exist, schools typically defined their desired outcomes as evidence of progress on these measures. Particularly in projects concerning student well-being, social and emotional learning and/or mental health, operationalising intended outcomes and identifying evidence was difficult, since in “the projects linked to mental health and wellbeing, evidence was slightly harder to find” while there is “a number of reading projects in East Cams and Fenland, and --- therefore that made it potentially easier for schools to run a project that was actually bespoke to them as opposed to just picking up a kind of predeveloped package that ticks boxes” (Region1_ResearchSchool). The distinction between data, outcomes, and evidence was, therefore, sometimes blurred in project reporting and teachers’ responses.

7.1.2. Identifying Research and Evidence to Address Identified Local Needs

The responding schools showed universal awareness of the materials by the EEF, with 71% of schools indicating having used the EEF toolkit directly to help them decide upon a topic. The most-often cited use was checking evidence relating to an issue relevant to the school, specifically in the form of relevant interventions or programmes. The Research School interviews corroborated this.
This suggests that identifying suitable research-informed interventions to address schools’ self-identified local needs was challenging, as schools often did not directly engage with evidence beyond the EEF toolkit. This was in contrast to Research Schools seeing the buying in of ‘ready-made’ projects (indicative of a more mechanistic use of evidence) as less preferable than schools forging their own intervention based on available evidence. Research School staff suggested buying in ready-made projects was potentially linked with less creativity, and that having to think through their own projects forced schools to be more imaginative:
“…the schools that bid for more money were generally buying in an intervention rather than devising their own […] having such a significant sum of money on the table actually led to schools using less imagination rather than more.” .
(Anonymised_ResearchSchool)

7.1.3. Evaluating Interventions

Despite drawing on ‘proven’ research-informed insights and interventions via the EEF toolkit, schools found it very important that their development efforts were evaluated locally and shown to work in their setting.
“I’ve been surprised by how many schools actually want to embrace a really rigorous form of impact evaluation, in terms of what it’s done for their pupils, because the application process, it doesn’t force them to use a particular evaluation methodology, I’ve been surprised by, once schools get a sense of what is possible on that front, they actually become quite ambitious.”.
(Region2_ResearchSchool)
Nearly two-thirds of the projects used different versions of a before-and-after evaluation design, including a baseline measurement point, with a substantial number using well-defined comparison groups. A fifth of projects had designed a form of quasi-experimental comparison-group design. Other common evaluation designs included continuous monitoring of pupil outcomes, mixed quantitative and qualitative designs, and action research. Therefore, while the type of evidence to be generated through these evaluation approaches varied (from causal to narrative), most were able to generate some evidence around the implementation, and to some extent impact, of the specific intervention being deployed.
Despite these careful plans, schools often struggled to get projects and their evaluations off the ground and within given schedules, finding the implementation harder than they had anticipated. However, the participating teachers, head teachers, and other school staff, including in Research Schools, found the experiences rich, enlightening, and enhancing of their professional expertise.
In the second part of this section, we consider evidence for enabling mechanisms which supported schools in their efforts to develop and evaluate research-informed interventions to improve the learning of disadvantaged pupils. We note that while time and staff turnover posed significant practical challenges in many schools, the Research Schools highlighted that good anticipatory contingency planning often mitigated these.

7.2. Enabling Mechanisms

Research School support was important for defining outcomes and designing evaluations even at a low level. In terms of identifying and defining desired outcomes, the Research Schools highlighted the importance of schools being very clear on what they wanted to implement and aligning their projects with their school improvement plans. Quite low volumes of support from the Research Schools could help. Examples included a single twilight training session seen to be making a significant difference for schools developing greater clarity and specificity regarding their desired outcomes.
“Helping them unpick as well, making it easier for them, because they do put in a lot of layers to it and you feel like going, ‘No, it won’t work’, and they feel like going, ‘Oh, is that okay?’ I say, ‘Yeah, just test the spelling age; just test their reading age; don’t put in three or four different things’.”.
(Region3_ResearchShool)
Similarly, early targeted support with developing evaluation designs by the Research Schools was beneficial.
“When I ran the implementation session and I talked about process and outcome evaluation, and after that schools that hadn’t necessarily been flagged to me for needing individual support emailed me just to say, ‘Ooh, I’ve realised I haven’t factored that in, can you give me some further advice on what we might do to track that?’”.
(Region1_ResearchSchool)
Translated research materials were a good resource but required further re-translation to be useable for teachers. The Research Schools played a key role in mediating schools’ engagement with research evidence. Rather than reading and analysing those papers and school data themselves, when moving beyond the EEF toolkit, teachers sought summaries of research paper findings from the Research Schools to help them decide what was important to research:
“I do find that teachers do not want copies of research papers; what they want is a synopsis of the research papers or me to tell them why they’re important. They want to perhaps have the copy but not to read it through and analyse it themselves; they want you to sort of say, ‘These are the important bits: use this in your evidence base’.”.
(Region3_ResearchSchool)
This suggests that despite the EEF materials having undergone significant research translation to make research accessible to teachers and schools, further re-translation is required by typical schools.
Senior leadership team (SLT) support was important but required learning. Two-thirds of reviewed project had head teachers, deputy head teachers, assistant head teachers or executive heads/trust executives as project leads. A further fifth of projects were led by a subject or level (e.g., early years) lead, with the remainder led by a school’s Special Educational Needs Co-ordinator (generally also a member of a school’s leadership team). This suggests a high level of senior leadership support for the projects. Conversely, some Research School staff suggested that some members of senior leadership teams were not very supportive of change in practice, viewing staff’s creative ideas as “too much”, and being perceived by the Research School staff as not always being ambitious in terms of what their EBPF projects could deliver.
The second set of interviews revealed that the engagement with EBPF’s guidance on implementation and evaluation had led some school SLTs to alter their entire approach to leading whole-school change and that the Research Schools were also learning from this. This was seen as the real value-added of the EBPF programme as a whole: this guidance was seen lacking from the most common school headship training, which was seen as more concerned with dealing with and managing resistance than exploring ways to bring staff together with evidence-based thinking. This pointed to a further important dimension relating to SLT support: while often considered as something school leaders ‘choose’ to do, SLT support for research-informed improvement in itself requires learning. One of the Research School specifically mentioned how a positive impact of EBPF was the change in leadership practices (allowing more freedom, more experimentation), regardless of the specific outcomes of any given projects:
“The biggest success of the Evidence Based Practice Fund, it’s been a change in thinking and attitude [in school leadership teams] as much as it is about the individual projects themselves. […] it’s proving the worth of evidence-informed research and practice, whether the individual project has borne fruit or not.”.
(Region2_ResearchSchool)
Teacher collaboration within and cross-school could be a resource but required relational expertise and culture shift. The first participant survey suggested a high incidence of collaboration: 88% of respondents indicated that they had taken part in their EBPF project alongside at least one other colleague and 78% of respondents were positive about the level of support received from other (non-SLT) colleagues in their school during the implementation of their EBPF project. One head teacher noted that regular engagement within their school and across institutions helped the implementation:
“The constant—not constant—but regular meetings on what’s worked, what hasn’t pretty much half termly throughout the year of the project and still ongoing. Regular meetings where we would talk about what we’d done and where we think it’s been successful and what we might need to do differently and where the next step is. So those opportunities to come together regularly and reflect on what we we’ve been doing, I think was really important.”.
(Region1_FocusGroup)
This was reflected in a teacher focus group. Here, teachers reported attending conferences and training workshop, linking to new staff in their school, and having their respective SLT take an active role in supporting projects, all as a direct result of implementing an EBPF project. However, only 50% of survey respondents indicated that their EBPF participation had made them feel part of a community of practice.
We also noted a very small number of instances of cross-school collaboration. Such projects generally displayed a higher level of engagement with issues of implementation, most likely suggesting that the collaboration had been considered at the development stage and the practical implications of engaging, for instance, with other schools in a respective city, were being actively addressed. The Research School interviews suggested that the EBPF funding structures did not always support cross-school collaboration as much as they could: schools were expected to carry out different projects, not collaborate to generate aggregate evidence. This is despite the range of participant responses suggesting that supporting more cross-school collaboration could also support the scalability and sustainability of impact. Some schools had created learning hubs and generated resources for long-term use. They saw these as a “big part of the success of the project, 16 and […] something that will be continued now that the project has been completed” (teacher respondent). However, the findings also suggested that this resource could not be used automatically. While something that had been conducted in another school was sometimes identified as a source of information, this was not automatically helpful. One interviewee stated that schools gained most benefit from support that was carefully tailored to their own context:
“The moment you try and get people to do something because it works in another school, it’s unlikely to work as well. I think schools need to have ownership and they need to make the decision, ‘…because this is our structure, this is our issue’. I think that was a real strength.”
Developing new research-informed, evidence-based practices was doable, but particularly for teachers who were already familiar with the research process. A key enabler of successful EBPF implementation and of reporting positive outcomes from EBPF participation involved the existing levels of knowledge and understanding of school staff. Research School interviewees noted how the best projects, with the most effectively recorded impacts, were those where school staff were well-versed in knowledge about the research process, while other schools found it challenging.
“The danger with all of this beautiful research evidence summarised for schools is that it’s there but barriers to engaging with it are too much for some schools, and they’re the schools we most want to do it.”.
(Region2_ResearchSchool)
This highlights our final point.

Schools Need Opportunities to Learn about Research and Evidence Engagement and Evaluation, and This Requires Time

Survey participants were asked what their overall perceptions of EBPF were. Three quarters of respondents indicated that their participation in EBPF was worthwhile, with similar patterns by region. Similarly, 88% of respondents saw their projects as a valuable learning experience and as providing them with the opportunity to do something that they would otherwise not have been able to (75%). This, in itself, was seen as highly valuable. Schools reported considering the first year as ‘practice’, suggesting a teacher “trials some of this, and trials some of this, so that when we get into September she can hit the ground running.” The comments in one of the final reports illustrated this:
“[EBPF] has given us the opportunity to upskill all our staff which in turn enables the progress and raised attainment of our pupils. We have been able to enhance our curriculum resources, strengthen our partnership with parents and have a positive impact on more children reading for pleasure.”
This was described as a source of agency within an often-constrained system:
“The good thing that I saw is getting staff engaged in research, seeing staff think, ‘Oh, I can have some ownership in what I do’, because we are a profession that are very much done unto but this is a way to say, ‘Well, I have got some say on how I deliver you or how I can do this, and I can evidence it because I’ve seen this’, so it’s nice to say, ‘Well, if I can do it in this bit of my practice perhaps do this as well?’”.
(Region3_ResearchSchool)

8. Discussion and Conclusions

8.1. Revisiting Paper Aims

Despite the long-standing desire by researchers and policy makers alike to utilise research in schools to help improve all students’ learning outcomes, and efforts in research translation and evidence building to facilitate this, most typical schools do not use research [3,18]. We know a lot about what effective research-engaged schools are like, but there is relatively less insight into how schools could become research-engaged to facilitate equity without intensive and consistent researcher support, and how this could work at scale [21,22,28].
Our study addresses this knowledge gap by empirically examining how schools operating within deprived areas in England use research and evidence within the context of a programme (the Evidence-Based Practice Fund) meant to support schools to use evidence for the purpose of improving their effectiveness and learner outcomes, in the context of a large social mobility-raising educational programme, the Opportunity Area programme.
In this final section, we offer a theoretical reappraisal of our findings to contribute to a more widely applicable theory of change about schools’ agentic development of research use to support local place-based needs and equity. We also connect our findings with existing insights from the literature, including recent work around the Opportunity Area programme, more widely.

8.2. Framing Findings in the Existing Evidence Body

Considering the findings within the activity theoretical framework [45,59] we propose the following insights from this study. The findings suggested that local institutional learning and development needs and problems of practice are a key motive for schools and teachers to engage with research, and that practitioners find local evaluation of whether the research-informed interventions work in their setting important. This is consistent with findings by Burn and colleagues [22], who also concluded in their study of school-based research champions that having a good understanding of the local aspects of each educational setting was a key facilitator of these practitioners’ work and the uptake of research in schools.
At the same time, schools found the identification and defining of desirable outcomes challenging and oftentimes required some (albeit low-key) support from the Research Schools. This demonstrates that the first stimulus for change is not given, even when it is felt by practitioners. Scaffolds are needed to help schools and teachers articulate and define outcomes, especially to avoid narrowing school-led research-informed interventions solely into areas where standardised measures exist, regardless of local need. However, the study also suggests that even a low volume of support by Research Schools can help schools define the first stimulus for change. In their much wider study of Research Schools, Gu and colleagues [38] similarly found that the role of Research Schools is varied, and that what they provide takes a variety of forms. Our findings would suggest that the critical mass requirement for this provision to be effective can actually be quite small, if it is targeted and aligned with the understanding of local contexts outlined above.
Schools also found identifying suitable research to achieve their desired outcomes challenging. Despite significant efforts in the UK for research translation and dissemination to practitioners, those resources still necessitated further re-translation by the research schools to provide a second stimulus for developing research-informed interventions to address a local need. An alternative to such bespoke re-translation was utilising ready off-the-shelf interventions, but the Research Schools deemed this as less imaginative, a restrictive solution to a local need rather than an opportunity for agentic professional and institutional learning.
Further, the findings suggest the need for a third stimulus: when such change is to be implemented at scale, school/practitioner-led local evaluations are seen by practitioners as a central part of trialling research-informed interventions, to be sure they work ‘here’. This closes the loop to the first stimulus (the local need which motivated the process in the first place) to enable research-informed practice to be employed at scale without the part usually undertaken by academics. However, developing realistic local evaluations requires collaboration, which is rarely systemically supported. Even when such systems of support are put in place, the uptake of that guidance varies substantially between schools, and the manner of uptake can also differ [32].
Therefore, our findings suggest that relational agency [64]—working productively with other teachers and school leaders within and across institutions—can be a significant resource for developing research-informed practice at scale. Indeed, in a study also using a CHAT theoretical perspective to look at biology teachers’ collaborative professional inquiry [70], teachers’ relational agency was deemed to have been enhanced particularly because the active and agentic nature of the professional learning opportunity at hand in that study was similar in principle (even if not in the detail of implementation) with the EBPF project.
While this may be the case, we find that alongside relational agency, productive engagement with EBPF also requires relational expertise [64]: teachers and school leaders do not simply ‘choose’ to collaborate and support colleagues/staff but need to learn to engage with the knowledge of others to work productively together to expand understandings of and solutions to their local problems. When given space to practice this, it can also lead to whole-school learning and, thus, a culture shift. Ample research, from the foundational educational effectiveness work of Hargreaves [71] to recent studies of evidence-used-promoting school cultures (e.g., Williams et al. [72], looking at evidence-based practice to support children with Autism in Australian schools), highlights the complexity of school culture and school culture change, particularly in relation to evidence-based practice, but also identifies productive mechanisms by which teachers may develop the collaborative turn that our findings suggest is supportive of better evidence use.
Finally, the findings suggest that developing research-informed interventions to address local needs is a significant undertaking, not simply an addition or tweak to existing practice. This presents a paradox: teachers and school leaders need to be well versed in knowledge about the research process to undertake and support research-informed practice, even though research-informed practice is often expected to be a tool and not an outcome of professional learning and change. The role of school leaders in particular in this process, has been particularly highlighted [19,30,73], even as other evidence [38] points to the fact that school leaders may experience and perceive the evidence-based research-informed professional change more readily, or at least be in a better position to articulate it.

8.3. Conclusions

The findings of our study exploring the mechanisms of implementation around a range of school-based projects supported by the Evidence-Based Practice Fund in three Opportunity Areas in England using a multi-modal theory-based evaluation design point to a series of enablers and barriers to the uptake of research-informed and evidence-based practice in schools for the purpose of improving learner outcomes.
Our findings suggest barriers exist around the understanding of these concepts, and what EBPF projects may look like, as well as around highly ambitious but not always feasible plans to understand the local impact of new practice. Despite this, even small amounts of support by peer institutions or other organizations can facilitate productive engagement with the project. This can result in teachers and school leaders finding participation worthwhile both from the perspective of improving learner outcomes and in terms of the change in school culture that can support, and in turn, be supportive of, better practice, and, later, better learner outcomes.
Ultimately, we find, that to shift local norms to enable such new practices to be developed may well require that the trialling of these practices themselves to be made the initial outcome, without expecting research-engagement to immediately lead to changes in students’ learning. Our findings offer some evidence that doing so may enable the upskilling of staff and opportunities for whole-school learning and may, thereby, contribute to transformative practice at scale.
Our study, therefore, contributes to the development of a theory of change regarding the mechanisms that may be involved in typical schools, serving disadvantaged populations, making use of research to develop new practices to help address local needs, and problems of practice in relation to the learning of all students.

Author Contributions

Conceptualization, R.H. and S.I.; methodology, R.H. & S.I.; formal analysis, R.H. & S.I.; investigation, R.H. & S.I.; data curation, R.H. & S.I.; writing—original draft preparation, R.H.; writing—review and editing, R.H. & S.I.; funding acquisition, R.H. & S.I. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Suffolk County Council, grant number IOA 3.4, under the Opportunity Area Programme funded by the Department for Education, UK. The APC was funded by the University of Cambridge.

Data Availability Statement

Data cannot be made available for ethical reasons relating to identifiable information being present in all transcripts and survey responses.

Acknowledgments

We would like to acknowledge the support of the Opportunity Area leads from the three counties where the Evidence Based Practice Fund was implemented as part of the Opportunity Area programme, as well as Suffolk County Council for funding the research. We would like to thank Sara Curran and Gabrielle Arenge for their support with elements of the data collection and analysis.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. EEF. Closing the Attainment Gap. 2018. Available online: https://educationendowmentfoundation.org.uk/public/files/Annual_Reports/EEF_Attainment_Gap_Report_2018_-_print.pdf (accessed on 3 March 2022).
  2. Stewart, K.; Waldvogel, J. Closing Gaps Early. The Sutton Trust. 27 September 2017. Available online: https://www.suttontrust.com/ourresearch/closing-gaps-early-parenting-policy-childcare/ (accessed on 25 February 2022).
  3. Kyriakides, L.; Charalambous, E.; Creemers, B.P.M.; Antoniou, P.; Devine, D.; Papastylianou, D.; Fahie, D. Using the dynamic approach to school improvement to promote quality and equity in education: A European study. Educ. Assess. Eval. Account. 2019, 31, 121–149. [Google Scholar] [CrossRef]
  4. Chowdry, H.; Crawford, C.; Dearden, L.; Goodman, A.; Vignoles, A. Widening participation in higher education: Analysis using linked administrative data. J. R. Stat. Soc. Ser. A (Statistics Soc.) 2013, 176, 431–457. [Google Scholar] [CrossRef] [Green Version]
  5. Carniero, P.; Cattan, S.; Dearden, L.; van der Erve, L.; Krutikova, S.; Macmillan, L. The Long Shadow of Deprivation: Differences in Opportunities across England. Technical Report. Available online: https://www.gov.uk/government/publications/the-long-shadow-of-deprivation-differences-in-opportunities (accessed on 18 January 2022).
  6. Strand, S.; Lindorff, A. Ethnic Disproportionality in the Identification of Special Educational Needs (SEN) in England: Extent, Causes and Consequences. Available online: http://www.education.ox.ac.uk/wp-content/uploads/2018/12/Combined-Report_2018-12-20.pdf (accessed on 18 January 2022).
  7. Crawford, C.; Macmillan, L.; Vignoles, A. When and why do initially high-achieving poor children fall behind? Oxf. Rev. Educ. 2017, 43, 88–108. [Google Scholar] [CrossRef] [Green Version]
  8. Hutchinson, J.; Bonetti, S.; Crenna-Jennings, W.; Akhal, A. Education in England: Annual report 2019. Education Policy Institute. Available online: https://epi.org.uk/wp-content/uploads/2019/07/EPI-Annual-Report-2019.pdf. (accessed on 18 January 2022).
  9. Slavin, R.E. How evidence-based reform will transform research and practice in education. Educ. Psychol. 2020, 55, 21–31. [Google Scholar] [CrossRef]
  10. Reynolds, D.; Sammons, P.; De Fraine, B.; Van Damme, J.; Townsend, T.; Teddlie, C.; Stringfield, S. Educational effectiveness research (EER): A state-of-the-art review. Sch. Eff. Sch. Improv. 2014, 25, 197–230. [Google Scholar] [CrossRef] [Green Version]
  11. Bauer, J.; Prenzel, M.; Renkl, A. Evidenzbasierte Praxis–im Lehrerberuf?! Einführung in den Thementeil. Unterrichtswissenschaft 2015, 43, 188–192. [Google Scholar]
  12. Ball, S.J. Policy Sociology and Critical Social Research: A personal review of recent education policy and policy research. Br. Educ. Res. J. 1997, 23, 257–274. [Google Scholar] [CrossRef]
  13. Oancea, A. The aims and claims of educational research. In Education, Ethics and Experience; Hand, M., Davies, R., Eds.; Routledge: Abingdon, VA, USA, 2016; pp. 109–122. [Google Scholar]
  14. Styles, B.; Torgerson, C. Randomised controlled trials (RCTs) in education research–methodological debates, questions, challenges. Educ. Res. 2018, 60, 255–264. [Google Scholar] [CrossRef] [Green Version]
  15. House of Commons Education Committee. Is the Catch-up Programme Fit for Purpose? UK House of Commons. Available online: https://committees.parliament.uk/publications/9251/documents/160043/default/ (accessed on 3 March 2022).
  16. Macdonaldo, J.E.; De Witte, K. The effect of school closures on standardised student test outcomes. Br. Educ. Res. J. 2022, 48, 49–94. [Google Scholar] [CrossRef]
  17. Hofmann, R.; Arenge, G.; Dickens, S.; Marfan, J.; Ryan, M.; Tiong, N.D.; Radia, B.; Janik Blaskova, L. The COVID-19 learning crisis as a challenge and an opportunity for schools: An evidence review and conceptual synthesis of research-based tools for sustainable change. CEPS J. 2021, 11, 39–66. [Google Scholar] [CrossRef]
  18. Coldwell, M.; Greany, T.; Higgins, S.; Brown, C.; Maxwell, B.; Stiell, B.; Stoll, L.; Willis, B.; Burns, H. Evidence-Informed Teaching: An Evaluation of Progress in England. Research Report; Department for Education: London, UK, 2017. [Google Scholar]
  19. Godfrey, D. Leadership of schools as research-led organisations in the English educational environment: Cultivating a research-engaged school culture. Educ. Manag. Adm. Leadersh. 2016, 44, 301–321. [Google Scholar] [CrossRef]
  20. Dimmock, C. Leading research-informed practice in schools. In An Eco-System for Research-Engaged Schools; Godfrey, D., Brown, C., Eds.; Routledge: Abingdon, VA, USA, 2019. [Google Scholar]
  21. Cain, T.; Brindley, S.; Brown, C.; Jones, G.; Riga, F. Bounded decision—Making, teachers’ reflection and organisational learning: How research can inform teachers and teaching. Br. Educ. Res. J. 2019, 45, 1072–1087. [Google Scholar] [CrossRef]
  22. Burn, K.; Conway, R.; Edwards, A.; Harries, E. The role of school—Based research champions in a school–university partnership. Br. Educ. Res. J. 2021, 47, 616–633. [Google Scholar] [CrossRef]
  23. Ruthven, K. Improving the development and warranting of good practice in teaching. Camb. J. Educ. 2005, 35, 407–426. [Google Scholar] [CrossRef]
  24. Hargreaves, D.H. Teaching as a research-based profession: Possibilities and prospects (The Teacher Training Agency Lecture 1996). In Educatonal Research and Evidence-Based Practice; Hammersley, M., Ed.; SAGE Publications Ltd.: New York, NY, USA, 1996. [Google Scholar]
  25. McIntyre, D. Bridging the gap between research and practice. Camb. J. Educ. 2005, 35, 357–382. [Google Scholar] [CrossRef]
  26. Goldacre, B. Building Evidence into Education. Department for Education. Available online: http://media.education.gov.uk/assets/files/pdf/b/ben%20goldacre%20paper.pdf (accessed on 3 March 2022).
  27. Biesta, G.; Aldridge, D. The contested relationships between educational research, theory and practice: Introduction to a special section. Br. Educ. Res. J. 2021, 47, 1447–1450. [Google Scholar] [CrossRef]
  28. Farley-Ripple, E.; May, H.; Karpyn, A.; Tilley, K.; McDonough, K. Rethinking connections between research and practice in education: A conceptual framework. Educ. Res. 2018, 47, 235–245. [Google Scholar] [CrossRef]
  29. Hofmann, R. Leading professional change through research(ing): Conceptual tools for professional practice and research. In Transformative Doctoral Research Practices for Professionals; Brill Sense: Leiden, The Netherlands, 2016; pp. 141–154. [Google Scholar]
  30. Brown, C.; Greany, T. The Evidence-Informed School System in England: Where Should School Leaders Be Focusing Their Efforts? Leadersh. Policy Sch. 2017, 17, 115–137. [Google Scholar] [CrossRef]
  31. Kime, S.; Coe, R. The Evidence-Based Teacher: Identifying, Understanding and Using Research in Schools; Routledge: London, UK, 2021. [Google Scholar]
  32. Higgins, S.; Katsipataki, M.; Aguilera, A.B.V.; Dobson, E.; Gascoine, L.; Rajab, T.; Reardon, J.; Stafford, J.; Uwimpuhwe, G. The Teaching and Learning Toolkit: Communicating research evidence to inform decision-making for policy and practice in education. Rev. Educ. 2022, 10, e3327. [Google Scholar] [CrossRef]
  33. Haynes, L.; Service, O.; Goldacre, B.; Torgerson, D. Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials. UK Cabinet Office. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/62529/TLA-1906126.pdf (accessed on 3 March 2022).
  34. EEF Blog. Available online: https://educationendowmentfoundation.org.uk/news/eef-blog-teacher-choices-trials-our-new-approach-to-researching-questions (accessed on 9 March 2022).
  35. What Works Network. The Rise of Experimental Government: Cross-Government Trial Advice Panel Update Report. UK What Works Network and Economic and Social Research Council. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/753468/RiseExperimentalGovernment_Cross-GovTrialAdvicePanelUpdateReport.pdf (accessed on 9 March 2022).
  36. Hofmann, R.; Ilie, S. Developing and evaluating a research-based scalable implementation toolkit for dialogic teaching in disadvantaged schools [Individual paper]. In Proceedings of the American Educational Research Association (AERA) Conference, Virtual Meeting, Washington, DC, USA, 8–12 April 2021. [Google Scholar]
  37. Creemers, B.P.; Kyriakides, L. Using educational effectiveness research to improve the quality of teaching practice. In The Routledge International Handbook of Teacher and School Development; Day, C., Ed.; Routledge: London, UK, 2012; pp. 415–425. [Google Scholar]
  38. Gu, Q.; Seymour, K.; Rea, S.; Knight, R.; Ahn, M.; Sammons, P.; Kameshware, K.K.; Hodgen, J. The Research Schools Programme in Opportunity Areas: Investigating the Impact of Research Schools in Promoting Better Outcomes in Schools; Education Endowment Foundation: London, UK, 2021. [Google Scholar]
  39. Biesta, G. Improving education through research? From effectiveness, causality and technology to purpose, complexity and culture. Policy Futures Educ. 2015, 14, 194–210. [Google Scholar] [CrossRef]
  40. Burnett, C.; Coldwell, M. Randomised controlled trials and the interventionisation of education. Oxf. Rev. Educ. 2021, 47, 423–438. [Google Scholar] [CrossRef]
  41. Ilie, S.; Rose, P.; Vignoles, A. Understanding higher education access: Inequalities and early learning in low and lower-middle-income countries. Brit. Educ. Res. J. 2021, 47, 1237–1258. [Google Scholar] [CrossRef]
  42. Ruthven, K.; Mercer, N.; Taber, K.S.; Guardia, P.; Hofmann, R.; Ilie, S.; Luthman, S.; Riga, F. A research-informed dialogic-teaching approach to early secondary school mathematics and science: The pedagogical design and field trial of the epiSTEMe intervention. Res. Pap. Educ. 2017, 32, 18–40. [Google Scholar] [CrossRef] [Green Version]
  43. Hofmann, R.; Ruthven, K. Operational, interpersonal, discussional and ideational dimensions of classroom norms for dialogic practice in school mathematics. Brit. Educ. Res. J. 2018, 44, 496–514. [Google Scholar] [CrossRef] [Green Version]
  44. Sammons, P. Zero tolerance of failure and New Labour approaches to school improvement in England. Oxf. Rev. Educ. 2008, 34, 651–664. [Google Scholar] [CrossRef]
  45. Edwards, A. Activity theory and small-scale interventions in schools. J. Educ. Chang. 2008, 9, 375–378. [Google Scholar] [CrossRef]
  46. Rainio, A.P.; Hofmann, R. Teacher professional dialogues during a school intervention: From stabilization to possibility discourse through reflexive noticing. J. Learn. Sci. 2021, 4–5, 707–746. [Google Scholar] [CrossRef]
  47. Darling-Hammond, L. Research on Teaching and Teacher Education and Its Influences on Policy and Practice. Educ. Res. 2016, 45, 83–91. [Google Scholar] [CrossRef] [Green Version]
  48. Osborne, J.; Simon, S.; Christodoulou, A.; Howell-Richardson, C.; Richardson, K. Learning to argue: A study of four schools and their attempt to develop the use of argumentation as a common instructional practice and its impact on students. J. Res. Sci. Teach. 2013, 50, 315–347. [Google Scholar] [CrossRef]
  49. Horn, I.S.; Kane, B.D. Opportunities for Professional Learning in Mathematics Teacher Workgroup Conversations: Relationships to Instructional Expertise. J. Learn. Sci. 2015, 24, 373–418. [Google Scholar] [CrossRef]
  50. Vanlommel, K.; Schildkamp, K. How do teachers make sense of data in the context of high-stakes decision making? Am. Educ. Res. J. 2019, 56, 792–821. [Google Scholar] [CrossRef]
  51. Michaels, S.; O’Connor, C.; Resnick, L.B. Deliberative Discourse Idealized and Realized: Accountable Talk in the Classroom and in Civic Life. Stud. Philos. Educ. 2008, 27, 283–297. [Google Scholar] [CrossRef]
  52. Engeström, Y. From Stabilization Knowledge to Possibility Knowledge in Organizational Learning. Manag. Learn. 2007, 38, 271–275. [Google Scholar] [CrossRef]
  53. Joyce, K.E.; Cartwright, N. Bridging the Gap Between Research and Practice: Predicting What Will Work Locally. Am. Educ. Res. J. 2020, 57, 1045–1082. [Google Scholar] [CrossRef] [Green Version]
  54. Engeström, Y.; Engeström, R.; Suntio, A. Can a school community learn to master its own future. In Learning for Life in the 21st Century: Sociocultural Perspectives on the Future of Education; Wells, G., Claxton, G., Eds.; Blackwell: London, UK, 2022; pp. 211–224. [Google Scholar]
  55. Hofmann, R.; Vermunt, J.D. Professional learning, organisational change and clinical leadership development outcomes. Med. Educ. 2021, 55, 252–265. [Google Scholar] [CrossRef]
  56. Opfer, V.D.; Pedder, D. Conceptualizing Teacher Professional Learning. Rev. Educ. Res. 2011, 81, 376–407. [Google Scholar] [CrossRef]
  57. Aubrey, K.; Riley, A. Understanding and Using Educational Theories; Sage Publications Incorporated: New York, NY, USA, 2019. [Google Scholar]
  58. Voeten, M.J.; Bolhuis, S. Teachers’ conceptions of student learning and own learning. Teach. Teach. 2004, 10, 77–98. [Google Scholar]
  59. Engeström, Y.; Sannino, A. Studies of expansive learning: Foundations, findings and future challenges. Educ. Res. Rev. 2010, 5, 1–24. [Google Scholar] [CrossRef] [Green Version]
  60. Peavey, E.; Wyst, K.V. Evidence-Based Design and Research-Informed Design: What’s the Difference? Conceptual Definitions and Comparative Analysis. HERD Health Environ. Res. Des. J. 2017, 10, 143–156. [Google Scholar] [CrossRef]
  61. Stichler, J.F. Research, research-informed design, evidence-based design: What is the difference and does it matter? HERD Health Environ. Res. Des. J. 2016, 10, 7–12. [Google Scholar] [CrossRef] [Green Version]
  62. Kyriakides, L.; Creemers, B.; Antoniou, P.; Demetriou, D. A synthesis of studies searching for school factors: Implications for theory and research. Br. Educ. Res. J. 2010, 36, 807–830. [Google Scholar] [CrossRef]
  63. Ellis, V.; Gower, C.; Frederick, K.; Childs, A. Formative interventions and practice-development: A methodological perspective on teacher rounds. Int. J. Educ. Res. 2015, 73, 44–52. [Google Scholar] [CrossRef] [Green Version]
  64. Edwards, A. Being an Expert Professional Practitioner: The Relational Turn in Expertise; Springer: Dordrecht, The Netherlands, 2010. [Google Scholar]
  65. Sannino, A. The principle of double stimulation: A path to volitional action. Learn. Cult. Soc. Interact. 2015, 6, 1–15. [Google Scholar] [CrossRef]
  66. Easton, C.; Harland, J.; McCrone, T.; Sims, D.; Smith, R. Implementation of Opportunity Areas: An Independent Evaluation. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/747975/2018-09-04_OA-process-eval_FINAL.pdf (accessed on 24 March 2022).
  67. Department for Education. Building the Foundations for Change: A Selection of Case Studies. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/753695/DFE_-_Opportunity_Areas-One_Year_On_.PDF (accessed on 24 March 2022).
  68. EEF. The Research School Network. Available online: https://educationendowmentfoundation.org.uk/support-for-schools/research-schools-network (accessed on 24 March 2022).
  69. Stern, E. Impact Evaluation: A Guide for Commissioners and Managers; BOND: London, UK, 2015. [Google Scholar]
  70. McNicholl, J. Relational agency and teacher development: A CHAT analysis of a collaborative professional inquiry project with biology teachers. Eur. J. Teach. Educ. 2013, 36, 218–232. [Google Scholar] [CrossRef]
  71. Hargreaves, D.H. School Culture, School Effectiveness and School Improvement. Sch. Eff. Sch. Improv. 1995, 6, 23–46. [Google Scholar] [CrossRef]
  72. Williams, N.J.; Frederick, L.; Ching, A.; Mandell, D.; Kang-Yi, C.; Locke, J. Embedding school cultures and climates that promote evidence-based practice implementation for youth with autism: A qualitative study. Autism 2021, 25, 982–994. [Google Scholar] [CrossRef]
  73. Konstantinou, I. Establishing a centre for evidence-informed practice within a school: Lessons from the Research and Evidence Centre. Rev. Educ. 2022, 10, e3324. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hofmann, R.; Ilie, S. A Theory-Led Evaluation of a Scalable Intervention to Promote Evidence-Based, Research-Informed Practice in Schools to Address Attainment Gaps. Educ. Sci. 2022, 12, 353. https://doi.org/10.3390/educsci12050353

AMA Style

Hofmann R, Ilie S. A Theory-Led Evaluation of a Scalable Intervention to Promote Evidence-Based, Research-Informed Practice in Schools to Address Attainment Gaps. Education Sciences. 2022; 12(5):353. https://doi.org/10.3390/educsci12050353

Chicago/Turabian Style

Hofmann, Riikka, and Sonia Ilie. 2022. "A Theory-Led Evaluation of a Scalable Intervention to Promote Evidence-Based, Research-Informed Practice in Schools to Address Attainment Gaps" Education Sciences 12, no. 5: 353. https://doi.org/10.3390/educsci12050353

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop