Effective Interventions and Strategies for Pupils with SEND
eBook - ePub

Effective Interventions and Strategies for Pupils with SEND

Using Evidence-Based Methods for Maximum Impact

  1. 120 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Effective Interventions and Strategies for Pupils with SEND

Using Evidence-Based Methods for Maximum Impact

Book details
Book preview
Table of contents
Citations

About This Book

Effective Interventions and Strategies for Pupils with SEND offers practical, tried-and-tested strategies for supporting and championing pupils with special educational needs and disabilities. Each strategy has been researched, trialled and reviewed, with the results presented accessibly and the concerns of real teachers a key focus of the discussion.

With each chapter written by an experienced and innovative teacher working with children with SEND, this book covers a wide range of strategies for supporting pupils with SEND. These effective strategies include:



  • Using a 'daily run' to improve concentration and behaviour


  • Creating SEN champions and more effective teaching assistants


  • Embedding anxiety-reducing strategies in the classroom.

Written for teachers by teachers, Effective Interventions and Strategies for Pupils with SEND is an indispensable resource for all SENCOs and other educators and staff working with children with special educational needs looking to provide the best learning experiences possible.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Effective Interventions and Strategies for Pupils with SEND by Gill Richards, Jane Starbuck, Gill Richards, Jane Starbuck in PDF and/or ePUB format, as well as other popular books in Education & Education General. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2019
ISBN
9780429516467
Edition
1

Chapter 1

The impact of introducing robust monitoring of interventions

Katherine Chubb

Context

My school is a smaller than average sized primary school and has a much lower than average proportion of pupils known to be eligible for free school meals. The proportion of pupils labelled as having Special Educational Needs and Disabilities (SEND) is below average, as is the proportion of pupils with an Education, Health and Care (EHC) plan. We run a number of Wave 2 and Wave 3 Literacy and Numeracy interventions, but as we have only 11 pupils on our SEND register, it is hard to obtain reliable data as to the effectiveness of the intervention programmes.
As a SEN Co-ordinator (SENCO), I have a key part to play in evaluating the impact of our provision on pupil progress. Indeed, the Special Educational Needs and Disability Code of Practice: 0–25 Years (2015) sets out that effective provision management can be used strategically to help a ‘school to develop the use of interventions that are effective and to remove those that are less so’ (DfE 2015: 6.77). I believed the systems in place in our school at the time were not rigorous enough and I wanted to investigate if there was a better system for monitoring and evaluating the impact of interventions which would enable me to link the outcomes of provision to performance management. This is a key step in moving from a system of provision mapping to provision management, as recommended by Gross (2015).
Through this research project I wanted to find out whether the quality of interventions could be improved by targeted training for those delivering them (Wearmouth 2016; Gross 2015). As our school systems stood, there was an overlap in responsibilities for professional development in our school, and for monitoring and evaluating the impact of interventions. Teaching Assistants (TAs) were at the time being observed once a year by a Higher Level Teaching Assistant (HLTA). The focus for these observations was decided upon by the Senior Leadership Team (SLT) and based on targets from the School Development Plan, instead of using what research suggests – the quality features of an intervention session. The feedback given to those delivering interventions was not based around ‘fidelity to the programme’, which research by Gross (2015: 130) found to be highly important in terms of programme impact. Lack of school funding now means that there are fewer opportunities for accessing high quality training, but I would argue that a better understanding of expected intervention outcomes, within a system linked to performance management, could enable better targeting of funding for training where gaps in teacher and TAs’ knowledge and skills have been identified.

Strategy

I narrowed my research to find out what impact introducing a more rigorous system for monitoring intervention sessions would have on the quality of the intervention and the progress made by the children involved in them. The system was to be based upon the ‘fidelity’ (the degree to which delivery protocols matched those intended by the programme authors) to each intervention programme. Feedback and training given would be linked to what research – such as that from The EEF Guidance Report (2017) on ‘Improving Literacy in Key Stage 2’ – suggests are the key quality features of an intervention session. The EEF Guidance Report (2015) into ‘Making Best Use of Teaching Assistants’ states that:
At present there are only a handful of programmes in the UK for which there is secure evidence of effectiveness and if a school uses unproven interventions then we must ensure they include common elements of effective interventions … [we must] ensure there is fidelity to the programme and do not depart from suggested delivery protocols (EEF 2015: 24)
I identified four intervention programmes run in school where concerns had been raised by teachers and TAs as to their effectiveness, impact on progress and how closely we followed delivery protocols. The interventions included ‘Number Box’ (one child), ‘Power of 2’ (five children), ‘Nessy Learning Programme’ (five children) and ‘Hornet Literacy Primer’ (one child). These particular interventions do not appear on evidence rating system databases such as the online ‘Evidence for Impact’ (E4I) database (IEE n.d.), which highlights if there is secure evidence on the effectiveness on an intervention. It is also harder to collect reliable data on them, due to the small number of children taking part in them in our school.
With the consent of those delivering the interventions, I carried out two observations of each intervention programme delivered by TAs in school. Intervention sessions observed lasted from between 5 to 20 minutes in length. After the initial observation, I provided targeted feedback on how the quality of the session could be improved and carried out a follow-up observation seven weeks later. Before any observations took place, I produced check sheets that could be used as a basis for discussion around the ‘fidelity to the programme’ with the adults delivering the interventions. These check sheets contained:
  • a summary of the delivery protocol for each intervention, with details on the aims of each programme
  • the suggested target group, and
  • further detail on the recommended length, frequency and assessment method.
I then researched the quality features of an effective intervention. I combined recommendations from the EEF Guidance Report (2015) with the format for an observation framework suggested by Gross (2015) and Wearmouth (2016). My observation framework (see Table 1.1) contained 22 quality features with prompts for the observer to help identify whether that feature was present in the intervention session being observed.

Collecting evidence to measure impact

Evidence collection

I chose to carry out observations of the four TAs delivering intervention sessions using the observation framework rather than questionnaires, because I aimed to ‘bring certain practices and behaviours to light’ and resolve any practical problems in the delivery of the intervention (Burton and Bartlett 2005: 114). This method also gave me quantitative data on the quality features of an intervention session observed before and after feedback was given to the adult delivering the programme. I was then able to analyse the results to see what impact the new system of monitoring had achieved.
Informed consent to be observed and interviewed as part of my research (which was additional to normal school practice) was gained from the TAs delivering the four identified interventions, and they were told they could opt out at any time (BERA 2018). I was aware that my research was taking place in a climate where there were concerns among TAs that their hours were at risk of being reduced. I made it clear to those who agreed to be involved that my research project was not in any way linked to this. I also made them aware that the observations were not to identify any individual’s ability to deliver an intervention, but to investigate the effectiveness of the current systems in place for monitoring interventions. Children involved were identified as Child A (Number Box intervention), Child B (Nessy Learning Programme), Child C (Hornet Literacy Word Primer) and Child D (Power of 2 Maths Intervention). I also ensured that TAs’ interview answers were anonymised (they were identified as TA1, TA2, TA3, TA4) and all research data was kept secure.
Table 1.1 Observation framework
Observation framework quality features Observed in session Evidence
Are the right pupils targeted and are there clear entry and exit criteria for the intervention?
Is the location appropriate?
Is the frequency of the intervention as specified?
Is the session length as specified in the programme?
Does the session content match that specified in the programme?
Are resources pre-prepared and are well managed?
Do the adults know what the learning objectives are?
Does the pupil know what the learning objectives are?
Is the session planned and adjusted on the basis of assessment?
Do pupils help identify their own learning targets and assess their own progress?
Are key instructions and learning points given concisely and clearly and repeated as necessary?
Is behaviour well managed and does the adult promote interaction between the pupils in the group or adult?
Does the adult promote independence and help the pupil to recognise successful strategies helping them to apply them in other situations?
Does the adult create a secure and supportive environment where there is safety to ‘have a go’ and make mistakes?
Does the adult challenge the pupil and expect the most from them?
Is the session active, lively and multisensory, with sessions being well paced?
Does the class teacher review the intervention jointly?
Does the class teacher and TA have time to meet to review pupil’s progress?
Is the pupil’s progress carefully tracked?
Has there been good training for the person delivering the curriculum?
Are there opportunities for pupils to apply their learning and have it reinforced in class?
Is parental involvement secured as specified in the programme?
I then carried out an initial observation of TA1. I endeavoured to ensure that she felt at ease during the observation in order to minimise the impact my presence would have. From this observation, I identified a number of key features of an effective intervention that I could not observe within a session and would instead require interviews with the TAs involved after each observation. I developed an interview schedule to collect evidence to identify if the remaining quality features of the intervention were in place for that particular intervention programme. Questions asked were to elicit, for example, whether children got the opportunity to apply what they have been learning in the intervention to other contexts, and whether parents were informed about their child’s involvement and progress in each intervention. In the second set of interviews, I did not stick as rigidly to the interview schedule. I took more of an unstructured approach so that interviewees were able to ‘talk around the topic in their own way’ and I could ask additional questions ‘where a particular issue had not been covered’ (Burton and Bartlett 2005: 95) to get more qualitative data.
Having carried out the initial observations and interviews, I provided feedback to each TA, referring to the relevant intervention check sheet. We then addressed any practical problems that had resulted in some of the intervention’s quality features not being seen. Some TAs were not aware of individual’s attainment data, so I collected this from the teachers who had overall responsibility for that child’s progress. This enabled me to measure the impact that introducing a more rigorous system of monitoring had on the progress of the children involved in them.

Impact

One of the main findings from my observations and interviews can be seen in Figure 1.1. It shows that the number of quality features of an intervention session increased following the introduction of this more rigorous monitoring system. The average increase in the number of quality features observed was six, with the largest increase being nine.
The impact that the new system of monitoring had on the progress of individuals involved in these interventions was mixed and while drawing any reliable conclusions was difficult due to the low number of children and the sho...

Table of contents

  1. Cover
  2. Half Title
  3. Series Page
  4. Title Page
  5. Copyright Page
  6. Table of Contents
  7. Acknowledgements
  8. List of contributors
  9. Introduction
  10. 1. The impact of introducing robust monitoring of interventions
  11. 2. Does embedding anxiety-reducing strategies in the classroom improve behavioural and educational outcomes for children in Year 6?
  12. 3. A marked improvement for pupils with special educational needs and disabilities
  13. 4. Increasing the effectiveness of teaching assistant support
  14. 5. Becoming SEN champions
  15. 6. How does a daily run affect the concentration, attention and behaviour of children, especially those with ADHD and/or behavioural difficulties?
  16. 7. Moving up to ‘big school’
  17. 8. The impact of introducing ‘key adults’ to support children with challenging behaviour
  18. 9. Improving the school’s universal provision in accordance with changes outlined in the new special educational needs and disability code of practice
  19. 10. The impact of the Special Educational Needs and Disabilities Code of Practice: 0–25 Years on relationships between SENCOs, parents and colleagues
  20. Index