Visualising Intervention – Action Plans

How can we show that non-teacher intervention is working in school

Note: this article is based on working practice during 2005

Part of the Visualising Education Blog Series

The action plan is a simple idea and work well in helping children and young people structure their learning week.  A typical action plan will include the follow:

  • Student name
  • Student class or year group
  • Intervention workers name
  • Date of completion
  • Comment box
  • Student signature
  • Intervention worker signature

However, this approach does not allow for quantitative analysis or the visualisation of the results, making it real for the student and usable to impact statistics.  The following action plan was designed to capture the student’s achievements in a numeric form, see Figure 1 below.

Action_Plan_2_Wk
Figure 1) The action plan

This action plan contains two reviews, week one and week two.  It was purposefully design like this to allow the student to see the difference between the two weeks.  This approach also reduced our printing and paper costs as the action plan could print both side of the A4 paper, allowing for four weeks of action planning to be recorded on one piece of paper.

The scores were transferred into a spreadsheet which had formulas and functions to automatically generate graphs.  These graphs work well with the students as it allowed them to physically see any changes between the action planning meetings.  The graphs were used during parent meetings, student progress meetings and external stakeholders meetings, which had a significant positive impact on stakeholders’ engagement.

At the end of each intervention period (every half term) a copy of the graph would be sent home to the parents, to the student’s form tutor and year manager promoting transparency and home-school communication. For many students on the intervention team caseload (BSED and SEN Statements) this approach increased students organisation skills, such as bring pens, homework and the school planner.  A point which I make in my book, if most of the school population do not bring their school planner on a daily basis then why fight with some of the most difficult students in school to achieve above and beyond what the average school student  is doing.  This type of intervention is about finding ways of making these students feeling successful, so focus on the pen first!  Hard mentoring can be used with a much greater effect with students who are successful within school.

Initially, the action plan was completed directly on the computer but the students did not like it, they preferred the convenience of the paper copy and the much more comfy chairs!  However, I suspect that the advent of android tablets and touchscreen computing would allow this process more readily.  We missed a trick with this data as we only used it to produce the progression graph.  We could have used the data to show the percentage of increase in areas such as attendance, homework and bringing equipment across the cohort.

In conclusion, the action plan served as useful function in tracking, monitoring and promoting student organisation and engagement within the system but offered little influence outside the intervention meeting.  Any change within the learning environment was related to the student-intervention worker professional relationship making this a qualitative measure and hard to quantitatively capture.  We overcame this barrier by developing a credit system; see Visualising Intervention – Intervention Credits.

This blog is based on extracts from this book: A Practical Guide To Inclusion: A Manual For Implementation and Delivery

 

Part of the Visualising Education Blog Series

Visualising Intervention – Intervention Credits

How can we show that non-teacher intervention is working in school

Note: this article is based on working practice during 2005

Part of the Visualising Education Blog Series

As a team providing personalised intervention for more than eighty students across key stage three (age 11-13yrs) we faced the same problems as most researchers who explore this sector, quantitative evidence of student improvement.  Qualitatively, the intervention was making a difference which was observable within the learning experience.  However, this left the department (learning support) and staff, open to the usual comments regarding performance and ability.

Then my brother (Tim Campbell) showed me a kid’s pocket (cheque) book which had different activities on each page.  The idea was the child would request the activity and when the activity took place they handed over the page or cheque.  The epiphany, I quickly realised that this idea could be used to provide the quantitative evidence demonstrating that the intervention staff were making a difference to the children and young people across the school.

mentoring_cheque
Figure 1) The mentoring cheque

 

How Did We Use Them

The intervention worker (mentor, learning support worker or teaching assistant) was responsible for making sure that the student had access to their (intervention) credits.  This was fairly easy as each target was printed in batches of five as we could get five cheques on one side of an A4 piece of paper.  This allowed for two chequebooks to be created on each print run.

The student received their personalised chequebook which contained about nine cheques or three cheques for each of the student’s intervention targets taken from their intervention plan (IBP, IEP, SEN Statement etc…).  This was a deliberate approach as it allowed the school/ department to compare students’ success against pre-defined targets, which introduced data transparency.

The number of cheques for a given target depended on how successful the student was at achieving them.  For example, a target which the student found hard to achieve would have three to five cheques but we left in cheques which the student could easily achieve to ensure the continuing feeling of success.

Given the nature of the caseload students (BESD or SEN Statemented) it was decided that it was the student responsibility to get the cheque signed not the teachers or teaching assistants.  To promote this idea and increase the engagement with the chequebook we introduced prizes.  It was quickly realised that the prizes had to be well defined as students would use any gaps to get more credits or more rewards.  To overcome this, a standardised list was created with the required credit points and times which the item could be accessed (break, dinner, after school).  The standardised list also included items like pens and pencils for only a few credits which supported well for students who had poor organisational skills.

Based on previous observations and experience it was discovered that systems which mirrored the behavioural system, led to very low engagement levels.   As a result of this, the team went to great lengths to separate this credit scheme from the behavioural system.  Presenting the credit scheme at the whole school staff training, teachers were directed not to request or remind the students about their chequebook and be honest and fair about the student performance during the lesson.  Most importantly, do not treat the credit scheme as a punishment.

The Bank (data collection)

When the credits were returned by the student their name, target, subject and date were entered in a spreadsheet which had formulas and functions allowing it to automatically update the student’s scores and the overall success graph.   This allowed the student to experience instant gratification.  At the end of each day a new vertical bar chart, listing all the targets and their frequency was printed and displayed on the wall immediately outside the intervention team office and the staffroom.  This had a major positive impact on the students’ motivation levels.  For some students they could see that they were having a positive experience within the school and that they were contributing to the school as a whole, others used it as a competition between friends.  For a few, it acted as a catalyst for a change in behaviour.

This approach to quantitative data gathering allowed the inclusion department to highlight how much impact the provided intervention was having within each department and across the school.  More importantly, this allowed the teaching staff to visualise who school intervention based on students targets.  Recording the data also allowed the intervention team to track and change targets when the student was continually successful.

How Successful Was It

In my considered opinion, this approach to tracking and accessing intervention success was the most efficient and effective approach that we tried over the five years which I was at the school.  To highlight this, of the eighty students on the intervention caseload we took 28 (35%) of them to the cinema which was worth 200 credits.  This means that each student had to achieve their target(s) 200 times or 6.6 targets per day, one each lesson plus a spare.  When you multiply this by 28 (students) the intervention had 5600 instances of success over that half term period.

Now read this post:  Visualising Intervention – Action Plan.

This blog is based on extracts from this book: A Practical Guide To Inclusion: A Manual For Implementation and Delivery

 

Part of the Visualising Education Blog Series