Visualising Intervention – Intervention Credits

How can we show that non-teacher intervention is working in school

Note: this article is based on working practice during 2005

Part of the Visualising Education Blog Series

As a team providing personalised intervention for more than eighty students across key stage three (age 11-13yrs) we faced the same problems as most researchers who explore this sector, quantitative evidence of student improvement.  Qualitatively, the intervention was making a difference which was observable within the learning experience.  However, this left the department (learning support) and staff, open to the usual comments regarding performance and ability.

Then my brother (Tim Campbell) showed me a kid’s pocket (cheque) book which had different activities on each page.  The idea was the child would request the activity and when the activity took place they handed over the page or cheque.  The epiphany, I quickly realised that this idea could be used to provide the quantitative evidence demonstrating that the intervention staff were making a difference to the children and young people across the school.

Figure 1) The mentoring cheque


How Did We Use Them

The intervention worker (mentor, learning support worker or teaching assistant) was responsible for making sure that the student had access to their (intervention) credits.  This was fairly easy as each target was printed in batches of five as we could get five cheques on one side of an A4 piece of paper.  This allowed for two chequebooks to be created on each print run.

The student received their personalised chequebook which contained about nine cheques or three cheques for each of the student’s intervention targets taken from their intervention plan (IBP, IEP, SEN Statement etc…).  This was a deliberate approach as it allowed the school/ department to compare students’ success against pre-defined targets, which introduced data transparency.

The number of cheques for a given target depended on how successful the student was at achieving them.  For example, a target which the student found hard to achieve would have three to five cheques but we left in cheques which the student could easily achieve to ensure the continuing feeling of success.

Given the nature of the caseload students (BESD or SEN Statemented) it was decided that it was the student responsibility to get the cheque signed not the teachers or teaching assistants.  To promote this idea and increase the engagement with the chequebook we introduced prizes.  It was quickly realised that the prizes had to be well defined as students would use any gaps to get more credits or more rewards.  To overcome this, a standardised list was created with the required credit points and times which the item could be accessed (break, dinner, after school).  The standardised list also included items like pens and pencils for only a few credits which supported well for students who had poor organisational skills.

Based on previous observations and experience it was discovered that systems which mirrored the behavioural system, led to very low engagement levels.   As a result of this, the team went to great lengths to separate this credit scheme from the behavioural system.  Presenting the credit scheme at the whole school staff training, teachers were directed not to request or remind the students about their chequebook and be honest and fair about the student performance during the lesson.  Most importantly, do not treat the credit scheme as a punishment.

The Bank (data collection)

When the credits were returned by the student their name, target, subject and date were entered in a spreadsheet which had formulas and functions allowing it to automatically update the student’s scores and the overall success graph.   This allowed the student to experience instant gratification.  At the end of each day a new vertical bar chart, listing all the targets and their frequency was printed and displayed on the wall immediately outside the intervention team office and the staffroom.  This had a major positive impact on the students’ motivation levels.  For some students they could see that they were having a positive experience within the school and that they were contributing to the school as a whole, others used it as a competition between friends.  For a few, it acted as a catalyst for a change in behaviour.

This approach to quantitative data gathering allowed the inclusion department to highlight how much impact the provided intervention was having within each department and across the school.  More importantly, this allowed the teaching staff to visualise who school intervention based on students targets.  Recording the data also allowed the intervention team to track and change targets when the student was continually successful.

How Successful Was It

In my considered opinion, this approach to tracking and accessing intervention success was the most efficient and effective approach that we tried over the five years which I was at the school.  To highlight this, of the eighty students on the intervention caseload we took 28 (35%) of them to the cinema which was worth 200 credits.  This means that each student had to achieve their target(s) 200 times or 6.6 targets per day, one each lesson plus a spare.  When you multiply this by 28 (students) the intervention had 5600 instances of success over that half term period.

Now read this post:  Visualising Intervention – Action Plan.

This blog is based on extracts from this book: A Practical Guide To Inclusion: A Manual For Implementation and Delivery


Part of the Visualising Education Blog Series