What CSAIF funded: Team Up were awarded £180,000. Funding was provided to help Team Up expand their current reach so that they could help 3,000 children, supported by 20 University societies, who will collectively give 36,000 hours of social action each year. Read the full impact evaluation.

About the evaluation

Level on Standards: Level 1 - they can explain what they do and why it matters, logically, coherently and convincingly.

Evaluator: N/A (Team Up conducted their evaluation in-house)

Aim: The aim of the evaluation was to assess the effectiveness of Team Up’s tutoring programme on pupil grades.

Key findings:

  • Grades for Team Up pupils increased by 1.96 sublevels over one academic year. (This compares favourably to somewhat comparable national data for KS3, but unfortunately data was not available for KS4, which a significant percentage of the Team Up pupils are in).
  • 80% of pupils indicated they considerably enjoyed the tutoring programme (average score 5.5 out of 7), and 79% of pupils felt the tutoring was helpful (5.43 out of 7).
  • Of the 36% of tutor feedback forms that were returned, 95% rated pupils as making good to excellent progress.

Methodology: Primarily a pre-post design, using administrative data on pupils’ sublevel progress from 19 schools. Also some post-session surveys, focus groups and session observations.

Why is this a Level 1 Evaluation?

Measuring progress against sublevels does technically represent pre-post data, and this data does show a positive change in outcome (1.96 sub-levels). However, because literally all active pupils must be making some positive progress against sub-levels, to reach Level 2 there needs to be some additional data to indicate that 1.96 sub-levels is greater than it would likely be without Team Up.

The report does provide comparison data, survey data on whether pupils found the tutoring helpful and some qualitative quotes, but at least one of these things would have to be slightly more robust in order to qualify for Level 2.

About the evidence journey

Progress: Though this evaluation has not quite progressed Team Up to a higher Level on the Standards of Evidence, it has put in place the processes to generate a very large amount of quantitative data that will be a powerful source of evidence with the right survey, qualitative or comparison data alongside it.

Lessons learned: Team Up have learned a huge amount about how better to monitor and evaluate the impact of the programme. This has led to significant changes to our programme design including more effective:

  • quantitative testing through the development of a Team Up baseline and progress test
  • qualitative testing through distance travelled questionnaires
  • weekly progress monitoring through standardized Assessment for Learning tests at the end of each session
  • More effective attribution of impact through control data

Next steps: Team Up's next steps will be to collect and analyse this data over the 2016-2017 programmes in order to be able to report more rigorous data with which to evidence our impact.