Skip to content

What CSAIF funded: d2 Digital was awarded £132,248 (including £10,000 for evaluation) to build and trial a peer mentoring function into an existing digital platform which supports behaviour change in those who aim to be alcohol free or have controlled alcohol use. The platform provides regular text messages of encouragement for clients to stick to their recovery plans and goals. Depending on their response to the text message, the client either receives a personalised motivational reply or a call from a volunteer, peer mentor or professional.

The grant allowed the team to support 240 service users across East Lancashire and West Kent by mobilising a key team of volunteers, peer mentors and staff who managed the platform. 40 peer mentors were recruited and trained on the system during the period of the grant. View the full impact evaluation.

About the evaluation

Level on Standards: Level 2- they have captured data that shows positive change, but cannot confirm they caused this.

Evaluator: The RSA

Aim: To demonstrate that a SMS Digital Behaviour Change System (Evie) can remotely support an individual’s potential for achieving longer term change and support reduced re-referrals to structured services; and, examine the learning from implementation to provide commissioners and service providers a guide on successful implementation of any digital intervention.

Key findings:

  1. Re-presentation rates are positive (on average only 1 per cent) and the qualitative data from the focus group is encouraging.
  2. The impact on other outcomes is however still unknown due to methodological challenges.

Methodology:

  • Quasi-pre-post representation data
  • Focus group data (not pre-post)
  • Evidence from a previous evaluation which uses a comparison group (instead of pre-post data)

Why is it a Level 2 Evaluation?

Practical challenges in the evaluation have led to the level 2 validation being dependent upon being seen in combination with the earlier, more robust evaluation of a similar (without social action) version of the programme. In addition, D2D is very transparent about the changes to the evaluation and their causes and has done a good job of making the best of the data that could still be collected and of making links between the qualitative and limited quantitative data.

About the evidence journey

Progress: The original evaluation plan progressed through a number of changes from the outset, with a revised evaluation plan signed off in the middle of the project. This was due to several factors, primarily as the original plans for quantitative methods of evaluation were not possible as the project progressed.

Lessons learned:

  • It is important that all parties commit at the beginning of the project to the required tasks; e.g. if questionnaires and semi structured interviews are part of the evaluation process it needs to be agreed who will be conducting them and all of the work around them, and this not left until it is too late.
  • It is important to not be too ambitious in what can be evaluated in terms of numbers recruited as well as the approach taken; for example assessing the real possibility of recruiting a control/comparison group.

Next steps:

  • Prior to new technology/way of working being implemented, develop, with all involved parties, a clear Theory of Change identifying the ultimate goal and the steps needed to achieve them. These steps need to be easily monitored and evaluated along the way, with the opportunity to change the process/procedure if necessary.
  • Where individuals and staff members used the system they reported that they received a valuable service and enhancement to their overall experience. However, the most valuable lesson learnt was around how change management needs to happen across all stakeholders, from commissioners, to service managers, keyworkers, to volunteers. Something which will be taken forward in any further roll out of the technology.