About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

What works in training? Six lessons for developing professionals

The UK training industry is big business. The private market is worth around £3 billion and there’s 12,300 providers. But how much of it does any good? I’m sure you’ve had that sinking feeling on exiting your PowerPoint course or the Manage-Your-Difficult-Boss session. A sense that all that fresh knowledge will fizzle away, just as soon as you step back in to the office.

Those bad feelings are backed up by research. Being talked at by gurus in one-off training days does little good. But trainers - and their victim pupils - can take heart from the the available evidence that points to what might work best. Below are six key lessons, based on systematic reviews of research or rapid evidence assessments, for training professionals:

1. Don’t just talk at people

As any undergraduate who has dozed off during a lecture will tell you, the didactic ‘chalk and talk’ model rarely works. You need much deeper interactions - time to really inquire and grapple with the issues.

2. Beware standalone classroom-training

You need to embed learning in everyday practice. Evidence on classroom-based training in isolation from the real world shows it’s not very effective. Yes, some of your knowledge may get better by doing a one-off course. But not by much. And it will not grow any lasting skills, or behaviour change. A systematic review showed that training needs to be integrated into routine practice. So, if you take the example of medics, teaching should be part of hospital ward rounds, or case discussions – not lost in the classroom.

3. Make it collaborative

Trainees need to work together. They must join groups and develop their skills together as peers. Bring in outside experts, according to Evidence from school teachers, and encourage peer support. Don’t go it alone.

4. Make sure you do follow-up

It needs to be an ongoing relationship, not just one-off training days. As the name implies, it should be Continuous Professional Development. There’s no one-size-fits-all to the nature of this follow-up. But there must be a “rhythm” to activities, through multiple instances of ongoing support, according to all the research studies in a review of teacher education.

5. Build a sense of purpose

It doesn’t matter if your trainees have signed up as volunteers or if they are the forced-conscripts of the bosses and the HR department. What matters is a positive learning culture. Achieving a shared sense of purpose is important for success, according to a review of evidence by the Teacher Development Trust (PDF p.5). Don’t assume that the sense of purpose is ready-made. You may have to go out and build it.

6. Think about using simulation-based learning

It’s still early days evidence-wise, but simulation-based training may have some advantage over more traditional classroom methods. A systematic review of nursing education found good results in six of the twelve included studies. Simulation training - acting out real-life scenarios - achieved gains in a raft of areas – such as better knowledge, improved critical thinking, more work satisfaction and confidence. However, the evidence isn’t totally persuasive, but it still looks promising.

But a word of warning about all this evidence. There’s some big gaps. It’s interesting to find, for instance, that there’s still little evidence about online or virtual professional learning. That’s not to say that it’s failing. Only the evidence isn’t there. Educational companies like Pearson should be applauded for looking at whether their products – including online – can show evidence of educational benefits, not just their share price on the FTSE-100.

It’s not just the gaps that are a problem. When there is evidence, little of it is strong. A research review for the College of Policing, said that: “the majority of the research evidence identified in this review is inconclusive. There is a lack of robust evidence on the different training approaches outlined.” (PDF p.5). Only in the health is the research half-decent, according to the College of Policing.

Should we care about the evidence behind training? Well, yes we should, if we want to make the most of our time on training. We’re too busy to spend hours online or whole days away on courses that fail to change us (unless you hate your job, then you might love the chance to escape). Be a smart consumer – and ask for evidence behind the trainers’ claims.

Also, for us, there’s some self-interest in knowing about the evidence. We’re running our own training at the Alliance for Useful Evidence - like our charity leader masterclasses with bodies like ACEVO and ACOSVO and Evaluation Support Scotland, as well as for Grade 7 Policy Professionals in Whitehall and the Welsh Government, and for local authority chief executivess with SOLACE.

The masterclasses are popular - but we must ask if they work? It’s great to read the glowing responses to our feedback forms. But we teach about biases in questionnaires. Can those evaluation scores be trusted? We need to be ruthless with ourselves and think about fresh ways of engaging audiences, as Geoff Mulgan, CEO of Nesta, has argued for in a recent paper on what makes meaningful meetings.

The best we can do is to check feedback against the wider research literature. It’s a clear case of practicing what we preach: learn from the best available evidence when designing our training. This blog has set out some of the research I have found most useful in doing that design work (it took 2 years to set up the masterclass programme). If there are any systematic reviews that I have missed, do let me know.

Selection of useful evidence:

Evidence Reviews: What works in Training, Behaviour Change and Implementing Guidance? College of Policing, 2015.

Developing Great Teaching: Lessons From The International Reviews Into Effective Professional Development Teacher Development Trust, 2015.

Coomarasamy, A and Khan, K. S., (2004). What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review British Medical Journal (329).

Cant, R. P., and Cooper, S. J., (2010). Simulation-based learning in nurse education: systematic review. Journal of Advance Nursing 66(1), 3-15.

Goodman, J. S. & O’Brien, Teaching and learning using evidence-based principals in D. M. Rousseau (Ed.), The Handbook of Evidence-Based Management: Companies, Classrooms, and Research: 309-336. (New York: Oxford University Press, 2012).

Note that this blog first appeared on the Alliance for Useful Evidence website


Jonathan Breckon

Jonathan Breckon

Jonathan Breckon

Director, Alliance for Useful Evidence

Jonathan was the Director of the Alliance for Useful Evidence from 2012 to 2021.

View profile