Can we mitigate gender bias in innovation funding?

Anyone who’s sifted through applications can understand how easy it is to become paralysed by the enormity of the responsibility of giving grants for public sector innovation.

No one wants to be responsible for getting between that idea and the people who really need it.

We know from research into academic grant funding that even people who believe they are making unbiased choices can be unconsciously influenced by irrelevant factors, like applicant gender. Harvard Business School’s Professor Iris Bohnet has written about the traditional interview process and how it can easily weed out the best rather than helping you find them.

Organisations such as Applied are trying to counter this in the hiring process by providing platforms that help to devise more inclusive job descriptions, introduce bias-free scoring systems and eliminating gendered language from applications.

As we thought about it, we kept returning to the same question: is selecting a job candidate like selecting an innovation project?

We’re looking for exciting people with ideas we’ve never heard before - how do you write a person specification for that?

Of course, there are differences; in innovation grant-giving, the field is very open, the limits of the competition are deliberately and rightly undefined and we’re much more interested in teams (and cohorts of teams) than individuals.We’re looking for exciting people with ideas we’ve never heard before - how do you write a person specification for that?

There are a lot of factors at play when making funding decisions, but a systematic process should be a help rather than a hindrance. We thought that the Innovation Skills Framework might make a good person (or perhaps team) specification.

From a small amount of data from the selection process for our programme, Innovate to Save, we could determine that the proportion of female majority interview teams making it into the cohort was much lower than the proportion of male majority interview teams. So we started to think about how Professor Bohnet’s suggestions might apply in a grant-making, rather than hiring, process.

An aerial photograph of a playing field

What did we do?

When it comes to hiring, there are six specific recommendations for designing interview processes which encourage systematic and deliberative decision-making. We considered these, and tried to apply them to our own selection process.

Of the six recommended actions, we adopted four as part of our selection process. Those were:

  • Everyone gets asked the same structured questions
  • Projects are directly questioned by one panel member at a time
  • Scoring answers before moving on to the next question, so scores don’t influence each other
  • Focusing on skills they will need as part of the programme

The proportion of majority female teams funded this time was much closer to the proportion of majority male teams being funded.

What happened?

Both our programme selection processes were run with small numbers, and based on comparing two such small groups, where other factors including panel membership, the interviewees, the projects changed, it’s not possible to reasonably attribute any changes in grant-making to this bias mitigation project.

That said, there were improved patterns; the proportion of majority female teams funded this time was much closer to the proportion of majority male teams being funded.

Transforming best practice recommendations into real-world actions is always interesting; we found that it was necessary to make compromises in designing the process, and that running the experimental process led to some interesting challenges for the selection panel when compared to a more traditional interview approach.

The practice of ensuring every project is given the same question is utterly fair - but in reality, the effect was that the person asking the question was unable to probe for subtleties and it meant that project answers weren’t perhaps as rich as previously.

It was simply not practical to take notes on answers, score and log behavioural characteristics within the time we had for interview - this recommendation was dropped from our process after a practice-run ahead of interview day.

Making such fundamental changes to the design of our selection process increased the amount of administration and organising necessary to make the process run smoothly. Most projects told us that they enjoyed the experience, but one or two were much less comfortable with the situation.

Despite the challenges throw up by the recommendations we implemented, we are very proud to have purposefully designed a selection process that seeks to be as inclusive as possible. We intend to continue to take positive steps in this area, and hope that our experiences might help others improve their own practice.


Angharad Dalton

Angharad Dalton

Angharad Dalton

Programme Manager, Y Lab

Angharad is the Programme Manager for Innovate to Save

View profile

Rob Callaghan

Rob Callaghan is on secondment as Research Associate at Y Lab, the Public Services Innovation Lab for Wales.