Promoting experimentation in government – learning from Canada’s experience

Back in 2015, the Canadian Prime Minister publicly released, for the first time, his instructions to all his ministers. Among these instructions, entitled mandate letters, one was particularly relevant to experimentation. It was addressed to the President of the Treasury Board of Canada, and it stated:

"You should work with your colleagues to ensure that they are devoting a fixed percentage of program funds to experimenting with new approaches to existing problems and measuring the impact of their programs."

The mandate was further clarified by a subsequent directive on experimentation produced by the Treasury Board of Canada Secretariat (TBS) and Privy Council Office (PCO), which makes an explicit link between experimentation and more effective policy making.

A few weeks back we were lucky enough to have a chance to compare notes with Myra Latendresse-Drapeau and Dan Monafu who are part of the team within TBS who is in charge of turning the mandate to experiment into a reality across government.

The visit was extremely timely for us as we are working to develop a framework for government experimentation in the UAE.

We were inspired by the Finnish Government’s Office of Experimentation - one of the exhibitors at last year’s Edge of Government, as well as the Danish Design Center event on policy experimentation by design.

As usual, nothing beats learning from practitioners and Dan and Myra were extremely generous in sharing their insights and lessons 'from the trenches'.

Here are some of the elements that particularly stood out for us from our conversation:

1. Translating the experimentation intent into practice: Traditionally, government work is not often the place for organic, incremental learning from practice. Rather, government innovators are typically in the situation of someone who is given the command of a plane that has already taken off and needs to find a landing strip (with little guidance to go by).

Under those circumstances, the process of translating the intent of a new policy (in this case, on experimentation) requires a process of ongoing (re)discovery (what was the original rationale?), reframing (how does the political frame translate into the civil servants’ frame?) and validation across different parts of the bureaucracy.

Implementation is not a straight line and requires careful stewardship. Here’s a couple of examples that stood out for us:

  • Budget allocation for experimentation: How does the original intent of allocating a fixed percentage of funds to experimentation translate into practice? This is not exclusively a question of availability of finances, but perhaps first and foremost of capabilities. As usual, 'government' is not a monolith and some departments are more ready than others to adopt a new approach like experimentation (we faced a similar situation in the UAE when a percentage of the national budget was allocated to innovation). We were very impressed, for example, by the experimentation framework and guidance tools that were shared with us by Jason Pearman from NRCan’s Innovation Hub. So how can one match the original intent of fostering experimentation across the board with the reality of different speeds and skills for implementation? Here is where the role of 'translators' like Dan and Myra is crucial. In their case, the government landed on a flexible interpretation of the 'fixed budget allocation' mandate to allow for the outlier departments not to be slowed down, while creating a space for those with less capacity to gradually internalise the experimentation mantra.
  • Upstream vs. downstream reporting on experimentation progress: Dan and Myra shared another familiar implementation conundrum when they talked about the challenge of coming up with a meaningful framework for departments to report on their progress on experimentation. We faced a similar situation with our innovation KPIs: the tension between upstream reporting (indicators that are produced to 'feed the bureaucratic beast') and downstream data (which is first and foremost useful for those working in the trenches). Their preliminary analysis of the first round of reporting around the experimentation commitment (through a mechanism called Departmental Plans - which each department and agency in the federal system must produce yearly), showed that: approximately two-thirds of departments referenced experimentation but many of those have not yet identified a fixed percentage for experimentation; and, a little under one-third, or 26 per cent, provided specific examples of experimentation. A very encouraging result for year one, and one that provides a much needed baseline. However, it is one that needs to be put into context to manage expectations of those who expect implementation to be a fairly linear affair. And how does one translate this 'upstream' indicator into useful guidance for 'downstream' work? By their own accounts, there remain different understandings of the relationship between experimentation and innovation - this shows the need to define terms and provide concrete examples.

2. A straightforward, operational definition of what is an experiment: precisely because of the above complexities, coming up with a straight-forward, non-intimidating definition of what constitutes an experiment is paramount to socialise the concept. The definition that the government team came up with has the refreshing simplicity of Astro Teller’s 3 principles of a good experiment. An activity qualifies as an experiment if:

  • It lends itself to rigorous measurement – in other words, it must have a comparator
  • It allows integration of the learnings into the next iteration of a project/activity
  • It is situated in the context of a shorter programme cycle, which means you have the opportunity to act on your learning and evaluate as you go.

Stated more broadly, experimentation to them refers to activities which seek to explore, test and compare the effects and impacts (i.e. what works) of policies, interventions and approaches in order to inform evidence-based decision making

To get at what works, it is important to continually incorporate the unique insights and evidence generated by exploring (determining what works through diverse sources of information, experiences and perspectives) and by experimenting (determining what works by comparing interventions through experimental design).

What stood out from this definition is the strong emphasis on organisational learning (experimentation as a search for new value) as well as the understanding that an experimental approach calls for more systemic changes - to the planning process, but also, for instance, to budgeting (with outcome based budgeting being more conducive to create the space for exploring new approaches).

3. What support for experimenters? Dan and Myra described a three-tier approach to supporting experimenters, including a community of practice at the working level, an online portal with practical guidance and tools and, intriguingly, a “table on experimentation” comprising senior management executives at the Assistant Deputy Minister level from about 10 different departments.

The “table” will be meeting regularly to review progress and is an important instrument to socialise the concept of experimentation across government, as well as a venue to facilitate the above mentioned ongoing process of translation from political intent to the realities of the bureaucracy.

The Finnish government set up a “godparents of experimentation” group with the same purpose. Interesting models for us to take inspiration from as we reflect on the best approach for making bottom up and top down experimentation meet in the UAE context.

As more governments embrace the concept of experimentation, learning from the early adopters and developing a better appreciation of the systemic implications of an experimental approach to policy making will become increasingly important.

We are extremely grateful to Dan and Myra for an opportunity to learn from their experience and we look forward to continuing the dialogue at IGL 2017.

Giulio Quaggiotto and Myra Latendresse-Drapeau​ will be speaking at IGL2017 on Tuesday 13 June in a panel discussion on Experimental Government. Get your tickets now or follow the hashtag #IGL2017.

Author

Shatha Alhashmi

Shatha works for the UAE's Prime Minister’s Office as a Director in the Mohammed Bin Rashid Centre for Government Innovation.

Giulio Quaggiotto

Giulio Quaggiotto

Giulio Quaggiotto

Senior Programme Manager in Innovation Skills

Giulio was a Senior Programme Manager in the Innovation Skills team, responsible for advising international development and public sector organisations on the implementation of their i…

View profile