It’s too soon to do an evaluation.
I heard this a lot when I joined Innovate UK in 2013 to establish an in-house economics function that would, in time, put us on the map of evaluating public investment in science, technology, and innovation.
But is it ever too soon to start finding out if your programmes are being delivered in the right way and that public money is being well spent?
At the time, the state-of-the-art was modest. I attended an international conference that year on the evaluation of public support for science, technology and innovation. I was surprised that evaluations often lacked a control group against which impact could be compared, and relied on unverified data, often from small samples of companies.
But we’ve seen huge leaps in the sophistication and quality of evaluation since then.
I love the challenge of understanding impact
Before moving into innovation, I’d been at the Department for Work and Pensions, doing appraisal and evaluation of employment programmes. We had the luxury of millions of data points, every benefits transaction ever made. This allowed precise analysis of the effect of any mooted change in policy.
We had much less data at Innovate UK, and there were a huge number of unknowns about the businesses we supported. Incomplete and unobserved data really did mean we were looking for faint signals in the noise.
I love the challenge of understanding the true impacts of what we do, trying new techniques, working across organisations to expand our capabilities, and looking to improve the quality and usefulness of our evaluations. We’ve come a long way.
We introduced control groups, comparing changes in performance in supported businesses with those in similar businesses we hadn’t funded. And we started designing evaluations into programme lifecycles, so we knew upfront what data we needed and how we would get it, and how we would track progress over time.
It’s never too soon to start an evaluation.
Getting great results you can trust
Innovation is incredibly important, and never more so than now. It is the key driver of productivity growth, which in turn determines living standards for all of us. It also provides solutions to global challenges such as climate change, all whilst growing the economy and supporting high-skilled jobs.
But to make sure we’re making the most of our investments, we need evaluations to understand our impact.
A good example was our recent evaluation of the Small Business Research Initiative (SBRI), a programme which helps public sector organisations solve challenges by connecting them with innovative businesses. We found the benefits to businesses awarded SBRI funding were up to an impressive 4 times the level of public sector investment.
Our evaluation of the Biomedical Catalyst – a partnership between Innovate UK and the Medical Research Council to support the most innovative life sciences opportunities – showed up to £5 of private sector investment for every pound invested by the public sector. We also looked beyond the headlines, showing the extent of technological progress achieved, investment leveraged, and intellectual property developed.
Pushing the boundaries on evaluation
Our evaluation of ICURe, a programme supporting the commercialisation of university research, used real-world valuations of spin-out companies formed through the programme to understand the economic value created from our investment.
These valuations not only showed a strong return on public investment but also clear evidence that ICURe spin-outs were far more successful in commercial terms than the average. Nearly half raised equity funding, raising 5 times more than teams not accepted onto the programme.
I always try to push the boundaries a little in a new evaluation. But it’s important to get the core right. We’ve gradually earned a reputation for robust, insightful evaluation that informs our approach and makes the case for what we do here at Innovate UK.
Can you have a single figure for everything?
Searching for a single figure that captures the impact of a programme is a constant challenge. It can be immensely powerful to find a clear message that cuts through the complexity and sets out what we deliver with taxpayers’ money.
We have also sought to summarise the impact of Innovate UK as a whole, but that is where the challenges really mount.
One approach is through economic modelling.
A few years ago, we commissioned an analysis of all companies that had received Innovate UK grants. We matched them to a group of similar companies and tracked them over several years.
Companies supported by us grew their turnover on average by £5 million to £10 million, and employment by 30 to 40 employees, after 2 to 4 years, compared to similar companies not supported by Innovate UK. We’re now working on an updated approach with partners at the Innovation Caucus.
The vital information is in the detail
Studies like these are difficult. The data is messy and comes with a significant lag so can only tell you what happened a few years ago. They also don’t tell us how we impact growth, or how to better design or deliver our programmes. That’s where the power of programme-level evaluation comes in.
And that’s the real trade-off.
You can either have a neat number that can be used in a bullet point but hides all the detail underneath. Or you can have nuanced evaluations that have a strong narrative around impact and value.
We’ll continue to pursue the magic number that sums it all up. But it will be the detail in the in-depth programme evaluations that provide the vital information we can use to improve our delivery, shape our processes, and understand where we’re adding the most value.
It just won’t always be easy to make a headline out of it.
Top image: Credit: UKRI