This morning my son asked me the meaning of the word ‘efficiency’—he thought it was something to do with ‘importance’. In a way, he was right. Efficiency is all about working out what is most important and prioritising it. It’s about not wasting time on the things that don’t matter.
Efficiency is key to delivering services and to making the most of available resources. How to have more impact without needing more money, or, increasingly, how to have the same impact with less money, is on many voluntary sector professionals’ minds. It was also a core theme of our annual conference NPC Ignites.
Another key topic of many of the talks at the conference was that innovation is an important route to greater efficiency—whether that’s through digital opportunities or just thinking about things differently. But as one of our speakers Baroness Barbara Young reminded us, there are lots of fantastic innovations already in existence, many of which aren’t used when they could be. Doing good better is as much about spreading and scaling up existing innovations as it is about finding new approaches.
The same applies to evaluation. My colleagues and I are constantly encouraging and supporting organisations to make sensible use of existing evidence (rather than collect data they don’t need, or set up potentially inefficient services), as well as to collect and analyse data efficiently.
At NPC Ignites, I chaired our panel on impact measurement, with Jane Lewis, Innovation and Improvement specialist, Sarah Mistry from Bond, and Pippa Knott from the Centre for Youth Impact. We talked about more innovative and efficient practice at each stage of the impact cycle.
Jane emphasised the importance of paying attention to implementation when using research to inform practice. An intervention may work brilliantly in one context, but really good data is needed on the population it worked with, and their needs, and the context it worked in, to work out if and how it could be adapted or scaled. Stand alone evaluation data may otherwise be a waste of time.
But even if we’ve got good evidence and we know how to interpret it, how easy is it for more people to access this knowledge? The Centre for Youth Impact are focused on building a good infrastructure to help cascade knowledge and learning to reach all parts of the youth sector. We don’t want everyone having to seek out everything themselves and on their own, or risk duplication of effort.
And once you have your evidence-based intervention, how do you best approach looking at your own impact? This is where a shared approach to looking at outcomes and potential measures comes into its own, and is something Bond have worked on for the international development sector. Shared measurement is the efficient evaluator’s dream. It is not the easiest thing to implement—it needs initial time, investment and commitment, and the will to get over political barriers. But the value of people getting their brains together to decide what should be measured and how, and making this information available to everyone, is well worth the initial hurdles.
We think efficient evaluation is the way forward, and we’ll be looking for good examples in our upcoming research on the best innovations in impact measurement.