Warning sign

Traps on the evaluation journey for commissioners & providers

By Anne Kazimirski 20 May 2015

This blog was first published by Open Forum Events in advance of Anne speaking at the Managing Public Sector Outcomes: Making a Real Difference conference.

Evaluation is hard—there’s no denying it. It’s hard to know how to approach it, it’s hard to find time for it, and it’s hard to get agreement from all stakeholders on what data to collect. Conversely, it’s easy to waste time, it’s easy to collect data on the wrong thing, and it’s easy to turn evaluation into a tick-box exercise.

Being aware of the traps can help stop you falling into them—here are my top three to avoid:

1. Forcing squares into circles

As a provider, have you ever had to twist the stated purpose of the services you provide to fit the criteria of a funding stream? Or as a commissioner, have you ever had to ask for data that you know has little significance, just to satisfy administrative requirements?

This is unfortunately more common than you may think. Take for example a domestic violence service for women fleeing a violent situation. You’d expect this service to report on their service users’ feelings of safety, their mental health and their confidence levels. I know one that was instead asked to report on reduction in smoking!

Of course rules and remit definitions are necessary to manage commissioning and funding, but my plea is: please don’t let it get this far. Whatever your role in the chain, please challenge what doesn’t make sense as often as you can.

2. Collecting arbitrary data

It is tempting to cut corners in evaluation, borrowing questions used on another programme, or improvising with a few that seem to vaguely fit the bill. Many organisations jump to this stage without questioning what they’re trying to achieve, or what they’re going to do with the data once they have it. Thinking through what your expected outcomes are, and how your activities are supposed to achieve them (ie, developing a theory of change) is a pretty essential step towards working out what data you should collect and why.

3. Working alone

I am yet to come across a unique organisation trying to achieve something that no one else has tried to achieve. The way they are doing it may be new, but the outcomes aren’t. So I can guarantee that someone out there has already attempted to measure the same outcomes as you (though I can’t guarantee they managed it). It’s not easy, but it’s always worth trying to uncover other people’s measurement approaches or questionnaires, such as those on the Inspiring Impact programme’s Impact Hub. Better still, if there is a way for you to forget competition for a minute (if you are a provider), consider working with your peers to agree on a shared approach. If you are a funder or commissioner, you’re in a prime position to bring similar organisations together.

Evaluation is a journey; you’ll take twists and turns along the way. But watch out for the traps and you’ll stand a far better chance of measuring your impact in a way that allows you to become more effective.

Footer