Come in we're open sign

What does open mean?

By Marina Svistak 22 April 2013

A couple of weeks ago I listened to a presentation on iTunes U by a venture capitalist from the Silicon Valley. He gave several reasons why he decides to invest in a start-up or not, one of which particularly stuck with me. Paraphrasing slightly, the message was that he does not invest unless the company is able to tell when it has failed and what it has learnt from this failure. But how does one learn? Well, first and foremost, you need access to data.

Fast-forwarding a few weeks, I found myself searching for the latest initiatives in ‘open data’ on twitter–something I’m personally interested in—to see how access to data can give us knowledge and improve our decision-making, enabling us to learn from our failures and replicate our successes. I came across all sorts of different projects on politics and the economy, sciences and culture, and even down-to-earth practicalities like the location of the nearest public toilets. But I found little from the charity sector. Where do we stand on sharing data?

There are two sides to every coin—in this case, financial and impact data. The former is a more clear-cut, driven by the requirements of the Charity Commission, and supported by initiatives in the space such as Open Charities and Charity Financials. In many ways this helps to make data analysis easier and encourages smaller charities to take part and benefit from the benchmarking option. The process of calculating your turnover and expenditure is also quite straightforward. So, what about impact data? Are we ready to share this kind of data for it to be dissected, analysed and studied under a microscope?

A few charities already have, but it is not happening on a big scale. In 2007, WRVS, which uses volunteers to provide practical help to older people,  was struggling to work out whether it was making a difference to the lives of their beneficiaries. However, as Lynne Berry, the late chief executive and a strong advocate of impact measurement, said: ‘Impact evaluation helped us think about what we were doing that doesn’t achieve our mission’. This resulted in the reallocation of resources and in cutting certain types of services to focus on those with the biggest impact potential. But WRVS is one of only a few great examples that have fully embraced data collection.

This is partly to do with the way data is collected, which makes it questionable, subjective and exposed to many weaknesses. This can include things like asking leading questions in surveys, using sample sizes that are too small, and mismatching the outcomes with the indicators. NPC is a passionate proponent of setting up standardised and rigid methodologies but we also know how hard it is for some charities to implement such processes.

So what to do? Change happens in small incremental steps. Before we can start sharing impact data, we need to improve our ways of collecting it. And this change begins at home.

In May, NPC is running a conference together with Third Sector on Measuring & Evaluating Outcomes in Practice. The objective is to share knowledge and practical tips to help charities become better at measuring their impact. Only through having an honest and open discussion about impact data, will we be able to make that important step forward.