28 October 2011
Working out whether organisations, policies or programmes are actually delivering anything of value is surely part of any sensible approach to improving our society. That goes for government schemes, local government investments and the work of the voluntary and social enterprise sector.
In the jargon, this is the world of ‘impact measurement’, ‘outcome focus’ and ‘evaluation’. If you can improve the way that money flows through the system from funders of all kinds to the places and organisations that use it best, then everyone is better off. That kind of approach is what New Philanthropy Capital, which I now head, is focused on. Working all through the value chain from charitable trusts, grant making bodies, commissioners and philanthropists up at the ‘top’ down to the charities delivering the service to the beneficiaries. And small gains everywhere on this agenda will play massive dividends.
But sometimes we have to be careful with evaluation. The memory of that was rekindled as, over the last few weeks, people have been talking about the creation, successes and failures of Sure Start, a lot of the reflection being on the back of a new book by one of its driving forces, Naomi Eisenstadt. There are lots of lessons to be learned from its inception and progress, and the debates that still rage remain important ones – was its aim primarily to try to help bond mothers with their very young children, or to help them go out to work? Or was it, rather differently, to allow the joining up of services in new public institutions? Was it right to let Sure Start become a service for all rather than one that excluded the middle classes who really did not need it? And was it ever the same once local government got its hands on it and the ring fence began to go?
Either way, as speakers at a recent Institute of Government discussion made clear, while in 1997 there was very little around on pre-school or early years by 2010 every party committed itself – to at least a degree – to continuing Sure Start. Not bad really –even if the cuts remain a big threat.
But in 2005, during a period when I was special adviser at the Education Department, we received the first of the big evaluations of Sure Start. On the face of it the evaluation was not at all bad – lots of successes for a scheme that had not been going that long and was still very heterogeneous. But it had a headline that seemed to suggest that some of the most in need children were being failed by Sure Start and this was leapt upon by parts of the press and even some within government who had always felt it was not targeted enough.
In fact that research did not say that such children and their families had had negative experiences with Sure Start; it was more that they had not actually been using it. But that did not stop a pretty sticky period for this high profile and expensive programme.
But luckily ministers held their nerve. They did not junk the programme. They did learn from the evaluation and improved the scheme in a number of ways. Controversially – and relevant to some versions of crude localism around today – this included recognising that some of the services provided, often in very much parent led centres, appeared to be of little value in terms of outcomes and so new guidance pushed harder towards the activity that the evaluation showed paid off. Crucially, there was more focus on outreach to those who we really wanted to use Sure Start. And sure enough the next evaluation, a few years later, showed a lot more success.
So what are the lessons? First, you do have to evaluate – and there is a worry that this has slipped down the priority list since the election. Localism, or community empowerment, or even use of the voluntary sector in the Big Society is no excuse for not knowing what you are doing, trying to measure if it works and improving it on the basis of that work.
And then you have to be grown up when you get the results. Don’t hide them. Don’t ignore them. Act on them but also don’t panic if they are not everything you hoped for.
Above all do not assume you will get a 100% success rate. A very good evaluation of recent times showed that the Education Maintenance Allowance (EMA) was very cost –effective in encouraging 16 year olds from poorer families to stay on at school. Of course a lot was deadweight (ie would have happened anyway) but very few policies can only target totally additional activity – the question is how cost effectively does the programme get at the people you really want. EMA did well but those deadweight figures were used to justify ending the scheme.
Public policy and all organisations hoping to create change need evaluation and that evaluation must be heeded. But evaluation needs a mature response if we are to advance sensibly.
This blog was originally published on The MJ, the online management journal for local authority business, and is re-posted here with permission.