Most of us want to know that the projects we work on produce the outcomes we set out to achieve, and that they get them in a cost effective way. This is why evaluation is so important, but I think most of us also recognise evaluation of the kind of work many of us do, especially large social programmes, is extremely difficult. So I am pleased to see the recent publication of the quite intensive evaluation of the Troubled Families programme.
An ambitious evaluation
Before we get into the results, it’s clear it is an ambitious evaluation in itself. There have been some impressive efforts to link administrative data sets from the Ministry of Justice, the Department for Work and Pensions, and the Department for Education (although there still appear to be problems with getting the Department of Health and Social Care to play ball), to try to really understand the impact of the programme. This is a shining example to other evaluations and proof the public sector can do great things when it chooses not to hide behind technical issues or problems of privacy and so on.
There are some positive and encouraging findings in the evaluation, and pleasingly it has been given much less of a positive spin than David Cameron’s completely implausible 2015 claim that Troubled Families had turned around the lives of 99% of the families targeted.
But a lack of the answers we want
But for all its good features it is also an example of many of the issues that bedevil evaluations of this sort of programme.
First, in trying to address a complex set of issues facing ‘troubled families’, the scheme is itself pretty complex. So, when it does work – and the evidence seems strongest around reducing the number of children in care a little (2.5% in the comparator group; 1.7% in the programme group) – we do not really know what it was that made this happen. What made the difference? And was it one thing, a combination of things, or everything? Currently we don’t know.
Second, is it really value for money? Aside from the question just mentioned, of if we need the whole expensive package (£920m 2015/20, the evaluation report says) or only elements of it, there is also the question of whether the outcomes achieved were proportional to the money spent.
If you throw a lot of cash or resource at a problem, then on the whole you will get some movement in the direction you want. The real issue is always: could you have achieved the same outcome for less outlay of resources?
This evaluation does not really tell us that. It tries to get towards it through some cost benefit analysis to see if we can say this scheme is worth more to society than it costs.
But the fact that doing something is likely to be better than doing nothing does not tell us that in the Spending Review the Treasury should pump more money into this programme.
So what should we do?
Where then, do we go with schemes like this? While there are some positive and encouraging findings in the evaluation, the accompanying statement, which claims the Government was ‘absolutely right to have invested so much in this approach’ rather oversells the conclusions.
Trying to help families or communities enmeshed in complex, multi-faceted problems is a positive move, but designing some top-down, under-piloted, system as the way to solving them is unlikely to be the answer. What we need to do from the Troubled Families experiment is to learn.
We should try to learn that presenting people and families experiencing complex, overlapping problems with a plethora of different, un-joined-up interventions is a bad way forward; that data and administrative data link up is crucial to addressing many social issues; that extra capacity and funding is invariably needed to make some of these ‘good’ things happen. We should also look at some of the qualitative work done in this evaluation to try to understand what worked best and make it better – for instance digging into the finding that key workers felt they had major impacts on improving parenting skills.
In particular, we must not preach that we have found a Holy Grail scheme solving everything but be content we are making progress when we focus on how to help families overwhelmed by their needs and unable to access help.
This blog originally appeared in the Municipal Journal.
Don't know where to start collecting, analysing and using data? NPC can help.
NPC Senior Consultant Anoushka Kenley considers what economic analysis has to offer those thing about how to measure impact.
This paper presents the core factors that have enabled us to advocate for Data Labs—a model that unlocks government data to help organisations understand their impact on social issues.