paper dolls in someone's hands

Troubled evaluations

By Dan Corry 10 November 2016

Evidence isn’t ‘nice’, but it’s necessary

At NPC, we spend a lot of our time telling charities and philanthropic funders to try to take care when putting together programmes and allocating resources. Think hard about what you are doing through a theory of change, we say. Look and see what works elsewhere. Give it a go on a small scale to start with. Do a really good evaluation of its effectiveness before you scale up. Choose sensible metrics.

There is sometimes resistance to this ‘common sense’ advice in a charity sector that deep-down believes you do good because you ‘feel it’ in your heart, and you know you have done good when you see the smile of the person you are working with. The rest is often seen as mumbo-jumbo from bean counters like us!

We need government to lead by example, but it hasn’t

You can imagine, then, how much harder it is to preach this message when the government pushes upon us all a programme that it did not think fully through, did not pilot, did not listen to constructive criticism about and then set metrics for and incentives around which were almost guaranteed not to work. I’m talking, of course, about the Troubled Families Programme (TFP).

Even when the evaluation by respected economists at the National Institute of Economic and Social Research (NIESR) concluded ‘we were unable to find consistent evidence that the programme had any significant or systematic effect’, the government response so far is to simply deny it. In this case, it was not just a small charity not quite ‘getting it’ due to lack of capacity or expertise, but the government shelling out hundreds of millions of pounds to little avail.

You don’t make a difference through wishful thinking

The model in question has a lot going for it. TFP was an attempt to focus on families and at last help them holistically. A welcome part of this was setting up incentives that get public sector agencies and other bodies—including non-profits—to work together, and with the family, rather than each working through their own silo (the latter option causing chaos for the families and inefficient use of public money).

Of course the TFP had always been criticised, but the promise it offered meant these were overlooked. Originally Tony Blair was presented with evidence suggesting that if risk factor analysis was used, it would reveal a relatively small number of families that, it could be argued, cause a great proportion of costs to the state—in terms of crime, health and so on. From then on the temptation to identify them and find a programme to ‘cure’ them became irresistible, both to Blair and then David Cameron, under whom the TFP was born. The search for success at all costs became the enemy of the scheme.

So what went wrong?

You can argue that equating these poor and disadvantaged families with those causing anti-social behaviour and crime was always a mistake. But it got worse when it was decided that a payment by results approach (PbR) would be used. Fine in principle, but as the National Audit Office pointed out, it is vital the payments reward the correct behaviour, otherwise we get distortions and game-playing.

With TFP, the metrics were about as wrong as could be. To receive payment, the local authority had only to hit a few of a set of targets with little evidence of the strength of its link to the ultimate outcomes desired. Naturally, it went for the easiest ones and made sure it delivered on them. And the government decided that if you hit these targets, then the family was deemed to have been ‘turned round’.

The craziness became apparent when government started claiming ridiculously high success percentages—and associated cost savings—which could not be true with a client group with such complex needs. This was pointed out at the time. It should not have been a surprise when the real evaluation spelled it out.

And what should we take away from this?

We must not conclude schemes like this are doomed. But we must insist that government, local authorities and other commissioners listen to evidence, get experts in (however much maligned they are post-Brexit) to think about getting PbR frameworks right and ensure they are rigorous in evaluation—including publishing and acting on independent evidence.

Many things in social policy are difficult and, as even TFP critic Jonathan Portes has pointed out, it’s expected that not everything will work. But it seems to me that government standards in the TFP came in below that of the much less well-funded and staffed charity sector. That cannot be right.

A version of this article was originally written for the MJ.

Categories:

Footer