Thanks to Ian Bell, Rebecca Endean, and Nicki Webb and team for all they have done in setting up the Justice Data Lab and running it. Secondly, I would like to thank the Oak Foundation—represented by Louise Montgomery here—for supporting NPC in trying to open up government databases to help charities assess their impact. Without their support we would not have a Justice Data Lab, and we would not be here. Thirdly, I would like to salute all the organisations that have used the Justice Data Lab to date for the leadership they have shown. In the points that follow, their importance in this should become clear.
A revolution in evaluating programmes
But to the main point, it is too soon to claim much about the impact of the Justice Data Lab. In this first year, a few organisations—the brave and the organised—have dipped their toe in the water. One or two—such as the Prisoner Education Trust—have embraced it as a means of assessing their programmes. But most organisations have sat on the side-lines.
But, I predict that in five to ten years time, when we look back, we will judge the impact of the Justice Data Lab to as causing a revolution in evaluating the impact of programmes and services in key social policy areas, not just offending, but employment, health, education, and drug treatment.
This revolution will come about because of four things the Justice Data Lab provides:
1. Feedback loops
First, it creates a feedback loop that hitherto has not existed. Before the lab, if you wanted to know how many people you worked with six or seven years ago reoffended within a year of your programme, you would have had to spend an awful lot of money to get incomplete data that would not answer the question. Today you can get the answer in a few weeks at virtually no cost to your organisation. That feedback loop gives managers and funders new information to inform decisions to expand, continue, modify, or drop programmes. Over time, this new feedback loop will help managers and funders become more impact focussed. 1 Deviations from the speech were made by David at the event, therefore this paper isn’t an verbatim record of everything that was presented.
2. Examination of the mechanisms of change
Secondly, the Justice Data Lab allows researchers, evaluators, and providers of services to examine the mechanisms of change—what is it that helps reduce recidivism?—rather than worry about collecting data. I expect and hope the data lab will stimulate demand for good qualitative research on factors that promote desistance.
3. Library of evaluated interventions
Thirdly, over time, the Justice Data Lab reports will create a library of studies to compare and contrast what seems to make a difference, when, where, and for whom. To date we have just over 60 studies. While these are interesting, they are not sufficient to identify patterns. But if after years of use we have hundreds and thousands of reports, then we may start to see something. 4. Government support for impact measurement Fourthly, the Justice Data Lab is leading the way on how government can help charities both assess their own impact while also adding to the evidence base of what works and does not in addressing some of our most pressing social problems. We at NPC are in discussions with several government departments and the Cabinet Office about how this model can be adapted in other social areas. You can learn all about our work on our new webpage www.npcdatalabs.org.
Together, these changes will help charities, public and private sector organisations improve what they do, and give funders better information on the impact of their resources. For some organisations this will be good, while others this may be bad, but for the sector as a whole, and for offenders and the general public, the net effect is good.
Conditions needed for the revolution
But the Justice Data Lab will only lead to improvements in practice if three conditions prevail:
1. Leadership and Courage
First, we must get beyond the leadership and courage of the early adopters and get those on the sidelines to use the service once, and then twice, three times, and so on until this becomes routine. I realise this is a big ask. Most of us suffer from Illusory Superiority—the belief that we outperform our peers. Seeking data that may challenge that belief is not natural. I have two concerns on this front: first, that all the early adopters have used the Justice Data Lab, and we are on the edge as to whether others will follow. Secondly, that organisations that have used it and are disappointed by the results—maybe the estimated impact was small, statistically insignificant, or even negative, will become disillusioned and not use it again. I have had a couple of conversations where that sentiment was expressed.
2. Use of information
This leads to the second condition—that organisations use the information. I have not talked with enough users to know, but at the minute, I suspect the results are used in the ‘beauty parade’ framework—do we look good or do we look bad? If that is true, and that is how it stays, in my mind the Justice Data Lab will have failed. The results need to become integrated into how organisations assess, design, and manage their programmes.
3. Mature discussion on data analysis
This is linked to the third condition. As a sector, we need mature discussions of what the data means and how to interpret and use it. Users and funders need to examine the results with the cool head of an objective analyst, not to back up a predetermined agenda or decision.
I am optimistic that these conditions will be met, but I don’t think it will be a quick process. There are a number of changes that have to happen for these to be met, some of which cannot be rushed.
There are some methodological issues that the Ministry of Justice need to address—which I know they are looking at—such as linking to other data sets to improve the matching process. All stakeholders need to get well versed in understanding and using concepts such as statistical significance, sampling, comparator groups, effect sizes, and so on, so we can have those mature discussions.
Funders need to be asking for the results in grant applications, not to use as a crude threshold of funding / notfunding, but as one piece of information in making decisions of what to fund, but also to provide the incentive to organisations to use the Justice Data Lab.
Finally, is the sector makes a cultural shift to where publishing internal and external evaluations, be they positive or negative, becomes the new norm. At a conference last week, Danny Kruger of Only Connect made that commitment, which I think is terrific. My final thought I want to leave you with is that, to maximise the good that you do, you should be as passionate about contributing to the public body of knowledge of what works and what does not work as you are about delivering the best services that you can.