Standards of evidence: helpful or a hindrance?
9 October 2014
If I want to buy a new washing machine or TV, the first thing I do is check out Which? magazine for a review of the best buys. I don’t think of looking anywhere else. I trust what they say. Their reviews makes it easy to compare one product with another and find one that meets my requirements. It is a shame there isn’t something like this for commissioners and policymakers in children’s services.
The explosion of research on what works over recent years needs some type of honest broker to navigate the increasingly complex landscape. This could in effect act as the Which? magazine for policymakers and practitioners. My area, children’s services, is no exception. There have, for example, been over 150 systematic reviews of what works in the area of the health and well-being of 0-5 year olds in the last five alone. Is it reasonable to expect a commissioner or manager to have the time or expertise to digest and make sense of such a mountain of evidence?
A quick look online would suggest that help is at hand. There are online databases listing what works in different areas. But a deeper look shows that there are in fact over 30. Granted, they are not all interested in the same policy area but there is much overlap. Four, for example, focus on youth offending. To make matters worse, a programme that makes it onto one database might not make it onto another. So, not only are there multiple databases but also different criteria for deciding which programmes get onto the list and which don’t.
To give an example, Life Skills Training, a programme designed to equip young people with the skills to adopt healthy behaviours, features on five of the widely used databases. Only three out of all 475 programmes reviewed on these databases feature on all five. One of the main reasons for these discrepancies is widely different views on what constitutes an acceptable standard of evidence. What counts as ‘good enough’ research for one is not sufficient for another.
There is no sign of the number of databases diminishing. With the creation of six new What Works centres the opposite may happen. With too many lists to choose from, commissioners and policy makers are more likely to simply switch off from the evidence and rely instead on instinct and anecdote—decision-making will suffer as a result.
It is curious that Which? magazine has established itself as the ‘go to place’ for consumers for information on a wide array of products and services, and yet among a much smaller community (policymakers and practitioners) there is no consensus about a reliable and trusted source.
We can speculate on the reasons—the Consumers’ Association, which produces Which?, is well established, has a smart business model, generating sufficient income to disseminate the information in a highly accessible fashion and it provides what people want. All of which could be useful lessons for the new generation of ‘what works?’ honest brokers.
Perhaps a more subtle difference is that Which? magazine presents all the information about every product and allows the consumer to decide, whereas the What Works lists try to categorise interventions into what works and what doesn’t and offer a prescription. In the same way that Which has, in effect, educated consumers about what to consider when making a purchase, so our honest brokers could educate research consumers about what to look for in effective interventions.
 Social Programs That Work, Blueprints for Healthy Youth Development, National Registry of Evidence-based Programs and Practices, Office of Juvenile Justice and Delinquency Prevention (OJJDP), and Office of Justice Programs, U.S. Department of Justice.
Louise will be speaking at our upcoming High Impact conference about raising the standards of evidence.