One thing is abundantly clear: What Works Centres have become a powerful brand. The five centres in operation already span health, justice, education, local economies, and early years interventions. Two others – the Centre for Ageing Better and the What Works Centre for Wellbeing – are under development, and there are also centres in Scotland and Wales with associate member status.
Speaking at a What Works Centres event hosted by the Institute for Government last week, Oliver Letwin, Minister for Government Policy, expressed his hope for a further increase in the number of centres. Sure enough, just a day later, a panel commissioned by the Coalition Government to explore opportunities for integrated service delivery recommended the creation of a “What Works Centre for Service Transformation”.
But what will it take to ensure this growing network delivers on its potential?
Bringing the front line on board
Success will partly depend on centres making “evidence useful for people at the coalface”, as one panellist at the event put it. Many frontline services are over-stretched and under-resourced. Practitioners rarely have the time to trawl websites or review academic studies. Nor do they have the funding for external training courses.
The centres have developed some very effective outreach strategies:
One approach is the roll out of online toolkits – dashboards that rate interventions according to criteria such as cost, the quality and availability of evidence, and known impact. The Education Endowment Foundation (EEF) has a Teaching and Learning Toolkit. The What Works Centre for Crime Reduction will launch theirs in January. These toolkits not only make evidence more readily accessible but also help users appreciate the complexity of available evidence – highlighting where recorded outcomes are mixed.
Yet simply making evidence accessible won’t stimulate demand for it or change behaviours. Centres operating in areas such as policing and early years intervention are frequently working in environments that lack a strong “culture of enquiry”, as Shirley Pearce (College of Policing) noted at the event. Encouraging practitioners to act on evidence and abandon long-established working practices will take time and careful negotiation.
For the centres, one of the most effective ways of driving change appears to be a happy by-product of their efforts to fill gaps in our existing evidence base. A staggering 4,500 schools in England, for instance, are taking part in evaluations commissioned by EEF (nearly all of them randomised control trials or RCTs). As Kevan Collins (EEF) observed, this brings “conversations about evidence” directly into the classroom and has rapidly changed attitudes to the use of RCTs in the UK education system.
Yet such system-wide engagement is not an option for all centres. The EEF benefits from a £125 million endowment. Others such as the What Works Centre for Local Economic Growth have far fewer resources.
Improving the quality of policymaking
Reaching practitioners is only part of the battle. Helping put evidence at the heart of spending decisions is the other. As our report on the use of evidence in policymaking highlights, centres face a range of challenges – on both the supply and demand side – in ensuring that policymakers and commissioners pay more attention to evidence. Two that stood out at the event were:
i. Evidence generation
To be of use to decision makers What Works Centres need access to good quality evidence on the cost effectiveness of different interventions. In a report released by the What Works Network at the event, Oliver Letwin and Danny Alexander called for “policy-makers to help the centres find out what works by robustly evaluating the impact of their policies”.
Despite some good practice, evaluation in government is generally under-resourced, commissioning is incoherent, and evaluation evidence that is produced is often not used or even seen by decision makers (see NAO report, 2013). Whitehall needs to professionalise its evaluation function. As we argue in our International Delivery report, evaluation should be a recognised civil service profession where it is overseen by a head of profession who sets professional standards, co-ordinates training, and can monitor evaluation activities across government.
Aside from Oliver Letwin and Danny Alexander, the What Works Network needs to generate far more interest from government ministers. In the short-term, only by holding ministers and departments to account are we realistically likely to incentivise demand for the evidence that centres have to offer. Centres themselves must remain independent and challenge the evidence base behind policy announcements. But they cannot do this alone.
Scrutiny and audit bodies should also do more. This isn’t about advocating “for rule by experts”, as Peter Riddell recently told the BBC, but rather forcing policymakers “to explain why they have taken a decision and measure themselves against real evidence”.
We need to see many more initiatives like the House of Commons Education Committee’s recent “Evidence Check” exercise—where the Department for Education was asked to outline its use of evidence in nine policy areas. The department’s responses have since been posted online as the Committee crowdsources opinion on the quality of the evidence used and whether compelling “contrasting evidence” has been overlooked.
The growing network of What Works Centres offers a valuable resource to decision makers at a time when budgets are tight. But it is a resource that needs much greater support.