Working to make government more effective

Comment

Deep impact? How government departments measured their impact under the Coalition

What impact have government departments had in the real world under the Coalition?

What impact have government departments had in the real world under the Coalition? A new Whitehall Monitor report analyses government departments’ ‘impact indicators’, part of the Coalition’s cross-government system of performance measurement. Emily Andrews, Gavin Freeguard and Robyn Munro summarise their findings. Click on the images to enlarge.

The British public thinks politicians should prioritise ‘fulfilling the promises they make before getting elected’, but currently don’t.

Polling conducted by Populus for the Institute for Government found that the British public values politicians fulfilling their election pledges, but believe parties currently prioritise party political goals (like getting re-elected, scoring political points and making big announcements). It also found that nearly two-thirds of the public said they would be more willing to vote for a party that showed how it could implement its policies. One way to show promises are being fulfilled would be to use a form of performance measurement (demonstrating what government was actually doing) or even performance management – using data to hold a department to account and drive further improvement.

Departmental Business Plans, introduced in 2010, are the latest cross-government way of measuring performance.

The Coalition introduced ‘Business Plans’ in 2010. These focus on actions or administrative steps departments could take – ‘outputs’ within their control – as part of a ‘Structural Reform Plan’. This was a reaction to Labour’s ‘Public Service Agreements’, which had set targets for the effects the government wanted to have in the real world – ‘outcomes’ or impact.

Each department’s Business Plan includes a series of ‘impact indicators’ which aim to measure the effect of their reforms.

Impact indicators ‘are designed to help the public to judge whether [departments’] policies and reforms are having the effect they want’. They include things like how many people are on particular types of benefit (in the Department for Work and Pensions), reoffending rates (Ministry of Justice), the proportion of trains running on time (Department for Transport) and net migration to the UK (Home Office). According to the Number 10 Transparency website, the Department for Education (DfE) has the most indicators, with 28. The Ministry of Defence (MoD) has the fewest (six). Very few of these 207 indicators have specific targets against them. So to work out what’s happened to the indicators, we looked at the scores for each in 2010 and at the time of writing:

  • Green means it’s moved in the right direction (for example, more people moving off Job Seeker’s Allowance)
  • Red means it’s moved in the wrong direction (such as more trains running late)
  • Amber means it’s stayed the same or the picture is mixed
  • Dark grey means the data is unavailable or incomparable.

Overall, more than half of all impact indicators have moved in the right direction compared to 2010.

All of the Department of Energy and Climate Change’s (DECC’s) impact indicators – which range from lowering the number of households in fuel poverty to increasing the proportion of UK energy supply from low carbon sources – have headed in the right direction compared to 2010. The indicators for the Department for Culture, Media and Sport (DCMS) also scored well (only the proportion of children playing competitive sport declined). However, there were improvements across under 50% of the indicators for seven departments – the Department for Education (DfE), Department for Business, Innovation and Skills (BIS), Foreign and Commonwealth Office (FCO), Home Office (HO), the Treasury (HMT), the Ministry of Justice (MoJ) and the Ministry of Defence (MoD). At MoD and MoJ, more than 40% of their indicators had headed in the wrong direction since 2010 – including a falling percentage of deployable personnel and falling service personnel satisfaction at MoD, and rising reoffending rates at MoJ. Some departments have not published comparable data for a significant percentage of their indicators: 50% in the case of the Department for International Development (DfID) and DfE.

However, scores, data and explanations for what the indicators mean can be difficult to find…

The Government intended that members of the public should be able to access the data from a central website. This would enable them to use the data themselves, understand what government reforms were achieving and hold departments to account. However, using the Number 10 Transparency website, we found that in many cases this was difficult, if not impossible. Then, when we tried tracking down all of the indicator data ourselves, we found a huge variation in the quality, usability and accessibility of the data.

…and it isn’t clear that the impact indicators are being used by the public, by departments or by the centre of government.

The Government intended that the public, departments and the centre of government would be able to use the impact indicators to judge progress. But when we asked government departments how they measured outcomes in 2013, almost half said they used something other than just the Business Plans (two said they used something different altogether). The availability and accessibility of data makes it difficult for the public to use the indicators as intended. No army of armchair auditors appears to have enlisted. And the plans also appear to have lost any political link with the centre of government. The formation of a new government after May 2015 – by whatever party or parties – is likely to prompt some discussion about the future of performance measurement. We offer five tentative conclusions:

  • Performance management needs to be re-invigorated. An incoming government must set clear priorities on what outcomes it wants to see. But any new government should also be sensible around the transition of power: there is no point dismantling a system just because it was inherited from a predecessor, only to have to reconstruct it in time.
  • Performance management can be used to drive cross-departmental collaboration. Any further reductions in the size of Whitehall departments should drive new, cross-departmental ways of working.
  • The link to politics, and the public, is vital. Delivery of political promises could bring electoral benefits, and expresses politics in a language that matters to people.
  • We need better quality data – and it should be public. And while the needs of different users may vary, there are clear benefits – both practical and in terms of perception – to transparency.
  • It should be a performance management, not merely a performance measurement, regime – and there is more to it than ‘just’ the data. Data is much more powerful if used to drive improvement. That includes being able to properly assess what it means – with proper comparison against benchmarks, baselines and counterfactuals. And the data can only ever take us so far: as Einstein once said, ‘Not everything that is countable counts, not everything that counts is countable.’
Publisher
Institute for Government

Related content