Measuring performance

The early election left departments unsure about their priorities. It also delayed attempts to improve how government measures its own performance; the new Single Departmental Plans are an improvement but could still be more focused. While the British public have become more dissatisfied with how the Government is running the country since the election, international rankings suggest the UK compares relatively well to others.

There are a number of ways of thinking about government ‘performance’.

First, we can use our own objective measures, as this report does in relation to Whitehall and our Performance Tracker does in relation to public services.

Second, we can use government’s own measures – at present, Single Departmental Plans. The original set from February 2016 did not prioritise enough or make performance indicators sufficiently accessible. They were not (on the whole) updated to take account of Brexit. Although the early election in June 2017 delayed a significant revision, the Government has now published revised plans. These show greater focus, more consistency and include some performance indicators. Nonetheless, they could still be improved further.

Third, we can (theoretically) measure what the public thinks. Although polling what the public thinks about this is difficult – political views are hard to separate from objective views on performance – Ipsos MORI polling shows that dissatisfaction with government has increased since the 2017 election.

Finally, we can see how the UK civil service performs compared to others internationally. The pilot International Civil Service Effectiveness (InCiSE) index placed the UK fourth out of 31 countries, behind Canada, New Zealand and Australia. But it could still improve on a number of measures, especially digital services and integrity.

The UK has tried various ways of measuring performance across government

Chart selected cross-government performance initiatives

Governments need a sense of what objectives they are trying to achieve, a plan for how and with what resources they are going to achieve them, and indicators telling them how well they are doing in meeting (or not meeting) them.

Since 1979, UK governments have tried a variety of cross-government performance regimes. In recent decades, the UK has been seen as a world leader, from Labour’s Public Service Agreements (1998) and Prime Minister’s Delivery Unit (2001), widely emulated internationally[1] to the Coalition’s Departmental Business Plans (2010). The latter were a promising step forward in making government transparent and accountable to the public, even if they ran out of steam in the second half of the parliament and the data they contained were not as consistent, accessible or usable they might have been.[2] The National Audit Office has argued that, while a number of these systems have included good elements, the UK should aim to set up an enduring performance system that could survive changes in administration.[3]

Following the 2015 general election, the civil service set to work on Single Departmental Plans (SDPs). According to the civil service chief executive, John Manzoni, they were designed to bring together inputs (aligned with the Spending Review) and outputs, and ‘prioritise effectively based on a clear understanding of how our resources can best be deployed. There [would] be no room for “nice to haves”’.[4]

Unfortunately, for all the hard work of civil servants in getting SDPs finished, ministers did not prioritise enough. When the SDPs were first published in February 2016, seven departments had more than 60 ‘priorities’ (including Theresa May’s Home Office). Even worse, many of these were so unspecific that it would have been impossible to understand whether the department in question had achieved them or not; how, for example, would we know if the Foreign Office has successfully ‘[stood] up to Russian aggression whilst engaging and working with Russia where necessary’?

The initial plans were met with scepticism but the hope that they could be improved:

  • We concluded they gave ‘no sense of ministerial priorities’ and were of ‘no use either to civil servants trying to implement the Government’s agenda or to the public trying to hold them to account’.[5]
  • The Public Administration and Constitutional Affairs Committee (PACAC) described the published versions of the plans as ‘not sufficient for accountability purposes. They contain too little detail on either spending or performance’,[6] although it hoped the framework would, with improvements, ‘last for the long term’.[7]
  • The National Audit Office found the SDPs ‘do not provide all the public accountability the Government said they would’ and ‘do not meet the Government’s stated aim to be “the most transparent government ever”’, although it hoped ‘the considerable time and energy’ the civil service put into them and into learning from past lessons would not be wasted.[8]
  • The Public Accounts Committee (PAC) welcomed the SDPs as ‘an important step forward’ but acknowledged ‘their effectiveness has not yet been tested’ and that further development was necessary to improve ‘the information that Parliament and the public can access to understand government’s plans and to see how it is performing’.[9]

In its response to the PAC, the Government said it would publish refreshed plans, alongside more details of how the planning process worked and links to performance indicators, in June 2017.[10]

The early election delayed updates to the Single Departmental Plans...

Chart publication of Single Departmental Plans

Unfortunately, the planned June 2017 updates did not happen; a general election took place instead. The Conservative manifesto was the second longest since 1945 (after 2015), giving the Government the challenge of prioritising new promises as well as those in the existing SDPs.[11]

The December 2017 update was much needed. Before that, only one department, HM Revenue and Customs, had updated its plan since the election in June 2017. Only eight others had made any revisions since the June 2016 referendum. There were no published plans for the three departments – DExEU, DIT and BEIS – which were created in July 2016 when Theresa May became Prime Minister. The HO plan still noted that changes to free movement and EU workers’ access to benefits were ‘being pursued as part of the EU renegotiation’.[12]

The Government intended SDPs to effectively become the performance objectives for permanent secretaries, replacing previous Permanent Secretary Objectives;[13] these have not been updated since the 2015–16 objectives were published in February 2016.

...the new Single Departmental Plans are much improved but could still be more focused

Chart priorities identified in Single Departmental Plans

The SDPs published in December 2017 were a significant improvement on their predecessors, with a greater sense of prioritisation, a more consistent and useful format, and some performance indicators.[14] There were also some cross-government objectives like ‘Get[ting] the best Brexit deal for Britain’ and ‘Tackl[ing] the injustices that hold people back’. The 939 priorities we counted in February 2016 had fallen to 842 in December 2017, even though an extra department had been added. Some departments are much more focused. The DfE for example, has top-level objectives to ‘close the word gap’ and ‘close the attainment gap’.

But there are more improvements that should be made. There are still too many priorities: DCMS has more than 90. Many of the priorities remain unspecific – the ‘Russian aggression’ priority of the FCO survives. Some of the data used to judge performance against the priorities could be better quality – the HO admits there are problems with crime data, but uses it anyway. And the SDPs should link priorities with both spending and workforce strategy (does the Government have the right people in place with the requisite skills and experience to achieve its objectives?). These changes would make the SDPs even more useful, helping departments to use their resources more effectively and increasing their accountability to Parliament and to the public.'

Public dissatisfaction with Theresa May’s Government is higher since the early election

Chart Ipsos Mori poll 'Are you satisfied or dissatisfied with the way the Government is running the country?'

Measuring public views of government effectiveness can be difficult. There are few regular polls asking the public explicitly what they think about this, and respondents may be answering based on political views rather than an assessment of administrative effectiveness.[15]

One of the closest is Ipsos MORI’s Political Monitor series, which has asked the public regularly since 1977 (and now roughly monthly), ‘Are you satisfied or dissatisfied with the way the Government is running the country?’

Since January 2010, the highest recorded result for dissatisfaction with government came in July 2016, a fortnight after the referendum on leaving the EU and a few days before Theresa May became Prime Minister. Satisfaction then improved; the highest level since January 2010 came in April 2017, a few days after May announced her intention to seek an early election, and the second highest in the following month.

However, dissatisfaction rose and satisfaction fell sharply a few days before the general election on 8 June. The three highest scores for dissatisfaction with how Theresa May’s Government was running the country (64% in July, 60% in September, 59% in November) came in the three polls following the election.

The British public might not have studied the workings of Whitehall and Westminster in detail; they may not have paid attention to SDPs or international comparisons; but their view of British government since the election is more negative than it was before.

However, the UK still has one of the most effective civil services internationally

Chart country rankings in International Civil Service Effectiveness index 2017 - top five overall highlighted

The pilot International Civil Service Effectiveness (InCiSE) index was a collaboration between the Institute for Government and the Blavatnik School of Government at Oxford University, with support from the UK civil service and funding from the Open Society Foundations.[16] It pulls together a wealth of existing data to assess civil service performance, examining both ‘core functions’ (things that the civil service delivers, like policymaking, and fiscal and financial management) and ‘attributes’ (characteristics that can affect how those things are delivered, for example integrity and openness).

The pilot index, published in July 2017, ranked the UK fourth out of 31 civil services. Canada came top, followed by New Zealand and Australia, with Finland rounding out the top five. Estonia (seventh overall) comes top when results are adjusted for gross domestic product.

Chart Scores of top five countries in International Civil Service Effectiveness InCiSE index

The UK performed particularly well on core functions, ranking second overall despite falling below the average score on digital services.

It came seventh on core attributes, despite having the highest score on openness, scoring lower on integrity and capabilities.

Despite its relatively good performance, there are still areas where the UK civil service could learn from other countries. There will also always be room for improvement even where the UK performs well. For example, although the UK scores well on openness, more could be done on the quality of data publication and responses to requests for information, as this report demonstrates.[17]