How we define departments (throughout)

Where possible, we group bodies into ‘departmental groups’ according to where ministerial responsibility lies, even when these are reported under a separate ‘departmental’ heading in the original data. For instance, we group Ofsted with DfE and not as a separate department.

We then make the following distinction within each departmental group:

  • Department The core department and other bodies within the department that are line-managed within a structure that flows from the departmental leadership (for example, the HM Prison and Probation Service within MoJ, the Education and Skills Funding Agency within DfE).
  • Other organisations Other bodies employing civil servants, like executive agencies and non-ministerial departments, for which ministers in the department have responsibility (e.g. Ofsted in DfE, DVLA in DfT), but which are not part of the department’s line management structure.

This isn’t always possible, and there are some other occasions where we don’t attempt to do so:

  • We apply our definition of ‘department’ in our analysis of staff numbers, grade, age, gender, ethnicity, disability, professions/specialisms and ministerial correspondence.
  • We use the wider ‘departmental group’ in our analysis of location.
  • We use the department as defined by the data producer on engagement, pay, major projects, Freedom of Information, spend over £25,000 and organograms.

In our analysis of government funding to public bodies (page 65), we exclude any bodies we consider to be part of the department, such as the Education and Skills Funding Agency.



Other Organisations

AGO Attorney General's Office Crown Prosecution Service; Crown Prosecution Service Inspectorate; National Fraud Authority; Revenue and Customs Prosecution Office; Serious Fraud Office; Treasury Solicitor
BEIS Department for Business, Energy and Industrial Strategy Advisory, Conciliation and Arbitration Service; Companies House; Competition and Markets Authority; HM Land Registry; Insolvency Service; Intellectual Property Office; Met Office; Office of Gas and Electricity Markets (Ofgem); Ordnance Survey; UK Space Agency

Cabinet Office (excluding agencies)

Office of the Parliamentary Counsel

Buying Solutions; Central Office of Information; Charity Commission; Crown Commercial Service; Government Procurement Service; National School of Government; UK Statistics Authority
DCLG Department for Communities and Local Government Fire Service College; Planning Inspectorate; Queen Elizabeth II Conference Centre
DCMS Department for Digital, Culture, Media and Sport Royal Parks; National Archives
Defra Department for Environment, Food and Rural Affairs Animal and Plant Health Agency; Animal Health; Animal and Veterinary Laboratories Agency; Centre for Environment, Fisheries and Aquaculture Science; Food and Environment Research Agency; Government Decontamination Services; Marine Fisheries Agency; Ofwat; Rural Payments Agency; Veterinary Laboratories Agency; Veterinary Medicines Directorate
DExEU Department for Exiting the European Union  

Department for Education

Education and Skills Funding Agency; Education Funding Agency; National College; National College for Teaching and Leadership; Standards and Testing Agency; Teaching Agency

Office of Qualifications and Examinations Regulation; Ofsted; Skills Funding Agency
DfID Department for International Development  
DfT Department for Transport Driver and Vehicle Licensing Agency; Driver and Vehicle Standards Agency; Driving Standards Agency; Government Car and Despatch Agency; Highways Agency; Maritime and Coastguard Agency; Office of Rail Regulation; Vehicle and Operator Services Agency; Vehicle Certification Agency
DH Department of Health (excluding agencies) Food Standards Agency; Meat Hygiene Service; Medicines and Healthcare Products Regulatory Agency; National Healthcare Purchasing and Supplies; NHS Business Services Authority; Public Health England
DIT Department for International Trade Export Credits Guarantee Department/UK Export Finance (from Q3 2016)

Department for Work and Pensions

Child Maintenance and Enforcement Commission; DWP Corporate and Shared Services; Jobcentre Plus; Pensions & Disability Carers Service

The Health and Safety Executive; The Rent Service
FCO Foreign and Commonwealth Office (excluding agencies) Security and Intelligence Services; Wilton Park Executive Agency; Foreign and Commonwealth Office Services
HMRC HM Revenue and Customs Valuation Office
HMT HM Treasury Asset Protection Agency; Debt Management Office; Government Actuary's Department; Government Internal Audit Agency; National Savings and Investments; Office for Budget Responsibility; Office for Government Commerce; OGC Buying Solutions; Royal Mint

Home Office (excluding agencies)

UK Border Agency

Criminal Records Bureau; Her Majesty's Passport Office; Identity Passport Service; National Fraud Authority; National Crime Agency

Ministry of Defence

Defence Equipment and Support

Defence Science and Technology Laboratory; Defence Support Group; UK Hydrographic Office; Meteorological Office

Ministry of Justice (excluding agencies)

HM Courts and Tribunals Service; HM Courts Service; Legal Aid Agency; National Offender Management Service; Scotland Office (including Office of the Advocate General for Scotland); The Office of the Public Guardian; Tribunals Service; Wales Office

National Archives; UK Supreme Court; Criminal Injuries Compensation Authority
NIO Northern Ireland Office  

Reshuffle analysis

We consider a minister to have changed role if they move department, move rank (e.g. from parliamentary under-secretary of state to minister of state), or the policy areas their role covers substantially change or increase. Because our analysis tends to take place as a reshuffle is unfolding, ministerial responsibilities might occasionally change without us recording a minister as having changed role.

Workforce analysis

Numbers may not be exact, as the Office for National Statistics (ONS) reports staff numbers in any given category to the nearest 10. It notes where numbers are less than five, which we have rounded up to three (for example, in our analysis of age).

The ONS also reports as ‘senior civil service’ certain roles – such as health professionals, military personnel and senior diplomats – which the Cabinet Office does not consider to be part of the actual senior civil service. This is why we refer to ‘senior civil service and equivalent’ in our analysis of ONS data.

While we use full-time equivalent in our analysis of staff numbers (see below), only headcount figures are available for most characteristics of the workforce (such as grade, ethnicity and disability status). There are some exceptions, such as specialisms.

Staff numbers

For staff numbers, we use table 9 from the ONS’s quarterly Public Sector Employment series, which contains staff numbers (full-time equivalent, FTE) in all public organisations that employ civil servants. FTE counts part-time staff according to the time they work (e.g. a person working two days a week as 0.4); this is more accurate than headcount, which does not distinguish between full-time and part-time employees.

Our calculated rates of change in each period for each department are adjusted for reclassifications of staff between bodies. Reclassifications are usually noted by the ONS in footnotes to the data tables. The figures shown for each department in our ‘change from baseline’ charts take a geometric average of per period change rates over all periods from 2010 Q3 (our Spending Review baseline) to the latest period. In our analysis of the Department for Exiting the European Union (DExEU), we have used the ONS’s estimate of total headcount, which includes all members of staff on loan from other departments. This means that some employees will be counted twice (under DExEU and their home department).


We have grouped the 27 different civil service professions (and ‘unknown’ and ‘not reported’) used by the ONS into four overarching categories as follows:


IfG Category

Commercial Cross-departmental specialisms
Communications Cross-departmental specialisms
Corporate Finance Cross-departmental specialisms (included in the Finance sub-category)
Digital, Data and Technology Cross-departmental specialisms
Economics Cross-departmental specialisms (included in the Analytics sub-category)
Finance Cross-departmental specialisms (included in the Finance sub-category)
Human Resources Cross-departmental specialisms
Inspector of Education and Training Departmental specialisms
Intelligence Analysis Departmental specialisms
Internal Audit Cross-departmental specialisms
Knowledge and information Management Cross-departmental specialisms
Legal Cross-departmental specialisms
Medicine Departmental specialisms
Operational Delivery Operational delivery
Operational Research Cross-departmental specialisms (included in the Analytics sub-category)
Planning Departmental specialisms (included in the Planning sub-category)
Planning Inspectors Departmental specialisms (included in the Planning sub-category)
Policy Cross-departmental specialisms
Project Delivery Cross-departmental specialisms
Property Cross-departmental specialisms
Psychology Departmental specialisms
Science and Engineering Departmental specialisms
Security Departmental specialisms
Social Research Cross-departmental specialisms (included in the Analytics sub-category)
Statistics Cross-departmental specialisms (included in the Analytics sub-category)
Tax Departmental specialisms
Veterinarian Departmental specialisms

Please note: this is a change from our previous classifications, used most recently in Whitehall Monitor 2017.

Financial transparency

We have ranked each government department according to how transparently it accounts for movements in spending plans.

For each financial year we compared the original spending plan, as published in Spending Review 2010, Spending Review 2013, and Spending Review 2015, with every reissue of a plan for that financial year (in annual Budget documents and the department’s Annual Report and Accounts), and noted whether the spending plan had changed and whether this change was explained. We looked for explanations in the annual Budget documentation, in the Government’s Public Expenditure Statistical Analyses (PESA), in departmental Annual Reports and Accounts, and in Explanatory Memoranda to Main and Supplementary Estimates.

We graded each department according to:

  • whether an explanation was given for a change
  • whether each movement was fully or partially explained
  • where the explanation appeared and how easy it was to access the documentation.

We then ranked the departments based on their average ranking across the financial years falling within each Spending Review (2011/12 to 2014/16 for Spending Review 2010; 2015/16 for Spending Review 2013; and 2016/17 for Spending Review 2015). The average ranking across all Spending Review periods was used to calculate the overall ranking.

Managing public spending

For each department we calculated the total amount of Resource Departmental Expenditure Limit (RDEL) minus depreciation, using 2016/17 data in HMT’s Online System for Central Accounting and Reporting (OSCAR). This provided us with a 100% departmental spending figure. Individual spending lines for each department were then ranked from largest to smallest and calculated as a percentage of the total  RDEL figure.

Each department’s spending lines were categorised as direct management, sponsorship of public bodies, system and grant funding, and markets and contracting. For each department we categorised approximately 85%–100% of total RDEL spending. Negative spending lines (i.e. income, such as at the Home Office for UK Visas and Immigration) have not been included. In certain areas we used departmental Annual Reports and Accounts to supplement our understanding of spending.

The net result of this process was a percentage breakdown into four component parts of each department’s total RDEL. This percentage breakdown forms the underlying basis of the heat map on page 62.

Each spending line in the OSCAR data is categorised as either ‘Programme’ or ‘Administration’ spending, providing us with the data to provide the breakdown of directly managed spending on page 63.

Public bodies by department

For figure 4.4, we have taken data from both the Cabinet Office’s Public Bodies 2017 report, and the list of government organisations on GOV.UK (as at 4 October 2017). Where there are differences between the two data sources, we have made a case-by- case decision regarding which source best reflects the current status of public bodies.

Responsiveness ranking

Our composite ranking collates timeliness in responding to:

  • Freedom of Information requests, Q4 2016 to Q3 2017. The score represents whether replies were made ‘in time’ – that is, inside the statutory 20-day limit or with a permitted ‘reasonable extension’.
  • Written parliamentary questions, 2016–17 session. We take together the percentage of ‘named day’ questions that were answered by the specified day, or within five sitting days for ordinary written questions.
  • Ministerial correspondence, 2016. We use the percentage answered within target – departments themselves set the target time, up to a maximum of 20 days.

The area of the bubbles is determined by total volume, and the position on the y axis is the percentage answered on time for that metric. Departments are then ordered according to their overall rank for timeliness, calculated using their rank on each individual metric.

Hospitality releases

We are extremely grateful to Transparency International UK (TI-UK) for compiling all quarterly releases published by government departments about the gifts and hospitality received and travel and meetings conducted by ministers, special advisers and senior officials. Our analysis has looked only at the ministerial releases. Having been provided with a list of publication dates by TI-UK, we sought to fill in any gaps by looking at GOV.UK and through correspondence with the Government Digital Service.

We understand that departments are supposed to publish a quarter in arrears. We have given a few days’ grace in our calculations, and so our analysis may be slightly more generous than the reality.

Spending over £25,000

We searched for £25,000 spend data on GOV.UK and for releases covering the period November 2010 to November 2017, in line with David Cameron’s initial instruction to government departments and our own publication schedule.

Where we could not find a file, we corresponded with the Government Digital Service; we also did the same if we could not find a publication date, and also used the history function on If a release still could not be located, we marked the file as ‘Not published’ (and if its date could not be found, ‘Date unknown’).

Treasury guidance says the releases should be published by the end of the following month (e.g. the September 2016 file should have been published by 31 October 2016). We extended the limit for ‘On time’ releases to 70 days from the first day of the month to which the data refers, to allow for weekends and public holidays and to give a few days’ grace. The guidance is clear that each monthly release should be published separately, but some departments have published in bulk. We have generously counted those months that were in time as ‘On time’, and others as ‘Late’.


We searched for organogram data on GOV.UK and and recorded whether or not we could find the file for each six-month period. We corresponded with the Government Digital Service if we could not find a file. Guidance says departments should publish their 31 March organograms by 6 June, and the 30 September versions by 6 December. Our final data was collected on 8 January 2018; since departments were due to have published their organograms for September 2017 by 6 December, this is being slightly generous. Some organograms may therefore have been published since, though they would still be late.

Single Departmental Plans

A good Single Departmental Plan (SDP) would have a short list of specific priorities, with a list of actions the department plans to undertake to achieve them.

The SDPs published in December 2017 have headline objectives with sub-objectives underneath and actions beneath those. If the objective constituted a single specific priority, it was counted, and the sub-objectives and actions below were ignored. If it did not, we turned our attention to those sub-objectives and actions below, counting the number of specific (or non-specific) priorities that sat beneath it. A priority was deemed specific if it was possible to assess whether or not it had been achieved.

For publication dates, we have relied on the ‘full page history’ function found on pages published on GOV.UK.

International Civil Service Effectiveness (InCiSE) index

InCiSE has published a full technical report with details of the methodology used, available on the Institute for Government website.