How we define departments (throughout)
Where possible, we group bodies into ‘departmental groups’ according to where ministerial responsibility lies, even when these are reported under a separate ‘departmental’ heading in the original data. For instance, we group Ofsted with DfE and not as a separate department.
We then make the following distinction within each departmental group:
- department – the core department and other bodies within the department that are line-managed within a structure that flows from the departmental leadership (for example, HM Prison and Probation Service within MoJ, the Education and Skills Funding Agency within DfE)
- other organisations – other bodies employing civil servants, like executive agencies and non-ministerial departments, for which ministers in the department have responsibility (for example, Ofsted in DfE and DVLA in DfT) but which are not part of the department’s line management structure.
This isn’t always possible, and there are some other occasions where we don’t attempt to do so:
- We apply our definition of ‘department’ in our analysis of staff numbers, grade, age, gender, ethnicity, disability, professions/specialisms, Freedom of Information and ministerial correspondence.
- We use the wider ‘departmental group’ in our analysis of location.
- We use the department as defined by the data producer on engagement, pay, major projects, Freedom of Information, spend over £25,000 and organograms.
In our analysis of government funding to public bodies (page 68, Figure 4.4), we exclude any bodies we consider to be part of the department, such as the Education and Skills Funding Agency.
|AGO||Attorney General’s Office||
Crown Prosecution Service; Crown Prosecution Service Inspectorate; National Fraud Authority; Revenue and Customs Prosecution Office; Serious Fraud Office; Treasury Solicitor
|BEIS||Department for Business, Energy and Industrial Strategy||
Advisory, Concilation and Arbitration Service; Companies House; Competition and Markets Authority; HM Land Registry; Insolvency Service; Intellectual Property Office; Met Office; Office of Gas and Electricity Markets (Ofgem); Ordnance Survey; UK Space Agency
|CO||Cabinet Office (excluding agencies) Office of the Parliamentary Counsel||
Buying Solutions; Central Office of Information; Charity Commission; Crown Commercial Service; Government Procurement Service; National School of Government; UK Statistics Authority
|DCMS||Department for Digital, Culture, Media and Sport||
National Archives; Royal Parks
|Defra||Department for Environment, Food and Rural Affairs||Animal and Plant Health Agency; Animal and Veterinary Laboratories Agency; Animal Health; Centre for Environment, Fisheries and Aquaculture Science; Food and Environment Research Agency; Government Decontamination Services; Marine Fisheries Agency; Ofwat; Rural Payments Agency; Veterinary Laboratories Agency; Veterinary Medicines Directorate|
|DExEU||Department for Exiting the European Union|
|DfE||Department for Education||Education and Skills Funding Agency; Education Funding Agency; National College; National College for Teaching and Leadership; Office of Qualifications and Examinations Regulation; Ofsted; Skills Funding Agency; Standards and Testing Agency; Teaching Agency|
|DfID||Department for International Development|
|DfT||Department for Transport||Driver and Vehicle Licensing Agency; Driver and Vehicle Standards Agency; Driving Standards Agency; Government Car and Despatch Agency; Highways Agency; Maritime and Coastguard Agency; Office of Rail Regulation; Vehicle and Operator Services Agency; Vehicle Certification Agency|
|DHSC||Department of Health and Social Care (excluding agencies)||
Food Standards Agency; Meat Hygiene Service; Medicines and Healthcare Products Regulatory Agency; National Healthcare Purchasing and Supplies; NHS Business Services Authority; Public Health England
|DIT||Department for International Trade||Export Credits Guarantee Department/UK Export Finance (from Q3 2016)|
|DWP||Department for Work and Pensions||Child Maintanence and Enforcement Commission; DWP Corporate and Shared Services; Jobcentre Plus; Pensions & Disability Carers Services; The Health and Safety Executive; The Rent Service|
|FCO||Foreign and Commonwealth Office (excluding agencies)||Foreign and Commonwealth Office Services; Security and Intelligence Services; Wilton Park Executive Agency|
|HMRC||HM Revenue and Customs||Valuation Office|
|HMT||HM Treasury||Asset Protection Agency; Debt Management Office; Government Actuary’s Department; Government Internal Audit Agency; National Savings and Investments; Office for Budget Responsibility; Office for Government Commerce; OGC Buying Solutions; Royal Mint|
|HO||Home Office (excluding agencies)||Criminal Records Bureau; Her Majesty’s Passport Office; Identity Passport Service; National Fraud Authority; National Crime Agency; UK Border Agency|
|MHCLG||Ministry of Housing, Communities and Local Government||Fire Service College; Planning Inspectorate; Queen Elizabeth II Conference Centre|
|MoD||Ministry of Defence||Defence Equipment and Support; Defence Science and Technology Laboratory; Defence Support Group; UK Hydrographic Office; Meteorological Office|
|MoJ||Ministry of Justice (excluding agencies)||
Criminal Injuries Compensation Authority; HM Courts and Tribunals Service; HM Courts Service; Legal Aid Agency; National Archives; National Offender Management Service; Scotland Office (including Office of the Advocate General for Scotland); The Office of the Public Guardian; Tribunals Service; UK Supreme Court; Wales Office
|NIO||Northern Ireland Office|
Reshuffle analysis, pages 17–19
We consider a minister to have changed role if they move department, move rank (e.g. from Parliamentary Under-Secretary of State to Minister of State) or the policy areas their role covers substantially change or increase. Because our analysis tends to take place as a reshuffle is unfolding, ministerial responsibilities might occasionally change without us recording a minister as having changed role.
Workforce analysis, pages 25–48
Numbers may not be exact, as the Office for National Statistics reports staff numbers in any given category to the nearest 10. It notes where numbers are less than five, which we have rounded up to three (for example, in our analysis of age).
The ONS also reports as ‘Senior Civil Service’ certain roles – such as health professionals, military personnel and senior diplomats – which the Cabinet Office does not consider to be part of the actual senior civil service. This is why we refer to ‘senior civil service and equivalent’ in our analysis of ONS data.
Staff numbers, pages 26–31
For staff numbers, we use table 9 from the Office for National Statistics’ quarterly Public Sector Employment series, which contains staff numbers (full-time equivalent, FTE) in all public organisations that employ civil servants. FTE counts part-time staff according to the time they work (e.g. a person working two days a week as 0.4); this is more accurate than headcount, which does not distinguish between full-time and part-time employees.
Our calculated rates of change in each period for each department are adjusted for reclassifications of staff between bodies. Reclassifications are usually noted by the ONS in footnotes to the data tables. The figures shown for each department in our ‘change from baseline’ charts take a geometric average of per period change rates from the baseline (for most departments, Q2 2016) to the latest period. In our analysis of the Department for Exiting the European Union, we have used the ONS’s estimate of total headcount, which includes all members of staff on loan from other departments. This means that some employees will be counted twice (under DExEU and their home department).
Professions/specialisms, pages 38–39
We have grouped the 27 different civil service professions reported by the ONS into four overarching categories as follows:
|Corporate Finance||Cross-departmental specialisms (included in the Finance sub-category)|
|Digital, Data and Technology||Cross-departmental specialisms|
|Economics||Cross-departmental specialisms (included in the Analytics sub-category)|
|Finance||Cross-departmental specialisms (included in the Finance sub-category)|
|Human Resources||Cross-departmental specialisms|
|Inspector of Education and Training||Departmental specialisms|
|Intelligence Analysis||Departmental specialisms|
|Internal Audit||Cross-departmental specialisms|
|Knowledge and Information Management||Cross-departmental specialisms|
|Operational Delivery||Operational delivery|
|Operational Research||Cross-departmental specialisms (included in the Analytics sub-category)|
|Planning||Cross-departmental specialisms (included in the Planning sub-category)|
|Planning Inspectors||Cross-departmental specialisms (included in the Planning sub-category)|
|Project Delivery||Cross-departmental specialisms|
|Science and Engineering||Departmental specialisms|
|Social Research||Cross-departmental specialisms (included in the Analytics sub-category)|
|Statistics||Cross-departmental specialisms (included in the Analytics sub-category)|
Please note: this is a change from our previous classifications, used most recently in Whitehall Monitor 2017.
Financial transparency, page 61
We have ranked each government department according to how transparently it accounts for movements in spending plans.
For each financial year we compared the original spending plan, as published in Spending Review 2010, Spending Review 2013 or Spending Review 2015, with every reissue of a plan for that financial year in the Treasury’s annual Public Expenditure Statistical Analyses (PESA) publication, noting whether the spending plan had changed and whether this change was explained in the PESA report.
We graded each department according to whether an explanation was given for changes and whether the explanation was full or partial, while also taking into account the size of the changes.
For each department in each financial year, we then calculated how many penalty points – awarded for not explaining changes – they had received as a percentage of total possible penalty points. The overall ranking in the table is based on the average across all financial years.
Managing public spending, pages 64–67
For each department we calculated the total amount of Resource Departmental Expenditure Limit (RDEL) minus depreciation, using 2017/18 data in HMT’s Online System for Central Accounting and Reporting (OSCAR). This provided us with a 100% departmental spending figure. Individual spending lines for each department were then ranked from largest to smallest and calculated as a percentage of the total RDEL figure.
Each department’s spending lines were categorised as direct management, sponsorship of public bodies, system and grant funding, or contracting. For each department we categorised approximately 80% to 100% of total RDEL spending. Negative spending lines (i.e. income, such as at the Home Office for UK Visas and Immigration) have not been categorised. In certain areas we used departmental Annual Reports and Accounts to supplement our understanding of spending.
The net result of this process was a percentage breakdown into four component parts of each department’s total RDEL. This percentage breakdown forms the underlying basis of the heat map on page 64.
Each spending line in the OSCAR data is categorised as either ‘programme’ or ‘administration’ spending, providing us with the data to provide the breakdown of directly managed spending on page 66.
Size of Freedom of Information teams, page 113
We submitted FoI requests to each government department asking for “the number of staff (full-time equivalent) in the team responsible for Freedom of Information requests” for 2010 and 2018 and any intervening years if the £600 cost limit was not exceeded.
Only three out of 22 departments held this information for the whole period – DfE, CO and HMRC. A further 12 departments provided us with figures for 2018, while seven – the Scotland Office, DfID, DHSC, MHCLG, DfT, MoD and DWP – were unable to provide staff numbers for last year. The Scotland Office and DfID told us that all staff are responsible for responding to FoI requests, and that as a result they could not provide us with figures. DfT told us that they “have no business need to record this type of information”, but shared with us an earlier FoI which had figures from 2005 to 2012.
DHSC told us that “where the number of individuals is fewer than or equal to five, we are unable to disclose the exact number of cases under section 40(2) of the FOIA, which relates to personal information of third parties.” MoJ responded to our initial request seeking a clarification, but then failed to provide any further information.
Finally, DWP, MHCLG and MoD all refused our request on cost grounds, but the latter two departments did suggest that a narrower request (i.e. just staff numbers for 2018) might be successful.
Hospitality releases, pages 119–120
We are extremely grateful to Transparency International UK for originally compiling all quarterly hospitality releases published by government departments for ministers, special advisers and senior officials. It has a website which allows users to search the data at https://openaccess.transparency.org.uk/. Our analysis has looked only at the ministerial releases. Having been provided with a list of publication dates by Transparency International UK, we sought to fill in any gaps by looking at GOV.UK and through correspondence with the Cabinet Office.
We understand that departments are supposed to publish a quarter in arrears. We have given a few days’ grace in our calculations, and so our analysis may be slightly more generous than the reality.
Spending over £25,000, page 119–120
We searched for £25,000 spend data on GOV.UK and data.gov.uk for releases covering the period November 2010 to October 2018 in line with David Cameron’s initial instruction to government departments and our own publication schedule.
Where we could not find a file, we corresponded with the Cabinet Office; we did the same if we could not find a publication date, and also used the history function on data.gov.uk. If a release still could not be located, we marked the file as ‘not published’ (and if its date could not be found, ‘date unknown’).
Treasury guidance says the releases should be published by the end of the following month (e.g. the September 2016 file should have been published by 31 October 2016). We extended the limit for ‘on time’ releases to 70 days from the first day of the month to which the data refers, to allow for weekends and public holidays and to give a few days’ grace. The guidance is clear that each monthly release should be published separately, but some departments have published in bulk. We have generously counted those monthly releases that were in time as ‘on time’, and others as ‘late’.
Organograms, pages 118–120
We searched for organogram data on GOV.UK and data.gov.uk and recorded whether or not we could find the file for each six-month period. We corresponded with the
Cabinet Office if we could not find a file. Guidance says departments should publish their 31 March organograms by 6 June, and the 30 September versions by 6 December. Our final data was collected on 12 December 2018 as departments were due to have published their organograms for September 2018 by 6 December. Some organograms may therefore have been published in the meantime, though they would still be late.
Single Departmental Plans, pages 130–134
We defined a performance measure as any dataset, figure or other indicator included under the ‘Our Performance’ section of each SDP objective. Since SDPs use a broad range of measures, it was not possible to set quantitative boundaries for our different red/amber/green (RAG) categories. Instead we made qualitative judgements, which took into consideration the aspect of performance being measured and any relevant supplementary information found under the respective objective. Where necessary we examined longer-term patterns in a dataset to see whether changes during the period under analysis exceeded typical fluctuations.
To judge how sufficiently a measure captured performance, we subjected it to two tests: is the department responsible for changes in the measure, and is the measure connected to the objective? We assigned an RAG rating for each test, where green represented closely related, amber represented somewhat related, and red represented largely unrelated. For a measure to be considered usable for performance analysis, it had to receive one amber and one green rating, or two green ratings.
For our objectives performance RAG rating (Figure 9.3), there were several occasions where the average of all of an objective’s measures fell between an RAG colour. In these instances we assessed individual measures and gave more weight to measures which had been scored as having higher relevance and/or displayed greater rates of change in performance.
Unless otherwise stated, references to SDPs concern the most recent versions. For most departments, these were last updated in May 2018, although some have received minor updates since then. For publication dates, we have relied on the ‘full page history’ function found on pages published on GOV.UK.