Interview: Deputy National Statistician on the future of census data
The Census 2021 has been the most successful campaign in the UK census history to date. 97% of households across England and Wales sent their responses, exceeding the 94% pre-census target and the response rate of the previous census in 2011.
It was also the first time that the Office for National Statistics (ONS), the government department responsible for planning and running the census, delivered a digital-first census, with over 22 million people submitting their answers online.
Data collected through the census is essential to inform policy and to assist vital infrastructure, planning and services. It identifies which areas should be prioritised and where resources should be allocated at national, regional and local levels. However, despite the rich picture that the census provides of people across England and Wales, it only happens once every 10 years - a long period of time in population change terms.
Alison Pritchard, Deputy National Statistician and Director General Data Capability at the ONS, believes that the future of the census lies in being able to provide administrative data and population statistics more frequently: “We’re just working on how frequent that can be but it could be quarterly or even monthly. We’re working out what the art of the possible is,” she said.
Earlier this year, the ONS announced its plans to adopt a new dynamic population model (DPM) that would allow the department to estimate and produce population figures more regularly at national and local authority level. The DPM will use a statistical modelling approach to leverage a range of data sources such as administrative and survey data, and it builds on previous ONS research to develop admin-based population estimates.
“We are going to be consulting users during the first half of next year on the research we have done,” Pritchard told Government Transformation magazine. “In late 2023, we will make a recommendation to government on what is needed for us to continue to realise our ambitions for more frequent, timely and inclusive population and social statistics.”
Challenges ahead include ensuring that the right methodology on administrative data is used when gathering information directly from citizens, as well as circumventing obstacles around data sharing - something Pritchard said the ONS is collaborating on with other government agencies: “It’s part and parcel of what we’re working through together.”
Digital twins for better decision-making
To ensure effective data-sharing in the public sector, Pritchard maintains that data needs to be articulated around “three Ls”, namely Linked, Local and Longitudinal.
Linked data requires integrated datasets able to offer a holistic view of an issue: “If you want to tackle something like climate change or jobs, you can’t just use one or two datasets: you need a combined view,” Pritchard shared at the Government Data Summit.
Local data provides an accurate picture and granular detail of the differences existing between localities in the country and is essential for the delivery of the Levelling Up agenda. The ONS uses local data regularly for initiatives such as the personal inflation index, which shows people how inflation is affecting them in their particular circumstances.
Pritchard added: “When you start getting local data, you start seeing really interesting pictures of areas of deprivation right alongside areas of high wealth in the same or adjacent households.”
The last L refers to longitudinal data: a kind of dataset that Pritchard said still needs to be mainstreamed. The common approach of using data for policy evaluation means that data is evaluated as the policy is developed, and success or failure outcomes are only found years after the policy was implemented.
Instead, longitudinal data provides a digital twin virtual model able to test policy before it is implemented, ensuring better decision-making and a more efficient allocation of resources.
“Having the right longitudinal data available to do the modelling and test policy ideas against a virtual model that allows us to safely work out what we think the response to a particular intervention is: that’s what government should be doing.”
‘Design for tomorrow, build for today’
For Pritchard, a common mistake that government makes falls into when working around data is short-termism. Instead, she is adamant in her emphasis of working on the foundational elements supporting the use of data, including the plumbing components around data architecture, risk management of data assets and technical architecture.
“Let’s design for tomorrow and build for today,” Pritchard says. “Get that right and we will see many benefits downstream.”
In this area, working towards the same usable standards will be crucial - first internally within government, then across public sector organisations in the UK and, lastly, internationally.
In August, the UK adhered to the Special Data Dissemination Standard Plus model, the International Monetary Fund’s highest dissemination standard for economic and financial data. Pritchard said that this initiative will allow the ONS and the wider government to work to international data standards.
“We’re working towards standards that, even if they are not usable today, they may be in one or five years’ time,” she added. ”That will ensure that we have the right model to work with data internationally as well.”
Ultimately, for government to succeed in its use of data, it will need to continuously build and sustain trust in how it collects it and use it with citizens.
Pritchard said that the main reason given by the public on why they trust ONS data is that they believe that it does not have a vested interest in the way data is used. Contrarily, the top reason offered by the public for mistrusting data is when they believe that it is being used for purposes different to what it was collected for.