Five lessons on successful data interoperability
Data interoperability underpins the quest for a more data-driven government. When data flows freely, things become much simpler; new insights are uncovered leading to better outcomes for citizens and businesses. But how can government departments operationalise their data and transcend barriers to sharing - both technical and cultural?
Five leaders from the Office for National Statistics (ONS), DWP Digital, Employment and Social Development Canada (ESDC), HM Treasury and CGI share their most valuable lessons on interoperability.
With no unified data architecture, government departments are forced to take an ad hoc, siloed approach to data linkage that is both inefficient and limiting, says Fiona James, Chief Data Officer at the Office for National Statistics (ONS). She believes greater collaboration cross government is the key to making data interoperable and unlocking its full potential.
The ONS draws better insights from the data it acquires using its Reference Data Management Framework (RDMF), a set of data linkage products developed by ONS that enables faster, more consistent linkage of datasets and easy enrichment of datasets’ analytical potential with reference data. James says ONS are currently working towards making this available for the rest of government, with the intention being for RDMF to become the recognised ‘standard’ for cross-government data integration.
“Having a consistent linkage approach will significantly boost analytical potential across government. As Privacy Enhancing Technology, it also provides a really reliable way of de-identifying personal or sensitive information, whilst retaining that analytical benefit,” James says.
The challenge is being able to explain and get buy-in to the power of this new approach, particularly to non-experts, policymakers, ministers and permanent secretaries - especially amid security considerations about the handling of sensitive data, James explains. “There's a high level of support for RDMF in principle, but we're dialling up our validation and assurance elements to be really transparent around how data is de-identified, what the quality standards are for the linkage and how using the RDMF compares to using existing linkage approaches.”
Full buy-in is only possible when people understand how data is being used, James adds. “We need to buy-in to the greater good for public and society’s benefit around data: what is it and why is it needed? Only then can it genuinely begin to feed into decision-making and produce new analysis.”
The ONS framework has already been used successfully across a number of different areas. The Maritime and Coastguard agency used the geographic index to uncover new insights related to victims of drowning in the UK, which led to free swimming lessons being introduced in areas with a higher level of deprivation.
Outcomes such as these offer a glimpse into what the future of government decision-making could look like if a broader, cross-cutting approach to data integration is implemented more widely, James says. “From climate change to health inequality, we’ll be able to answer questions we currently don't have answers to because we're not joining things together in a way that enables us to do that. My only ask is that government breaks away from this siloed way of operating with data.”
Simplifying the path to delivery
Data interoperability requires a common understanding about who needs what and how that information can be shared. In the context of government - where each department has separate aims, policies and technology stacks that require data to be collected and used differently - this requires new ways of thinking, says Jaime Reid, Director Consulting Services in the UK Government Sector at CGI.
She believes organisations should focus on simplifying the path to delivery: “If you start on the ground moving up, not only is it really slow, but there's a copious amount of information and detailed processes to sift through. It’s almost impossible to make headway because it becomes too hard and too complex to take meaningful action.”
By starting at a higher level, and looking at what kind of data is needed by different departments and why, the opportunities, commonalities and consistencies will start to emerge more easily, Reid explains.
Everybody likes a picture that fits on one page. By simplifying the process and uniting around shared enterprise goals as opposed to individual ones "you can create a picture that people understand, can get excited by and you can begin to form an action plan around it,” Reid explains.
CGI worked with the National Crime Agency to accelerate data sharing and improve the circulation of intelligence internationally. Through designing, developing and deploying a system that automatically downloads and processes INTERPOL data, they agreed joint outcomes as a result of data-sharing, which enables officers to spend more time on valuable data-driven decision-making.
“To design a solution that benefits everyone and balances divergent needs, people must have a shared goal that they recognise and sign up
to; an agreed destination! To be successful, they must be willing to work together, explore alternative ideas and value different perspectives. I believe that this is how progress is made and value is realised," Reid said.
Building systems that work together
DWP Digital is seeking to provide a more agile and seamless user experience founded around the re-use and sharing of data. “This requires a joined-up organisation where teams are not operating in silos and where processes and people are considered before building a service,” says Jacqui Leggetter, Head of Data Integration at DWP Digital.
Leggetter is responsible for enabling DWP Digital service transformation based on adoption of its strategic reference architecture, which looks at making better use of the data available to provide more holistic and seamless services for citizens. “Instead of siloing customers, we're trying to open up and give them an opportunity to come in through multiple channels.” For example, pensioners may have caring responsibilities or need additional support with attendance allowance, which currently involves different contact methods; filling out separate on-line claim forms; often providing the same information more than once.
“We’ve started to build reusable components that underpin our citizen services, capturing the common data items like addresses and relationships once, then make them available to service lines across DWP through an API. This allows us to create services where citizens can easily interact with multiple products in a consistent way that enables a richer self-service experience.”
Working in this real-time data sharing world means there needs to be a high level of trust and confidence that these common technical components will be able to support cross-DWP services and wider data sharing across government. Leggetter says this requires close cooperation between business functions, infrastructure, and software engineering experts to monitor, identify and resolve issues quickly. “DWP Digital is currently processing 600 million API calls every month. This means that 600 million times every month we're sharing real-time data in a meaningful way across DWP, and across wider government services.”
Leggetter says the biggest challenge is transforming the mindset of those that have spent years finding ways to keep data locked down. “It is important to ensure that people feel empowered and have some autonomy in transforming their own service line, she explains.
“Allowing teams the freedom to discover and re-use the data available and be part of the wider data-sharing ecosystem - not having someone do it for them - is crucial in encouraging people to make the best use of the data DWP holds.”
Another valuable lesson has been upskilling non-technical staff, Leggetter notes. “There's a balance between finding the fastest route to doing something and having engineering excellence; sometimes those two concepts can create tension. We’ve seen the benefits of upskilling business analysts, service designers and delivery managers, who also need to know how to identify where they can accelerate delivery through reuse.”
Problems caused by legacy IT systems is still a major barrier to data interoperability, Leggetter adds. “How to drive good use of data that's held in legacy systems is probably one of the biggest challenges. We've tried to tackle some of that at DWP Digital by building our legacy bridge. This has enabled us to do a bit of data transformation that enables our new services to communicate with our legacy services; bridging old and new world technology; keeping everything synchronised during the transition period.”
More than technology
Ima Okonny, Chief Data Officer at Employment and Social Development Canada (ESDC), believes the key to becoming interoperable is by taking a “human-centric” approach to data, which aligns the organisation and respects the circumstances under which data may have been collected from citizens.
This is driving ESDC’s current transformation agenda, Okonny says. “We’re looking at how we can build better systems that centre around the people we need to serve, as opposed to the technology itself.”
“Part of the challenge is understanding the context that you're working in - as opposed to trying to fit the business into a very rigid structure from a purely data perspective,” Okonny explains. “Whatever we build needs to be centred around creating the best possible outcome for citizens. This will help our systems and how we think about interoperability, as opposed to building things in a silo.”
ESDC is the Government of Canada department responsible for developing and managing social programs and services. With roughly 40,000 employees spanning the entire country, embedding a data-first culture is the key to enabling better data-sharing within the organisation, but this is an ongoing challenge.
Having a consistent data infrastructure, creating a cohesive story around data and implementing data literacy programmes are some of the key ways to ensure the organisation is “moving in the same direction," Okonny says.
“Data is an art and a science; it is not just a technical exercise, it is an exercise in culture. This means bringing as many people as possible along on the journey, and then executing and scaling as you go,” she says.
Pushing the boundaries
John Kelly, Chief Data Officer at HM Treasury, believes it's up to DDaT leaders to push the boundaries for what’s possible in data interoperability or risk more missed opportunities.
“We've got the Central Digital and Data Office (CDDO) working to build a data catalogue and some standards around how governments can share what they've got. But doing so in a way that involves the majority of the departments just takes some time to finesse and get right."
While that work is starting, Kellys says: "I've seen opportunities that have come and gone, that would have been better informed if departments could have shared data more quickly, or policies that we'd better understand the effectiveness of if we were sharing data to help describe that policy landscape.”
He believes the onus is on senior leaders to question old practices and push the boundaries to drive progress. “Now, importantly, there are appropriate controls in place that can be seen as gateways or barriers to sharing information in an interactive way. But I think we need to test ourselves to challenge ourselves sometimes, on some of those.”
Covid was an exceptional situation; a call to action that set the bar for how significant something needs to be to move quickly on a data sharing issue, Kelly explains.
“Now we need to test ourselves and ask how less significant that situation needs to be before we're prepared to share and move quite quickly. We've already proven that we can do it. We need to push ourselves and think about whether we are doing it quickly enough in most cases.”