Innovation

How AI can help local authorities prioritise food inspections

Written by Staff Writer | Aug 17, 2022 2:26:27 PM

The Food Standards Agency (FSA), the government department responsible for food safety and hygiene in England, Wales and Northern Ireland, is testing an artificial intelligence (AI) tool that will help local authorities prioritise food business inspections and deal with the backlog of the pandemic.

Jesse Williams, Head of the Food Hygiene Ratings Scheme (FHRS) at the FSA, shares with Government Transformation how the tool was developed - and why it is relevant for other government agencies looking to implement AI.

Dealing with the backlog from the pandemic

The food hygiene rating scheme (FHRS) is a programme that informs the public about the hygiene standards of businesses that handle food, such as restaurants, take-aways, hotels or supermarkets. 

Businesses are rated on a 0 to 5 scale that provide a snapshot of the hygiene measures found at the time of the inspection. Although the display of the popular green rating sticker is not compulsory in England - it is in Wales and Northern Ireland - all ratings can be found online. The scheme is run by the FSA in collaboration with local authorities, who are responsible for carrying out the on-site inspections.

During the first year of the pandemic, the number of businesses waiting for a FHRS inspection increased steadily as a result of a hygiene inspections backlog, explains Jesse Williams, who heads the FHRS at FSA. Food teams had to be diverted to Covid-19-realated case work and officers were unable to make on-site visits, adding pressure to the system. 

Furthermore, local authorities were receiving a greater number of registrations from new businesses, and many others changed their functions, which meant that they could potentially require new inspections. 

At the beginning of 2022, the FSA began consultations with a group of local authorities across the country to investigate ways to solve this problem. Driving change through data and digital technology are at the core of the FSA, so the government department started to explore how an algorithmic tool could leverage the huge amounts of data that the FSA and local authorities have on food business hygiene. 

“We wanted to investigate whether there was an opportunity to optimise data and technology to help local authorities to risk-assess and prioritise their backlogs,” Williams says. “This would be done by providing each new business with a predicted food hygiene rating and indicator of hygiene compliance before they had had their first inspection.” 

This data, he adds, could then be combined with the local authority’s existing prioritisation processes and officers’ existing knowledge about their businesses. 

The FSA’s Strategic Surveillance Team and Regulatory Compliance Division entered a partnership with technology supplier Cognizant to start developing the tool in collaboration with a group of local authorities. 

Among other tasks, this collaboration involved working with business and technical subject matter experts within and outside of the FSA to understand and define the use-case scope, approach and expected benefits and risks of the tool. 

It also meant collating and preparing relevant data sources to be used for predictive modelling based on subject matter experts’ feedback and ensuring data quality standards, as well as liaising with local authorities to test the model with humans and analyse the results.

How the tool works

The tool chosen by the FSA is made up of a machine learning (ML) model that is then integrated into a web application which can be accessed by officers within local authorities in charge of carrying out the FHRS.

A ML framework called LightGBM was used to develop the FHRS artificial intelligence (AI) model. The model was trained on data from three sources, including FSA’s FHRS data, data publicly available from the Census 2011, and open data from developer HERE Technologies.

“Using this data, the model is trained to predict the food hygiene rating of an establishment awaiting its first inspection, as well as predicting whether the establishment is compliant or not,” Williams shares. 

“The model is integrated into a web application that presents local authority users with information on establishments within their remit.”

The information that local authorities can access through the web application includes a table with the name of and type of establishment and its predicted compliance and predicted rating. This data can also be visualised through an interactive map and downloaded into a CSV file.

Williams says that the FHRS rating and compliance predictions from the algorithmic model, along with the knowledge of the subject matter experts engaging with the system, can be combined to deploy local authorities’ resources more effectively and focus on businesses who show traits of being at a higher risk of non-compliance, and therefore those which may have a lower food hygiene rating. 

“Not using machine learning, and therefore not having this predictive capability, may reduce the ability of local authorities to make the best possible use of data and digital to tackle their backlogs or to operate more efficiently,” he adds. 

“In the long-term, this unrealised benefit may threaten the integrity of the food hygiene rating scheme or reduce consumer protection because a backlog may remain, or local authority resources may be used less efficiently.”

The development work took place over 10-week sprints and it is currently in its testing phase where it will be shared with a limited number of local authorities to gather feedback and conduct further iterations if the FSA decides to operationalise the tool. Although the original tool development was carried out with Cognizant, the next phase has been taken over by Kainos and Faculty.

How the FSA is mitigating the tool’s bias risks

Like any AI system that uses ML algorithms fed on data, there is a risk of bias creeping into the tool. In the case of the FHRS algorithm, there are three potential sources of bias, Williams explains. The first is a potential bias from the model itself, where, for example, it could consistently score establishments of a certain type with much lower and less accurate predictions. 

A second source of bias could originate from inspectors, whose perception of an establishment could be impacted by the tool’s predicted rating and whether the system has classified it as compliant or not before they carry out a full inspection.

Lastly, Williams says that with the use of AI and ML there is a chance of “decision automation bias or automation distrust bias” happening: “Essentially, this refers to a user being over or under reliant on the system leading to a degradation of human-reasoning,” he says. 

To mitigate these bias risks, the FHRS team integrated explainability and fairness tooling during the exploration and development phases of the model. These tools will also be integrated and monitored post-alpha testing to detect and mitigate potential biases from the system once it is fully operational.

The team also continuously engages in ‘reflect, act and justify sessions’ with businesses and technical subject matter experts throughout the delivery of the project, as well as using impact assessments to identify, assess and manage potential risks.

In addition to these measures, the FSA has developed usage guidance for local authorities that specifically outlines how the tool is expected to be used. 

“This document also clearly states how the service should not be used, for example, the model outcome must not be the only indicator used when prioritising businesses for inspection,” Williams concludes.

Williams adds that the aim of the tool is to sit alongside, rather than replacing, the way that local authorities prioritise inspections. And although the solution is specific to FHRS, he says that the approach could be replicated in other government departments with similar use cases and their relevant data sources.