The Alan Turing Institute proposes new framework for automated analytics

The Alan Turing Institute has proposed a new framework for the use of automated analytics in security and law enforcement. 

The framework is designed to assess the risk of how automated analytics are impacting privacy intrusion. It is based on new research by the Alan Turing Institute’s Centre for Emerging Technology and Security (CETaS), which says that automated analytics is crucial to enabling national security and law enforcement agencies to operate effectively, but raises privacy concerns. This is because protecting the nation can sometimes involve a degree of monitoring and surveillance.

As automated methods are increasingly deployed to process the data collected through the use of such powers, the report seeks to understand the additional privacy considerations that could arise as a result of this automated processing  and to reduce any intrusion to the minimum level.  

The research was based on interviews and focus groups with stakeholders across the UK government, national security and law enforcement, and legal experts outside government.

The subsequent framework aims to introduce a common language and taxonomy that will assist stakeholders in assessing the potential impact of relevant privacy considerations. The framework is not intended to replace any existing authorisation or compliance processes, but rather to
provide an additional layer of assurance to existing compliance processes.

It focuses on six key factors relevant to proportionality judgements: datasets; results; human inspection; tool design; data management; and timeliness and resources.

Promote transparency and public trust 

Dr Marion Oswald, lawyer and Senior Research Associate at The Alan Turing Institute, said: “We need to better understand, map and monitor the risk of multiple, connected, automated systems feeding into each other over an extended period. We hope this framework will be adopted by people across the national security and law enforcement communities, such as analysts, investigators, legal advisers, oversight bodies and judicial commissioners.” 

Ardi Janjeva, lead author and Research Associate at CETaS, said: “Big data analytics and automated systems are becoming much more widespread in society. This means that changing expectations of privacy need to be understood in a more rigorous way, to promote transparency and public trust. That’s why we need more public perception surveys of intrusion from automated analytics in different scenarios.”

New call-to-action

Also Read