new facial recognition technology guidelines for MET police

The independent panel that advises City Hall on the ethics of policing has set out new guidelines on how current and new facial recognition technology should be used by the Met Police in the capital.

The Metropolitan Police has carried out 10 trials using facial recognition technology across London as part of efforts to incorporate the latest technologies into day-to-day policing. Facial recognition software is designed to check people passing a camera in a public place against images on police databases.

Following an extensive review of the Met’s use of this software, the independent Ethics Panel has today published a comprehensive final report which recommends that live facial recognition software should only be deployed by police if the five conditions below can be met:

  • The overall benefits to public safety must be great enough to outweigh any potential public distrust in the technology
  • It can be evidenced that using the current and new facial recognition technology will not generate gender or racial bias in policing operations
  • Each deployment must be assessed and authorised to ensure that it is both necessary and proportionate for a specific policing purpose
  • Operators are trained to understand the risks associated with use of the software and understand they are accountable
  • Both the Met and the Mayor’s Office for Policing and Crime develop strict guidelines to ensure that deployments balance the benefits of this technology with the potential intrusion on the public

This builds on the initial recommendations that the panel made in July last year, which the Met have already incorporated into their use of current facial recognition technology and will set guidelines for the deployment of any new facial recognition technology. They have published information about the trials on their website and informed Londoners about what the software is attempting to achieve, and have allowed the panel to observe and comment on subsequent trials.

The panel has also set out a framework to support the police when trialling new technology. The framework is designed to address any ethical concerns about how new technology will be used by the police, to make sure it is there to protect the public from risk and harm. The framework consists of 14 questions covering engagement, diversity and inclusivity that the Met must consider before proceeding with any technological trial.

The Ethics Panel’s research was informed by an examination of Londoners’ views on the police’s use of live facial recognition technology. More than 57 per cent felt police use of facial recognition software was acceptable, but this figure increased dramatically to around 83 per cent when respondents were asked whether they supported using the technology to search for serious offenders.

Although half of the respondents thought the use of this software would make them feel safer, more than a third of people raised concerns about the impact on their privacy.

In addition to the Ethics Panel’s own research, the Met Police is carrying out two independent technical evaluations into its use of facial recognition software. The panel recommends that the Met does not conduct any further trials until the police have fully reviewed the results of the independent evaluations and are confident they can meet the conditions set out in the final report.

The panel concluded that while ‘there are important ethical issues to be addressed, these do not amount to reasons not to use LFR at all’, suggesting that ‘the Met should proceed with caution and ensure that robust internal governance arrangements are in place that will provide sound justifications for every deployment.’

Deputy Mayor for Policing and Crime, Sophie Linden, said:

“I welcome this extensive report into the potential implications of facial recognition software, and the recommendation that this technology should only be deployed by the police after five conditions are met – including strict new guidelines.

“We will continue to work closely with the Met and ensure the panel’s recommendations are addressed before further deployment.”

Dr Suzanne Shale, who chairs the London Policing Ethics Panel, said:

“Our report takes a comprehensive look at the potential risks associated with the Met’s use of live facial recognition technology. Given how much of an impact digital technology can have on the public’s trust in the police, ensuring that the use of this software does not compromise this relationship is absolutely vital.

“To reduce the risks associated with using facial recognition software, our report suggests five steps that should be taken to make sure the relationship between the police and the public is not compromised. We will be keeping a close eye on how the use of this technology progresses to ensure it remains the subject of ethical scrutiny.”

Read the full report here: https://www.london.gov.uk/press-releases/mayoral/future-framework-for-facial-recognition-software