Following a review of the Met Police’s use of the technology, the panel recommends that live facial recognition software should only be deployed if five conditions can be met.
The independent London Policing Ethics Panel (LPEP), set up by the Mayor of London to provide ethical advice on policing issues that may impact on public confidence, has set out new guidelines on how facial recognition technology should be used by the Metropolitan Police (Met) in the UK capital.
Facial recognition software is designed to check people passing a camera in a public place against images on police databases. The guidelines build on the initial recommendations that the panel made in July last year, which the Met has already incorporated into its use of facial recognition technology.
The Ethics Panel’s research was informed by an examination of Londoners’ views on the police’s use of live facial recognition technology. More than 57 per cent felt police use of facial recognition software was acceptable, but this figure increased dramatically to around 83 per cent when respondents were asked whether they supported using the technology to search for serious offenders.
Although half of the respondents thought the use of this software would make them feel safer, more than a third of people raised concerns about the impact on their privacy.
The Met has carried out 10 trials using facial recognition technology across London as part of efforts to incorporate the latest technologies into day-to-day policing.
“Given how much of an impact digital technology can have on the public’s trust in the police, ensuring that the use of this software does not compromise this relationship is absolutely vital”
Following an extensive review of the Met’s use of this software, the independent Ethics Panel has today published a comprehensive final report which recommends that live facial recognition software should only be deployed by police if the five conditions below can be met:
The panel has published information about the trials on its website and informed Londoners about what the software is attempting to achieve.
The panel has also set out a framework to support the police when trialling new technology. The framework is designed to address any ethical concerns about how new technology will be used by the police, to make sure it is there to protect the public from risk and harm.
The framework consists of 14 questions covering engagement, diversity and inclusivity that the Met must consider before proceeding with any technological trial.
In addition to the Ethics Panel’s own research, the Met Police is carrying out two independent technical evaluations into its use of facial recognition software. The panel recommends that the Met does not conduct any further trials until the police have fully reviewed the results of the independent evaluations and are confident they can meet the conditions set out in the final report.
The panel concluded that while “there are important ethical issues to be addressed, these do not amount to reasons not to use LFR at all”, suggesting that “the Met should proceed with caution and ensure that robust internal governance arrangements are in place that will provide sound justifications for every deployment”.
“Our report takes a comprehensive look at the potential risks associated with the Met’s use of live facial recognition technology. “Given how much of an impact digital technology can have on the public’s trust in the police, ensuring that the use of this software does not compromise this relationship is absolutely vital,” said Dr Suzanne Shale, who chairs the London Policing Ethics Panel.
She added: “To reduce the risks associated with using facial recognition software, our report suggests five steps that should be taken to make sure the relationship between the police and the public is not compromised. We will be keeping a close eye on how the use of this technology progresses to ensure it remains the subject of ethical scrutiny.”
You might also like: