You are viewing 1 of 2 articles without an email address.


All our articles are free to read, but complete your details for free access to full site!

Already a Member?
Login Join us now

Ethics panel sets out future guidelines for using facial recognition software

Following a review of the Met Police’s use of the technology, the panel recommends that live facial recognition software should only be deployed if five conditions can be met.

LinkedInTwitterFacebook
Software checks people passing a camera in a public place against an image database
Software checks people passing a camera in a public place against an image database

The independent London Policing Ethics Panel (LPEP), set up by the Mayor of London to provide ethical advice on policing issues that may impact on public confidence, has set out new guidelines on how facial recognition technology should be used by the Metropolitan Police (Met) in the UK capital.

 

Facial recognition software is designed to check people passing a camera in a public place against images on police databases. The guidelines build on the initial recommendations that the panel made in July last year, which the Met has already incorporated into its use of facial recognition technology.

 

Londoners’ views

 

The Ethics Panel’s research was informed by an examination of Londoners’ views on the police’s use of live facial recognition technology. More than 57 per cent felt police use of facial recognition software was acceptable, but this figure increased dramatically to around 83 per cent when respondents were asked whether they supported using the technology to search for serious offenders.

 

Although half of the respondents thought the use of this software would make them feel safer, more than a third of people raised concerns about the impact on their privacy.

 

The Met has carried out 10 trials using facial recognition technology across London as part of efforts to incorporate the latest technologies into day-to-day policing.

“Given how much of an impact digital technology can have on the public’s trust in the police, ensuring that the use of this software does not compromise this relationship is absolutely vital”

Following an extensive review of the Met’s use of this software, the independent Ethics Panel has today published a comprehensive final report which recommends that live facial recognition software should only be deployed by police if the five conditions below can be met:

  1. The overall benefits to public safety must be great enough to outweigh any potential public distrust in the technology;
  2. It can be evidenced that using the technology will not generate gender or racial bias in policing operations;
  3. Each deployment must be assessed and authorised to ensure that it is both necessary and proportionate for a specific policing purpose;
  4. Operators are trained to understand the risks associated with use of the software and understand they are accountable;
  5. Both the Met and the Mayor’s Office for Policing and Crime develop strict guidelines to ensure that deployments balance the benefits of this technology with the potential intrusion on the public.

The panel has published information about the trials on its website and informed Londoners about what the software is attempting to achieve.

 

The panel has also set out a framework to support the police when trialling new technology. The framework is designed to address any ethical concerns about how new technology will be used by the police, to make sure it is there to protect the public from risk and harm.

 

The framework consists of 14 questions covering engagement, diversity and inclusivity that the Met must consider before proceeding with any technological trial.

 

In addition to the Ethics Panel’s own research, the Met Police is carrying out two independent technical evaluations into its use of facial recognition software. The panel recommends that the Met does not conduct any further trials until the police have fully reviewed the results of the independent evaluations and are confident they can meet the conditions set out in the final report.

 

Proceeding with caution

 

The panel concluded that while “there are important ethical issues to be addressed, these do not amount to reasons not to use LFR at all”, suggesting that “the Met should proceed with caution and ensure that robust internal governance arrangements are in place that will provide sound justifications for every deployment”.

 

“Our report takes a comprehensive look at the potential risks associated with the Met’s use of live facial recognition technology. “Given how much of an impact digital technology can have on the public’s trust in the police, ensuring that the use of this software does not compromise this relationship is absolutely vital,” said Dr Suzanne Shale, who chairs the London Policing Ethics Panel.

 

She added: “To reduce the risks associated with using facial recognition software, our report suggests five steps that should be taken to make sure the relationship between the police and the public is not compromised. We will be keeping a close eye on how the use of this technology progresses to ensure it remains the subject of ethical scrutiny.”

 

You might also like:

 

LinkedInTwitterFacebook
Add New Comment
You must be a member if you wish to add a comment - why not join for free - it takes just 60 seconds!