You are viewing 1 of 2 articles without an email address.


All our articles are free to read, but complete your details for free access to full site!

Already a Member?
Login Join us now

Liberty report examines police's use of algorithms to predict crime

Policing by Machine collates the results of 90 Freedom of Information requests sent to every force in the UK

LinkedInTwitterFacebook
Fourteen police forces have or are planning to use algorithms which map future crime
Fourteen police forces have or are planning to use algorithms which map future crime

At least 14 UK police forces have used or intend to use discriminatory computer programs to predict where crime will be committed and by whom, according to research carried out by the non-profit and membership association, Liberty.

 

The Policing by Machine report collates the results of 90 Freedom of Information requests sent to every force in the UK – “laying bare the full extent of biased ‘predictive policing’ for the first time" – and how it threatens everyone’s rights and freedoms, said Liberty.

 

Predictive algorithms

 

It reveals that 14 forces are using, have previously used or are planning to use algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.

 

The report exposes:

  • police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed;
  • predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling;
  • a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions;
  • the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct.

“Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a ’neutral’ technological veneer that affords false legitimacy,” said Hannah Couchman, advocacy and policy officer for Liberty.

 

She added: “Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

“Life-changing decisions are being made about us that are impossible to challenge"

The report highlights a number of areas, including predictive policing algorithms that analyse troves of historical police data but, Liberty claims, this data presents a misleading picture of crime due to biased policing practices.

 

The computer programs are not neutral, and some are capable of learning, becoming more autonomous in their predictions and entrenching pre-existing inequalities while disguised as cost-effective innovation.

 

It also examines predictive mapping programs, which use police data about past crimes to identify “hot spots” of high risk on a map. Police officers are then directed to patrol these areas. Meanwhile, individual risk assessment programs predict how people will behave, including whether they are likely to commit – or even be victims of – certain crimes.

 

Machine learning

 

Durham Constabulary has used a program called Harm Assessment Risk Tool (HART) since 2016. It uses machine learning to assess the likelihood of a person committing an offence, but is designed to overestimate the risk.

 

HART bases its prediction on 34 pieces of data, including personal characteristics such as age gender and postcode, which could encourage dangerous profiling. And it has also considered factors such as “cramped houses” and “jobs with high turnover” when deciding the probability of a person committing crime.

 

Avon and Somerset Police’s risk assessment program even predicts the likelihood of a person perpetrating or suffering serious domestic violence or violent sexual offences.

 

The report makes a number of recommendations, including:

  • Police forces in the UK must end their use of predictive mapping programs and individual risk assessment programs.
  • At the very least, police forces in the UK should fully disclose information about their use of predictive policing programs. Where decision-making is informed by predictive policing programs or algorithms, this information needs to be communicated to those directly impacted by their use, and the public at large, in a transparent and accessible way.
  • Investment in digital solutions for policing should focus on developing programs that actively reduce biased approaches to policing. A human rights impact assessment should be developed in relation to new digital solutions, which should be rights-respecting by default and design.

Read the full report here.

 

You might also like:

LinkedInTwitterFacebook