Policing by Machine collates the results of 90 Freedom of Information requests sent to every force in the UK
At least 14 UK police forces have used or intend to use discriminatory computer programs to predict where crime will be committed and by whom, according to research carried out by the non-profit and membership association, Liberty.
The Policing by Machine report collates the results of 90 Freedom of Information requests sent to every force in the UK – “laying bare the full extent of biased ‘predictive policing’ for the first time" – and how it threatens everyone’s rights and freedoms, said Liberty.
It reveals that 14 forces are using, have previously used or are planning to use algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.
The report exposes:
“Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a ’neutral’ technological veneer that affords false legitimacy,” said Hannah Couchman, advocacy and policy officer for Liberty.
She added: “Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”
“Life-changing decisions are being made about us that are impossible to challenge"
The report highlights a number of areas, including predictive policing algorithms that analyse troves of historical police data but, Liberty claims, this data presents a misleading picture of crime due to biased policing practices.
The computer programs are not neutral, and some are capable of learning, becoming more autonomous in their predictions and entrenching pre-existing inequalities while disguised as cost-effective innovation.
It also examines predictive mapping programs, which use police data about past crimes to identify “hot spots” of high risk on a map. Police officers are then directed to patrol these areas. Meanwhile, individual risk assessment programs predict how people will behave, including whether they are likely to commit – or even be victims of – certain crimes.
Durham Constabulary has used a program called Harm Assessment Risk Tool (HART) since 2016. It uses machine learning to assess the likelihood of a person committing an offence, but is designed to overestimate the risk.
HART bases its prediction on 34 pieces of data, including personal characteristics such as age gender and postcode, which could encourage dangerous profiling. And it has also considered factors such as “cramped houses” and “jobs with high turnover” when deciding the probability of a person committing crime.
Avon and Somerset Police’s risk assessment program even predicts the likelihood of a person perpetrating or suffering serious domestic violence or violent sexual offences.
The report makes a number of recommendations, including:
Read the full report here.
You might also like: