Vertical Markets

‘Policing by Machine’ report

by Mark Rowe

Some UK police forces are ‘Policing by Machine’ says a human rights campaign group. Fourteen forces have used or intend to use computer programmes to predict where crime will be committed and by whom, according to Liberty.

Their new report, “Policing by Machine”, collates the results of 90 Freedom of Information requests sent to every force in the UK, and reports on ‘predictive policing’ – and how it threatens everyone’s rights and freedoms, according to the campaigners. Algorithms ‘map’ future crime or predict who will commit or be a victim of crime, using police data. The campaign group complains of lack of transparency and risk of ‘automation bias’.

The report deplores the ‘dystopian future’ of ‘pre-criminality’; that people are placed on a database assumed to be vulnerable to gang membership, or extremism. The report says: “The idea of “pre-criminality”, based on prejudicial generalisations, is already prevalent in the way we are policed. In order to resist predictive policing, we need to resist the dangerous practice of categorising people and subjecting them to additional surveillance and monitoring because we think we know how they are going to act.”

Hannah Counchman, advocacy and policy officer at Liberty, said: “Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a “neutral” technological veneer that affords false legitimacy. Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

The report complains that the software uses historical police data – data that presents a misleading picture of crime due to biased policing practices, it’s claimed. As with any other sort of software carrying out programming faster and further than any human could, the public – and the police – do not know how the programmes arrive at a decision. This means they are not adequately overseen, and the public cannot hold them to account, the report says.

The report calls on police forces to end their use of predictive mapping and individual risk assessment programmes; or at least disclose their use. Liberty calls for a ‘human rights impact assessment’ to be developed in relation to new digital solutions. For the full 83-page report visit the Liberty website.

Picture by Mark Rowe, pair of shoes wrapped around blue lamp outside Holborn police station, central London, late 2018 (since removed).

Related News

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing