- Security TWENTY
- Women in Security
Automated facial recognition technology is used by UK police forces without a clear legal basis, oversight or governmental strategy, despite its potential to infringe on civil liberties and rights, according to a report on facial recognition by the watchdog body Big Brother Watch.
It warns of a ‘silent erosion of human rights’. Its report, Face Off, on UK police use of facial recognition software linked to CCTV says: “It is highly questionable whether the use of automated facial recognition with public surveillance cameras, scanning and biometrically analysing every passer-by’s face, and enabling authorities to identify and track citizens without their knowledge, is compatible with fundamental human rights – in particular, the rights to a private life and to freedom of expression. The necessity of such biometric surveillance is highly questionable, and inherently indiscriminate scanning appears to be plainly disproportionate. As it stands, the risk that automated facial recognition is fundamentally incompatible with people’s rights under the Human Rights Act 1998 is yet to be considered.”
Leicestershire, South Wales and the Met police forces have used the software, including at the Champions League final in Cardiff in 2017 and outside a protested-against arms trade exhibition. BBW warns that the tech ‘poses an unprecedented threat to citizens’ privacy and civil liberties, and could fundamentally undermine the rights we enjoy in public spaces’. The 50-page report also covers facial recognition in use in other countries, such as Germany, the United States, Russia and China.
Tony Porter, the Surveillance Camera Commissioner (SCC), welcomed the report as adding value to a much needed debate on a matter of growing public interest. He said: “The effective regulation of use of face identification technology (commonly referred to as Automated Face Recognition or AFR) by the police is a priority of the National Surveillance Camera Strategy and a matter which I have been addressing as a priority for some time now, engaging with the National Police Chiefs Council, the Home Office, fellow regulators and Ministers alike.
He made the point that police (like indeed anyone else) have to abide by the Surveillance Camera Code of Practice which the SCC regulates under the Protection of Freedoms Act 2012. “That is not to say that I consider existing or indeed anticipated legislation as being wholly sufficient in these matters. I do not.” He said he thought that the police are genuinely doing their best with AFR.
The BBW report also pointed to low rates of matches to suspects, and to cases of ‘false positives’; whereby the software finds a match but the person is innocent. South Wales Police have commented that ‘false positives will continue to be a common problem for the foreseeable future’.
South Wales Police are to use their facial recognition at the Biggest Weekend in Swansea, a music festival. The force reports that it has three watch-lists, of Organised Crime Groups (OCGs) linked to such festivals; those wanted on warrant; and wanted for outstanding offences in the area. Deputy Chief Constable Richard Lewis said: “This is a huge event for Swansea with over 60,000 people descending on the area to attend the event or soak up the local atmosphere. The deployment of our AFR vans is part of our overall approach to policing the event, keeping people safe and helping us to ensure everyone has an enjoyable crime free day. The use of the vans is proportionate to the nature of the event and the watch lists have been constructed specifically to address issues at the event and in the locality.”
Paul Wiles, Biometrics Commissioner wrote last year about police use of Facial Recognition Technology at the Notting Hill Carnival in west London. Visit https://www.gov.uk/government/news/metropolitan-polices-use-of-facial-recognition-technology-at-the-notting-hill-carnival-2017.
Picture by Mark Rowe; Thameside graffiti, west London.