Biometrics

Facial recognition judgement

by Mark Rowe

In what civil liberties campaigners have hailed what they termed a ground-breaking legal challenge against police use of facial recognition software in public that may have scanned an estimated 500,000 people in and around Cardiff during a trial since 2017.

The Court of Appeal agreed with campaign group Liberty’s submissions, on behalf of Cardiff resident Ed Bridges, 37, and found South Wales Police’s use of facial recognition technology in breach of privacy rights, data protection and equality laws.

Edward Bridges who lives in Cardiff brought the case under human rights law and the Data Protection Act (DPA), and the Equality Act 2010. The appeal succeeded on three grounds and failed on two. The successes were: that despite the DPA and code of practice, and South Wales’ own policies, the police had ‘no clear guidance’ where to use facial recognition, or who to put on a watch-list.

The court ‘held that this was too broad a discretion to afford to the police officers’. The police also fell down on their data protection impact assessment, under the DPA; nor under equality law did police take ‘reasonable steps’ to be sure that the software was not biased in terms of skin colour or gender. The court did add it did not have’ clear evidence’ that the software had any bias.

South Wales Deputy Chief Constable Jeremy Vaughan, who is the national policing lead for facial recognition, said there was ‘nothing in the Court of Appeal judgment that fundamentally undermines the use of facial recognition to protect the public’.

Comments

The UK data protection regulator the ICO said the judgment was ‘a useful step’ towards ‘a clear legal framework’.

Liberty lawyer Megan Goulding said: “This judgment is a major victory in the fight against discriminatory and oppressive facial recognition. The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination.

“It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”

South Wales Police Chief Constable Matt Jukes said: “I am confident this is a judgment that we can work with. Our priority remains protecting the public, and that goes hand-in-hand with a commitment to ensuring they can see we are using new technology in ways that are responsible and fair.

“After our approach to these issues was initially endorsed by the Divisional Court and now that the Court of Appeal has given further scrutiny on specific points, we will give their findings serious attention. The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require this attention. Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.”

As for the Equality Act part of the appeal, Jukes said that in 2019 his force commissioned academic analysis of this question ‘and although the current pandemic has disrupted its progress, this work has restarted’.

South Wales Police and Crime Commissioner Alun Michael said: “I believe that the court process has made it clear that the use of facial recognition technology by South Wales Police is legitimate and has set out what we need to do to ensure we meet our legal obligations. This is in addition to the thorough system of scrutiny and challenge that I have already put in place.”

Speaking on Radio 4, the Surveillance Camera Commissioner Tony Porter called on the Home Office to act quickly as the surveillance camera code of practice was out of date, dating from before the emergence of such tech as drones, AI and automatic facial recognition (AFR). For his full response visit the Commissioner’s website. He said that the Home Office had been ‘asleep on watch’.

The judgement was handed down remotely. For the full details visit the Court of Appeal website.

Background

As the court judgement set out, the AFR ‘works by extracting faces captured in a live feed from a camera and automatically comparing them to faces on a watch-list. If no match is detected, the software will automatically delete the facial image captured from the live feed. If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images’.

Related News

  • Biometrics

    Deploying voice biometrics

    by Mark Rowe

    Voice biometrics is a fast, convenient and highly accurate way to authenticate customers, writes Brett Beranek, Senior Principal Solutions Marketing Manager, Enterprise,…

  • Biometrics

    Remote gaze detection

    by Mark Rowe

    The Japanese company NEC Corporation reports developing “remote gaze detection technology”. It enables real-time detection of the direction an individual is looking,…

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing