- Security TWENTY
- Women in Security
In what could be his last talk as Surveillance Camera Commissioner (SCC) before his near seven years in the post ends, Tony Porter has said that automatic facial recognition (AFR) – as in the Court of Appeal case in August, of Bridges versus South Wales Police – will and must be used; but, lawfully.
He called for the making of standards, policy and training around police deployment of AFR, because otherwise, ‘how do we know they are any good?’ and not infringing on someone’s liberty. He said: “I have said before and I will say again; I believe there needs to be a review of overt surveillance. I am a big fan of surveillance, but not so much a fan of bad surveillance.” Whether automatic number plate recognition, or body-worn video cameras live streaming, overt surveillance is now as powerful as covert surveillance, he said: “Whether it is as intrusive is another matter, and the Government needs to listen to this.”
Earlier, Porter, a former senior policeman before he became the first SCC in 2014 (pictured), said that he had drafted guidance for AFR use by police that was with his legal team (that represented him at the Court of Appeal) before going to forces shortly. He went back to when he first became aware of such facial recognition – ‘on a huge LED screen in Piccadilly’, in central London in 2017. With typical frankness he admitted that at the time, when asked about the legal issues by national newspapers, ‘I didn’t know; it was very complicated … I didn’t feel fit to answer, so I didn’t; I actually stayed silent, and I am rather glad I did.’
Porter recalled being an ‘interested party’ (with others) in the Bridges court case, which as Porter touched on, involved Cardiff man Ed Bridges who walked past a police van using facial recognition (and took up the case legally through the civil rights campaign group Liberty). He felt that his biometrics hd been processed unlawfully. After a first court ruling that the processing was lawful, the Court of Appeal in August (as featured in the September 2020 print edition of Professional Security magazine) ruled; and Tony Porter went through the court’s findings – that it was not in accordance with the law; that the police had not complied with their ‘equalities duty’ as a public body; and that (as Porter put it) the police force’s DPIA (data protection impact assessment, as required for such data processing) was ‘a little bit wonky’.
The laws and rules that applied, as the appeal court found and Tony Porter repeated to the Global MSC annual conference – usually in autumn in Bristol, this year done online – were the SCC’s code of practice, RIPA (the Regulation of Investigatory Powers Act, for surveillance by police), besides data protection law (the GDPR), human rights, equality law, and health and safety; which Tony Porter summed up as ‘a basket of confusion’.
As for whether the actual facial recognition used in South Wales was lawful, Tony Porter said that in the right circumstances it could be used, lawfully – ‘but of course if you used it in the same circumstances as Bridges, it would be unlawful’. As Porter added, police needed to focus on who was in the ‘watch-list’, and where they were looking for those on the list. Also to be looked at was algorithm bias (such as, was the biometric product biased against people by skin colour).
While Porter’s guidance is for police specifically – and he pointed out that police have to comply with a different part of the Data Protection Act than the part commerce has to – he also said that others working with the police on AFR, such as local government, retail or rail stations, had to comply with the Protection of Freedoms Act 2012, which set up the Surveillance Camera Commissioner’s Office. Porter spelt out that commerce’s compliant use of facial recognition would not be the same as compliance with GDPR, but ‘much broader’.
As for the police, he called for a ‘national procurement strategy’ for such technology, to make sure there was no bias in its use, and that it was according to standards, adding that ‘there needs to be ethical oversight’. He urged that retail and other private users of facial recognition would also approach the ethics of such use.
As a deployment would require up to a dozen assessments, he said that (unnamed) related regulators (such as the Biometrics Commissioner) could do much more to make it easier for users. The biometric products would also have to be cyber-secure; without ‘back doors’ that could be hacked; here he mentioned the SCC’s ‘secure by default‘ accreditation for products, as launched last year.
Government needs to update the SCC code of practice, he said; he felt that (as he stated in the aftermath of the Bridges judgement in August) that the Home Office let the police down ‘to an extent’. The police need ‘solid ground to work upon, if they are pushing the boundaries of the law … because I believe this technology will and must be used’, to protect people and save lives.
Porter also called for protocols around who is the police’s authorising officer in use of such tech; if the one authorising is the one deploying, ‘that is akin to marking your own homework’. While he admitted that would mean ‘more bureaucracy’, he added that people would be reassured of scrutiny that highly technical kit was being deployed, because it was necessary and proportionate (watchwords of Tony Porter’s time as Commissioner, of video generally).
The event earlier heard the tech side of AI (artificial intelligence) from manufacturers Pelco and Genetec.
More in the December 2020 print edition of Professional Security magazine.