Font Size: A A A

Home > Blogs > Demand from data

Mike Gillespie

Demand from data

Mike Gillespie of Advent IM gives his opinion on the use of facial recognition by police forces and an altogether different view of its efficacy and social implications. With so much at stake and a growing understanding of privacy from the general public, post-GDPR, the debate on this topic is growing and the implications, far-ranging.

You can’t go to a security expo these days without seeing Automatic Facial Recognition (AFR) being touted as the must have feature in just about every surveillance camera solution. Suppliers are very quick to tell you how this wonderful technology will enhance all manner of security operations. However, they are all too often not so quick at articulating how that technology will be used with safeguards. Safeguards for the privacy of individuals, a privacy that is not just an ethical or existentialist desire, but one that is enshrined in numerous laws, from the Human Rights Act, to the Protection of Freedoms Act to the new GDPR and Data Protection Act. Of even more concern is the look of confusion on the sales reps faces when asked how the resultant data is managed, safely and securely with appropriate retention and deletion protocols to ensure data captured relating to ‘innocents’ is discarded in a timely manner.

The UK is recognised as being one of the most surveilled nations in Europe, with an estimated half a million cameras being deployed in London alone. Recent trials by the Met Police at the Notting Hill Carnival and Remembrance Sunday and by South Wales Police at a major football match, have done little to improve public confidence in the adequacy or proportionality of the use of AFR by police forces. Along with this, there has been years of subtle messaging (Neuro Linguistically Programmed almost) from both state and commerce that has sought to convince us that our privacy needs to be something we give up to be adequately protected or for us to live in a consumer environment.

The constant idea when it comes to security that if you have nothing to hide, you have nothing to fear has become almost a mantra. That security can only be achieved if we give up our fundamental rights and freedoms as human beings.

One thing that I have seen however, with the arrival of GDPR, is a sudden upswell in the demand from people to have their data protected (albeit some of those demands being based on misunderstanding or marketing by myth). Misguided or not, the cat is out of the bag and people are starting to ask questions. It is quite possible that this mindset could lead to a reassertion of wider rights to privacy and that this will make itself known to the surveillance state.

Key considerations

Lack of transparency – The public never gets to see the Data Protection Impact Assessments which are mandated as part of a rollout of any significant public space surveillance, and Privacy Notices are also not easily obtainable, resulting in an overall lack of transparency. I must applaud Tony Porter, the Surveillance Camera Commissioner, and the man tasked with upholding and enforcing the Protection of Freedoms Act for his work in the area of transparency. In particular this year, as part of the Surveillance Camera Day, calling for public sector operators of public space surveillance control rooms to open their doors to the public in the interests of transparency. And why not? After all, if they have nothing to hide…

Tip of the iceberg – with AFR being so widely available it is now being adopted by both public and private sectors, and is undoubtedly going to be a standard feature in city centres, shopping centres and commercial districts in the next couple of years.

Proportionality – is it proportionate to surveill thousands of people using this invasive technology in the hope of detecting one or two wanted people? More importantly, what makes it ethically ok to use AFR to monitor people with known mental illnesses at a major event, even when those people have broken no laws and have no criminal records?

Data Retention – given the numerous times that local authorities and police forces have been shown to be holding onto information for far longer than is necessary, let alone what is legally acceptable can we trust these same organisations with our biometrics?

At the SCC Question Time event in 2018, BBW (Big Brother Watch) challenged the proportionality of the police holding a database of millions of people’s facial images captured using this and other surveillance technologies. Millions of INNOCENT people with no criminal history and no evidence that they are intending to commit criminal activity.

For technology like this to be rolled out en- masse in a trustworthy manner, organisations are going to have to get much better at transparency. There can be no argument whatsoever against publishing the outcomes of a DPIA, for having a proper and unambiguous privacy policy and for being honest with the public about the proposed retention policy. Technically, video surveillance has too often been shown to have shockingly woeful lack of security. Lessons haven’t always been learnt with the same lack of security being apparent in some Body Worn Video rollouts. Manufacturers of surveillance systems are only just getting to grips with the type of security controls needed to protect these systems from malicious external attacks and data exfiltration, helped by the SCC Secure by Design, Secure by Default standard being launched this year.

AFR, contrary to the myth, isn’t a magic amulet. It won’t intuitively identify an individual as a bad person. It works off a watch list of images of people known to the police. If the police already have an image of someone who is wanted, then it is also reasonable to assume that person will take measures to obfuscate their identity. If so, we now also therefore have to accept that it’s OK for the police to detain anyone obscuring their identity through the use of clothing and prosthetics, whether a wanted criminal or not. And that, if that makes you uncomfortable then you apparently have something to hide ….

Should we trust the state to protect our privacy and look after our information adequately? The National Audit Office has reported that there were 9000 personal information breaches across 17 government departments in one year alone (only a handful of which were apparently deemed important enough to report to the Information Commissioner). Poor protection, poor retention and poor deletion practices are endemic. So, do I trust the state to look after my data captured as part of the rollout of AFR? No, I am afraid not.

To continue to rollout widespread mass surveillance that’s designed to collect citizen biometric data but not designed to protect it would, surely, be to court disaster.

See also the Surveillance Camera Commissioner’s blog: https://videosurveillance.blog.gov.uk/2019/05/10/minimum-standard-for-the-manufacture-of-surveillance-cameras-to-be-launched/.

More from Big Brother Watch; visit the BBW website.

Photo by Mark Rowe; shutters, Exeter.