The civil liberties lobby group, Big Brother Watch, claims that Britain is facing an ‘epidemic’ in the use of facial recognition technology in areas that are privately owned but used by the public. Their research follows a report in the Financial Times last week that the 67-acre site around King’s Cross station in London is using the technology to monitor people using the area. The Information Commissioner, Elizabeth Denham’ says she is ‘deeply concerned’ about these developments. So how worried should we be?

Facial recognition cameras are the latest form of biometric technology to identify the unique identity of each individual. Up until recently fingerprints and DNA have been the only two measures of each human being’s uniqueness. But our faces are equally unique and the FR camera technology is able, using photographs, to convert the particular construction of our faces into a mathematical form that provides each of us with a sort of personal identity number that can be stored on a database. Live facial recognition (LFR) makes use of adapted CCTV cameras to do the job while we are on the move. We are all almost certainly unaware that this is being done to us when it is.

The Big Brother Watch report says that use of the LFR cameras is now spreading fast across privately-owned sites used by the public around Britain. It cites the Meadowhall shopping centre in Sheffield, the Millennium Point conference centre in Birmingham, the Trafford Centre in Manchester (until it was pressured to stop doing so last year), in casinos and betting shops and even at Liverpool’s World Museum where it claims visiting children were subject to being monitored by such cameras at an exhibition on China last year.

The FT’s report about King’s Cross drew a response from Argent, the developers who own the site, that the technology was being used ‘to secure public safety’ but the Mayor of London, Sadiq Khan, has written to them referring to the ‘serious and widespread concern’ about the use of the technology and demanding further explanation.

Several police forces, including London’s Metropolitan Police, have been engaged in trials of the technology to see whether or not it would be of use in fighting crime.

The problems associated with LFR are several. In the first place, there is the issue of accuracy. In its current state of development the technology is known to have flaws, not least its tendency for its rate of inaccuracy to increase as the skin tones of those it is photographing become darker, so that with very dark-skinned people it can confuse men and women. On these grounds alone San Francisco banned the use of the technology recently.

Rates of inaccuracy are also much higher with LFR than with images recorded when the subject is static. A study by Essex University earlier this summer of the trials being conducted by the Metropolitan Police concluded that identifications made through LFR were wrong 80% of the time. Clearly if the police or any other body with access to the results of LFR were to depend on it in the pursuit of individuals they had identified for some purpose, the implications for false identification and miscarriage of justice are considerable.

Secondly, there is the issue of privacy. LFR cameras are being used without the permission of the public whose images are being caught and recorded on a database. The Information Commissioner’s Office put out a statement that succinctly expresses the problem. ‘Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.’

And finally, what most concerns those activists raising the alarm about LFR is the virtual absence of any legislation to regulate its use. Other biometric data, such as from fingerprints and DNA, is regulated, and there is a UK biometrics commissioner, Professor Paul Wiles, to make sure the law is observed. But facial recognition technology does not come within the law by which he operates. He evidently believes it needs to but has as yet been unable to persuade government to act on it. Indeed he claims that in his three years as commissioner he has been able to secure only one meeting with a minister.

The King’s Cross case shows the need for legal regulation of the use of LFR, he believes. He said: ‘I have no idea what they’re trying to do at King’s Cross. There’s no point in having facial-matching technology unless you are matching it against some kind of database – now what is that database? It’s alarming whether they have constructed their own database or got it from somewhere else. There is a police database which I very much hope they don’t have access to.’

To some people, however, the use of LFR will seem a perfectly sensible development of the CCTV technology we already employ on a massive scale throughout the country. Such technology has already proved invaluable in tracking down criminals and LFR will do the same. The innocent have nothing to fear, they argue, and point out that there was a huge outcry from civil liberty organisations when CCTV cameras were first introduced but we have now got used to them and no one seems to bat an eyelid any more. So it will be with LFR.

But that’s not how campaigners see it now. Silkie Carlo, the director of Big Brother Watch, said: ‘Facial recognition is the perfect tool of oppression and the widespread use we’ve found indicates we’re facing a privacy emergency.’

Her talk of LFR being a ‘perfect tool of oppression’ will remind many of what is going on in China where its use is more advanced than anywhere else in the world and is being deployed, in conjunction with a system of social credits that punishes anti-social behaviour, to force people to conform with what the government wants or risk becoming an internal exile in the country, unable to get work, housing or other basics of life.

No one is suggesting that LFR is sending a democracy like Britain’s along that authoritarian path. But is the expansion of its own more limited use in Britain a cause for alarm or not? And if it is, what should we do about it?

What’s your view? Let us know.

Related Content