Live Facial Recognition in London ‘Absolutely Fair,’ Says Met Police Official

A committee heard that deployment decisions defer across the forces, and South Wales Police do not plan to use LFR in protests.
Live Facial Recognition in London ‘Absolutely Fair,’ Says Met Police Official
A van being used by the Metropolitan Police as part of their facial recognition operation in central London, on May 6, 2023. Will Edwards/AFP via Getty Images
Lily Zhou
Updated:
0:00

A Metropolitan Police official defended the proportionality of deploying live facial recognition (LFR) in London, saying it’s “absolutely fair.”

Giving evidence to the House of Lords Justice and Home Affairs Committee on Tuesday, Lindsey Chiswick, director of intelligence at the Met, said she believes the balance between security and privacy is right.

“I think that balance is right, I am quite happy for a fleeting instantaneous glimpse of my face, which is immediately destroyed if I’m not on the watch list. I’m content for that to take place. Others may feel different. The majority of the public in the surveys that there have been over the past three years are supportive though,” she said.

According to Ms. Chiswick, if the system doesn’t find a match against a watch list, an image would be “immediately and automatically destroyed” and officers in the back of the van would only see a pixellated image.

She also said there are “a lot of wanted people in London,” and that surveys suggest between 60 and 80 percent of the public are supportive of police using the tool.

Ms. Chiswick was ask about the proportionality of scanning the public at scale after another witness questioned the Met’s LFR deployments.

Karen Yeung, interdisciplinary professorial fellow in law, ethics, and informatics, and an AI adviser at various international bodies including the Council of Europe, said that Met has scanned 144,000 faces in 2020, and that’s “a prima facie violation of the privacy of 144,000 people in the public settings in London, for which they made eight arrests.

“None of which were for serious crimes. Many of them were for small drug offences and shoplifting. So there is a divergence between the claims that they only put pictures of those wanted for serious crimes on the watch list. And the fact that in the Oxford Circus deployment alone, there were over 9,700 images on that watch list. I'd quite like to know how each of those 9,700 images were justified as lawful, necessary, and proportionate to put those faces on the list,” she said.

Ms. Chiswick defended the inclusion of shoplifters in the watchlist during the Oxford Street deployment, saying shops are “desperate” to catch them.

A match by the system does not automatically lead to an arrest. Police officers, who are trained ahead of each deployment, are required to make their own judgement as usual following a match, the committee was told.

The Met and the South Wales Police are the first two forces to trial LFR technologies. While the deployments have resulted in the successful arrests of suspects, they have also been mired in controversy and debates around privacy, public consent, and system inaccuracies and bias.

Impact on Protesters

In 2019, Met Police officers trialling live facial recognition fined a man who didn’t want to show his face to the cameras. According to local outlet Romford Recorder, the man covered his face using his jumper but was then stopped by officers who decided he was acting suspiciously.

The Court of Appeal in 2020 ruled that the use of facial recognition in South Wales breached privacy and that the force had not done all it reasonably could to ensure the software it used didn’t have a racial or gender bias.

Witnesses told peers on Tuesday that accuracy has improved over the years. Ms. Chiswick said the Met has had two false alerts during 19 deployments. Mark Travis, temporary deputy chief constable and senior responsible officer for facial recognition technology at South Wales Police, said the force had “zero errors” after scanning more than 800,000 faces.

The committee was told that a force would put together a watchlist ahead of each deployment by selecting different categories, ranging from murderers, rapists, local lists of shoplifters, to missing people, vulnerable people, or others, upload the list 24 hours before, and delete it after the deployment.

A mobile police facial recognition facility outside a shopping centre in London on Feb. 11, 2020. (Kelvin Chan/AP Photo)
A mobile police facial recognition facility outside a shopping centre in London on Feb. 11, 2020. Kelvin Chan/AP Photo

Baroness Chakrabarti questioned South Wales Police’s loan of their LFR kits to the Northamptonshire Police during the Formula 1 Aramco British Grand Prix 2023.

“The chief constable there said he was looking for protesters, and of the 790 names on his watch list, only 234 were wanted for arrest, which leaves 556 on our watch list who were not wanted for arrest for any crimes,” she said, adding that she’s “concerned” over the inclusion of potential protesters.

In response, Mr. Travis told the committee that decisions are made by individual forces based on local circumstances.

He said the South Wales Police have not and do not plan to use LFR in protests because there are “far better technologies for gathering intelligence in relation to protest,” although he wouldn’t rule it out.

Ms. Chiswick said she doesn’t believe a hostile state can pursue dissidents by infiltrating the system.

“The systems are closed systems, so it’s not linked to other policing networks,” she said.

According to Ms. Chiswick, if foreign dissidents end up on a watch list, only those who are authorised can access the list during the time it’s in the system.

She didn’t rule out the possibility that hostile states may get access to a watch list “if they really wanted to,” but she doesn’t see reasons why they would want to.

In August, the government said it was looking to buy facial recognition technologies that the Home Office could potentially deploy in the following 12 to 18 months, including LFR and retroactive facial recognition, with which officers can look for matches with photos of suspects.

However, the Home Office said it was not looking for capabilities beyond identification, such as iris detection, lie detection, or analysis of how someone walks, which are known to be used in China’s mass-surveillance programme.

EU lawmakers last week agreed on a landmark artificial intelligence framework, which includes stringent restrictions for law enforcement and governments.

Related Topics