Senior UK Peer Calls for Limits on Facial Recognition Technology

Senior UK Peer Calls for Limits on Facial Recognition Technology
Facial-recognition technology is operated at Argus Soloutions in Sydney, Australia, on Aug. 11, 2005. Ian Waldie/Getty Images
Patricia Devlin
Updated:

A senior peer has called for legislation to “drastically limit” the use of facial recognition technology (FRT) after chilling claims it is being used by stalkers and predators.

Baroness Jenny Jones told The Epoch Times that the government must put in place a “structure for scrutiny” over public and private use of citizens’ biometric data.

Last week, a hard-hitting report by privacy group Big Brother Watch revealed how the growth of facial recognition surveillance is getting “out of control” in the UK.

As well as highlighting the rise of FRT in schools, the civil liberties group said facial surveillance is being used without the public’s knowledge in shops and by police on the street.

The Biometric Britain report (pdf) also revealed the sinister use of mass facial search engine sites which could be used by anyone to target women and girls.

One of those—PimEyes—allows users to upload a photograph of a person, even someone they have photographed in the street, and find any other images published of them online.

The website uses facial recognition to identify relevant photographs from a database of at least 900 million images.

Anybody with an internet connection can use the face search engine—and there are no checks to ensure that users are only searching for themselves or people who have consented to their images being used.

The company claims that its terms restrict searches to users’ own faces, although its enforcement of this is limited to a box that the user must tick before searching.

However, Big Brother Watch says the tool’s “total absence of safeguards” means it could not only be secretly used by potential employers and university admissions officers but also “domestic abusers or stalkers.”

Pornographic Content

The Epoch Times analysed the search facility—owned by a company whose headquarters are located in the U.S. tax haven of Belize—using a photograph of one of its reporter’s faces.

The website’s results were frighteningly accurate.

Of the 293 results returned in just under two seconds, 292 were correct matches.

Some images included pictures of the individual in the background of other people’s photographs, which they were previously unaware of.

Two of the results linked to photographs, uploaded to a personal Facebook page, are currently being used on false profiles on a foreign dating website without the owner’s knowledge.

PimEyes says it does not scrape photographs from social media sites, but only from those “publicly available” on the internet, for example, blogs and news sites.

Both images came with the warning of “explicit content”—despite only showing the female reporter’s face.

More worryingly, another “explicit content” result returned an image—not belonging to the individual—from a pornographic website.

Big Brother Watch detailed in its most recent report how journalist’s for the New York Times previously carried out similar analysis discovering that searches returned links to a number of females’ authentic images to false pornographic material.

The newspaper article, published in 2020, states, “For the women, the incorrect photos often came from pornography sites, which was unsettling in the suggestion that it could be them.”

Big Brother Watch Watch states that the FRT search engine could cause “potential negative outcomes” for individuals and could “put people at risk of harm.”

“The considerable evidence above suggesting that these tools are being used to identify and track women who appear, either consensually or non-consensually, in sexually explicit content online, is deeply chilling,” its report said.

“Online facial recognition is putting women at serious risk. Widespread and unfettered use of these tools is already leading to a new era of technology-powered sexual harassment, threats, and stalking.”

The group has urged the Information Commissioner’s Office (ICO) “to urgently step in to safeguard UK residents from such abuses.”

Along with safeguarding, privacy, and safety issues, Big Brother Watch says FRT search sites have “no lawful basis under data protection law.”

A cellphone is held with the logo of facial recognition company Clearview AI Inc. on the screen in front of the business webpage. (T. Schneider/Shutterstock)
A cellphone is held with the logo of facial recognition company Clearview AI Inc. on the screen in front of the business webpage. T. Schneider/Shutterstock

Watchdog Fine

In 2022, the ICO fined another similar search database—available to private customers including the police—£7.5 million.

The ICO investigation found it had collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms to create a global online database.

People were not informed that their images were being collected or used in this way.

The company provides a service that allows customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database.

The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.

The ICO said in its ruling, “Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.”

Although Clearview AI Inc no longer offers its services to UK organisations, the ICO said the company “has customers in other countries, so the company is still using personal data of UK residents.”

In November, Big Brother Watch filed a lengthy complaint to the ICO over PimEyes’ use of UK data.

The advocacy group also claimed in its complaint to the UK data and privacy watchdog that the site “facilitates stalking.”

PimEyes’ Chief Executive Giorgi Gobronidze denied the claims, saying it poses fewer risks related to stalking than social media or other search engines.

Gobronidze told the BBC that because PimEyes only searches images posted publicly anyone misusing it “gets only the information available on the open internet.”

Earlier this year, the ICO informed Big Brother Watch that no action was being taken against the search engine.

The Epoch Times has contacted PimEyes for comment.

Green Party member Jenny Jones, Baroness Jones of Moulsecoomb, poses for a portrait at the "Stop HS2" camp at Euston Station in London on Jan. 31, 2021. (Hollie Adams/Getty Images)
Green Party member Jenny Jones, Baroness Jones of Moulsecoomb, poses for a portrait at the "Stop HS2" camp at Euston Station in London on Jan. 31, 2021. Hollie Adams/Getty Images

‘No Safeguards or Oversight’

Speaking on Thursday, Jones said it was time that the government legislated the use of facial technology.

“Currently, FRT is used in a variety of ways, but none of them has safeguards, nor oversight,” she told The Epoch Times.

“The police have no limits on its use, even with the system’s flaws, including difficulty in distinguishing the features of darker skinned people.

“Most people would understand the use of such technology for finding terrorists or criminals, but be shocked at the extent it’s used in ordinary circumstances, when in shops or just walking on the street.”

Jones added, “It’s time for legislation to drastically limit the use of FRT and put in place a structure for scrutiny.”

In April, the Metropolitan Police welcomed a research report that found there were minimal discrepancies for race and sex when facial recognition technology is used in certain settings.

The research, carried out by the National Physical Laboratory, was commissioned by the Met and South Wales Police in late 2021 following fierce public debate about police use of the technology.

South Wales Police had paused its use of the technology amid concerns over discrimination but said it would resume in the wake of the report.

Human rights groups Liberty, Big Brother Watch, and Amnesty said the technology is oppressive and has no place in a democracy.

In September, 14 campaign groups, including Black Lives Matter UK wrote to then-new Met Commissioner Sir Mark Rowley demanding an end to police use of the system.

Madeleine Stone, legal and policy officer from Big Brother Watch, said at the time: “Live facial recognition is suspicionless mass surveillance that turns us into walking ID cards, subjecting innocent people to biometric police identity checks. This Orwellian technology may be used in China and Russia but has no place in British policing.”

False identifications identified during Met use of the technology include a 14-year-old black schoolboy in uniform and a French exchange student who had only been in the country for a few days.

In response to the concerns, the Met’s Director of Intelligence Lindsey Chiswick said: “We have listened to these voices. This research means we better understand the performance of our algorithm.

“We understand how we can operate to ensure the performance across race and gender is equal.”

PA Media contributed to this report.
Patricia Devlin
Patricia Devlin
Author
Patricia is an award winning journalist based in Ireland. She specializes in investigations and giving victims of crime, abuse, and corruption a voice.
Related Topics