A dossier presented to judges by a face search company has revealed how images belonging to UK residents can be used to find out their relationship status, if they have children, and even if they smoke or drink alcohol.
The disclosure was made by lawyers acting on behalf of Clearview AI in a court challenge over a £7.5 million fine imposed on the firm by Britain’s privacy watchdog last year.
The United States based facial recognition company—which provides services to overseas law enforcement—successfully overturned the Information Commissioners Office (ICO) sanction on Wednesday.
Judges said the ICO action—which ruled the face matching firm violated UK law by harvesting millions of online images without user consent—was unlawful.
The ICO said it is now “carefully” considering its “next steps,” with a spokesperson not ruling out the possibility of launching an appeal against the ruling.
Face Match
Clearview AI offers its clients a system that works like a search engine for faces: Police can upload a photo to its technology, and it finds matches in a database which judges said this week was growing by 75 million images a day.The company then provides links to where matching images appear online.
According to court documents, an evidence bundle comprising of over 1,600 pages was presented to First-tier Tribunal judges by Clearview AI lawyers in its ICO challenge.
It included how, at last count, the firm has over 20 billion photos taken from the public internet, and that its service—tested by the U.S. National Institute of Standards and Technology—had an accuracy rate of 99 percent.
Also included were examples of “successful searches” made on the Clearview system.
According to the court documents, images scraped and stored by the face company have been able to identify the person’s name and relationship status including “whether they have a partner and who that may be.”
An image and its data can also be used to identify whether the person is a parent, their “associates,” the place the photo was taken, where the person in the image is “based/lives/is currently located.”
The document also revealed that photos can identify what social media is used by an individual, whether the individual “smokes/drinks alcohol,” their “pastimes,” if they can drive, and if they have ever been arrested.
The judges noted that Clearview’s service enabled clients to “go beyond” the normal governmental databases as its database will “include images of people who have not come to the attention of the authorities in such a way as to result in an image of them being on the authorities’ databases.”
Intrusive
The court also heard how the facial recognition company had previously been trialed in the UK by the Metropolitan Police, the Ministry of Defence, and the National Crime Agency.Judges said that over 700 searches were carried out by these organisations on Clearview during this time.
However, after settling a case brought by the American Civil Liberties Union of Illinois in 2020, the company pulled out of the UK and restricted clients to law enforcement or national security bodies and their contractors.
In its sanction against the company last year, the ICO said even though it no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.
“People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement,” Information Commissioner John Edwards said at the time.
“Working with colleagues around the world helped us take this action and protect people from such intrusive activity.”
On top of the £7.5 million fine, the ICO also ordered the company to remove all data and images belonging to UK residents.
However, in its judgement on Wednesday, the First-tier Tribunal ruled: “We have concluded that the Information Commissioner did not have jurisdiction to issue the Enforcement Notice and the Monetary Penalty Notice to Clearview AI Inc because although the processing undertaken by CV was related to the monitoring of data subjects’ behaviour in the United Kingdom, the processing is beyond the material scope of the GDPR and is not relevant processing for the purposes of Article 3 UK GDPR.”
Jack Mulcaire, Clearview AI’s lawyer, said the firm was “pleased” with the judgement.
Fines
In a statement issued to The Epoch Times on Friday, an ICO spokesperson said the watchdog is considering its “next steps.”“It is important to note that this judgment does not remove the ICO’s ability to act against companies based internationally who process data of people in the UK, particularly businesses scraping data of people in the UK, and instead covers a specific exemption around foreign law enforcement,” the spokesperson added.
Clearview AI has been sued and fined multiple times in Europe via the European Union’s GDPR, with fines being levied in France, Italy, and Greece.
In Sweden, the local police authority was fined more than $300,000 for its illegal use of Clearview AI products in 2021.
However, Clearview AI has managed to avoid following some of the court orders.
Despite being fined $20 million for GDPR breaches in France in October, the company has so far refused payment and was found in breach of that order as of May 2023.
Privacy regulators are concerned about data harvested en masse from the internet.
Earlier this year, data protection agencies around the world issued a joint statement warning companies that scrape information from the public internet that the practice could violate privacy laws.
Clearview also does not offer an opt out for anyone outside of the United States to have their images removed from its database.
Its privacy policy page states, “Currently, only those who are a resident of one of the following states may submit a consumer request for access, opt-out, and/or delete.”
Those states include California, Colorado, Connecticut, Illinois, and Virginia.
Individuals outside of those areas have, so far, no explicit recourse to have their images, likeness, and other data removed from the company’s data set.