2 Ontario Police Forces Roll Out Facial Recognition Tech, Sparking Privacy Concerns

2 Ontario Police Forces Roll Out Facial Recognition Tech, Sparking Privacy Concerns
The crest of the York Regional Police on the headquarters building in Aurora, Ontario, on Sept. 15, 2020. Shutterstock
Jennifer Cowan
Updated:
0:00

Two police services in the Greater Toronto Area are now using facial recognition technology as part of their investigations, sparking concerns from legal observers who say the investigative tool could come at the expense of privacy rights.

Peel Regional Police Service and York Regional Police Service announced the adoption of the software late last month, saying the move comes after consultations with the province’s information and privacy commissioner. The police services also held public consultations in March and April.

The two police forces, which are partnering in the initiative, began using software from Idemia on May 27. Idemia is a multinational technology company from France specializing in facial recognition and biometric identification products and software.

The software will automate certain components of their existing mugshot databases, police said in a release.

“The new system will scan and compare against lawfully-collected digital evidence currently stored in our databases,” Peel Regional Police Deputy Chief Nick Milinovich said in a statement. “This new technology will not only support our criminal investigations greatly, but it will enable us to run mugshot searches faster with less human error.”
York Regional Police Chief Jim MacSween, in a separate statement, said partnering with Peel will enable the forces to work closely together to identify criminals who “don’t limit their activity to a single jurisdiction.”
Peel police describes facial recognition on its website as a technology that compares human faces from a digital image or video frame against a database of faces.

Privacy Concerns

The implementation of facial recognition technology by the two forces has raised concerns from the Criminal Lawyers’ Association (CLA).

CLA Vice-President Michelle Johal said the use of facial recognition software has the potential to put at risk the charter right to be free from unreasonable search and seizure.

“It is important to keep in mind that individuals whose criminal booking images (mugshots) and prints captured by police agencies under the authority of the Identification of Criminals Act, are not always convicted of any criminal offence,” Ms. Johal said in a statement provided to The Epoch Times.

“Anyone charged with a criminal offence, including shoplifting, is required to provide booking images and prints. Minor charges are regularly withdrawn, and these images are still retained on file unless an application is sent to the respective agencies for destruction.”

Pictures and prints are not automatically destroyed when charges are withdrawn, Ms. Johal said, adding that it’s “not uncommon” for police to deny destruction requests depending on the seriousness of the charge, even if the charge is eventually withdrawn.

“Some police agencies take many months, or historically over a year to process routine requests for fingerprint destruction,” she added.

York Regional Police media relations officer Kevin Nebrija disagrees that the technology infringes on citizens’ rights. He told The Epoch Times that the use of facial recognition software changes nothing from a privacy standpoint. Pictures used in the system must be collected by investigators through either witness cooperation or via a warrant, he said.

“The only change is investigators will no longer have to search for potential matches in the system manually,” Const. Nebrija said via email. “Images lawfully obtained during criminal investigations are only compared to already existing mugshots on our database and are not searched through or against any social media or any other private or public network.”

Live streaming of videos, whether public or private, will not be used to make arrests, he said, adding it’s a common misconception that cameras must be set up to assist with facial recognition.

If an incident were to occur at a shopping mall, security camera images could be used to compare to mugshots in the police database, but they could not be used independently “as grounds for arrest, search or seizure or to lay charges,” Const. Nebrija said.

“Matches are only able to be used as an investigative lead and any criminal investigation requires supporting evidence before an individual can be arrested and/or charged,” he said.

“Every match will have to be validated by a human investigator, who has received specialized training.”

Facial Recognition Guidance

Ontario’s Information and Privacy Commissioner (IPC) published guidance on police use of facial recognition and mugshot databases in January.
The report noted that “the lawfulness” of police facial recognition use has yet to be addressed by the courts, adding that there are no “clear or comprehensive” laws to govern police use of facial recognition technology in Ontario.

The guidelines offer law enforcement direction on using the software, although the IPC said the report is not “an endorsement” of the technology itself.

Some of the key considerations include “a carefully considered, incremental, transparent, and accountable approach to using facial recognition” and setting standards for image quality by requiring good lighting and a minimum pixel density. It also recommended evaluating the system for bias and performing extensive operator training.

Early Facial Recognition Use

Peel and York are not the first forces to use facial recognition in the GTA. The Toronto Police Service used the technology from the company Clearview AI in several criminal investigations between October 2019 and February 2020, before the force’s police chief halted the practice.
Clearview AI was reprimanded by then-federal privacy commissioner Daniel Therrien following his 2021 report saying the New York-based firm’s facial-recognition technology violated federal and provincial laws governing personal information.

The report said the company’s scraping of billions of images of people from across the internet represented mass surveillance and was a clear violation of Canadians’ privacy rights.

The investigation by Mr. Therrien and privacy-protection authorities for Alberta, British Columbia, and Quebec found Clearview AI’s technology allowed law enforcement and commercial organizations to match photographs of unknown people against the company’s database of more than three billion images for investigation purposes.

Mr. Therrien announced in 2020 that Clearview AI would stop offering its facial-recognition services in Canada in response to the investigation.

The move included the indefinite suspension of Clearview AI’s contract with the RCMP, its last remaining client in Canada.

The RCMP said it ceased its use of Clearview AI in July 2020 and worked closely with the office of the privacy commissioner during its investigation. The federal police force also said it “took steps” in March 2020 to “develop an internal directive” on facial recognition use.

“The directive stated that facial recognition technology will only be used in exigent circumstances for victim identification in child sexual exploitation investigations, or in circumstances where threat to life or grievous bodily harm may be imminent,” the RCMP said in a June 2021 press release.
The Canadian Press contributed to this report.