A coalition pushing for better regulation of facial recognition and other biometric surveillance technologies says proposed federal privacy legislation is in “dire need of significant amendments.”
In an open letter on Nov. 1 to Industry Minister François-Philippe Champagne, the Right2YourFace Coalition warns the use of facial recognition technology threatens human rights, equity principles, and fundamental freedoms, including the right to privacy.
Facial recognition tools can allow an image of a person’s face to be matched against a database of photos with the aim of identifying the individual.
The coalition says the technology can prompt biased or flawed results, creating a risk of false identifications.
The letter is signed by representatives of the Canadian Civil Liberties Association, the International Civil Liberties Monitoring Group, the Privacy and Access Council of Canada, and several others.
The coalition says Bill C-27, now before Parliament, fails to address the harms posed by facial recognition tools as businesses and government agencies adopt artificial intelligence systems at an increasingly rapid pace.
“AI is neither artificial, nor is it intelligent, and its use is largely unregulated,” Sharon Polsky of the Privacy and Access Council told a news conference on Nov. 1.
“Research has repeatedly confirmed that facial recognition results are unreliable, yet hundreds of millions of dollars are spent every year by Canadian municipalities, airports, retailers, schools, and of course, law enforcement, that embed more and more AI and facial recognition systems for safer communities.”
The Liberals introduced privacy legislation last year to give Canadians more control over how their personal data is used by commercial entities. The bill would also impose fines for non-compliant organizations and introduce new rules for the use of artificial intelligence.
The coalition is concerned about a lack of discussion on the bill’s “troubling implications for facial recognition,” said Daniel Konikoff, interim director of the privacy, technology, and surveillance program at the Canadian Civil Liberties Association.
The letter says the privacy section of the bill must include special provisions for sensitive information and explicitly provide enhanced protection of biometric details, such as face data, fingerprints, and vocal patterns, which can involve particular risks of racial and gender bias.
The coalition fears a provision of the bill that allows the collection of information for legitimate business purposes without the user’s knowledge or consent will be too broad, favouring profit over privacy.
The government says the artificial intelligence elements of the bill are aimed at protecting Canadians by ensuring high-impact AI systems are developed and deployed in a way that identifies, assesses, and lessens the risks of harm and bias.
The bill would also establish an AI and data commissioner to monitor company compliance, order third-party audits, and share information with other regulators.
The coalition is concerned the proposed legislation includes no definition of what qualifies as high-impact, leaving what they consider a crucial step to regulations. A definition of high-impact systems that includes facial recognition technology and other biometric identification tools must be included in the bill itself, the letter says.
Last month, Mr. Champagne wrote the House of Commons committee on industry and technology, which is studying the bill, to say the government was prepared to work with MPs to develop amendments to define classes of high-impact systems.
Among the classes Mr. Champagne suggested for inclusion is the use of AI to process biometric information for the purposes of identifying someone without their consent, or to determine an individual’s behaviour, or state of mind.
The coalition hasn’t seen “what those actual tangible amendments” would look like, Mr. Konikoff said.
“The devil’s in the details, but we don’t have any details here.”
In October last year, the Commons committee on information, privacy, and ethics called for a moratorium on the use of facial recognition tools by federal police and Canadian businesses, unless there is court authorization or input from the privacy watchdog.
The committee also urged the government to develop a regulatory framework concerning uses, prohibitions, oversight, and privacy of the emerging tool.