OTTAWA—The federal privacy watchdog is warning Canadians about the growing threat of surveillance capitalism—the use of personal information by large corporations.
In his annual report tabled Thursday in Parliament, privacy commissioner Daniel Therrien said state surveillance—a major concern after the 9/11 terrorist attacks—has been reined in somewhat in recent years.
Meanwhile, personal data has emerged as a highly valuable asset and no one has leveraged it better than the tech giants behind web searches and social media accounts, he said.
“Today, the privacy conversation is dominated by the growing power of tech giants like Facebook and Google, which seem to know more about us than we know about ourselves,” the report said. “Terms like surveillance capitalism and the surveillance economy have become part of the dialogue.”
The risks of surveillance capitalism were on full display in the Cambridge Analytica scandal, now the subject of proceedings in Federal Court because his office did not have the power to order Facebook to comply with its recommendations, Therrien said.
In addition, the law did not allow the commissioner to levy financial penalties to dissuade this kind of corporate behaviour.
Therrien, in his last year as privacy commissioner, is encouraging the federal government to make several improvements to planned legislation on private-sector data-handling practices when it is reintroduced in coming weeks.
Artificial intelligence, the newest frontier of surveillance capitalism, has immense promise in addressing some of today’s most pressing issues, but must be implemented in ways that respect privacy, equality and other human rights, Therrien cautioned.
“Our office’s investigation of Clearview AI’s use of facial recognition technology was an example of where commercial AI deployment fell significantly short of privacy laws.”
The commissioner found Clearview AI violated the private-sector privacy law by creating a databank of billions of images scraped from the internet without consent to fuel its commercial facial recognition software.
Digital technologies like AI, which rely on the collection and analysis of personal data, are at the heart of the fourth industrial revolution and are key to socio-economic development, the report said. “However, they pose major risks to rights and values.”
To draw value from data, the law should accommodate new, unforeseen, but responsible uses of information for the public good, the report added. But the additional flexibility should come within a rights-based framework, given frequent violations of human rights.
Therrien highlighted another trend—the increase in public-private partnerships and the use of corporate expertise to assist state organizations, for instance the RCMP’s association with Clearview AI.
Privacy issues arising from public-private partnerships were also evident in a number of government-led pandemic initiatives involving digital technologies in the last year, the report added. “These issues underscored the need for more consistency across the public and private sector laws.”