The use of facial-recognition technology in Canada needs to be governed by a legal framework, say Canada’s privacy commissioners.
While it can be a significant tool in the hands of law enforcement to protect national security and help find missing people, the technology involves highly sensitive biometric information. It raises concerns for privacy and human rights.
The Information and Privacy Commissioner for British Columbia, Michael McEvoy, said the technology offers significant safety benefits when correctly used. Still, the collection of facial images is something everyone should be concerned about, McEvoy said.
“It can also be exceptionally intrusive because it involves the collection and processing of highly sensitive personal information," McEvoy said. "Biometric facial data is unique to each individual and unlikely to vary significantly over periods of time. In a very real sense, it is at the core of our individual identity.”
Currently, Canada has a patchwork of laws governing a burgeoning technology sector as cameras watch many aspects of people’s lives. Privacy commissioners want a framework addressing the different uses or risks posed by the technology.
The uncertainty without a framework is exacerbated by a lack of court jurisprudence on the issue, notes a commissioners’ joint statement released on May 2.
“When taken as a whole, the current legal framework risks encouraging fragmented approaches to [facial recognition] use that would take years to resolve before the courts," the statement said.
The commissioners want clear and explicit definitions of how police can use the technology, strict policies on its use, independent oversight of police usage of it and privacy protections built into the law to mitigate risks to individuals, including measures to ensure the accuracy of information and limits to how long images can be retained in police data banks.
President Mike Larsen of the BC Freedom of Information and Privacy Association said companies “harvest images massively and indiscriminately in order to create a data set. Images of interest are then compared to this data set.”
“Everyone has good reason to be concerned about this because everyone is impacted,” Larsen said.
“There is a long history in Canada of legislators and policymakers ignoring the privacy and human rights implications of emerging surveillance technologies until some kind of scandal forces them to take action. The commissioners are raising legitimate concerns about a serious gap in our current legal framework.”
Larsen thinks the work should start with limiting use.
“Our law and regulations should start from the assumption that indiscriminate, pervasive mass biometric surveillance is presumed to be unlawful and then identify specific and reasonable circumstances in which facial-recognition technology can be employed," he said.
Larsen said no-go zones for facial-recognition surveillance should be peaceful assemblies and protests, as well as in public spaces, such as streets, transit spaces and schools.
“We’re concerned about any technology or practice that subjects children to surveillance, particularly in the context of schools,” Larsen said. “Lining up for recess or for the school bus should not imply joining a police lineup — even a digital one. The ‘school-to-prison pipeline’ is a real phenomenon, and we know that clustering more policing and surveillance technology around schools and school-aged children can lead to disproportionate criminalization."
He said that people’s pictures should not be scraped from their social media pages.
“We need law reform and regulations on this specific issue. With Clearview AI, we saw a U.S.-based facial-recognition company selling a service to Canadian police organizations,” Larsen said. “The company disputed the jurisdiction of the commissioners, and some police argued that they were not responsible for vetting the service provider’s compliance with our privacy law. That’s a mess.”
Last year, Canadian commissioners found Clearview AI scraped images of faces and associated data from publicly accessible online sources — including social media, and stored it in its database.
“What Clearview does is mass surveillance, and it is illegal,” Privacy Commissioner of Canada Daniel Therrien said at the time.
The company has challenged that finding.
Earlier guidance on technology use and a joint statement follows a public consultation started in June 2021 to seek feedback on a draft version of the guidance and a future legal and policy framework to govern police use of the technology.
The consultation followed an investigation into Clearview AI that found the private sector platform was involved in mass surveillance. A separate investigation found the RCMP’s use of Clearview unlawful since it relied on the illegal collection and use of facial images.