A good start: ETHI facial recognition technology report based on Citizen Lab research and recommendations – but has room for improvement

In early October 2022, the House of Commons Standing Committee on Access to Information, Privacy and Ethics (“ETHI”) published the final report of its study on “Use and impact of facial recognition technology“: Face recognition technology and the growing power of artificial intelligence. The report concluded what previous Citizen Lab research has shown, that “Canada’s current regulatory framework does not adequately regulate FRT [facial recognition technology] and AI [artificial intelligence]. Without a proper framework, FRT and other AI tools could do irreparable harm to some individuals.” The report includes nineteen recommendations for the federal government to address this issue.

Many of ETHI’s key findings and recommendations are consistent with research findings and recommendations from previous Citizen Lab reports and submissions on algorithmic policing technologies and similar government systems. These include, for example Monitor and Predict: A Human Rights Analysis of Algorithmic Policing in Canada and Bots at the Gate: A Human Rights Analysis of Automated Decision Making in Canada’s Immigration and Refugee System—both published in association with the International Human Rights Program (IHRP) at the University of Toronto Law School—and a joint submission with the Women’s Legal Education and Action Fund (LEAF) on a public consultation by the Toronto Police Services Board on his proposal Policy on the Use of Artificial Intelligence Technologies.

Citizen Lab grantee Cynthia Khoo was invited to appear before the Standing Committee to discuss Citizen Lab’s research and legal reform recommendations related to algorithmic policing and facial recognition technologies, and her testimony is also reflected in the final report (full statement available here.)

This blog post will highlight some of the key findings and recommendations we should see emerging from the ETHI report along with some concerns:

Moratorium on the (federal) police use of facial recognition technologies

Recommendation 18 of the report advises the Canadian government to “impose a federal moratorium on the use of facial recognition technology by (state) police services and Canadian industry, unless implemented by confirmed consultation with the Office of the Attorney General or by court approval.” This is Citizen Lab’s highest priority recommendation Monitor and Predict Report calling for a moratorium on law enforcement’s use of algorithmic policing technologies until and unless they meet the requirements of necessity, proportionality and reliability; and a further recommendation that police use of algorithmic policing technologies be subject to prior judicial approval.

Unfortunately, the moratorium recommended by ETHI is narrow, being restricted to only facial recognition technologies (in fairness the main focus of the study) and specifically only to federal police services. Our research and that of others, such as B. investigative journalists, have shown that the use of and interest in facial recognition and other algorithmic police technologies is widespread among them provincial, regionaland Community police authorities. Any national moratorium that does not include these levels of law enforcement is likely to have limited effect in practice, even before considering whether cooperation between the RCMP and local police forces would make circumvention trivial. While it is understandable that ETHI recommendations to the federal government are primarily aimed at the federal police, we hope that provincial and local governments will see fit to adopt this recommendation themselves and enact similar moratoria.

Recognizing the problem of public-private surveillance partnerships

What is perhaps even more significant about the recommended moratorium is its inclusion of “Canadian industry” – one of the few examples of drawing a line when it comes to the use of facial recognition technologies by private sector companies in addition to government agencies. This inclusion, together with Recommendation 1 (which every governmental institution under the Data Protection Act “to ensure that the practices of third parties from whom it receives personal data are lawful”) recognizes concerns about “public-private surveillance partnerships“: Agreements involving private companies such as technology providers, social media companies or data brokers systematically and voluntarily share or sell personal data to law enforcement agencies. Such arrangements may result in an invisible erosion of constitutional privacy rights by allowing law enforcement to circumvent Section 8 protections Canadian Charter of Rights and Freedomsby outsourcing digital surveillance without warranties to private companies that are not subject to the same constitutional obligations when collecting personal data.

The Citizen Labs and IHRPs Monitor and Predict The report discusses the issues with such data-sharing agreements brought to ETHI, as well as similar concerns raised by many other experts and advocates of technology, human rights and civil liberties who appeared before the committee. While the ETHI report details these concerns in a dedicated subsection, “Procurement and Public-Private Partnerships,” its recommendations focus on enhancing transparency rather than mitigating or preventing its impact. We hope this is just the beginning of more robust regulation on this front.

Increasing public transparency and accountability of algorithmic policing technologies

The ETHI report includes several recommendations to increase public transparency regarding policing’s use of facial recognition and other algorithmic policing technologies. While this is encouraging given that the Citizen Lab has strongly recommended greater transparency in the use of these technologies, we are concerned that many of the transparency recommendations are qualified by being “subject to national security concerns”, including:

  • that the Canadian government “create a public AI registry listing all algorithmic tools used by companies operating in Canada” (Recommendation 5);
  • “Require government institutions that acquire facial recognition technology or other algorithmic tools, including free trial versions, to make that acquisition public” (Recommendation 6); and
  • that the government “ensure full and transparent disclosure of any racial, age or other unconscious bias that may exist in facial recognition technology used by the government” once such bias is identified (Recommendation 9).

Without the caveat of national security, these transparency requirements are promising in their breadth – including not just facial recognition technology but also “any algorithmic tool used by a legal entity” in Canada and “other algorithmic tools” procured by the government . This also corresponds to Monitor and Predict Report calling for a moratorium and regulation of all algorithmic police technology, not just facial recognition. However, law enforcement and security agencies have been exceptionally opaque related to national security, to the extent that they are repeatedly challenged by federal courts violated their duty of disclosure. Each national security exception, therefore, appears to be a disappointing loophole cutting off the legs of otherwise significant public transparency reforms.

Recognizing the impact and human rights impact on marginalized communities

Getting to the heart of the matter, throughout its report, the committee acknowledged the serious impact of facial recognition technology on human rights and its damaging effects, particularly on historically marginalized communities. ETHI’s recommendations include several on the right to privacy, including critical bans on activities such as using facial recognition and other algorithmic technologies for mass surveillance (recommendation 11) and capturing images of individuals from public spaces to “build databases for facial recognition technology or Algorithms for artificial intelligence”. (Recommendation 17).

In particular, the use of images from public space would undermine the right to privacy through anonymity in public. Such an invasion of privacy is disproportionately dangerous for those who are historically and systemically excluded, such as women and gender minorities; Black, Indigenous and other racialized people; members of LGBTQ+ community; people with disabilities; and people who live in poverty. This, coupled with additional far-reaching implications of facial recognition and algorithmic policing technologies for the right to equality and social justice, makes it all the more important that the Canadian government follow ETHI’s proposal that the use of facial recognition in the public sector includes consultation with marginalized groups alongside “immediate and prior public communication and public comment” (Recommendation 10).

Given ETHI’s findings and recommendations, we hope this report is just the beginning of meaningful efforts by the Canadian government to stop and prevent the unconstitutional and harmful use of facial recognition and other algorithmic policing technologies across the country.

Read the ETHI report here:

Read the Citizen Lab and IHRP reports here:

source

Leave a Reply

Your email address will not be published. Required fields are marked *