By Brenda McPhail
Facial recognition technology (FRT) carries the risk of annihilating our right to anonymity in public and quasi-public spaces. It sounds alarmist. It sounds hyperbolic. But it’s neither. It’s simply an observation grounded in the promises made by makers of FRT tools themselves. NEC Corporation’s NeoFace Watch technology promises the ability to “process multiple camera feeds extracting and matching thousands of faces per minute.”1NEC, NeoFace Watch: Face Recognition: How It Works, 2023: https://www.nec.com/en/global/solutions/biometrics/face/neofacewatch.html. Clearview AI’s controversial (and, in Canada, illegal2A joint investigation of Clearview AI, Inc. by the Privacy Commissioner of Canada and the Commissioners of BC, Alberta and Quebec made this finding. See PIPEDA Findings #2021-001: https://priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2021/pipeda-2021-001/.) facial recognition software runs against a database of over 30 billion images scraped from the internet.3Clearview AI, Law Enforcement, 2023: https://www.clearview.ai/law-enforcement.
To understand the dangers, it’s essential to understand how facial recognition technologies work. FRT is a type of biometric (that is, body-based) technology that uses artificial intelligence (AI) algorithms and other computational tools to identify individuals through their facial features. FRT functions by extracting biometric information based on key facial characteristics and makes comparisons between live and stored biometric templates in databases. Or more simply, it uses our faces in a technologically-enabled matching process to figure out who we are. Notably, there are a number of studies that indicate that some FRT tools are less accurate on faces that are neither white nor male, leaving everyone who is neither at greater risk of misidentification.4The literature in this area is extensive. Two important pieces are: Joy Buolamwini and Timnit Gebru, Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81:77-91, 2018; and Patrick Grother, Mei Ngan, and Kayee Hanaoka, Face Recognition Vendor Test Part 3: Demographic Effects, NIST, 2019. The technology, however, continues to evolve and it is important to recognise that if the technology becomes more accurate, only one small problem is solved and others remain.
There are different ways this technology may be used. The most extreme version – live facial recognition in the streets of our communities – is not, to the best of our knowledge, currently used by Canadian police; although it has, we know, been tested at Toronto’s Pearson airport.5Tom Cardoso and Colin Freeze, “Ottawa tested facial recognition on millions of travellers at Toronto’s Pearson airport in 2016,” The Globe and Mail, July 19, 2021: https://www.theglobeandmail.com/canada/article-ottawa-tested-facial-recognition-on-millions-of-travellers-at-torontos/. But FRT to compare so-called “lawfully collected” images against mug- shot databases is increasingly used by police forces across Canada, largely without notice, meaningful consultation, or effective public oversight or accountability.
And of course, it’s not just police or national security forces who want to use it. Facial recognition is emerging in a variety of ways in the private sector, with documented uses ranging from live scanning for alleged shoplifters in the image feed from Canadian Tire security cameras6Austin Grabish, First Nations man wants apology after being flagged as shoplifter, asked to leave Canadian Tire store, CBC News, Oct. 19, 2022: https://www.cbc.ca/news/canada/manitoba/first-nation-apology-store-accused-1.6620457. – a story that hit the news when an Indigenous man was wrongfully identified – to checking student identity for online exams, to potentially paying for groceries with a face scan connected to a payment card.7or a brief round-up of facial recognition uses in Canada, see: Brenda McPhail, Facial recognition explained: How is FRT used in Canada?, CCLA, Dec 6, 2022: https://ccla.org/privacy/facial-recognition-explained-how-is-frt-used-in-canada/.
The use of FRT is a human rights issue that goes well beyond privacy concerns. Privacy is an enabling right—think of it as a gateway. Once the privacy gates are thrown open, once we lose control over information about ourselves (particularly something such as our face which is so fundamental and integral to who we are), the use of that information has impacts on other democratic, Charter-protected rights, most particularly freedom of expression, association, and equality rights. When we’re watched, and known, we may be less likely to speak up on controversial issues. We may be less likely to gather to protest and stand up for causes we believe in. When we’re watched, and known, all the discriminatory impacts of systemic racism, sexism, ableism and socio-economic exclusion built into social systems, particularly security systems, may be exacerbated. FRT makes the surveillant gaze – so often disproportionately directed at those who are racialized or marginalized – more effective, and shifts it from “we saw you” to “we know who you are.”
If residents of Canada become unable to move about their communities as just a face in the crowd, that fundamentally changes the nature of the society in which we live. In a rights-respecting democracy, we expect freedom from routine, indiscriminate observation – never mind identification – by the state; an expectation vindicated by rulings at the Supreme Court of Canada.8As the Court said in R. v. Jarvis, 2019 SCC 10, [2019] 1 S.C.R. 488: “Privacy, as ordinarily understood, is not an all-or-nothing concept, and being in a public or semi-public space does not automatically negate all expectations of privacy with respect to observation or recording.” Facial recognition has the potential to disrupt if not eliminate that expectation. So too would the presumption of innocence, a core democratic principle, be eroded if FRT were to be used indiscriminately in public spaces. And lest we think the possibility unlikely, something that might only happen in an authoritarian state, our Five Eyes ally, the UK, is actively experimenting with live FRT.9Metropolitan Police, Facial Recognition, 2023: https://www.met.police.uk/advice/advice-and-information/fr/facial-recognition.
The potentially wide application of FRT, the extensive range of actors who want to use it, and its ability to be secretly implemented using existing security cameras, make it imperative to have the necessary public conversations regarding whether there are uses of FRT that are acceptable in our society. If there are, which ones are we willing to allow, and how should they be regulated to mitigate any risks? The discussion has begun with the recent study and report by the Parliamentary Standing Committee on Access to Information, Privacy and Ethics (ETHI), where the Canadian Civil Liberties Association (CCLA), ICLMG and others made detailed recommendations. The report’s 19 recommendations reflect some of our concerns, including a call to implement a federal moratorium on using FRT until a regulatory framework concerning uses, prohibitions, oversight and accountability mechanisms, and privacy protections is democratically debated and put in place.10Standing Committee on Access to Information, Privacy and Ethics, Facial Recognition Technology and the Growing Power of Artificial Intelligence: Report, October 2022: https://www.ourcommons.ca/Content/Committee/441/ETHI/Reports/RP11948475/ethirp06/ethirp06-e.pdf [ETHI].
That’s the correct course of action, given what is at stake. In February 2023, the government issued its response to the report, which failed to address the severity of the challenges posed by FRT and artificial intelligence. Civil society is rallying to fill that gap. A coalition of groups and individuals from across Canada, led by CCLA and ICLMG among others, has come together under the banner of the “Right 2 Your Face Coalition,” with the goal of crafting impactful advocacy on regulating this dangerous technology and ensuring that a wide range of public-interest perspectives are integrated and promoted before decision-makers. In an open letter, the new coalition highlighted several key concerns with the government’s response: it ignores the calls for a federal moratorium on the use of FRT, it fails to assume a leadership role in responsible tech policy, and it relies heavily on the proposed Bill C-27 (the Digital Charter Implementation Act, 2022) as the catch-all solution, despite that Bill’s failure to adequately protect individuals’ privacy rights or to rein in artificial intelligence tools.11Right 2 Your Face Coalition, Joint Letter of Concern: In reply to the ETHI Committee’s FRT & AI Report and the Government’s Response, June 21, 2023: https://right2yourface.ca/open-letter/
Canada needs a rights-based approach to crafting new federal and provincial cross-sector laws for biometric protections, and also needs to update existing laws including the Canadian Human Rights Act and the Privacy Act, to appropriately govern and, in cases of mass surveillance, prohibit FRT use. There are many examples globally where biometric protective legislation has recently been enacted or is under consideration that provide a template, including a Canadian example, in Quebec.12See the Act Respecting the Protection of Personal Information in the Private Sector, CQLR P-39.1, recently amended by the Privacy Legislation Modernization Act (Law 25) in Quebec; other examples include the Illinois Biometric Information Privacy Act 740 ILCCS 14; the Biometric Identifier Information Law, City of New York Administrative Code, Title 22, Chapter 12; and the Washington state Act Relating to Biometric Identifiers, and adding a new chapter to Title 19 RCW, H.B. 1493. To get it right, the process must begin with proactive consultation with those communities most likely to be disproportionately impacted by the technology.
There is a policy window to act, but it’s closing rapidly as FRT gains ground, often quietly and covertly, across the country. People in Canada deserve the freedom to go about their days unidentified. As the ETHI Committee rightly notes in their report: “Without an appropriate [legislative] framework, FRT and other AI tools could cause irreparable harm to some individuals.”13ETHI, p. 64. The risks are obvious. The rights engaged are multiple. The time for a social and political response is now.
Dr. Brenda McPhail does research and advocacy at the junction of privacy and technology, and is the Director of Executive Education for the Master of Public Policy in Digital Society program at McMaster University. Right2yourface.ca
Since you’re here…… we have a small favour to ask. Here at ICLMG, we are working very hard to protect and promote human rights and civil liberties in the context of the so-called “war on terror” in Canada. We do not receive any financial support from any federal, provincial or municipal governments or political parties. You can become our patron on Patreon and get rewards in exchange for your support. You can give as little as $1/month (that’s only $12/year!) and you can unsubscribe at any time. Any donations will go a long way to support our work.You can also make a one-time donation or donate monthly via Paypal by clicking on the button below. On the fence about giving? Check out our Achievements and Gains since we were created in 2002. Thank you for your generosity! |
Footnotes
- 1NEC, NeoFace Watch: Face Recognition: How It Works, 2023: https://www.nec.com/en/global/solutions/biometrics/face/neofacewatch.html.
- 2A joint investigation of Clearview AI, Inc. by the Privacy Commissioner of Canada and the Commissioners of BC, Alberta and Quebec made this finding. See PIPEDA Findings #2021-001: https://priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2021/pipeda-2021-001/.
- 3
- 4The literature in this area is extensive. Two important pieces are: Joy Buolamwini and Timnit Gebru, Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81:77-91, 2018; and Patrick Grother, Mei Ngan, and Kayee Hanaoka, Face Recognition Vendor Test Part 3: Demographic Effects, NIST, 2019. The technology, however, continues to evolve and it is important to recognise that if the technology becomes more accurate, only one small problem is solved and others remain.
- 5Tom Cardoso and Colin Freeze, “Ottawa tested facial recognition on millions of travellers at Toronto’s Pearson airport in 2016,” The Globe and Mail, July 19, 2021: https://www.theglobeandmail.com/canada/article-ottawa-tested-facial-recognition-on-millions-of-travellers-at-torontos/.
- 6Austin Grabish, First Nations man wants apology after being flagged as shoplifter, asked to leave Canadian Tire store, CBC News, Oct. 19, 2022: https://www.cbc.ca/news/canada/manitoba/first-nation-apology-store-accused-1.6620457.
- 7or a brief round-up of facial recognition uses in Canada, see: Brenda McPhail, Facial recognition explained: How is FRT used in Canada?, CCLA, Dec 6, 2022: https://ccla.org/privacy/facial-recognition-explained-how-is-frt-used-in-canada/.
- 8As the Court said in R. v. Jarvis, 2019 SCC 10, [2019] 1 S.C.R. 488: “Privacy, as ordinarily understood, is not an all-or-nothing concept, and being in a public or semi-public space does not automatically negate all expectations of privacy with respect to observation or recording.”
- 9Metropolitan Police, Facial Recognition, 2023: https://www.met.police.uk/advice/advice-and-information/fr/facial-recognition.
- 10Standing Committee on Access to Information, Privacy and Ethics, Facial Recognition Technology and the Growing Power of Artificial Intelligence: Report, October 2022: https://www.ourcommons.ca/Content/Committee/441/ETHI/Reports/RP11948475/ethirp06/ethirp06-e.pdf [ETHI].
- 11Right 2 Your Face Coalition, Joint Letter of Concern: In reply to the ETHI Committee’s FRT & AI Report and the Government’s Response, June 21, 2023: https://right2yourface.ca/open-letter/
- 12See the Act Respecting the Protection of Personal Information in the Private Sector, CQLR P-39.1, recently amended by the Privacy Legislation Modernization Act (Law 25) in Quebec; other examples include the Illinois Biometric Information Privacy Act 740 ILCCS 14; the Biometric Identifier Information Law, City of New York Administrative Code, Title 22, Chapter 12; and the Washington state Act Relating to Biometric Identifiers, and adding a new chapter to Title 19 RCW, H.B. 1493.
- 13ETHI, p. 64.