Party leaders urged to vote against the Artificial Intelligence and Data Act

March 14, 2023

TO:
The Honourable Pierre Poilievre, P.C., M.P., Leader of the Opposition
Yves-François Blanchet M.P., Bloc Quebecois Leader
Jagmeet Singh M.P., NDP Leader
Elizabeth May M.P., Green Party Parliamentary Leader

RE: Letter to party leaders urging vote against AIDA at second reading

Dear Party Leaders, 

We, the undersigned organizations and individuals, write to express our concerns regarding the Artificial Intelligence and Data Act (AIDA), part of Bill C-27: An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts (Digital Charter Implementation Act 2022).

As illustrated by the Standing Committee on Access to Information, Privacy, and Ethics (ETHI) report on Facial Recognition Technology and the Growing Power of Artificial Intelligence, regulation is urgently needed to safeguard against the devastating impact to the privacy and human rights of Canadians. The application of AI systems have a highly significant and potentially negative impact in sensitive areas, most notably healthcare, employment, immigration, border security, and education.

However, we do not believe that AIDA, in its current iteration, will address these challenges. While regulation is urgent, AIDA offers an inadequate response and would cause more harm than good. It is important that we get it right. 

The Speaker of the House of Commons has already ruled that AIDA is sufficiently different from the rest of C-27 to deserve its own vote. We believe it should not only be studied separately, but re-thought completely. As a result, we are urging your parties to vote against AIDA at second reading, allowing the rest of C-27 to move forward. A few key reasons:

  • The clear absence of public consultations has made it hard for civil society groups, researchers and historically marginalized communities to significantly contribute to the legislation.1 
  • Many important pieces of the Act are left to regulation, and will be decided on only after it is passed. This will result in less scrutiny and transparency.2 
  • The proposed oversight is arbitrary and the enforcement mechanism is fragile.3
  • The Act fails to apply to government institutions, including national security agencies. This opens the door to abuses by law enforcement agencies like the Royal Canadian Mounted Police’s (RCMP) unlawful use of Clearview AI’s facial recognition technology. While we recognize that it is common for separate legislation to regulate the private and public sectors, we believe this should be reconsidered in light of blurring boundaries when it comes to AI.4 
  • The Act does not address the significant human rights implications of algorithmic systems.5 

While aspects of these problems could be addressed during committee study, we would be concerned about three issues:

  • That, as part of a larger study of C-27, AIDA will not receive the necessary degree of scrutiny that it requires.
  • That the committee cannot engage in the level of public consultation that is necessary to address the flaws in this bill and which the government should have undertaken before tabling legislation.
  • That key amendments necessary for addressing the flaws in the Act would be deemed to go beyond what is possible at committee, for example applying the Act to government institutions or establishing an adequate, independent oversight body.

Sincerely, 

Organizations:
International Civil Liberties Monitoring Group
OpenMedia
Canadian Civil Liberties Association
Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic
Tech Reset Canada
Digital Public
Ligue des droits et libertés
Public Interest Advocacy Centre 

Individuals:
Christelle Tessono, Tech Policy Researcher
Yuan Stevens, Legal and Policy Advisor
Renée Sieber, Associate Professor at McGill University
Ana Brandusescu, Artificial Intelligence Governance Researcher
Blair Attard-Frost, PhD Candidate at University of Toronto Faculty of Information
Thomas Linder, OpenNorth
Maurice Jones, Concordia University
Fenwick McKelvey, Communication Studies, Concordia University
Luke Stark, Western University


  1. Wylie, Bianca. “ISED’s Bill C-27 + Aida. Part 1: Tech, Human Rights, and the Year 2000.” Medium. Medium, October 9, 2022. And Tessono, Christelle, Yuan Stevens, Momin M. Malik, Sonja Solomun, Supriya Dwivedi & Sam Andrey. AI Oversight, Accountability and Protecting Human Rights: Comments on Canada’s Proposed Artificial Intelligence and Data Act. Cybersecure Policy Exchange. November, 2022.
  2. Scassa, Teresa. “Statutory Madlibs – Canada’s Artificial Intelligence and Data Act.” Teresa Scassa – Blog, July 20, 2022.
  3. “Not Fit For Purpose: Canada Deserves Much Better,” Center for Digital Rights October 28, 2022.
  4. “Police Use of Facial Recognition Technology in Canada and the Way Forward.” Office of the Privacy Commissioner of Canada, June 10, 2021.
  5. As discussed in the Cybersecure Policy Exchange and Center for Digital Rights reports.

Since you’re here…

… we have a small favour to ask. Here at ICLMG, we are working very hard to protect and promote human rights and civil liberties in the context of the so-called “war on terror” in Canada. We do not receive any financial support from any federal, provincial or municipal governments or political parties. You can become our patron on Patreon and get rewards in exchange for your support. You can give as little as $1/month (that’s only $12/year!) and you can unsubscribe at any time. Any donations will go a long way to support our work.panel-54141172-image-6fa93d06d6081076-320-320You can also make a one-time donation or donate monthly via Paypal by clicking on the button below. On the fence about giving? Check out our Achievements and Gains since we were created in 2002. Thank you for your generosity!
make-a-donation-button