Take action!

ICLMG statement: Canada Must Oppose Genocide in Gaza and Defend Free Expression at Home

As United Nations officials and many legal experts have alarmingly stated: There is a genocide underway in Gaza. A “humanitarian pause” was not enough. As of December 6, Israel has killed more than 21,731 Gazans, including 8,697 children and shows no intention of stopping until Gaza is “erased.”

Genocide is never justified, including in the “fight against terrorism.”

The UN Genocide Convention – which Canada has ratified – stipulates that “states that have the capacity to influence others have a duty to employ all means reasonably available to them to prevent genocide.” Therefore, Canada has the obligation to not only call for a permanent and immediate ceasefire, but to immediately halt all arm sales, transfers and military aid to Israel.

Furthermore, the International Civil Liberties Monitoring Group (ICLMG) is deeply alarmed by the overwhelming reports of incidents of Islamophobia and antisemitism, the fact that people are being investigated, suspended and/or fired by their employer for expressing solidarity with the Palestinian people and/or calling for a ceasefire, as well as the growing instances of criminalization of dissent.

Support for the human rights and lives of Palestinians must not be conflated with support for hate or terrorism. Already, governments internationally have moved to criminalize or outright ban protests and restrict speech in support of Palestinian lives and against the decades-long Israeli occupation and the ongoing genocide. In Canada, the arrest of eleven people for engaging in a political protest against the head of a charity that supports IDF soldiers raises serious concerns about the policing of political speech.

It is necessary that you publicly, clearly and without delay, reassert the importance of upholding freedom of expression and the right to dissent – in Canada and elsewhere.

These actions and sentiments are disturbingly similar to those we saw in the aftermath of September 11, 2001, and in response to the protests against the so-called “War on Terror” that followed. Governments – including the Canadian government – used a climate of fear and division to justify limits on freedom of expression and assembly, to drastically increase surveillance, and to undermine the civil liberties of vast swaths of the population, particularly Muslims and Arabs.

We thus urge Canadian officials to defend the freedom of expression and other human rights of all people in Canada and to support human rights, respect for international law, and justice globally.

– The International Civil Liberties Monitoring Group


Click below to send this statement to the Prime Minister, the Ministers of Foreign Affairs and Public Safety, as well as your MP. And please share widely. Thank you!

TAKE ACTION

On #GivingTuesday 2023, please help us continue to fight for civil liberties!

It’s Giving Tuesday again! This year, we hope that, in the context of this generosity movement, you’ll want to support ICLMG’s work of defending civil liberties from the negative impact of national security and anti-terrorism laws and activities, including surveillance, racial and political profiling, and complicity in torture abroad.

We do not receive any support from governments and rely on people like you to continue our work.

Say no more – I’ll support the ICLMG!

DONATE

This year, we were able to secure important wins!

  • The National Security and Intelligence Review Agency is conducting a review of CRA audits of Muslim charities – a key recommendation of our 2021 report.
  • We sent an urgent letter to the Minister of Immigration, urging him to stop the deportation of Dr. Ezzat Gouda to a death sentence in Egypt after a political and unfair trial in abstentia. We were relieved to learn that he was not deported.
  • We were able to obtain, alongside several partners, important amendments to Bill C-20, which aims to create a long overdue independent review body for the CBSA.
  • We were part of efforts to secure a crucial exemption to Bill C-41 – allowing humanitarian aid to populations in need in areas controlled by groups considered terrorist by the Canadian government; for example, in Afghanistan.
  • We received the Muslim Association of Canada’s “Friend of the Community” award for our work against Islamophobia and fighting for justice.

Canadians are still facing many challenges and we need your support to fight for our rights!

  • We see more and more threats to free speech and dissent, as well as instances of Islamophobia and anti-Palestinian racism, in Canada and abroad – including the wrongful conflation of calls for a ceasefire in Gaza and for the protection of Palestinian rights and lives with support for terrorism.
  • There are still at least 25 Canadians, including a majority of children, detained arbitrarily in conditions akin to torture in North East Syria.
  • Intelligence agencies are pushing for more surveillance powers.
  • Proposed new federal rules won’t rein in dangerous AI systems.

I’ll donate to protect civil liberties!

DONATE

Thank you so much for your support!

Xan & Tim

Canada’s Legislation on Facial Recognition Tech is Dangerous Say Civil Society Groups and Scholars

Right2YourFace | Direction Informatique

On November 1st, 2023, the Right2YourFace Coalition – a group of prominent civil society organizations and scholars – sent the letter below to the Minister of Public Safety, the Minister of Innovation, Science and Industry and other affected parties stating that the new proposed government legislation for privacy and AI falls short and will be dangerous for Canadians.

Please take action to protect people in Canada from facial recognition technology!

TAKE ACTION

Joint letter on Bill C-27’s impact on oversight of facial recognition technology

Dear Ministers,

As Bill C-27 comes to study by the Standing Committee on Industry and Technology (INDU), the Right2YourFace Coalition expresses our deep concerns with what Bill C-27 means for oversight of facial recognition technology (FRT) in Canada.

FRT is a type of biometric recognition technology that uses artificial intelligence (AI) algorithms and other computational tools to ostensibly identify individuals based on their facial features. Researchers have found that these tools are about as invasive as technologies get. Biometric data, such as our faces, are inherently sensitive types of information. As mentioned in our Joint Letter of Concern regarding the government’s response to the ETHI Report on Facial Recognition Technology and the Growing Power of Artificial Intelligence, the use of FRT threatens human rights, equity principles, and fundamental freedoms including the right to privacy, freedom of association, freedom of assembly, and the right to non-discrimination. AI systems are being adopted at an increasingly rapid pace and Canada needs meaningful legislation to prevent the harms that FRT poses. As it stands, Bill C-27 is not that legislation – it is not fit for purpose and is in dire need of significant amendments.

Bill C-27 is comprised of three parts and our concerns lie primarily with two of them: The Consumer Privacy Protection Act (CPPA) and the Artificial Intelligence and Data Act (AIDA). The CPPA creates the rules for data collection, use, and privacy that flow into implementations covered by the Artificial Intelligence and Data Act (AIDA). While implementations like FRT are the target of AIDA, the datasets FRT systems rely on must be collected and used under the terms of the CPPA. Consequently, we submit that both CPPA and AIDA require amendments to fully protect vulnerable biometric information.

We have identified five core issues with the Bill, including elements of both the CPPA and AIDA, that require immediate attention in order to avoid significant harm. They are:

  1. The CPPA does not flag biometric information as sensitive information, and it does not define “sensitive information” at all. This omission leaves some of our most valuable and vulnerable information—including the faces to which we must have a right—without adequate protections;
  2. The CPPA’s “legitimate business purposes” exemption is too broad and will not protect consumers from private entities wishing to use FRT;
  3. “High impact systems” is undefined in AIDA. Leaving this crucial concept to be defined later in regulations leaves Canadians without meaningful basis from which to assess the impact of the Act, and FRT must be included;
  4. AIDA does not apply to government institutions, including national security agencies who use AI for surveillance, and exempts private sector AI technology developed for use by those national security agencies – creating an unprecedented power imbalance; and
  5. AIDA focuses on the concept of individual harm, which excludes the impacts of FRT on communities at large.

Biometric information is sensitive information and must be defined as such

Not all data are built the same, and they should not be treated the same. Biometric information is a particularly sensitive form of information that goes to the core of an individual’s identity. It includes, but is not limited to, face data, fingerprints, and vocal patterns, and carries with it particular risk for racial and gender bias. Biometric information must be considered as sensitive information and afforded relevant protections. While the CPPA mentions sensitive information in reference to the personal information of minors, the text of the Act neither defines nor protects it. This leaves some of our most valuable and vulnerable identifiable information without adequate protection. The CPPA should include special provisions for sensitive information, and its definition should explicitly provide for enhanced protection of biometric data – understanding that the safest biometric data is biometric data that does not exist.

Legitimate business purposes” requires a better definition to protect against abuse

The CPPA states in provision 12(2) that purposes which “represent legitimate business needs of the organization” are appropriate purposes to collect user information without the user’s knowledge or consent. It is not difficult to see businesses framing their use of FRT as in service of legitimate business purposes like loss prevention, which is already happening in the private sector despite being established as violating Canadian privacy law. Disturbingly, the CPPA’s legitimate business purpose loophole tilts the scales in favour of business over personal privacy, suggesting that individuals’ privacy rights are less important than profit.

It should be demonstrably clear that a person’s rights and freedoms must be adequately balanced. Provision 5 of the CPPA states that the purpose of the Act is to establish “rules to govern the protection of personal information.” Businesses, thus, should not be given free rein to decide that their use of FRT—and the risks to privacy that come with the use and collection of that highly sensitive data—qualifies as legitimate and that biometric data could be collected without an individual knowing or consenting.

What is a high-impact system?

AIDA imposes additional measures on “high-impact systems”, requiring those who administer them to “assess and mitigate risks of harm or biased output.” Given that FRT and its associated AI systems have the ability to identify individuals using the above-mentioned biometric information, FRT must be considered high-impact. Yet, the Act offers no definition of what qualifies as high-impact, instead leaving this crucial step to the regulations.

The risk-based analysis associated with high-impact systems suggested by the wording in AIDA takes us down the wrong path. Would a grocery store’s coupon-distribution system be considered high-impact and thus require assessment and mitigation of risks of harm or biased output? What if that system were using FRT? What may seem to be a low-impact system of coupon distribution may in fact be incorporating and collecting biometric data. Given the risks and harms that FRT poses for human rights and fundamental freedoms, FRT’s impacts are both high and dangerous.

A rights-based analysis must accompany risks-based calculations. A definition of high-impact systems that includes FRT and other biometric identification technologies must be included in the bill itself.

National Security is absent from the Bill but must be addressed within it

Section 3(2) of AIDA states that the Act does not apply to a “product, service, or activity under the direction or control of” government institutions including the Department of National Defence (DND), the Canadian Security Intelligence Service (CSIS), the Communications Security Establishment (CSE), or “any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.” In plain language, private sector technology developed to be used by any of these institutions is exempt from AIDA’s reach. Given FRT’s connection to broader surveillance and AI-driven systems, excluding the DND, CSIS, and CSE—three pillars of Canada’s surveillance infrastructure—from AIDA leaves room for gross violations of privacy in the name of state security.

Further, provision 3(2)(d) gives regulators the ability to exclude whichever department or agency they please any time after AIDA has passed. This runs counter to the notion of accountable government and creates the risk that other organizations and departments using or wanting to use FRT may escape meaningful regulation and public consultation.

Collective – not just individual – harm must be considered

While protection of individuals’ data is central to AIDA, Parliament must remember that AI in general and FRT in particular is built on collective data that may pose collective harms to society. FRT systems are consistently less accurate for racialized individuals, children, elders, members of the LGBTQ+ community, and disabled people – which is in direct conflict with C-27’s intent to restrict biased outputs. This makes the inclusion of collective harm in C-27 all the more necessary.

Final Remarks

The above-outlined issues are by no means exhaustive but are crucial problems that leave Bill C-27 unequipped to protect individuals and communities from the risks of FRT. While we agree that Canada’s privacy protections need to meet the needs of an ever-evolving digital landscape, legislative and policy changes cannot be made at the cost of fundamental human rights or meaningful privacy protections. Parliament must meaningfully address these glaring issues. Together, we can work toward a digital landscape that prioritizes privacy, dignity, and human rights over profit.

Sincerely,

Canadian Civil Liberties Association

Privacy and Access Council of Canada

Ligue des droits et libertés

International Civil Liberties Monitoring Group

Criminalization and Punishment Education Project

The Dais at Toronto Metropolitan University

Digital Public

Tech Reset Canada

BC Freedom of Information and Privacy Association

See the full list of signatories here.

Watch the press conference here.

Since you’re here…

… we have a small favour to ask. Here at ICLMG, we are working very hard to protect and promote human rights and civil liberties in the context of the so-called “war on terror” in Canada. We do not receive any financial support from any federal, provincial or municipal governments or political parties. You can become our patron on Patreon and get rewards in exchange for your support. You can give as little as $1/month (that’s only $12/year!) and you can unsubscribe at any time. Any donations will go a long way to support our work.panel-54141172-image-6fa93d06d6081076-320-320You can also make a one-time donation or donate monthly via Paypal by clicking on the button below. On the fence about giving? Check out our Achievements and Gains since we were created in 2002. Thank you for your generosity!
make-a-donation-button

Page 1 of 1712345...10...Last »