On December 5th, 2024, ICLMG’s National Coordinator Tim McSorley, testified at the House of Commons Standing Committee on Justice and Human Rights for their study of Bill C-63, the Online Harms Act. You can watch his short testimony above, the full panel here or read his remarks here.
As part of this study, we also submitted a brief to the committee detailing our concerns as well as our recommendations to address them:
In 2021, the federal government published a proposal for online harms regulations. The International Civil Liberties Monitoring Group (ICLMG) joined many other organizations and experts in opposing significant parts of that proposal.[1] The government responded by engaging in further consultation, resulting in the introduction of Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, in March 2024.
Bill C-63 responds in many ways to the critiques that ICLMG and others leveled regarding the first proposal:
- While still including seven different categories of harms,[2] it proposes more severe rules around content moderation for the sharing of “content that sexually victimizes a child or revictimizes a survivor” as well as “intimate content communicated without consent,” as opposed to one approach for all seven harms.
- There is no explicit requirement that would require platforms to monitor all content in order to identify and remove harmful posts.
- The main focus is on the regulation of platforms, in the form of obligations to create and follow online safety plans, and not on policing all users.
- Except for content that sexually victimizes a child, there is no requirement for mandatory reporting of content or users to the RCMP or CSIS.
- There are no proposals to create new CSIS warrant powers.
- There are greater rules around platform accountability, transparency and reporting.
However, there remain serious areas of concern:
- Part 1 of the Act:
- The harm of “content that incites violent extremism or terrorism” is overly broad and vague, and encompasses kinds of activities that are not defined in law, opening the likely possibility of excessive censorship. Further, given the inclusion of the online harm of “content that incites violence,” it is redundant and unnecessary.
- The definition of “content that incites violence” is also overly broad, allowing for the possibility of content advocating for protest and civil disobedience to be made inaccessible on social media platforms.
- While not explicitly requiring platforms to proactively monitor content, the Act does not disallow such actions either.
- Lack of clarity in the definition of what is considered a regulated service could lead to platforms being required to monitor, and likely “break”, encryption tools that protect online privacy.
- Platforms would be required to preserve certain data relating to posts alleged to incite violence or to incite violent extremism or terrorism for one year; this is likely to ensure that the data is available if law enforcement receive judicial authorization to request it. However, the current wording leaves the breadth of the requirement uncertain and in need of clarification/narrowing.
- While the Act lays out transparency requirements for online platforms, it fails to include algorithmic transparency in regard to how content is recommended.
- Part 2 of the Act:
- This section amends Canada’s existing hate crime offences and creates a new stand-alone hate crime offence, and is only tangentially related to Part 1. It has raised serious concerns among human rights and civil liberties advocates in regard to the breadth of the offences and the associated penalties. As it does not touch explicitly on counter-terrorism concerns, it falls outside of ICLMG’s mandate so we will be limiting our comments. However, this does not signal that there is not a significant need for amendments and for consideration of splitting Parts 2 and 3 from the bill to be considered separately. [The brief was sent before the government’s announcement that they would be splitting the bill as we advocated for.]
Read the full brief here for detailed proposals to address some of our specific concerns.
Footnotes
[1] International Civil Liberties Monitoring Group, “Submission to the federal government’s consultation on its proposed approach to address harmful content online.” 25 September 2021. Online: https://iclmg.ca/wp-content/uploads/2021/10/Online-Harms-Submission-ICLMG.pdf
[2] (a) intimate content communicated without consent; (b) content that sexually victimizes a child or revictimizes a survivor; (c) content that induces a child to harm themselves; (d) content used to bully a child; (e) content that foments hatred; (f) content that incites violence; and (g) content that incites violent extremism or terrorism.
Since you’re here…… we have a small favour to ask. Here at ICLMG, we are working very hard to protect and promote human rights and civil liberties in the context of the so-called “war on terror” in Canada. We do not receive any financial support from any federal, provincial or municipal governments or political parties. You can become our patron on Patreon and get rewards in exchange for your support. You can give as little as $1/month (that’s only $12/year!) and you can unsubscribe at any time. Any donations will go a long way to support our work.You can also make a one-time donation or donate monthly via Paypal by clicking on the button below. On the fence about giving? Check out our Achievements and Gains since we were created in 2002. Thank you for your generosity! |