CIPPIC, Citizen Lab, and Internet Archive Post Online Harms Submissions

Submissions to Canada’s online harms proposal is becoming a deluge with CIPPIC, Citizen Lab, and Internet Archive.

It started with just me standing up for digital rights. Then, one by one, we were finding others joining the growing call for Canada to either greatly alter or scrap its online harms proposal altogether. After publishing my submission, the Internet Society Canada Chapter, the Independent Press Gallery, Michael Geist, and Open Media all joined the call to push back against Canada’s online harms proposal.

The submissions we were finding is now quickly becoming more of a deluge. CIPPIC (Canadian Internet Policy and Public Interest Clinic) has also published their submission to the online harms proposal. They are calling on the Canadian government to reconsider their online harms proposal. From their submission (PDF):

the government’s proposed legislation to regulate online harms seriously undermines claims that Canada is a leader in human rights. By raising the spectre of content filtering and website blocking, the current proposal threatens fundamental freedoms and the survival of a free and open internet in Canada and beyond. In an effort to combat hate speech and other ills, the proposed law threatens the free expression and privacy rights of the very equality-seeking communities that it seeks to protect.

The online harms proposal combines some of the worst elements of other laws around the world. This is why CIPPIC believes that the Department of Canadian Heritage needs to overhaul its current approach to addressing the problems caused by unlawful online content. We are seriously concerned about numerous elements of the proposed law — such as the lack of adequate transparency requirements, the loosened requirements for the Canadian Security Intelligence Service (CSIS) to obtain basic subscriber information, the various jurisdictional issues raised by the law, and whether an administrative body like the Digital Recourse Council should be able to determine what speech is legal under Canadian law.

The feedback we provide is focused on other key areas of concern. First, we focus on the need for increased clarity regarding which services or platforms are covered by the law. Second, we explain why the proposed 24-hour blocking requirement needs to be scrapped. Third, we demonstrate why the proposed proactive monitoring requirements need to be reined in. Finally, we advocate against the general requirement to identify and funnel profiling information to law enforcement about people’s online activity, in view of the chilling effects this will have on people’s online behaviour.

Canada is well-positioned to maintain its role as a global human rights leader and advocate for maintaining an internet that is open and free to all. A first step to preserving our role as a leader in this space involves an overhaul of this proposed law so that it is consistent with our democratic values.

Joining in on this increasingly large chorus of opposition is Citizen Lab, a Toronto based firm that earlier unearthed an Apple OS vulnerability exploited by NSO Group. From their submission (PDF):

We have reviewed the consultation materials, including the “Technical Paper” and the “Discussion Guide”, associated with the government’s proposal to address what it has referred to as “online harms”.2 We provide the following comments in response to that consultation process, divided into the following sections:

A. This Consultation Is Inadequate;
B. The Proposed Regime Will Not Achieve Its Intended Goals;
C. The Scope of the Proposal Is Overbroad and Incoherent;
D. Automated Enforcement Exacerbates Pre-Existing Problems;
E. Unidirectional Takedown Incentive Will Likely Be Inequitable and Unconstitutional;
F. Surveillance and Mandatory Reporting Requirements Are Dangerous and Chilling;
G. New CSIS Powers Are Unjustified and Inappropriately Included in this Consultation; and
H. Conclusion: Rewrite the Proposal from the Ground Up.

Technology-facilitated violence, abuse, and harassment is a real problem. Whether the violence, abuse, and harassment is based on gender (collectively, “TFGBV”), race, sexual orientation, other characteristics protected in Canadian equality law, or—more often than not—an intersecting combination of multiple characteristics, it plagues members of historically marginalized groups, who are routinely silenced and driven off the Internet as a result. This issue is serious and pressing, and it deserves and requires urgent and sustained attention from governments, technology companies, scholars, and civil society at every level.

In the same vein, thoughtless legislative measures to address these same issues for reasons of political expediency, or with insufficient care, thoughtfulness, intersectional and equitable considerations, and while lacking understanding of the practical and sociotechnical implications of such measures when implemented, do a profound disservice to the issue—as well as to targets, victims, and survivors, and to those historically marginalized groups whom online abuse, including NCDII and hate speech, most devastates.

In this respect, the proposals advanced by the government fail to account for the scholarship, concerns, and experiences of underrepresented, historically marginalized, and vulnerable individuals and communities. These are, of course, the very people who face the vast majority of technology-facilitated abuse, harassment, and violence—including women; Black, Indigenous, or otherwise racialized individuals; LGBTIQ+ individuals; individuals with disabilities; members of religious, linguistic and ethnic minority communities; immigrants and refugees; survivors of sexual violence, racist violence, and hate crimes; and sex workers—as well as individuals whose identities overlap multiple intersections among those groups.

Research—including research produced by the Citizen Lab—has consistently demonstrated that Internet filtering and content monitoring technologies often result in the disproportionate censorship and surveillance of historically marginalized individuals and communities. The technical interventions proposed by the Canadian government in the context of this consultation are emblematic of such an approach. The consultation materials advance an aggressive, algorithmic, and punitive regime for content removal it proposes, without any substantive equality considerations or clear safeguards against abuse of process. They also demonstrate the government’s willingness to enlist and empower law enforcement and intelligence agencies to intervene on these issues—whether or not the victim or survivor has consented to such intervention.

Adding their voice to the mix is Internet Archive Canada. They published their submission online as well. From their submission (PDF):

While this proposal appears centered around large social media platforms, we have deep concerns about it, including its potential for broad application to libraries and small and not-for-profit organizations like ours. We believe that libraries and others like us have a role to play in creating and sustaining a better internet, with more digital public spaces and more access to good and trustworthy information online. Unfortunately, imposing newly burdensome and potentially overbroad regulatory regimes—even with the best of intentions—is likely to make the costs of participation in certain digital spaces too high for all but the largest commercial actors. The result will be further entrenchment of the largest foreign corporations in positions of dominance online. Should the government proceed with this proposal, it should carefully consider the extent to which it will make it even more difficult for truly Canadian spaces to survive and thrive online, leaving us with a worse information ecosystem overall.

As we understand it, the government’s proposal would impose substantial costs, financial and otherwise, on any entity which is deemed to fall within the definition of an Online Communication Service. The definition could change by regulation at any time. This would make it a risky proposition to participate in online life in any way close to the definition of an OCS; with a change in definition, or even in interpretation, substantial investments of time, energy, and other resources could evaporate. And for those clearly within the concept of an OCS—whatever that is deemed to be—the costs of automated systems, the technical and human resources required to implement twenty-four hour takedowns, and all the actual and possible associated requirements, would be extraordinarily high. How could these be met by small libraries, not-for-profits, or startups? How could any but the largest multinational corporations play a part in shaping the online world? Would that situation truly address the problems at hand?

It is also important to consider the broader global context. If new and different rules are to be adopted in jurisdictions around the world, the costs of complying with each of them will multiply. This is, one must assume, why provisions like Article 19.17 of the CUSMA have been proposed and agreed to by Canada and many others. Will others ignore treaty obligations and promulgate conflicting rules? Will Canada’s adoption of unique, costly, and open-ended regulations—with potential application to broad swaths of actors and online speech—improve Canada’s internet, or make it a hinterland?

This really is becoming a major movement to oppose the online harms proposal. The snowballing effect is definitely becoming real on this one. While it is widely expected that the Canadian government is going to ignore submissions like this, it’s increasingly going to be difficult to say that people support this initiative as-is. The evidence is growing that organizations and people across the country are opposed to the direction the government has taken on this.

(Last two submissions, hat tip: Michael Geist)

Drew Wilson on Twitter: @icecube85 and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top