CARL, Canadian Civil Liberties Association, and CCRC Publish Their Online Harms Submissions

The deluge of submission continues with the Canadian Association of Research Libraries (CARL), Canadian Civil Liberties Association, and the CCRC.

The list of organizations speaking out about Canada’s online harms proposal is growing. Previously, the voices started off with just me, but our opposition has become a movement to this proposal. The voices that have already joined include the Internet Society Canada Chapter, the Independent Press Gallery of Canada, Michael Geist, Open Media, CIPPIC, Citizen Lab, and Internet Archive Canada.

Adding their voices to the mix is the Canadian Association of Research Libraries (CARL). In their submission (PDF), they outline a number of concerns including the inability to comply with such a law:

Our comments on the monopolistic tendencies in Big Tech directly relate to the significant penalties for non-compliance that have been outlined in the Discussion Paper. As with the GDPR, deep pockets and vast resources are required to comply with the complicated and onerous requirements in the proposed online harm legislation. Research libraries appreciate that the online services that we offer appear to fall outside of the proposed legislation, but we also feel that it is important to ensure that organizations that represent the public interest like Wikipedia, the Internet Archive, Project Gutenberg and others are also exempted. These organizations would likely not have the resources to comply, are not actively promoting online harm, and include much content that can be used to combat the spread of misinformation. Forcing them to comply with these requirements may actually force them to stop operations in Canada, further cementing the dominance and control that big tech has over the contents on the internet. As noted by Doctorow, new internet regulations like the General Data Protection Regulation (GDPR) have “done more to enshrine Big Tech’s dominance than the decades of lax antitrust enforcement that preceded them. This will have grave consequences for privacy, free expression and safety.”

CARL is concerned that the proposed approach may result in the significant over-removal of content. Without any measures to compel platforms to mitigate such overreach, this loss of content will harm the public historical record as well as small, independent content producers that depend on these platforms.
In comments that we submitted to the government that relate to the right to be forgotten (RTBF), we note that, any such right must:

  • Aim to balance an individual’s right to privacy with others’ freedom of expression.
  • Protect from the over-removal of content.
  • Respect the integrity of the historical record.

These three principles are also very relevant in this context. Over-removal in a RTBF regime or in an online harm regime as described will impact individual freedom of expression rights, increase the spectre of censorship and damage the historical record. This final point is of paramount importance to libraries. Information on the Internet may have future value for both the public and for researchers and we believe that an expert assessment of the impact of the removal on the historical record should form part of every decision to remove information from the internet.

Canadian libraries are also concerned that the proposal requires the use of algorithmic filters and AI driven tools to facilitate the removal of content. These problems are exacerbated by the 24-hour removal timelines and massive penalties for companies that fail to remove banned content. This will all-but guarantee that the system will lead to the mass removal of content. In addition, with no penalties in place for companies that over-remove content, there will be no incentive to restore content that was removed erroneously.

The first concern certainly echo’s my concern about an inability to comply with the new regulations. The proposal envisions a one-size-fits-all approach. The idea of penalizing us with a $10 million fine for being unable to remove something within 24 hours is absolute insanity. What if I had long working hours and was only able to answer two days later? That’s instant bankruptcy for me. The concerns about an inability to comply with the regulation and the request for an exemption is understandable because it would easily be an existential threat to their operations as well.

For me, simply wanting that exemption is a band-aid solution to a much deeper problem with this proposal. What really needs to happen is to stop this proposal altogether dead in its tracks. Failing that, there needs to be a whitelist approach that doesn’t effectively shut down almost every Canadian operation across the country. The requirements are impossible to comply with and the fines only considers the ideas of multi billion dollar enterprises being in existence on the Internet. Nothing about the proposal says that smaller operations actually exist, let alone were even considered.

Next up is the Canadian Civil Liberties Association (CCLA). In their submisison (PDF), they call the proposal excessive:

The CCLA has several concerns about the government’s proposed approach to “online harms” as well as concerns about the way this consultation is being undertaken. Considering the timing of this consultation process (discussed briefly below), our submissions on the substantive concerns about the proposal are set out in brief and are not exhaustive. This should not be misinterpreted to suggest that CCLA has little to say about the proposal. To the contrary, we believe a policy issue of this level of importance, and a proposal with such novel elements, must be subject to more rigorous scrutiny. We welcome the chance to be part of more meaningful discussions in the future; we will strongly resist attempts to push through legislation on this issue in the absence of truly inclusive and substantive consultations with Canadians.

The proposal is a radical policy change, in our view. It is excessive in scope, effect and purpose. CCLA’s substantive concerns about the proposal include the following:

1) The scope of the proposal problematically attempts to deal with a variety of different “online harms” and not solely unlawful content. This amounts to significant regulation of the ways in which Canadians communicate. The proposal also fails to appreciate how different the content categories are and the possibility that they may need to be addressed using different policy tools.

2) The proposal merges communications policy/regulation with public safety, national security and law enforcement concerns in a way that is quite troubling. Mandatory reporting by online communications service providers (OCSPs), as tentatively defined in the proposal, give rise to significant questions about the use of artificial intelligence and over-reporting, as well as state surveillance and the role of large platforms in its facilitation. The law enforcement proposals would also leave a great deal of detail to be decided by regulation, leading to concerns about political interference and the absence of meaningful democratic debate.

3) The proposal includes 24-hour takedown requirements for platforms for a wide variety of content and fails to consider the significant risk to lawful expression posed by this requirement. There are few meaningful due process protections built into this scheme.

4) The proposal includes a power to seek website blocking orders. Although this is touted in this context as a means of making the internet safer, site blocking presents a real threat to an open and safe internet. Clear and meaningful safeguards are required if such a power is deemed necessary in extraordinary circumstances.

A lot of this mirrors previous concerns about the proposal. The 24 hour window for takedowns, the scope of including both unlawful content and content that is simply considered “harmful” is swept into the same category, and the significant concern about incorporating web censorship. These are among the themes we’ve seen in previous submissions and it is nicely echoed here.

Finally, there is the CCRC (Canadian Coalition for the Rights of Children). In their submission:

The CCRC appreciates respect for the agency of young users in the complaint-based approach to removal of harmful content. Experience shows that digital education and informing young people about their rights and how to exercise them in the digital world is one of the best forms of prevention.

The proposed provisions in the discussion paper would benefit from review and revision from the perspective of young users with regard to: clarity in the definitions of the five types of harmful content; inclusion of youth-friendly language; inclusion of a youth-friendly complaint process

The CCRC recommends more attention to ensuring that all young people have access to rights-based educational resources and support as they exercise their rights and responsibilities in an on-line world.

So, not exactly an opposing piece, but they are calling for more educational resources for children in an online world. Probably one of the more unique submissions we’ve seen so far.

We’ll continue to track these submissions and note what they are saying moving forward.

(Hat tip: Michael Geist)

Drew Wilson on Twitter: @icecube85 and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top