Canadian Government Readies Its Next Internet Crackdown With Online Harms Paper

The Canadian government is initiating its next front in cracking down on the Internet. It’s through potential “online harms” legislation.

The Canadian government is already putting the Internet on notice. Some are warning that the next prong on Canada’s war on the Internet could put it on the same level as third world totalitarian regimes. This is through the so-called “online harms” legislation that was promised to be forthcoming. A paper released by the Canadian government would demand, among other things, 24 hours to remove allegedly “harmful” content.

Long time observers, including us, have noted that this is online harms bill has long been promised. It was embedded in the debates surrounding Bill C-10. Bill C-10, as you no doubt know, would put user generated content under heavy regulation and demote independent Canadian content in favour of legacy corporations operating in this country. Luckily, that free speech killing legislation got held up in the Canadian senate and is destined to die on the order paper should an election is called.

The Canadian government has released technical paper and, in the process, billed that release as “consultation” on the process. The details of what the Canadian government is planning is quite shocking. For instance, the 24 hour window to remove content that is literally flagged by anyone:

11. [A] The Act should provide that an OCSP must address all content that is flagged by any person in Canada as harmful content, expeditiously after the content has been flagged.

a. [B] The Act should provide that for part [A], “expeditiously” is to be defined as twenty-four (24) hours from the content being flagged, or such other period of time as may be prescribed by the Governor in Council through regulations.
b. The Act should provide that in respect of part [A], “address” signifies that the OCSP must respond to the affected person stating that the content either a) does not meet the definition of harmful content, or b) does meet the definition of harmful content and has been made inaccessible to persons in Canada. In the latter situation, the OCSP must also make the content inaccessible in Canada within the timeframe required by part [B], and assess that content with respect to its obligations under parts [E] and [F].
c. The Act should provide that in prescribing a new timeframe as provided for in part [B], the Governor in Council may prescribe through regulations different timelines for different types or subtypes of harmful content. The Act should provide that the new timeframes could be either extended or shortened from the timeframe provided in part [B].

It really should be obvious from the outset that this is literally designed to be abused. If anyone has the power to flag content and has the power to remove said content within 24 hours by law, then this literally allows anyone to shut down any service they want for any reason. It requires a 24 hour window for, at minimum, a response. All an individual has to do is spam complaints and force staff to respond to complaints to the point where it is no longer viable for an online service to remain online.

Obviously, the paper only gets worse from there:

12. [C] The Act should provide that an OCSP must institute internal procedural safeguards providing users of the service in Canada with the following, as may be prescribed through regulations by the Digital Safety Commissioner, with the approval of the Governor in Council:

a. accessible and easy-to-use flagging mechanisms for harmful content;
b. notice of the OCSP’s content moderation decision within twenty-four (24) hours of the content being flagged, unless the timeframe is changed by the Governor in Council;
c. the accessible and easy-to-use opportunity to make representations, and compel an OCSP to promptly review and reconsider its decision; and
d. notice of the OCSP’s decision upon reconsideration, which must be provided without delay, including a notice of the recourse available to the Digital Recourse Council of Canada.

This raises a whole host of questions from a web administrators perspective. The big million dollar question would be, “Where do I even begin to set something like that up?” With the above quotation does nothing to explain that. As such, you aren’t going to get an answer here. For those more technically inclined than me, you might come up with an idea on how to set a system up. Maybe it is a form complete with URL bars, radials for different options, etc. The problem is that, no matter what system you set up, it’s not up to you to determine if it is considered “easy-to-use” or not. That determination may or may not fall with the Canadian government. Even then, it’s possible that someone might complain that it wasn’t “easy-to-use”. What happens then?

What’s more is the 24 hour threshold. At that point, you are practically required to hire staff to handle this. This adds an often hefty price to the cost of doing business just for the sake of regulatory compliance. If you are a website administrator that is managing to make enough money to hire a second person, what would you rather do? Hire someone to handle government regulation or hire someone to better your online presense to increase the reach of your site? From the business perspective, it’s all about expansion, so if you don’t have to do the former, then the latter is the obvious choice. You want to increase that cash flow and better your online presence.

The paper also suggests increasing the threshold for transparency:

13. The Act should provide that an OCSP must publish clear content-moderation guidelines, applicable to the five (5) types of harmful content, as may be prescribed through regulations by the Digital Safety Commissioner.

This, of course, doesn’t sound that bad on the surface. The problem is that it adds a further barrier to entry for new web owners.

It is so often the case that when someone creates a website, they do so on the basis of “winging it”. Essentially, it is often an experiment of learning on the fly. New web administrators are trying to learn how to set up something basic. They ask questions like, “how do I set up e-mail with my domain?”, “What are the steps to set up my Content Management System of choice (often WordPress)?”, “How do I log in to my administration panel?”, “How do I FTP files on my site?”, “What plug ins are appropriate for my site?”, “What is HTTPS and how do I set that up?”, “How do I properly set up my database?” and so on and so forth. There is a dizzying array of questions new administrations often have to sort out that normal users frequently don’t even think about.

Probably about the last thing on a new administrators mind on day 1 is, “How do I set up clear guidelines with harmful content and how do I make sure it is compliant with the Digital Safety Commissioner?”. If you think that new administrators are going to have that going through their minds on day one of a websites launch, you are on the moon compared to these administrators. The closest you could possibly reasonably get to the kind of thinking you are hoping with these administrators is an administrator who is setting up a web forum and is trying to plan out different moderator roles. Even then, it’s less to do with complying with any government standard and just organizing potential staff roles.

So, what this does is set up the very real possibility that a new administrator is going to suddenly receive this bizarre notification a couple months down the road that says that the content posted is “harmful” and that administrator very likely won’t know what to do with it. It could very easily relate to some user making an off-handed and obviously not serious suggestion of harming someone or property and it was flagged as terrorism by someone. Most administrators, including new administrators, are not lawyers, nor do many of them have much knowledge in the ways of the law. So, they could react in any number of ways including:

  1. The administrator freaks out at the phrase “terrorism” and shuts the whole site down. (I would almost put money on that being the most common bad reactions)
  2. The administrator thinks that it is a scam message and ignores it.
  3. The administrator inadvertently violates privacy laws by handing over everything they know about the user including the IP address.
  4. The administrator suddenly feels they are in way over their heads with this website ownership business and actively considers choosing a different profession.
  5. The administrator gets defensive, hires a lawyer, spends thousands trying to sort this whole thing out. In the end, realizes that he or she ran out of money and is forced to effectively declare bankruptcy, causing the site to shut down in the process.

Best practical reaction you are going to get is that people who are already more than familiar with legal stuff, observers that specialize on this topic, and news junkies will set something up to be within compliance. That just leaves the remaining 95% (yeah, a number I made up, but probably not all that inaccurate either) of websites that specialize in things like pottery, anime, video games, gardening, rap music, outdoor living, and who knows what else to be blissfully unaware of the fact that the laws completely changed around right from under them.

If you think that this might be a case of just reading too much into things, the next section should pretty much clear that up and then some:

14. The Act should provide that an OCSP must generate and provide reports on a scheduled basis to the Digital Safety Commissioner on Canada-specific data about:

a. the volume and type of harmful content on their OCS;
b. the volume and type of content that was accessible to persons in Canada in violation of their community guidelines;
c. the volume and type of content moderated;
d. resources and personnel allocated to their content moderation activities;
e. their content moderation procedures, practices, rules, systems and activities, including automated decisions and community guidelines;
f. how they monetize harmful content;
g. when relevant, their responses to the activation of the Incident Response Protocol;
h. when relevant, information on their (a) notifications to the Royal Canadian Mounted Policy (RCMP) or (b) reporting to law enforcement as provided for in part [E], including:
I. the volume and type of these reports or notifications;
II. anonymized and disaggregated information about the kinds of demographics implicated; and
III. the amount and kind of data and information that was preserved, as prescribed through regulations by the Digital Safety Commissioner.

Two words to any web administrator big or small operating in Canada: “You’re screwed”. No algorithm or metric or method is ever going to fully satisfy 14. (a). This is clearly written by someone who has no clue how the Internet operates or has any technical background whatsoever. That is basically asking administrators to wave a magic wand and magically produce the impossible. “harmful content” is 100% subjective and the Canadian government wants to quantify a subjective matter. The rest of the section is just icing on the cake on how totally screwed Canadian web services are.

Michael Geist is commenting further on this:

The government envisions pro-active monitoring and reporting requirements that could have significant implications. For example, it calls for pro-active content monitoring of the five harms, granting the Digital Safety Commissioner the power to assess whether the AI tools used are sufficient. Moreover, the OCSs would face mandatory reporting requirements of users to law enforcement, leading to the prospect of an AI identifying what it thinks is content caught by the law and generating a report to the RCMP. This represents a huge increase in private enforcement and the possibility of Canadians garnering police records over posts that a machine thought was captured by the law.

In order to enforce these rules, the public could file complaints with the Digital Safety Commissioner. The new commissioner would be empowered to hold hearings on any issue, including non-compliance or anything that the Commissioner believes is in the public interest. The Digital Safety Commissioner would have broad powers to order the OCSs “to do any act or thing, or refrain from doing anything necessary to ensure compliance with any obligations imposed on the OCSP by or under the Act within the time specified in the order.” Moreover, there would also be able to conduct inspections of companies at any time:

“The Act should provide that the Digital Safety Commissioner may conduct inspections of OCSPs at any time, on either a routine or ad hoc basis, further to complaints, evidence of non-compliance, or at the Digital Safety Commissioner’s own discretion, for the OCSP’s compliance with the Act, regulations, decisions and orders related to a regulated OCS.”

In fact, the inspection power extends to anyone, not just OCSs, if there are reasonable grounds that there may be information related to software, algorithms, or anything else relevant to an investigation.

The proposed legislation includes administrative and monetary penalties for non-compliance, including failure to block or remove content. These penalties can run as high as three percent of global revenue or $10 million. If there is a failure to abide by a compliance agreement, the AMPs can run to $25 million or five percent of global revenues. The AMPs would be referred to the new privacy tribunal for review. Given that liability for non-compliance could run into the millions, companies will err on the side of taking down content even it there are doubts that it qualifies as harmful.

Geist concludes with this:

Far from constituting a made-in-Canada approach, the government has patched together some of the worst from around the world: 24 hour takedown requirements that will afford little in the way of due process and will lead to over-broad content removals on even questionable claims, website blocking of Internet platforms that won’t abide by its content takedown requirements, a regulatory super-structure with massive penalties and inspection powers, hearings that may take place in secret in some instances, and regulatory charges that may result in less choice for consumers as services block the Canadian market. Meanwhile, core principles such as the Charter of Rights and Freedoms or net neutrality do not receive a single mention.

The government says it is taking comments until September 25th, but given the framing of the documents, it is clear that this is little more than a notification of the regulatory plans, not a genuine effort to craft solutions based on public feedback. For a government that was elected with a strong grounding in consultation and freedom of expression, the reversal in approach could hardly be more obvious.

From what we’ve read so far in this paper, this is more than just stopping “harmful content”. This is a paper spelling out how the Canadian government intends on destroying the Canadian Internet. When the impossible is being asked, larger players will simply block Canada for obvious business reasons. Anyone left in Canada will be forced to eventually shut down entirely sooner or later. Whether it is with worries about this legislation, or the threat of multi-million dollar fines. There really is no alternative here. Either you are going to get shut down or the government is going to shut you down. That will be the choice should this paper become law.

Drew Wilson on Twitter: @icecube85 and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top