Article 13 Slammed By the United Nations as a Free Speech Threat

The United Nations has condemned the European copyright proposal article 13. It is blasting the proposal as a threat to free speech.

We could be just 2 days away from a crucial vote on the now infamous Article 13 proposal. The proposed law, dubbed the upload filter or the censorship machine, has been blasted by many as a threat to free speech. The proposed law would mandate that websites that accept user generated content install filtering software that blocks all allegedly infringing material. The problem is that no technology exists that would permit exceptions such as fair use. Additionally, it would raise the barrier for entry so much that many feel that only currently exiting platforms can sustain their existence under these new rules.

So, it may come as little surprise that Article 13 is being widely condemned in many circles. Digital rights activists are mobilizing to stop the censorship machine. They are currently encouraging European citizens to contact their lawmakers and tell them to not support the proposed law.

Meanwhile, innovators including the founders of the Internet have submitted a joint letter condemning the censorship machine saying that it will shut down the open nature of the Internet as we know it. Others, meanwhile, are going so far as to call the censorship machine a “carpet bombing” of the entire free Internet world.

Now, as the days before the vote wind down, it seems that the growing chorus of stopping the censorship machine is only growing louder. BoingBoing’s Cory Doctorow is pointing to a report prepared by the United Nations that slams Article 13. The report itself says that Article 13 is a direct threat to human rights and free speech. Doctorow writes the following about the report:

David Kaye […] is the UN’s Special Rapporteur on freedom of expression; he just released a detailed report on the catastrophic free speech implications of Article 13, the EU’s proposed copyright rule that would make sites filter everything their users post to check for copyright violations.

Kaye’s report points out the grave deficiencies with the plan: that it throws fair dealing (the right to reproduce copyrighted works for parody, commentary, criticism, etc) under the bus, because computers can’t tell whether you’re reproducing a work to comment on it or to just make it available; that it leaves users who get improperly censored out in the cold, with no judicial review of the machines’ orders to block their speech; and that it tilts the internet to favour the (mostly US-based) giant internet companies, while imposing an undue burden on EU competitors who are just getting started.

The document itself (PDF) is highly detailed in its findings. Here is what the United Nations said about concerns with free speech:

Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating “best efforts” and taking “effective and proportionate measures.” Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave considerable leeway for interpretation.

The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be “provided by law.” Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

As for the redress mechanisms, the report suggests that they are so insufficient that they may violate human rights:

The Art 13(7) proposal for content sharing providers to establish a “complaint and redress mechanism” does not sufficiently address these concerns. The designation of such mechanisms as the main avenue to address users’ complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content – particularly in the context of fair use and other fact-sensitive exceptions to copyright – may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.

The report further comments on the impacts the proposed laws could have on small businesses and non-profit organizations:

I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an “online content sharing provider” under Article 2(5) is based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)’s criteria for “effective and proportionate” measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.

In conclusion, the United Nations called on the voting committee to address the concerns highlighted above:

I urge Your Excellency and your Member State Governments to ensure that any measure the EU adopts to modernize its copyright laws addresses these concerns and is consistent with Article 19 of the ICCPR and related human rights standards.

At this point, it is unclear what impact the United Nations can have on such a vote. Still, this report does highlight many of the major concerns others have raised already. If anything, it shows that the more people become aware of these proposed laws, the more condemnation those proposed laws receive.

Drew Wilson on Twitter: @icecube85 and Google+.


1 Trackback

Leave a Reply

Your email address will not be published. Required fields are marked *