While the upload filter has been discussed at length, more skeptical readers might ask for a full legal analysis. They got it thanks to EDRI.
We’ve been following the debate surrounding Europe’s article 13 for some time. It is often dubbed the upload filter or the censorship machine. While it has gotten a lot of attention, some of the more skeptical readers might be thinking that maybe the fears are overblown. After all, how often have you seen anyone actually cite the proposed legislation directly?
That is where European Digital Rights (EDRI) comes in. Seemingly pulling a Drew Wilson, the digital rights organization has published a paragraph-by-paragraph analysis of the legislation in question. They call it a re-deconstruction of the legislation and you can read the full analysis here (PDF).
Here are a few examples of their analysis:
1. Online content sharing service providers referred to in paragraph -1a shall, in cooperation with rightholders, take appropriate and proportionate measures to ensure the functioning of licensing agreements where concluded with rightholders for the use of their works or other subject-matter on those services.
This is the first element of the upload filter. What is meant here is impossible to achieve without the provision of identification “hashes” (or other fingerprinting data) of copyrighted content, in line with Google’s ContentID. The “appropriate and proportionate” wording has no meaning, as neither party (the rightsholder and the service provider) would be expected to agree to measures which they did not consider to be appropriate or proportionate. There is no clarity about for whom or to what the measures are meant to be appropriate or proportionate. It certainly seems highly unlikely that third parties that are not parties to the contract (the users) would be covered by this wording.
In the absence of licensing agreements with rightsholders online content sharing service providers shall take, in cooperation with rightholders, appropriate and proportionate measures leading to the non-availability of copyright or related-right infringing works or other subject-matter on those services, while non-infringing works and other subject matter shall remain available.
This is the second element of the upload filter, which is imposed on virtually all internet services. Providers that do not have a licensing agreement cannot meet this obligation without implementing upload filters, while those who do have licenses must implement filters to monitor usage of the licensed content. The reference to “appropriate and proportionate” has no particular meaning in relation to how private companies manage their services. The provider has no obligation (as made clear by the “terms and conditions” reference above) to host any content, so the final words (“shall remain available”) of the paragraph have no legal meaning.
In order to ensure the functioning of any licensing agreement, online content sharing service providers should take appropriate and proportionate measures to ensure the protection of works or other subject-matter uploaded by their users, such as implementing effective technologies.
Although the Parliament’s amendments have the effect of turning removing all meaning from this sentence, this is clearly meant to refer to upload filtering.
This obligation should also apply when the information society service providers are eligible for the liability exemption provided in Article 14 of Directive 2000/31/EC.
Upload filters should also be implemented by hosting providers not otherwise regulated by this legislation.
So, there’s a handful of examples of their analysis. Obviously, the document is much more in-depth then that. Either way, though, it ultimately removes any doubt as to whether or not article 13 is an upload filter or not. So, if anyone is asking for source material, this document clearly serves that function for anyone who asks.