Europe Passes Law to Remove Terrorism Within 1 Hour of Filed Complaint

Europe has passed a controversial anti-terrorism law. The law would require platforms to remove content within 1 hour of a complaint being filed.

Europe’s copyright directive, which contains the controversial article 13/17 and article 11, dominated a lot of the headlines around here. While it was certainly newsworthy, another piece of legislation has been drawing controversy. That comes in the form of Europe’s Terrorist Content Moderation directive.

One of the most contentious elements of the law is that it would require platforms to delete material within 1 hour of receiving a complaint. That, of course, is raising a lot of questions about freedom of expression because it doesn’t really leave any wiggle room to question the complaint or prevent false positives. According to Forbes, the penalties for failing to comply with an order is 4% of annual turnover. From the report:

the European Union now looks set to go even further, with a proposal approved on Wednesday in the European Parliament that will force social media companies to remove terrorist related content within an hour or face substantial fines. EU lawmakers have become the latest to tackle the long-overdue regulation of social media, passing a proposal “to tackle the misuse of internet hosting services for terrorist purposes.”

The risk for social media is that “companies that systematically and persistently fail to abide by the law may be sanctioned with up to 4% of their global turnover.” Their let-off is that “they will not be generally obliged to monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity.”

European Digital Rights (EDRi) weighed in on the laws, saying that the worst parts about the law have been removed:

Although it has been questioned whether this additional piece of law is necessary to combat the dissemination of terrorist content online, the European Union (EU) institutions are determined to make sure it sees the light of day. The Regulation defines what “terrorist content” is and what the take-down process should look like. Fortunately, Members of the European Parliament (MEPs) have decided to include some necessary safeguards to protect fundamental rights against overbroad and disproportionate censorship measures. The adopted text follows suggestions from other EP committees (IMCO and CULT), the EU’s Fundamental Rights Agency, and UN Special Rapporteurs.

European Digital Rights (EDRi) and Access Now welcome the improvements to the initial European Commission (EC) proposal on this file. Neverthless, we doubt the proposal’s objectives will be achieved, and point that no meaningful evidence has yet been presented on the need for a new European counter-terrorism instrument. Across Europe, the inflation of counter-terror policies has had disproportionate impact on journalists, artists, human rights defenders and innocent groups at risk of racism.

The next step in the process are trilogues negotiations between the European Commission, the European Parliament and Member States. Negotiations are expected to start in September / October 2019.

Throughout the debate, one aspect was particularly upsetting for digital rights advocates is the fact that earlier drafts required the use of the all too familiar upload filter. Supposedly, platforms need to install upload filters to block terrorism. Naturally, such technology really doesn’t exist in any meaningful or effective manner, but the laws were demanding it nevertheless. Towards the end of the debate, this requirement has been removed and digital rights are calling this a victory. From Julia Reda:

We managed to push back on upload filters, which are included in the Commission proposal in Article 6. The text adopted by the LIBE Committee makes explicit that the state authorities who can order platforms to remove material they consider terrorist content cannot impose obligations on web hosts to monitor uploads, nor to use automated tools for that matter.

Instead, the text calls for “specific measures” that hosts can take in order to protect their services (see Article 6). These measures can range from increasing human resources to protect their service to exchanging best practices (see Recital 16). But regardless of the measure chosen, hosts must pay particular attention to users’ fundamental rights. This clarification is a major victory, considering that the introduction of upload filters seems to be the main objective of the European Commission proposal.

While the contentious one hour requirement managed to stay, this debate isn’t a complete loss for digital rights. It still looks grim for free speech, but given what has been happening in Europe in the last few months, this is actually considered a success.

With the legislation heading towards the trilogue stage, there is still room for debate yet on this one. As EDRi pointed out, these debates are expected to take place sometime in September and October of this year. While it may be difficult to remove the 1 hour requirement, there may be opportunity to do so – especially in light of the upcoming European elections.

Drew Wilson on Twitter: @icecube85 and Facebook.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top