Home / News & Blogs / Is ‘the Internet censorship machine’ really happening? An update on the new Copyright Directive (part 2)

Is ‘the Internet censorship machine’ really happening? An update on the new Copyright Directive (part 2)

| 21 November 2018

In our previous blog, we discussed article 11 and the new rights it confers to publishers of press publications. In this one, we will take a close look at article 13 and the obligations it poses on online platforms to monitor and filter content posted by their users, in order to prevent or reduce copyright infringement. Are upload filters really going to be mandatory and is this going to wreck the Internet as we know it?

What obligations does article 13 impose and on whom?

In short, article 13 obliges online platforms where a large amount of content is shared by users (think of Youtube, Facebook, etc) to cooperate with rightholders, to ensure that copyright-infringing material is not available on their platforms.

For this purpose, online platforms are required to implement appropriate, proportionate and effective measures or technologies that can recognize protected content and determine whether the use of the content was covered by an agreement with a rightholder. Although the Directive itself does not explicitly use the word ‘filter’ or ‘block’, the obligation to monitor and recognize protected content and prevent the availability of infringing works, certainly implies a filter.

Certain service providers are exempted, such as non-profit online encyclopedias (like Wikipedia), internet access providers and cloud services that allow users, including businesses, to upload content for their own use.

Why are article 13 and the filtering obligations so controversial?

The ‘upload filters’ received much public criticism. Fearing the consequences for an open and free Internet, creativity, and start-ups, many Internet professionals argued against the obligations under the Directive. The most striking outcry was a letter to the President of the EP, written by 70 technology and legal experts, including Vincent Cerf (Internet Pioneer), Tim Berners-Lee (Inventor of the World Wide Web) and Jimmy Wales (Co-Founder Wikipedia). They criticized the burden for small platforms to comply, the uncertainty about which platforms would actually fall under the Directive and the change of the liability model of the Electronic Commerce Directive. According to the critics, the overall impact on the open and free Internet would be disastrous.

Can filtering ever be good enough?

Filtering content on the Internet accurately is technically very difficult to achieve. There is always a significant risk of filtering too much or too little content. Broken down, what is asked of online platforms is to (1) store all copyrighted works and all rightholders in the world, together with all licenses ever granted anywhere in the world, in a global database, (2) automatically monitor all the content that is being uploaded by users and check whether the content is present in the copyright-database, taking into account exceptions to copyright (like parodies) and to (3) automatically block the content if there is no license or exception. This seems like an enormous challenge, and even beginning to undertake this challenge may require resources that only the very largest platforms have at their disposal.

And even then, all of the vast resources of Youtube were apparently not enough to prevent that a video of a cat purring was blocked, because the filter assumed that it contained copyrighted works of EMI Music. If upload filters already filter the purring of a cat, what would happen to, for example, covers of popular songs?

The end of copyright exceptions?

Furthermore, copyright laws provide for exceptions and exemptions, which are complex and require human experts to be applied properly. (If and when this can be automated completely and accurately, the copyright lawyer could be the next profession exterminated by robots and AI.)

For example, recognizing a parody is very difficult; this still is a typical task for humans. According to the Court of Justice of the European Union (CJEU), the essential characteristics of parody are ‘’to evoke an existing work, while being noticeably different from it and to constitute an expression of humour or mockery’’. Furthermore, there should be a fair balance between the rights of the author and the freedom of expression of the parody-maker. Although algorithms are getting more advanced each day, it is fair to say that it would be impossible to correctly recognise a parody (and other copyright exceptions) with an upload filter.

Other copyright exceptions are also in danger when upload filters are being used. Take for example the exception to use copyrighted material for educational purposes. In 2015, a video of a Harvard law lecture was taken down by YouTube’s Content-ID filter (known for its many flaws and a prime example of why upload filters are currently failing). The professor used snippets of Jimi Hendrix covers, causing the Content-ID filter to automatically block the video. Especially ironic, since it was a lecture on copyright!

When in doubt, block?

Because online platforms will naturally wish to minimise any risk of infringement and claims by rightholders, it is reasonable to expect that the filters will block content in cases of doubt, and even in cases like those mentioned above, where a human expert would have been certain of the legality, but the filtering machine lacked the required sophistication to apply the applicable copyright exception. This could lead to private censorship, especially when taking into account how important online platforms have become for sharing ideas, art, and culture. Although the Directive obliges service providers to put complaints and redress mechanisms in place, it can be doubted whether these will prove sufficient.

A paradigm shift from the existing liability regime for intermediaries

The obligation to actively filter content (even when combined with redress mechanisms) drastically changes the classic role of online platforms on the Internet. The hosting exemptions under the Electronic Commerce Directive (2000/31EC) provided the exact opposite. The hosting exemption under the Electronic Commerce Directive prevented service providers from liability when they had no knowledge of the illegality of the content stored by users. They were only obliged to remove or disable such content when they were notified of this illegality. This exemption stressed the passive role of online platforms. The new Copyright Directive, however, requires online platforms to actively monitor all content that is uploaded.

The amendments by the EP: what has been changed?

After initially rejecting the Directive to reopen the debate, the EP has now approved and amended the draft text. What are the most important amendments, and can they sufficiently alleviate the concerns?

First of all, small and micro enterprises are now exempted from the Directive. These are businesses with less than 50 employees and less than 10 million on their annual turnover or annual balance sheet (source). It is still quite unclear what actions should be taken by, for example, middle sized companies. For example, could the requirement that measures be ‘proportionate’ be interpreted so that smaller providers with less resources are forgiven for having less accurate filters with more under-blocking, over-blocking, or both? Or even no filter at all, if the company sufficiently substantiates that it is not capable of producing a filter that is sufficiently accurate to be called ‘proportionate’?

In fact, another way to look at the requirement of proportionality could be that an obligation to ‘proportionately filter’ is requiring by law something which is impossible to achieve in reality. After all, the criticism against this proposal essentially revolves around the very problem that content recognition techniques and filters are insufficiently capable of identifying and blocking content correctly, with due regard for the intricacies of copyrights and applicable exemptions. Which arguably makes filtering by definition not a proportionate but a disproportionate measure (at least for now).

A further amendment is that the Directive now requires human intervention and responses with undue delay in the mandatory redress mechanisms. This may somewhat increase the quality of this remedy, but it is not expected that copyright experts will be appointed to assess each complaint. It significantly increases the financial and administrative burden on companies as well, especially for smaller companies, while it is not certain that it will actually protect the rights of content uploaders.

In addition, the following sentence was added to article 13: ‘’Cooperation between online content service providers and right holders [content filters, red.] shall not lead to preventing the availability of non-infringing works or other protected subject matter, including those covered by an exception or limitation to copyright.’’. The European Parliament states in its press release that the actions taken by online platforms must be designed in a way that prevents this. As we observed earlier, this appears to require providers to do the impossible: to design and implement filters that are actually accurate and correctly apply limitations and exemptions to copyright. As such, this amendment could be described as a mere ”cosmetic change” as intended by Julia Reda (Member of the European Parliament for the Pirate Party and Vice President of the Greens/EFA group).

Is the proposal now final and when will it become effective?

The vote in the European Parliament is an important milestone in the legislative process, but not the final step. The proposal will now move into the ‘trilogue’ negotiations between the EU Council, Parliament and Commission, who will need to work together to put their respective versions together in a final text. When exactly this process will be concluded is impossible to say now, but (early) 2019 appears a possibility. From then, the standard implementation term of 2 years will apply for the EU member states to implement the Directive into their national laws.

Regardless of your opinion on the matter, if you are an online platform, a rightholder, or a concerned citizen, it will be important to follow the legislative process closely. We will keep you posted as new developments may occur.

If you have any questions about how your organisation may be affected, please do not hesitate to contact us.