A couple of ways EPP can go against their own line on copyright with article 13

For those watching the copyright debate in the European Parliament it is no mystery that European People’s Party is the key power to influence the future of the Digital Single Market in this area. The largest Parliamentary group, whose representatives hold crucial positions on the dossier, has adopted a group line on copyright. While both the LIBE and JURI Committees debate their compromise under EPP rapporteurs, what could possibly go wrong?

The hard line and the blurred line

The Parliamentarians affiliated with EPP have not presented a unified line in the reform debate, especially if it comes to content filtering (article 13 of the proposal). Their positions across various committees have ranged from hardliners such as Angelika Niebler’s, supporters of closing the value gap like Axel Voss, the current rapporteur at JURI, through the balanced position of Therese Comodini, Voss’ predecessor; to rapporteur Michał Boni’s decent draft report at LIBE or Róża Thun’s proposal for deletion tabled at IMCO.

In these circumstances EPP’s attempt to create a common ground is understandable – it is a way to preserve group unity. On the other hand, the exercise can only prove effective if it shaves off the extremist positions: of making the EC proposal even more troublesome for platforms and users as well as of deleting the article.

Positions of the political groups in JURI with respect to selected elements of the DSM directive proposal [Source].

The EPP group line adopted in July 2017 tries to reconcile a need to close the perceived value gap with some arguments protecting fundamental rights. The vision for EPP’s ideal article 13 is to ensure platforms enter into licensing agreements with rightholders to secure a better revenue for the latter.

Harming e-commerce, taking it easy on the filtering?

Similarly to the governments of France, Portugal and Spain, EPP is determined to change the interpretation of safe harbour that shields hosting providers and online platforms from liability for infringements committed by their users. In their words:

Platforms should engage in ‘communication to the public’ in the copyright sense when they are actively and directly involved in allowing users to upload works, in making works available and in promoting works to the public. They should not fall under the safe harbour clause of Art. 14 of the e-Commerce Directive and are therefore not exempt from copyright liability.

Creating this “exception from the exception” should not mean general monitoring obligation, as far as EPP is concerned. It is important because article 15 of the E-Commerce Directive prohibits general monitoring obligations for platforms and other hosting providers. At the best possible reading the EPP’s stance could mean a “no” to content filtering (that in practice requires general monitoring). According to EPP article 13 should also respect freedom of expression and ensure users can effectively appeal against decisions about removal of content.

On face value this looks like a decent common ground to introduce a notice-and-takedown procedure that would encompass the practice of removing content that infringes copyright after it is published, while letting users complain about it. But we need to remember that EPP Parliamentarians have been quite vocal through their amendments across all committees on how the filtering could help creators. Many like the proposal that the European Commission had put on the table and that describes a textbook example of general monitoring of content. So, sadly, there are still ways in which the EPP votes could uphold this concept, against their own group line.

1. Sticking to the idea that the platforms’ role is to prevent infringements before they actually happen

The EC proposal stipulates that Information Society Service Providers (ISSPs or platforms) need to take measures to prevent infringing content from becoming available. In practice the prevention is only possible to enforce if there is a mechanism that “knows” what users attempt to upload before this specific content is available for other people to see on a platform.

This does not point to notice-and-takedown at all. Notice-and-takedown works post-factum – as any such measure should – allowing taking down content already uploaded and letting the user prove this case was a legitimate use. Any preventive measures are those at the moment of upload. They lead straight into the content recognition technology, without even mentioning it explicitly in the article.

2. Keeping the reference to content recognition technologies

Content recognition technology is bound to be based on general monitoring. General monitoring is not determined upon the interest of the party applying it. Not being interested in the nature or in the substance of legal uploads is irrelevant. Looking for illegal stuff implies that even if an algorithm is “interested” in targeting only infringing content it has to take a peek into all content there is. And that is enough to call this procedure general monitoring.

Talk about how to reward creators, and not about technology

The whole attempt to tackle value gap through the measures envisioned in the proposed article 13 is really about the equity the rightholders want in the added value that the platforms bring to participation in culture, knowledge, or entertainment. This added value comes from both optimizing access to content similar to what we already like as well as from allowing everybody to have an audience, however small or niche it might be.

This goal can be achieved by better contractual negotiations by rightholders, or by getting more users to pay for access to quality material. Or by some sort of an econometric model that would provide a calculation of how much of that added value should go back to creators and artists.

Any indication of technology as a go-to solution for a fairer revenue distribution will inevitably lead to algorithms combing through billions of uploads to extract a few cents per an infringing one. This gain for rightholders will come at a price of a censorship machine ready to turn its Sauron’s eye onto any other type of content, also politically or commercially inconvenient.

Opened Book - Joannes Bemme, 1785 - 1818
Featured Blog post:
New policy paper on access to publicly funded research
Read more
Newer post
Bending over backwards – ALDE wants to ensure online is like the offline
October 30, 2017
Older post
France, Spain and Portugal: We must adapt the internet to the reality of copyright (not the other way around)
October 26, 2017