CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the provision agree on how it should work

Echtpaar bij de dorpsrechtbank van Puiterveen
Will the CJEU strike down Article 17?
Licentie

On Tuesday, November 10, the Court of Justice of the European Union (CJEU) heard case C-401/19. This case is a request by the Polish government to annul the filtering obligation contained in Article 17 of the Copyright in the Digital Single Market (DSM) Directive on the grounds that it will lead to censorship and will limit the freedom of expression and the freedom to receive and impart information guaranteed in Article 13 of the EU Charter of Fundamental Rights (Charter).

The defendants in this case are the European Parliament and the Council of the European Union. In addition, the European Commission and the governments of France and Spain intervened in the case on the side of the defendants. Advocate General Saugmandsgaard Øe was also present at the hearing.

Even for astute followers of the discussions around the implementation of Article 17, the hearing contained a number of surprises. While several Member States have been soldiering on with their national implementation proposals with little regard for the fundamental rights implications of Article 17, the hearing showed that the Court is taking Poland’s complaint very seriously and that the compliance of the contested provisions of Article 17 with the Charter is far from evident. Frequent reference was made during the hearing to the recent opinion of Advocate General Saugmandsgaard Øe in the YouTube and Cyando cases, which is highly critical of extensive obligations on platforms to police the copyright infringements of their users.

On the face of it, the case is about Poland’s request to annul Articles 17(4)(b) and (c) of the DSM directive. Poland argued its case, which essentially rests on the observation that while not explicitly mandating them, Article 17(4)(b) and (c) effectively require platforms to implement upload filters because there are no other effective means to comply with the obligations contained therein. Poland argues that this will lead to censorship and will limit the freedom of information of the users of online platforms.

According to Poland, the key problem with the directive is the move away from active participation of rightholders (as initiators of removal requests in the context of notice and takedown procedures) and instead handing the responsibility of removing infringing uploads over to platforms who will have to develop private enforcement systems to avoid liability for copyright infringement. Because they are not facing any comparable risk when they limit user rights by blocking access to legal content, this creates strong incentives for over-blocking. This in turn will result in censorship and violation of the fundamental rights to freedom of expression and information under the Charter. Consequently, the problematic parts of Article 17 should be annulled by the Court.

All other parties intervening in the case objected to this line of argument and stated that in their view Article 17 does not violate any fundamental rights. However, they presented strikingly contradictory interpretations of what Article 17 actually requires of platforms. There are two distinct lines of argument: The Commission, the Council and the European Parliament argued that that Article 17 contains enough internal safeguards to prevent users’ fundamental rights from being unduly limited. On the other hand, France and Spain argued that some limitations of fundamental freedoms are justified by the objective that Article 17 seeks to achieve. Continue reading

Taming the upload filters: Pre-flagging vs. match and flag

Raderboot
How to limit the damage filters can do
Licentie

One of the most important elements of any implementation of Article 17 will be how platforms can reconcile the use of automated content filtering with the requirement not to prevent the availability of legitimate uploads. While most implementation proposals that we have seen so far are silent on this crucial question, both the German discussion proposal and the Commission’s consultation proposal contain specific mechanisms that are intended to ensure that automated content filters do not block legitimate uploads, and that uploads are subject to human review if they are not obviously/likely infringing. 

In order to achieve this objective, the German discussion draft published in June relies on the idea of “pre-flagging”: users would be allowed to flag uploads containing third party works as legitimate. Platforms would then be prevented from automatically blocking pre-flagged uploads unless they determine that the flag is incorrect because the upload is “obviously infringing”. 

By contrast, the Commission’s implementation guidance consultation proposes a “match-and-flag” mechanism: if upload filters detect the presence of a third party work in an upload and the use is not deemed to be “likely infringing”, then the uploader is notified and given the ability to state that the use is legitimate. If the user flags the upload as legitimate, the platform will have to initiate a human review of the upload, which remains available from the moment of upload until the review has been concluded. This type of mechanism was first suggested by a group of copyright academics in October of last year. It is also at the core of the proposal that we had presented during the last meeting of the stakeholder dialogue.

Both approaches provide a mechanism that limits the application of fully automated upload filters (while implicitly acknowledging the fact that many platforms will deploy upload filters). In the Commission’s proposal, filters are limited to making a pre-selection (“is the upload likely infringing?”); in the German proposal, they can only operate on unflagged content and to filter out “obviously incorrect” pre-flags.

Convergence on “match-and-flag”?

Both approaches have been criticised by rightholders, who claim that they undermine the “original objective of the directive” without providing alternative proposals on how automated filtering can be reconciled with the requirement not to block legitimate uploads. In addition, the German discussion proposal has also been criticised by platforms such as Google and Facebook. The platforms are arguing that giving users the ability to pre-flag every single upload would be impractical and would likely lead to substantial numbers of unnecessary (where the content in question is already licensed) or unjustified (users making excessive use of the pre-flagging tool) pre-flags, which would make such a system impractical to operate at scale. Continue reading

Article 17 guidance: Don’t shoot the messenger / ne pas tirer sur le messager!

David geeft opdracht om de man te doden die Saul de genadestoot gaf Sauls dootslager gestraft
Strengthening user rights is part of Article 17
Licentie

Yesterday Politico published the response submitted by the French Government to the Commission’s stakeholder consultation on the article 17 implementation guidance. For anyone who has followed the French position in the debate about Article 17, the response (see here for an english language “courtesy translation”) will not be very surprising. The French reaction rejects the majority of what the Commission is proposing and questions and accuses the Commission of reversing the progress that (according to the French government) has been made with the adoption of Article 17 in early 2019. This position is largely aligned with the grievances expressed by a coalition of rightholders organisations in their recent letter to Commissioner Breton and it is consistent with the way France is implementing Article 17 in its national law (next week the Assemblée nationale will vote on a law that will authorise the French Government to implement the provisions of the DSM directive by decree). 

Unfortunately, it is also based on a flawed understanding of the issue at hand. Since the French authorities note in their reaction that they “will carefully consider stakeholders’ responses and are open to continued discussions so that future guidance may play their concrete technical role with regards to the correct application of article 17”, it is necessary to clear up some of the misunderstandings here.

Misrepresenting Article 17

The constant references, on the side of rightholders and from the French government, to the “original objective”, “goal” or “spirit” of the directive try to mask the fact that the actual text of the directive is not as clear-cut as rightsholders and the French government would like it to be. The fact is that, in between its introduction (as Article 13) in 2016 and its adoption (as Article 17) in 2019, the provisions dealing with the “Use of protected content by online content-sharing service providers” have undergone substantial changes. 

What was proposed as a relatively simple intervention to strengthen the position of rightholders vis-à-vis content sharing platforms by changing the liability position of said platforms, has – following intense criticism from academia, civil society and internet users – been transformed into a convoluted legal regime that serves a number of conflicting objectives. Article 17 now includes strong language that establishes new user rights and provides meaningful safeguards for preserving these rights. These additions to the article were essential in securing its adoption by the EU legislator. As a result, in the final version of the article, the original objective of strengthening the position of rightholders is just one of a number of objectives. 

By now it seems clear that in their fierce determination to get the new copyright directive adopted, many of the supporters of the original proposal have failed to notice that the final legislative compromise had morphed into an article that is fundamentally different from what was originally proposed. In supporting the final compromise of Article 17, rightholders and the French government have in fact supported provisions that go against their long established positions. Continue reading

Civil Society letter on the Article 17 implementation guidance

De briefschrijfster
Guidance must aim to protect users' rights!
Licentie

Earlier today 27 (update 17-09-2020: 32) civil society organisations sent a joint letter to Commissioner Breton summarising our responses to the Article 17 guidance consultation that closed last week. In addition to organisations participating in the stakeholder dialogue, the letter has also received support from a broad coalition of digital and human rights organisations from across Europe. 

The letter expresses concerns that the proposed Article 17 guidance endorses the use of automated content blocking by online services even though it is clear that this will lead to the violation of fundamental rights. It also warns that implementations of Article 17 based on the proposed guidance will violate established principles of EU law.

In this context the letter highlights the need for meaningful safeguards for legitimate uses of content uploaded to online platforms, and stresses the need for a robust redress mechanism for users. Summarising the consultation responses submitted by the various signatories, the letter highlights the importance of ensuring that uploads that are not manifestly infringing must remain online until a human review has taken place. The letter further stresses the importance of involving users’ organisations when setting technical parameters that are used to determine if an upload is manifestly infringing or not. 

The letter further highlights the need for full transparency of (automated) content removals and the ability for users (and user organisations on their behalf) to take actions against the abuse of the measures introduced by Article 17 of the DSM directive.

Finally, the letter also expresses support for the Commission’s clarification that Article 17 constitutes a “lex specialis” to the provisions of the InfoSoc Directive which provides Member States with maximum flexibility to include user rights preserving authorisation mechanisms in their national legislation.

You can read the full letter including the list of signatories here.

A step in the right direction: Our response to the targeted consultation on the Article 17 guidance

A woman shouting into a man's ear-trumpet
Thanks for listening!
Licentie

Yesterday we submitted our response to the European Commission’s targeted consultation on the Article 17 guidance. As we have explained previously, with this consultation the Commission was seeking feedback on its initial ideas for the Article 17 implementation guidance, which the Commission intends to publish before the end of the Year. The document is intended to provide Member States with guidance on how to balance the conflicting requirements of Article 17 (preventing copyright infringements while ensuring that legal uses are not affected) when implementing it in their national legislations.

As we said in our initial analysis, we were very happy to note a clear commitment of the Commission to maintain the delicate legislative balance of Article 17 that reflected many of the constructive contributions that have been made by stakeholders across the spectrum during the dialogues. In general, we consider the Commission’s proposal a step in the right direction and this is reflected in our response to the consultation. Unsurprisingly, organisations representing rightholders have a completely different reaction to the proposal and have already started a campaign to convince the Commission into abandoning its approach. Continue reading

Commission consultation on Article 17 guidance: User rights must be protected at upload

Heilige Rochus smeekt bescherming van oudere edelman af
Legitimate uses must be considered at all times
Licentie

At the end of July the Commission published a long awaited “targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the CDSM Directive“. With this consultation the Commission makes good on its (pre-covid) promise to “share initial views on the content of the Article 17 guidance” with the participants of the stakeholder dialogue. Nestled in-between 18 questions, the consultation document provides a detailed outline of what the Commission’s guidance could look like once it is finalised.

While we have been rather sceptical after the end of the six meetings of the stakeholder dialogue meetings, we are pleased to see that the initial views shared by the Commission express a genuine attempt to find a balance between the protection of user rights and the interests of creators and other rightholders, which reflects the complex balance of the provisions introduced by Article 17 after a long legislative fight.

In the remainder of this post we will take a first, high level, look at the Commission’s proposal for the Article 17 guidance, what it would mean for national implementations and how it would affect user rights.

Two welcome clarifications

With the consultation document the Commission takes a clear position on two issues that were central to the discussions in the stakeholder dialogue and that have important implications for national implementation of Article 17.

The first one concerns the nature of the right at the core of Article 17. Is Article 17 a mere clarification of the existing right of communication to the public, as rightholders have argued, or is it a special or sui generis right, as academics and civil society groups have argued? In the consultation document the Commission makes it clear that it considers Article 17 to be a special right (“lex specialis”) to the right of communication to the public, as defined in Article 3 of the 2001 InfoSoc Directive, and the limited liability regime for hosting providers of the E-commerce Directive. 

What sounds like a fairly technical discussion has wide ranging consequences for Member States implementing the Directive. As explained by João Quintais and Martin Husovec, now that it is clear that Article 17 is not a mere clarification of existing law, Member States have considerably more freedom in deciding how online platforms can obtain authorisation for making available the works uploaded by their users. This should mean that they are not constrained by the InfoSoc Directive. Therefore, mechanisms like the remunerated “de-minimis” exception proposed by the German Ministry of Justice that would legalise the use of short snippets of existing works are permitted and covered by the concept of “authorisation” introduced by Article 17. Continue reading

How Filters fail (to meet the requirements of the DSM directive)

Sakkamettant l'eau au sir (filtre)
Three common failure modes of upload filters
Licentie

Article 17 of the DSM directive establishes that Online Content Sharing Service Providers (OCSSPs) are liable for copyright infringing uploads by their users unless they either obtain a license for the use of such content, or take a number of measures designed to prevent the availability of such content on their platforms. While the directive never explicitly talks about filters or automated content recognition (ACR) systems, it is assumed by all sides of the debate that, in order to meet this obligation, platforms have little choice but to implement ACR-based filtering systems that will scan all user uploads and block or remove uploads that contain works that have been flagged by their rightholders.

This de-facto requirement to implement upload filters is – by far – the most controversial aspect of the entire copyright directive and it continues to dominate the discussions about the implementation of Article 17 into national legislation.

In this context, it is important to remember that the use of such filters is not new and that their functioning can already be observed in practice. What is new, however, is the de-facto requirement for OCSSPs to implement filters as well as a number of requirements that OCSSPs need to meet to ensure that any measures (including filters) implemented by them are not infringing on the rights of users. This includes the requirement that any such measures “shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or limitation“.

In other words, one of the most important contributions of the DSM directive is that, for the first time, it establishes conditions that need to be met by automated upload filters.

As we have argued many times before, these conditions present a very high hurdle for any technological solution to clear. The fact that upload filters are incapable of determining if a particular use of a copyrighted work is infringing or not has been established beyond any doubt. But that does not mean that the failure to assess the context is the only way that filters based on automated content recognition fail to meet the requirements established by the directive. In total there are at least three distinct ways how filters fail.

In the remainder of this post we will discuss these three failure modes based on examples collected by Techdirt in the course of a single week: removals caused by incorrect rights information, removals caused by the inability to recognise legitimate uses, and removals caused by the inability to accurately identify works.

Incorrect rights information

Incorrect rights information is probably the most common and best documented cause for the unjustified removal (or demonetisation) of works on YouTube.

ACR systems execute actions specified by whoever is recognised as the owner of a work. For the purposes of the ACR systems, the owner of a work is whoever claims to be the owner of the work and, unless there are conflicting ownership claims, there is no way to check the accuracy of such claims as there are no authoritative databases of ownership rights. As a result it is possible to claim ownership in public domain works (which no-one owns), in works that have been freely or widely licensed by their owners, or for any copyrighted work that has not already been claimed by someone else. Continue reading

Video recording from last weeks COMMUNIA salon

Last week Monday, on the first anniversary of the entry into force of the Copyright in the DSM Directive, we held the first virtual edition of our COMMUNIA Salon. During the event we presented an overview of the implementation status in the EU member states, zoomed in (sic!) on various legislative proposals to implement Articles 15 and 17 of the Directive and discussed ways to challenge parts of the Directive via the legal system. If you have missed the event you can watch a recording of the presentations and the subsequent discussion here: 

The event was kicked off by Teresa Nobre who launched the COMMUNIA implementation tracker and presented an overview of the implementation in the EU member states. While France has already implemented the press publishers right, the majority of EU member states are still in various stages of consultation. One year after the entry into force of the Directive, only Belgium, Croatia, France, Germany, Hungary and the Netherlands have put forward legislative proposals. Continue reading

New policy paper on fundamental rights as a limit to copyright during emergencies

Italian Landscape with Umbrella Pines
Adjusting essential uses to new modes of living
Licentie

Today, Communia released a policy paper on fundamental rights as a limit to copyright during emergencies. This policy paper has been prepared in the context of the COVID-19 pandemic, which has caused a massive disruption of the normal organization of society in many EU countries. 

In our paper we defend that, in order to transpose education, research and other public interest activities from public locations to private homes during government-imposed lockdowns, we need to be able to rely on the understanding that fundamental rights can, in exceptional situations, function as an external limit to our national copyright systems.

The main conclusions of our paper are the following:

First Conclusion

The educational and research exceptions and limitations provided for in Article 5(3)(a) of the InfoSoc Directive and in Articles 6(2)(b) and 9(b) of the Database Directive, and the public lending exception provided for in Article 6(1) of the EU Rental and Lending Rights Directive are mandatory for Member States, due to the fundamental rights that they internalize, namely those enshrined in Articles 11(1), 13 and 14(1) of the EU Charter of Fundamental Rights.Continue reading

Article 17 Stakeholder dialogue: COMMUNIA input paper

A woman shouting into a man's ear-trumpet
Our principles for the the Commission guidelines
Licentie

Given the ongoing health emergency, the European Commission’s stakeholder dialogue on the implementation of Article 17 of the Copyright Directive is currently suspended. The 7th meeting of the stakeholder dialogue, which was originally scheduled for Monday of this week and where the Commission was expected to “share initial views on the content of the guidelines”, has been cancelled and it is currently unclear how and when the stakeholder dialogue will be resumed. In the meanwhile, the European Commission is continuing its work on the guidelines.

While we are waiting for news on the future of the stakeholder dialogue we have produced an input paper for the Commission (pdf) that summarises our observations from the stakeholder dialogue so far and formulates a number of principles that the Commission should take into account when drafting its guidelines. In line with our initial principles, the input paper focuses on licensing, transparency and procedural safeguards for users’ rights. The paper builds on the model that we had presented during the last meeting of the stakeholder dialogue

Specifically, we are asking the Commission to include the following in the Article 17 implementation guidelines: 

  • Recommend to national lawmakers to fully explore all legal mechanisms (including collective licensing with extended effect, mandatory collective management schemes and other non-voluntary licensing schemes) for granting OCSSPs authorisation to have in their platforms copyright-protected works and other subject matter uploaded by their users.
  • Require that all ownership claims made in the context of the measures required by Article 17 must be made fully transparent to allow public scrutiny and prevent unjustified removals or blocking by rightholders claiming ownership of works that they do not own.
  • Require that OCSSPs publish statistical information on the number of removal/blocking actions as well as the number of complaints and the resolution of complaints arising as the result of such actions.
  • Requires that in cases other than obvious infringement and in order to prevent automated measures from affecting lawful uses, users must have the ability to override all automated actions before the blocking/removal takes effect.
  • Require that in case of obvious (“prima facie”) infringement uploaded content can be automatically blocked/removed under the condition that uploaders have the ability to easily and effectively challenge such blocks/removals.
  • Require that users must be able to rely on all existing exceptions as grounds for challenging removal/blocking actions and must be able to dispute the ownership claims on which an action is based.

Continue reading