On Tuesday, November 10, the Court of Justice of the European Union (CJEU) heard case C-401/19. This case is a request by the Polish government to annul the filtering obligation contained in Article 17 of the Copyright in the Digital Single Market (DSM) Directive on the grounds that it will lead to censorship and will limit the freedom of expression and the freedom to receive and impart information guaranteed in Article 13 of the EU Charter of Fundamental Rights (Charter).
The defendants in this case are the European Parliament and the Council of the European Union. In addition, the European Commission and the governments of France and Spain intervened in the case on the side of the defendants. Advocate General Saugmandsgaard Øe was also present at the hearing.
Even for astute followers of the discussions around the implementation of Article 17, the hearing contained a number of surprises. While several Member States have been soldiering on with their national implementation proposals with little regard for the fundamental rights implications of Article 17, the hearing showed that the Court is taking Poland’s complaint very seriously and that the compliance of the contested provisions of Article 17 with the Charter is far from evident. Frequent reference was made during the hearing to the recent opinion of Advocate General Saugmandsgaard Øe in the YouTube and Cyando cases, which is highly critical of extensive obligations on platforms to police the copyright infringements of their users.
On the face of it, the case is about Poland’s request to annul Articles 17(4)(b) and (c) of the DSM directive. Poland argued its case, which essentially rests on the observation that while not explicitly mandating them, Article 17(4)(b) and (c) effectively require platforms to implement upload filters because there are no other effective means to comply with the obligations contained therein. Poland argues that this will lead to censorship and will limit the freedom of information of the users of online platforms.
According to Poland, the key problem with the directive is the move away from active participation of rightholders (as initiators of removal requests in the context of notice and takedown procedures) and instead handing the responsibility of removing infringing uploads over to platforms who will have to develop private enforcement systems to avoid liability for copyright infringement. Because they are not facing any comparable risk when they limit user rights by blocking access to legal content, this creates strong incentives for over-blocking. This in turn will result in censorship and violation of the fundamental rights to freedom of expression and information under the Charter. Consequently, the problematic parts of Article 17 should be annulled by the Court.
All other parties intervening in the case objected to this line of argument and stated that in their view Article 17 does not violate any fundamental rights. However, they presented strikingly contradictory interpretations of what Article 17 actually requires of platforms. There are two distinct lines of argument: The Commission, the Council and the European Parliament argued that that Article 17 contains enough internal safeguards to prevent users’ fundamental rights from being unduly limited. On the other hand, France and Spain argued that some limitations of fundamental freedoms are justified by the objective that Article 17 seeks to achieve. Continue reading
One of the most important elements of any implementation of Article 17 will be how platforms can reconcile the use of automated content filtering with the requirement not to prevent the availability of legitimate uploads. While most implementation proposals that we have seen so far are silent on this crucial question, both the German discussion proposal and the Commission’s consultation proposal contain specific mechanisms that are intended to ensure that automated content filters do not block legitimate uploads, and that uploads are subject to human review if they are not obviously/likely infringing.
In order to achieve this objective, the German discussion draft published in June relies on the idea of “pre-flagging”: users would be allowed to flag uploads containing third party works as legitimate. Platforms would then be prevented from automatically blocking pre-flagged uploads unless they determine that the flag is incorrect because the upload is “obviously infringing”.
By contrast, the Commission’s implementation guidance consultation proposes a “match-and-flag” mechanism: if upload filters detect the presence of a third party work in an upload and the use is not deemed to be “likely infringing”, then the uploader is notified and given the ability to state that the use is legitimate. If the user flags the upload as legitimate, the platform will have to initiate a human review of the upload, which remains available from the moment of upload until the review has been concluded. This type of mechanism was first suggested by a group of copyright academics in October of last year. It is also at the core of the proposal that we had presented during the last meeting of the stakeholder dialogue.
Both approaches provide a mechanism that limits the application of fully automated upload filters (while implicitly acknowledging the fact that many platforms will deploy upload filters). In the Commission’s proposal, filters are limited to making a pre-selection (“is the upload likely infringing?”); in the German proposal, they can only operate on unflagged content and to filter out “obviously incorrect” pre-flags.
Convergence on “match-and-flag”?
Both approaches have been criticised by rightholders, who claim that they undermine the “original objective of the directive” without providing alternative proposals on how automated filtering can be reconciled with the requirement not to block legitimate uploads. In addition, the German discussion proposal has also been criticised by platforms such as Google and Facebook. The platforms are arguing that giving users the ability to pre-flag every single upload would be impractical and would likely lead to substantial numbers of unnecessary (where the content in question is already licensed) or unjustified (users making excessive use of the pre-flagging tool) pre-flags, which would make such a system impractical to operate at scale. Continue reading
Back in January of this year, we noted how both the Netherlands and France (at that point the only Member States that had presented proposals to implement Article 17) had proposed selective implementations of Article 17 that ignored crucial user rights safeguards. A lot has happened since January, but yesterday both Member States took further steps in their national implementations. And this time the two Member States are moving in opposite directions:
While the Dutch government has reacted to criticism from civil society and members of Parliament by fixing some of the most obvious shortcomings of its implementation law, the 2nd chamber of the French Parliament has adopted a law that gives the French government the power to implement Article 17 (and the rest of the provisions of the DSM directive) however it sees fit.
Netherlands: a course correction
Back in July of 2019, the Netherlands were the first country to propose an implementation law of the DSM directive. Somewhat surprisingly (the Netherlands had been one of the most vocal opponents of Article 17 in the Council) the proposed implementation law did not make any efforts to protect user rights and omitted most of the user rights safeguards contained in the final version of Article 17. After the proposal was sent to Parliament in June this year, together with Bits of Freedom and others we pointed out these shortcomings to the members of the legal affairs committee. Yesterday, in response to questions from members of the legal affairs committee, the government conceded that its original implementation proposal was incomplete and added the missing user rights safeguards to the proposal for an implementation law. Continue reading
Yesterday Politico published the response submitted by the French Government to the Commission’s stakeholder consultation on the article 17 implementation guidance. For anyone who has followed the French position in the debate about Article 17, the response (see here for an english language “courtesy translation”) will not be very surprising. The French reaction rejects the majority of what the Commission is proposing and questions and accuses the Commission of reversing the progress that (according to the French government) has been made with the adoption of Article 17 in early 2019. This position is largely aligned with the grievances expressed by a coalition of rightholders organisations in their recent letter to Commissioner Breton and it is consistent with the way France is implementing Article 17 in its national law (next week the Assemblée nationale will vote on a law that will authorise the French Government to implement the provisions of the DSM directive by decree).
Unfortunately, it is also based on a flawed understanding of the issue at hand. Since the French authorities note in their reaction that they “will carefully consider stakeholders’ responses and are open to continued discussions so that future guidance may play their concrete technical role with regards to the correct application of article 17”, it is necessary to clear up some of the misunderstandings here.
Misrepresenting Article 17
The constant references, on the side of rightholders and from the French government, to the “original objective”, “goal” or “spirit” of the directive try to mask the fact that the actual text of the directive is not as clear-cut as rightsholders and the French government would like it to be. The fact is that, in between its introduction (as Article 13) in 2016 and its adoption (as Article 17) in 2019, the provisions dealing with the “Use of protected content by online content-sharing service providers” have undergone substantial changes.
What was proposed as a relatively simple intervention to strengthen the position of rightholders vis-à-vis content sharing platforms by changing the liability position of said platforms, has – following intense criticism from academia, civil society and internet users – been transformed into a convoluted legal regime that serves a number of conflicting objectives. Article 17 now includes strong language that establishes new user rights and provides meaningful safeguards for preserving these rights. These additions to the article were essential in securing its adoption by the EU legislator. As a result, in the final version of the article, the original objective of strengthening the position of rightholders is just one of a number of objectives.
By now it seems clear that in their fierce determination to get the new copyright directive adopted, many of the supporters of the original proposal have failed to notice that the final legislative compromise had morphed into an article that is fundamentally different from what was originally proposed. In supporting the final compromise of Article 17, rightholders and the French government have in fact supported provisions that go against their long established positions. Continue reading
Earlier today 27 (update 17-09-2020: 32) civil society organisations sent a joint letter to Commissioner Breton summarising our responses to the Article 17 guidance consultation that closed last week. In addition to organisations participating in the stakeholder dialogue, the letter has also received support from a broad coalition of digital and human rights organisations from across Europe.
The letter expresses concerns that the proposed Article 17 guidance endorses the use of automated content blocking by online services even though it is clear that this will lead to the violation of fundamental rights. It also warns that implementations of Article 17 based on the proposed guidance will violate established principles of EU law.
In this context the letter highlights the need for meaningful safeguards for legitimate uses of content uploaded to online platforms, and stresses the need for a robust redress mechanism for users. Summarising the consultation responses submitted by the various signatories, the letter highlights the importance of ensuring that uploads that are not manifestly infringing must remain online until a human review has taken place. The letter further stresses the importance of involving users’ organisations when setting technical parameters that are used to determine if an upload is manifestly infringing or not.
The letter further highlights the need for full transparency of (automated) content removals and the ability for users (and user organisations on their behalf) to take actions against the abuse of the measures introduced by Article 17 of the DSM directive.
Finally, the letter also expresses support for the Commission’s clarification that Article 17 constitutes a “lex specialis” to the provisions of the InfoSoc Directive which provides Member States with maximum flexibility to include user rights preserving authorisation mechanisms in their national legislation.
You can read the full letter including the list of signatories here.
Yesterday we submitted our response to the European Commission’s targeted consultation on the Article 17 guidance. As we have explained previously, with this consultation the Commission was seeking feedback on its initial ideas for the Article 17 implementation guidance, which the Commission intends to publish before the end of the Year. The document is intended to provide Member States with guidance on how to balance the conflicting requirements of Article 17 (preventing copyright infringements while ensuring that legal uses are not affected) when implementing it in their national legislations.
As we said in our initial analysis, we were very happy to note a clear commitment of the Commission to maintain the delicate legislative balance of Article 17 that reflected many of the constructive contributions that have been made by stakeholders across the spectrum during the dialogues. In general, we consider the Commission’s proposal a step in the right direction and this is reflected in our response to the consultation. Unsurprisingly, organisations representing rightholders have a completely different reaction to the proposal and have already started a campaign to convince the Commission into abandoning its approach. Continue reading
At the end of July the Commission published a long awaited “targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the CDSM Directive“. With this consultation the Commission makes good on its (pre-covid) promise to “share initial views on the content of the Article 17 guidance” with the participants of the stakeholder dialogue. Nestled in-between 18 questions, the consultation document provides a detailed outline of what the Commission’s guidance could look like once it is finalised.
While we have been rather sceptical after the end of the six meetings of the stakeholder dialogue meetings, we are pleased to see that the initial views shared by the Commission express a genuine attempt to find a balance between the protection of user rights and the interests of creators and other rightholders, which reflects the complex balance of the provisions introduced by Article 17 after a long legislative fight.
In the remainder of this post we will take a first, high level, look at the Commission’s proposal for the Article 17 guidance, what it would mean for national implementations and how it would affect user rights.
Two welcome clarifications
With the consultation document the Commission takes a clear position on two issues that were central to the discussions in the stakeholder dialogue and that have important implications for national implementation of Article 17.
The first one concerns the nature of the right at the core of Article 17. Is Article 17 a mere clarification of the existing right of communication to the public, as rightholders have argued, or is it a special or sui generis right, as academics and civil society groups have argued? In the consultation document the Commission makes it clear that it considers Article 17 to be a special right (“lex specialis”) to the right of communication to the public, as defined in Article 3 of the 2001 InfoSoc Directive, and the limited liability regime for hosting providers of the E-commerce Directive.
What sounds like a fairly technical discussion has wide ranging consequences for Member States implementing the Directive. As explained by João Quintais and Martin Husovec, now that it is clear that Article 17 is not a mere clarification of existing law, Member States have considerably more freedom in deciding how online platforms can obtain authorisation for making available the works uploaded by their users. This should mean that they are not constrained by the InfoSoc Directive. Therefore, mechanisms like the remunerated “de-minimis” exception proposed by the German Ministry of Justice that would legalise the use of short snippets of existing works are permitted and covered by the concept of “authorisation” introduced by Article 17. Continue reading
Last week on Thursday we held the third virtual edition of our COMMUNIA Salon. This edition focussed on the recent German proposal to implement Article 17 of the DSM Directive and included contributions by John Henrik Weitzmann (Wikimedia Deutschland), Julia Reda (Gesellschaft für Freiheitsrechte), Martin Husovec (London School of Economics) and Paul Keller (COMMUNIA). If you have missed the event you can watch a recording of the presentations and the subsequent discussion here:
Last week, Germany’s Ministry of Justice unveiled its proposal to implement Article 17 of the new Copyright Directive. In this post, we will look into the draft implementation in more detail, to understand how this proposal aims to protect user rights by:
- Making it easier for platforms to comply with the “best efforts” obligation to obtain authorization to publish their users’ uploads;
- Introducing a new exception covering minor uses of copyrighted content, which works as a fallback mechanism in the absence of authorization;
- Allowing users to override blocking/removal actions, by pre-flagging lawful uses;
- Allowing lawful content to stay up until human review and pausing the liability of platforms until a decision has been made;
- Sanctioning abusive behaviour by platforms, rightholders and users.
Complying with the “best efforts” obligation to obtain authorization
Under Article 17, platforms are deemed to carry out a copyright-restricted act when they give public access to copyrighted content uploaded by their users and, as a consequence, they must make “best efforts” to obtain an authorization to perform such acts. That authorization can hypothetically be granted through various means:
- directly by the copyright owners via individual licensing agreements (as mentioned in Article 17(1) second para.,) or
- by collective management organizations via collective license agreements, or
- by operation of law, if the national lawmakers decide e.g. to turn this exclusive right into an exception or limitation to copyright subject to compensation.
The implementation proposals that we have seen so far in other countries have limited themselves to the traditional individual licensing mechanism. This is of course problematic because individual licenses alone cannot cover the countless protected materials in existence and user rights will be at greater risk if the platforms have to block content at upload than if they obtain authorization to have that content uploaded to their platforms.
Germany had stated, when the Directive was approved, that it would explore further legal mechanisms (e.g. exceptions and limitations and collective licenses) to grant those permissions to platforms. The draft text now published delivers on those promises and introduces some welcoming innovation.
The proposed text starts by saying that the platforms need to make “alle Anstrengungen” (“every effort”) to acquire those rights by contract. The use of the wording “every effort” shall not, however, be interpreted as meaning anything else other than “best efforts”, according to the explanatory memorandum. In fact such obligation is considered to be fulfilled when the platform accepts a licensing offer made by a rightholder or when licenses are available through a domestic collective management organization (§4/1). Such contractual offers or collective licenses must apply to works typically uploaded to the platform, comprise a representative repertoire, cover the territory of Germany, and enable the use on appropriate conditions (§4/2).
A new de minimis exception that applies to the acts of platforms and noncommercial users
When, despite making the above-mentioned effort, the platform was not able to obtain an authorization, the draft text provides a fallback mechanism: it partially turns the new exclusive right into a remunerated exception, which covers minor uses of copyrighted content (§6 and §7/2). Continue reading
On Thursday, the 2nd of July, we will be organising the next edition of our virtual COMMUNIA Salon to discuss the new German implementation proposal for Article 17 of the DSM directive. For this event we will be joined by Julia Reda (Project lead control © at the GFF and former MEP), John Hendrik Weitzmann (General Counsel at Wikimedia Deutschland) and Dr. Martin Husovec (Assistant Professor, Department of Law, London School of Economics).
As we have written in our initial reaction, the German proposal is the first serious attempt by a member state to implement Article 17 of the directive in a way that preserves the precarious balance between the rights of users and creators. Where previous implementation proposals have limited themselves to (selectively) transposing the provisions of the directive, the German Ministry of Justice has presented a proposal that adds a number of interesting (and potentially controversial) additional provisions, which seem to be designed to strengthen the position of both users and individual creators. These include the addition of a remunerated de-minimis exception intended to safeguard common types of so-called “user generated content”, the ability for uploaders to “pre-flag” legitimate uses of protected works in their uploads, and the addition of a direct remuneration rights intended to ensure that individual creators benefit from the new legal regime.
With this proposal the German government presents an alternative vision for how Article 17 could work in practice, which could serve as a model for other member states when implementing the directive. During our Salon we will hear first reactions from civil society stakeholders and analyse the legal underpinnings of the more innovative elements of the proposal, such as the proposed de-minimis exception. The presentations will be followed by an informal question and answer session.
The Salon is open for everyone to attend and will be held on Zoom. Join us on Thursday, the 2 of July, at 1530 CET, by registering here. Registered participants will receive login information ahead of the event.