German government draft on Article 17: Two steps forward, one step back

Rechtvaardigheid (Justitia)
maintaining the balance?
Licentie

At the beginning of February, the German Government (the “Bundeskabinett”) adopted the proposal for the implementation of the copyright directive, which opens the path for discussions in Parliament. While the proposal that has been adopted earlier this month has undergone significant changes since the first versions emerged in January 2020 (the provisions dealing the the new neighbouring rights for press publishers and most of the exceptions) and in June 2020 (the provisions dealing with Article 17, Out-of-Commerce Works and the protection of the Public Domain), it still remains one of the most ambitious implementation efforts that we have seen so far.

This post will take a closer look at the provisions implementing Article 17 of the Directive into German law. As we had noted in our analysis of the original discussion draft from June last year, the German legislator is proposing to implement these provisions into a new act that is separate from the copyright act: the “Act on the Copyright Liability of Online Sharing Content Service Providers”. By now the proposal for this act has undergone two public revisions. After the first discussion draft in June, the Ministry of Justice published a revised draft (“Referentenentwurf“) in October of last year. This revised version maintained the core user rights safeguards underpinning the original proposal (a new remunerated exception for minor use and the ability for uploaders to flag uploads as legitimate) while making a few changes to the way the pre-flagging mechanism works

A step back for user rights

Just as the original discussion draft, the Referentenentwurf drew massive criticism from rightsholders that was primarily directed at the user rights safeguards contained in the proposal. Unfortunately the Ministry of Justice has now caved in to some of these demands and as a result  the user rights safeguards have been further cut back in the final proposal adopted by the government (“Regierungsentwurf“).

The most substantial change in the Regierungsentwurf concerns the legal mechanism for ensuring that user rights are protected in line with the requirements of paragraph 17(7). The original proposal relied on a new exception that legalised minor uses of copyrighted works (any use consisting of less than 20 seconds of audio or video, 1000 characters of text or images smaller than 250Kb) and would have prevented platforms from blocking such uses. 

In the updated proposal this exception is gone and has been replaced by a more limited construction of “uses presumably authorised by law” which cannot be blocked automatically. For a use to be “presumably authorised by law” it needs to fulfil the following cumulative criteria:

Continue reading

Finnish Article 17 implementation proposal prohibits the use of automated upload filters

Putti spelen op een wip
No upload filters after all?
Licentie

On Monday, the Finnish Ministry of Education and Culture held a public hearing on the implementation of Article 17 of the Copyright Directive. As part of this meeting, the Ministry outlined its proposal for a user rights-preserving “blocking procedure” that substantially deviates from all other implementation proposals that we have seen so far. 

The procedure presents a radical departure from the approach that is underpinning other user rights-preserving implementation proposals (such as the Austrian and German proposals) and the Commission’s proposed (and much delayed) Article 17 implementation guidance. Instead of limiting the use of automated filters to a subset of uploads where there is a high likelihood that the use is infringing, the Finnish proposal does away with automated blocking of user uploads entirely, but not with automated detection of potential infringements. 

The Finnish proposal relies on mandatory use of content recognition technology by platforms and the rapid notification of rightsholders of uploads that match works for which rightsholders have provided them with reference information. However, platforms are only required to disable access to uploaded content after rightsholders have provided them with a properly justified request to block a particular upload:

While this approach bans automated filtering of user uploads, it still heavily relies on automated content recognition technology. The proposed “blocking procedure” requires that all platforms covered by Article 17 would need to have technology in place that can match uploads to reference information provided by rightsholders so that rightsholders can be directly notified when matching content is uploaded. Notifications sent to rightsholders also include the justifications that uploaders have provided at the time of upload as to why they consider a use of third-party content to be legitimate. 

Continue reading

Austrian Article 17 proposal: The high road towards implementation?

Berglandschap met mensen op een weg
Reconciling the internal contradictions of Art. 17
Licentie

So far there we have seen two different approaches to implementing Article 17 into national copyright legislation. On the one hand, we have countries like France, the Netherlands, or Croatia who have presented implementation proposals that stick as closely as possible to the language and the structure of Article 17 while implementing its provisions within the structure of their existing copyright acts. In doing so these implementations essentially kick the can down the road with regards to figuring out how the conflicting requirements to filter (17(4)) and requirements to ensure that legal uploads are not filtered out (17(7)) can be reconciled. In the end, none of these implementation proposals offer a convincing mechanism for ensuring that creators get remunerated and that users’ rights are not violated.

On the other hand, we have the German approach that proposes to implement Article 17 via a separate “copyright-service-provider law” (“Urgeberrechts Diensteanbieter Gesetz”) that substantially departs from the language in an attempt to capture the structure and effet utile of the directive.

The German implementation proposal focuses on using the room for legislative discretion left by the directive to give practical meaning to the abstract requirements to protect user rights contained in the directive. It also adds measures aimed at ensuring that individual creators directly benefit from the new rules. As a result, the German implementation proposal is much closer to the legislative compromise struck by Article 17 than any of the more literal implementation proposals that have emerged so far.

Over the past few months, the German implementation proposal has come under intense pressure from exclusive rightsholders and some platforms who argue that the proposed approach does not adequately reflect the provisions of Article 17. Besides, rightsholders have also claimed that it violates national and international copyright law in multiple ways. A central argument of the opponents of the German implementation proposal is the claim that it strays too far from the text of the directive.

Given this background, it is interesting to see the first Austrian implementation proposal (that was circulated to stakeholders for feedback earlier this week) take a middle road between the two existing approaches. The Austrian implementation proposal does integrate the provisions from Article 17 directly into the text of the existing Austrian copyright act, thus deviating from the structure of Article 17, but mostly stays very close to the text of the directive. At the same time, it takes up key elements first introduced in the German approach: The non-waivable direct remuneration right for authors and performers, the protection of minor uses from automated filters, the ability for users to flag uploads as legitimate, and the ability for users’ organizations to act against platforms that engage in structural over-blocking. The result is a proposal that (similar to the German one) focuses on strengthening the position of creators and users, instead of leaving it up to platforms and large corporate rightsholders to set their own rules.

The Austrian proposal in more detail.

So let’s look at the Austrian proposal in more detail: Similar to the German proposal it introduces a direct remuneration right for authors and performers that will ensure that independent of existing contractual arrangements with publishers and other intermediaries, creators will be remunerated for the use of their works on platforms. As in the German proposal, this direct remuneration right can only be exercised via collective management organizations, which means that it will primarily benefit creators in sectors with existing collective management structures. In the German discussion this direct remuneration right has been strongly criticized by both platforms, who would prefer not to pay for obvious reasons) and by intermediary rightsholders, who prefer to control how much (or rather little) of their licensing revenue should go to the actual creators. Given that the need to make sure that creators benefit from the use of their works on platforms was the main argument for getting Article 17 in the first place, the fact that rightsholders are now trying to undermine the proposed direct remuneration right is more than a little bit hypocritical. Continue reading

CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the provision agree on how it should work

Echtpaar bij de dorpsrechtbank van Puiterveen
Will the CJEU strike down Article 17?
Licentie

On Tuesday, November 10, the Court of Justice of the European Union (CJEU) heard case C-401/19. This case is a request by the Polish government to annul the filtering obligation contained in Article 17 of the Copyright in the Digital Single Market (DSM) Directive on the grounds that it will lead to censorship and will limit the freedom of expression and the freedom to receive and impart information guaranteed in Article 13 of the EU Charter of Fundamental Rights (Charter).

The defendants in this case are the European Parliament and the Council of the European Union. In addition, the European Commission and the governments of France and Spain intervened in the case on the side of the defendants. Advocate General Saugmandsgaard Øe was also present at the hearing.

Even for astute followers of the discussions around the implementation of Article 17, the hearing contained a number of surprises. While several Member States have been soldiering on with their national implementation proposals with little regard for the fundamental rights implications of Article 17, the hearing showed that the Court is taking Poland’s complaint very seriously and that the compliance of the contested provisions of Article 17 with the Charter is far from evident. Frequent reference was made during the hearing to the recent opinion of Advocate General Saugmandsgaard Øe in the YouTube and Cyando cases, which is highly critical of extensive obligations on platforms to police the copyright infringements of their users.

On the face of it, the case is about Poland’s request to annul Articles 17(4)(b) and (c) of the DSM directive. Poland argued its case, which essentially rests on the observation that while not explicitly mandating them, Article 17(4)(b) and (c) effectively require platforms to implement upload filters because there are no other effective means to comply with the obligations contained therein. Poland argues that this will lead to censorship and will limit the freedom of information of the users of online platforms.

According to Poland, the key problem with the directive is the move away from active participation of rightholders (as initiators of removal requests in the context of notice and takedown procedures) and instead handing the responsibility of removing infringing uploads over to platforms who will have to develop private enforcement systems to avoid liability for copyright infringement. Because they are not facing any comparable risk when they limit user rights by blocking access to legal content, this creates strong incentives for over-blocking. This in turn will result in censorship and violation of the fundamental rights to freedom of expression and information under the Charter. Consequently, the problematic parts of Article 17 should be annulled by the Court.

All other parties intervening in the case objected to this line of argument and stated that in their view Article 17 does not violate any fundamental rights. However, they presented strikingly contradictory interpretations of what Article 17 actually requires of platforms. There are two distinct lines of argument: The Commission, the Council and the European Parliament argued that that Article 17 contains enough internal safeguards to prevent users’ fundamental rights from being unduly limited. On the other hand, France and Spain argued that some limitations of fundamental freedoms are justified by the objective that Article 17 seeks to achieve. Continue reading

Taming the upload filters: Pre-flagging vs. match and flag

Raderboot
How to limit the damage filters can do
Licentie

One of the most important elements of any implementation of Article 17 will be how platforms can reconcile the use of automated content filtering with the requirement not to prevent the availability of legitimate uploads. While most implementation proposals that we have seen so far are silent on this crucial question, both the German discussion proposal and the Commission’s consultation proposal contain specific mechanisms that are intended to ensure that automated content filters do not block legitimate uploads, and that uploads are subject to human review if they are not obviously/likely infringing. 

In order to achieve this objective, the German discussion draft published in June relies on the idea of “pre-flagging”: users would be allowed to flag uploads containing third party works as legitimate. Platforms would then be prevented from automatically blocking pre-flagged uploads unless they determine that the flag is incorrect because the upload is “obviously infringing”. 

By contrast, the Commission’s implementation guidance consultation proposes a “match-and-flag” mechanism: if upload filters detect the presence of a third party work in an upload and the use is not deemed to be “likely infringing”, then the uploader is notified and given the ability to state that the use is legitimate. If the user flags the upload as legitimate, the platform will have to initiate a human review of the upload, which remains available from the moment of upload until the review has been concluded. This type of mechanism was first suggested by a group of copyright academics in October of last year. It is also at the core of the proposal that we had presented during the last meeting of the stakeholder dialogue.

Both approaches provide a mechanism that limits the application of fully automated upload filters (while implicitly acknowledging the fact that many platforms will deploy upload filters). In the Commission’s proposal, filters are limited to making a pre-selection (“is the upload likely infringing?”); in the German proposal, they can only operate on unflagged content and to filter out “obviously incorrect” pre-flags.

Convergence on “match-and-flag”?

Both approaches have been criticised by rightholders, who claim that they undermine the “original objective of the directive” without providing alternative proposals on how automated filtering can be reconciled with the requirement not to block legitimate uploads. In addition, the German discussion proposal has also been criticised by platforms such as Google and Facebook. The platforms are arguing that giving users the ability to pre-flag every single upload would be impractical and would likely lead to substantial numbers of unnecessary (where the content in question is already licensed) or unjustified (users making excessive use of the pre-flagging tool) pre-flags, which would make such a system impractical to operate at scale. Continue reading

Commission consultation on Article 17 guidance: User rights must be protected at upload

Heilige Rochus smeekt bescherming van oudere edelman af
Legitimate uses must be considered at all times
Licentie

At the end of July the Commission published a long awaited “targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the CDSM Directive“. With this consultation the Commission makes good on its (pre-covid) promise to “share initial views on the content of the Article 17 guidance” with the participants of the stakeholder dialogue. Nestled in-between 18 questions, the consultation document provides a detailed outline of what the Commission’s guidance could look like once it is finalised.

While we have been rather sceptical after the end of the six meetings of the stakeholder dialogue meetings, we are pleased to see that the initial views shared by the Commission express a genuine attempt to find a balance between the protection of user rights and the interests of creators and other rightholders, which reflects the complex balance of the provisions introduced by Article 17 after a long legislative fight.

In the remainder of this post we will take a first, high level, look at the Commission’s proposal for the Article 17 guidance, what it would mean for national implementations and how it would affect user rights.

Two welcome clarifications

With the consultation document the Commission takes a clear position on two issues that were central to the discussions in the stakeholder dialogue and that have important implications for national implementation of Article 17.

The first one concerns the nature of the right at the core of Article 17. Is Article 17 a mere clarification of the existing right of communication to the public, as rightholders have argued, or is it a special or sui generis right, as academics and civil society groups have argued? In the consultation document the Commission makes it clear that it considers Article 17 to be a special right (“lex specialis”) to the right of communication to the public, as defined in Article 3 of the 2001 InfoSoc Directive, and the limited liability regime for hosting providers of the E-commerce Directive. 

What sounds like a fairly technical discussion has wide ranging consequences for Member States implementing the Directive. As explained by João Quintais and Martin Husovec, now that it is clear that Article 17 is not a mere clarification of existing law, Member States have considerably more freedom in deciding how online platforms can obtain authorisation for making available the works uploaded by their users. This should mean that they are not constrained by the InfoSoc Directive. Therefore, mechanisms like the remunerated “de-minimis” exception proposed by the German Ministry of Justice that would legalise the use of short snippets of existing works are permitted and covered by the concept of “authorisation” introduced by Article 17. Continue reading

How Filters fail (to meet the requirements of the DSM directive)

Sakkamettant l'eau au sir (filtre)
Three common failure modes of upload filters
Licentie

Article 17 of the DSM directive establishes that Online Content Sharing Service Providers (OCSSPs) are liable for copyright infringing uploads by their users unless they either obtain a license for the use of such content, or take a number of measures designed to prevent the availability of such content on their platforms. While the directive never explicitly talks about filters or automated content recognition (ACR) systems, it is assumed by all sides of the debate that, in order to meet this obligation, platforms have little choice but to implement ACR-based filtering systems that will scan all user uploads and block or remove uploads that contain works that have been flagged by their rightholders.

This de-facto requirement to implement upload filters is – by far – the most controversial aspect of the entire copyright directive and it continues to dominate the discussions about the implementation of Article 17 into national legislation.

In this context, it is important to remember that the use of such filters is not new and that their functioning can already be observed in practice. What is new, however, is the de-facto requirement for OCSSPs to implement filters as well as a number of requirements that OCSSPs need to meet to ensure that any measures (including filters) implemented by them are not infringing on the rights of users. This includes the requirement that any such measures “shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or limitation“.

In other words, one of the most important contributions of the DSM directive is that, for the first time, it establishes conditions that need to be met by automated upload filters.

As we have argued many times before, these conditions present a very high hurdle for any technological solution to clear. The fact that upload filters are incapable of determining if a particular use of a copyrighted work is infringing or not has been established beyond any doubt. But that does not mean that the failure to assess the context is the only way that filters based on automated content recognition fail to meet the requirements established by the directive. In total there are at least three distinct ways how filters fail.

In the remainder of this post we will discuss these three failure modes based on examples collected by Techdirt in the course of a single week: removals caused by incorrect rights information, removals caused by the inability to recognise legitimate uses, and removals caused by the inability to accurately identify works.

Incorrect rights information

Incorrect rights information is probably the most common and best documented cause for the unjustified removal (or demonetisation) of works on YouTube.

ACR systems execute actions specified by whoever is recognised as the owner of a work. For the purposes of the ACR systems, the owner of a work is whoever claims to be the owner of the work and, unless there are conflicting ownership claims, there is no way to check the accuracy of such claims as there are no authoritative databases of ownership rights. As a result it is possible to claim ownership in public domain works (which no-one owns), in works that have been freely or widely licensed by their owners, or for any copyrighted work that has not already been claimed by someone else. Continue reading

A better way to implement Article 17? New German proposal to avoid overblocking

Jakobs ladder
A step-by-step concept against overblocking
Licentie

Back in April 2019, at the occasion of the final vote on the DSM Directive in the Council, the German Federal Government issued a statement, announcing that it intended to implement Article 17 with a focus on “preventing ‘upload filters’ wherever possible, ensuring freedom of expression and safeguarding user rights”. While the German Government has yet to produce an implementation proposal for Article 17, we may now have gotten a first insight in what such an implementation might look like. Late last month, the Kölner Forschungsstelle für Medienrecht at the University of Applied Sciences in Cologne, published a step-by-step concept for implementing Article 17 of the DSM Directive (“Stufenkonzept gegen Overblocking durch Uploadfilter“). 

The paper authored by Prof. Dr. Rolf Schwartmann and Prof. Dr. Christian-Henner Hentsch consists of an implementation proposal in the form of concrete legislative language. The objective of the authors seems to be to stay as close as possible to the objectives formulated in the German statement to the Council. What makes this proposal remarkable is that it is the first proposal (although not an official one) for implementing the Article 17 of the new Copyright Directive that does not consist of more or less literal transposition of Article 17 into national law (as it is the case in the French, Dutch and Belgian legislative proposals). In order to achieve the stated objective of preventing over-blocking by upload filters, the concept proposes a combination of Article 17 with Article 12 of the DSM Directive (which provides Member States the option to introduce Extended Collective Licenses). 

The implementation proposal contains modifications of three different acts: the Copyright Act (Urheberrechtsgesetz – UrhG), the Tele Media Act (Telemediengesetz – TMG) and the Collective Management Organisations Act (Verwertungsgesellschaftengesetz – VGG). Specifically the authors propose the following modifications: 

In the Copyright Act, they propose to add a new section to the article (§ 19a UrhG) that defines the act of communication to the public. The purpose of this addition is to include acts of giving the public access to copyright-protected user uploaded works by Online Content Service Providers (OCSSPs) in the definition of the existing making available to the public right. This establishes that, in principle, OCSSPs need authorisation from rightholders for such acts. The added section also includes the definition of OCSSPs, which is a literal transposition of the definition used in the DSM directive. 

The second addition to the Copyright Act is a new exception covering uses for the purpose of caricature, parody or pastiche by users of OCSSPs (§ 51a UrhG). Notably, this exception only applies in the context of sharing works via OCSSPs (which is highly unusual as copyright exceptions are usually not limited to specific services) and is conditional on remuneration to rightholders via collective management organisations. Continue reading

Research Librarians: New TDM exception can be undermined by technical blocking from publishers

De zanderij
Protect TDM from adverse effects of TPMs
Licentie

Today the Association of European Research Libraries (LIBER) issued a warning that the new Text and Data Mining (TDM) exceptions contained in the DSM directive can easily be undermined by technical blocking from publishers. LIBER has come to this conclusion based on the results of a survey on content blocking, carried out by LIBER’s Copyright & Legal Matters Working Group and the UK Libraries and Archives Copyright Alliance (LACA). The submissions to the survey so far confirm fears that Technical Protection Measures (TPM) can be abused by rightholders to limit the usefulness of the exceptions contained in the directive. 

Article 3 of the directive allows reproductions in the context of text and data mining for the purposes of scientific research made by research organisations and cultural heritage institutions. While the article allows rightholders to take measures to ensure the security and integrity of the networks, it does not allow them to prevent researchers from exercising their rights under the exception. It also requires rightholders to remove any TPMs that prevent researchers from exercising their rights but does not specify how quickly this has to happen. 

The survey conducted by LIBER shows that in practice it is difficult for researchers to get rightholders to remove TPMs or other access and use restrictions to content that research institutions have licensed. And in cases where such limitations are eventually removed the process is often long and resource intensive. All of this does not only frustrate efforts to text and data mining by individual researchers, but can also have negative consequences for entire institutions. 

The submissions shared with us by libraries and researchers from a mix of countries (some with TDM exceptions already and some without) show that:

  • Researchers are blocked from accessing many types of content. Journal articles were the most common type of content mentioned (44%) but eBooks, websites, databases and newspapers were also cited.
  • Content blocking takes, on average, nearly a month to resolve. Respondents reported that it took between 24 hours and 2-1/2 months to resolve the content blocking issue, with the mean time being 24 days. A fifth of survey respondents said they were only partially able to resolve the issue and 11% said it was never resolved.
  • Sanctions impact whole communities, not just individual researchers. Actions taken by publishers included 1) suspension of campus-wide access to paid for electronic subscriptions 2) threats to cut off access to content unless TDM was stopped 3) technically limiting downloads to one document only 4) a request for additional payments and 5) the introduction of captcha technology to frustrate data mining

The findings reported by LIBER are extremely relevant in the context of the ongoing implementations of the DSM directive in the EU member states. As we have noted in our implementation guide for articles 3 & 4 of the directive Member states should include in their national laws provisions requiring rightholders to remove TPMs and resolve lock-outs within a maximum period of 72 hours once reported.

In order to better make the case why such a requirement must be included in national implementations LIBER is continuing to collect evidence of abusive practices. If you work for an academic research institutions that has ran into similar issues in the past you can still contribute your experience to LIBER: 

If you or your organisation have ever been blocked from accessing a publisher’s servers for reasons you believe are related to data mining, fill out the survey. The survey can be answered anonymously and will remain open indefinitely. You can also send information on this issue to liberlegalworkinggroup@gmail.com”.

Article 17 stakeholder dialogue (day 6): Hitting a brick wall

A woman shouting into a man's ear-trumpetLicentie

This week saw the sixth meeting of the EU stakeholder dialogue on the implementation of Article 17 of the new copyright directive. This meeting was the first one where the question of how to reconcile the protection of user rights’ with automated filters based on technology that cannot assess the legality of the use of copyrighted works was explicitly on the agenda. From the perspective of the users’ organisations participating in the stakeholder dialogue this is the main question that the stakeholder dialogue (and the guidelines that the Commission will have to issue afterwards) needs to address. 

Unfortunately, Monday’s meeting did not result in a productive discussion about how to protect users’ rights. Proposals made by COMMUNIA and Studio71 on how to strike a balance between the rights of users and the interests of creators and other rightholders were largely ignored by a coalition of major rightholders from both the music and the audio-visual sectors. Working in concert, the representatives of the Hollywood studios (MPA), film producers (FIAFP), commercial television channels (ACT), major record labels (IFPI) and music publishers (ICMP) disputed the fact that there is a tension between protecting users rights and automated blocking, restated their conviction that Article 17 is only about empowering them versus the platforms, and suggested that users should simply trust that rightholders will not block free speech or other legitimate uses. In doing so they have made it clear that they want their interests to prevail at all cost, that users should not be trusted and that for them user rights are something that should exist at their discretion. 

This outcome leaves the European Commission in the difficult position to make sense of the input gathered throughout the previous six meetings and to outline a way forwards. Fortunately it seemed that the Commission is not willing to succumb to the unconstructive behaviour exhibited by rightholders and will take serious its task of finding a balance between users rights and the interests of different types of rightholders. 

A proposals for protecting users’ rights

So how could such a balance look like and what is at stake? One of the key insights that emerged from the previous rounds of the stakeholder dialogue is that even the most advanced content recognition technology is incapable of understanding the context in which works are used. This means that technology alone cannot make the determination if a use is lawful or not. Article 17 requires platforms to take measures to prevent the availability of content that rightholders want to keep off the sharing platforms and, at the same time, to ensure that legitimate uses (such as quotations or parodies) are not affected. This means that no matter how good it is at recognising content, ACR alone cannot meet the requirements of the directive. Continue reading