The German Model to Protect User Rights when implementing Article 17

Rechtvaardigheid (Justitia)
Finding balance with exceptions, pre-flagging and abuse sanctioning
Licentie

Last week, Germany’s Ministry of Justice unveiled its proposal to implement Article 17 of the new Copyright Directive. In this post, we will look into the draft implementation in more detail, to understand how this proposal aims to protect user rights by:

  • Making it easier for platforms to comply with the “best efforts” obligation to obtain authorization to publish their users’ uploads;
  • Introducing a new exception covering minor uses of copyrighted content, which works as a fallback mechanism in the absence of authorization;
  • Allowing users to override blocking/removal actions, by pre-flagging lawful uses;
  • Allowing lawful content to stay up until human review and pausing the liability of platforms until a decision has been made;
  • Sanctioning abusive behaviour by platforms, rightholders and users.

Complying with the “best efforts” obligation to obtain authorization

Under Article 17, platforms are deemed to carry out a copyright-restricted act when they give public access to copyrighted content uploaded by their users and, as a consequence, they must make “best efforts” to obtain an authorization to perform such acts. That authorization can hypothetically be granted through various means:

  • directly by the copyright owners via individual licensing agreements (as mentioned in Article 17(1) second para.,) or
  • by collective management organizations via collective license agreements, or
  • by operation of law, if the national lawmakers decide e.g. to turn this exclusive right into an exception or limitation to copyright subject to compensation.

The implementation proposals that we have seen so far in other countries have limited themselves to the traditional individual licensing mechanism. This is of course problematic because individual licenses alone cannot cover the countless protected materials in existence and user rights will be at greater risk if the platforms have to block content at upload than if they obtain authorization to have that content uploaded to their platforms.

Germany had stated, when the Directive was approved, that it would explore further legal mechanisms (e.g. exceptions and limitations and collective licenses) to grant those permissions to platforms. The draft text now published delivers on those promises and introduces some welcoming innovation.

The proposed text starts by saying that the platforms need to make “alle Anstrengungen” (“every effort”) to acquire those rights by contract. The use of the wording “every effort” shall not, however, be interpreted as meaning anything else other than “best efforts”, according to the explanatory memorandum. In fact such obligation is considered to be fulfilled when the platform accepts a licensing offer made by a rightholder or when licenses are available through a domestic collective management organization (§4/1). Such contractual offers or collective licenses must apply to works typically uploaded to the platform, comprise a representative repertoire, cover the territory of Germany, and enable the use on appropriate conditions (§4/2).

A new de minimis exception that applies to the acts of platforms and noncommercial users

When, despite making the above-mentioned effort, the platform was not able to obtain an authorization, the draft text provides a fallback mechanism: it partially turns the new exclusive right into a remunerated exception, which covers minor uses of copyrighted content (§6 and §7/2). 

The exception is conditioned by the noncommercial purpose of the use (no definition of “noncommercial” is provided) and by quantity limits: up to 20 seconds of a film or motion picture, up to 20 seconds of an individual audio track, 1000 characters of an individual text, and one individual photograph or one individual graphic with a data volume of up to 250kB.

The exception aims to cover the copyright-restricted acts of platforms (although it has a “noncommercial purpose” condition, which seems contradictory), when giving the public access to the users uploads, and extends to the acts of users when uploading that content to the platform, provided that the user is not acting commercially or is not generating substantial income (§9/1).

It should be noted that this de minimis exception is without prejudice of the other exceptions, namely those that are justified by freedom of expression and other fundamental rights. Indeed, it is clarified in §6/2 that the exception serves as a fallback mechanism only if those uses are not already covered either by (1) a contractual authorization granted by rightholders (or CMOs) to platforms or by (2) other exceptions (namely, by the existing quotation exception or by the new exception for purposes of parody, caricature or pastiche).

Considering that the materials licensed to platforms will always be limited, being able to resort to an exception that applies to every copyrighted film, music, text, photograph and graphic, is reassuring.  In addition, the fact that the new exception offers users an opportunity to use protected content, up to a certain extent, without having to worry if the use qualifies e.g. as a quotation or a parody, is also encouraging. 

Sure enough, the de minimis exception can also have the negative effect of holding the users back when making lawful uses under other exceptions that do not offer the same legal certainty as the new exception. Naturally, if there is not enough case law reassuring that their uses are lawful under other exceptions (there are only a few CJEU decisions dealing with exceptions to copyright and these affirmative decisions are also lacking at the national level), users will think twice when crossing the quantity limits of the new exception, to avoid the risk of being liable for copyright infringement. 

The mandatory exception for parody, pastiche and caricature

In addition to creating a remunerated exception for minor uses of works, Germany also creates an unremunerated exception that covers uses “for the purpose of caricature, parody and pastiche”(§51a of the German Copyright Act). However, it fails at clarifying that the existing quotation exception (§51 of the German Copyright Act) covers criticism and review.

We recall that Article 17(7) second para. makes the following copyright exceptions mandatory: 

(i) quotation, criticism, review; 

(ii) use for the purpose of caricature, parody or pastiche.

The existing quotation exception covers uses “for the purpose of quotation”, which could be interpreted as including criticism and review, but considering that the German courts have interpreted this exception quite restrictively, it would be important for users to have those uses expressly mentioned in the text.

Pre-flagging: the user’s tool to keep lawful content online 

Another welcoming innovation of the German text is a mechanism aimed at solving one of the most difficult aspects of Article 17: reconciling the obligation to use best efforts to prevent the unavailability (at upload) of infringing content with the obligation not to prevent the availability (at upload) of lawful content.

Although Article 17 does not explicitly mention the use of filter technology, it seems difficult to comply with the above-mentioned obligations without having some reliance on automated content recognition technology (ACR), due to the amounts of content that are shared through the platforms. That is why Article 17 has been challenged in the CJEU by Poland and that is why Member States implementing this controversial provision have to make all efforts to, if not eliminate, at least reduce the reliance on ACR. Otherwise, free speech and other uses made under exceptions that are based on fundamental rights will be massively blocked.

Indeed, as seen during the stakeholder dialogue, ACR technology is incapable of understanding the context in which works are used and thus cannot determine if a use qualifies as a quotation or a parody. It can verify the extent to which the work is used (and that is what makes the new de minimis exception so appealing), but it cannot verify the conditions of use of the remaining exceptions. This means that no matter how good it is at recognising content, ACR alone cannot be relied upon to comply with both the obligation to use best efforts to block unauthorized content and the obligation not to block lawful content. 

That is the reason why, during the stakeholder dialogue, we presented a proposal (largely based on a statement issued by more than 50 copyright academics) suggesting that, unless a use was considered to be obviously infringing, users should be given the possibility to declare that the use is legitimate and prevent the blocking/removal of their uploads.

The German draft proposes a similar mechanism, which operates like this:

  • The rightholder has the right to request the blocking or removal of its work from the platform (§10 and §11).
  • However, the user also has the right (at upload) to pre-flag uses that should be considered authorized (because e.g. they are in the public domain, they are permitted under exceptions or are permitted by contracts, namely CC licenses) (§8/1).
  • If the user does not pre-flag the use as lawful, and provided that the rightholder has requested the platform to block/remove its work and has supplied the platform with the needed information, the platform has to block/remove the upload (§1o and §11).
  • If the user pre-flags the use as lawful but the pre-flagging is obviously incorrect, the platform has to block/remove the upload (§8/2, §10, §11 and §12). 
  • If the user pre-flags the use as lawful and the pre-flagging is not obviously incorrect, the platform cannot block/remove the upload. In that case, the platform informs the rightholder that the work has been uploaded/not been removed. The rightholder might challenge or not the pre-flagging, but the content has to stay up until a human decision has been made (§8/2, §10, §11 and §12).

Verifying when a pre-flagging is obviously incorrect

To avoid users abusing the pre-flagging system, the draft law requires platforms to check if the pre-flagging is obviously incorrect (§12). The proposal sets a machine verifiable rule, but leaves the door open to the application of other criteria. It is not clear if that criteria includes making an analysis of the context or if, for purposes of qualifying a pre-flagging as obviously incorrect, only machine verifiable rules can be applied. 

In our interpretation, this step is not intended to replace an eventual complaint procedure, where a human intervention is required to verify the context and decide if the use is lawful or not. Furthermore, it needs to take place at upload. This means that this is intended to be an automatic process and, thus, can only require the application of machine verifiable rules. Therefore, the situations where the pre-flagging can be considered obviously incorrect should be very limited (because machines cannot understand context). 

According to the machine verifiable rule provided in the draft text, when “the content uploaded by the user corresponds to at least 90 percent of the information provided by the rightholder” and provided that the context is not an individual image, the pre-flagging may be considered obviously incorrect. 

In our opinion, this is not enough to protect user rights. As we defended in our input paper to the stakeholder dialogue, the following types of matches can never be assumed to be obviously infringing: (1) exact or partial matches where the work or parts of the work only constitute a part of the upload and (2) matches of indivisible works (e.g. photographs), short works (e.g. short poems) and very short excerpts (e.g. memes, reaction gifs) of any works, even when they correspond to 100% of the upload.

Let’s not forget that, on the one hand, the new de minimis exception only covers noncommercial uses and, on the other, parodies, quotations, etc can be made for commercial purposes and can include the use of entire pieces of works. Labelling those other exceptions as “mechanically non-verifiable” (see heading of §5) and not expressly making those exclusions, to ensure that such lawful uses are not indeed blocked by machines, is not satisfactory. 

Lawful content stays up until human review and platform liability is paused

As explained above, if the user has pre-flagged the uploaded content as authorized and the pre-flagging is not obviously infringing, the platform cannot block/remove the content. It only has to inform the rightholder, who might challenge or not the pre-flagging. In any case, the content has to stay up until a decision has been made (§8/2, and §10 and §11).

The rightholder might challenge the upload through an internal complaint procedure provided by platforms (§§14 and 15). These complaints must be decided by a human within one week.  The parties might also decide to go through an alternative dispute resolution mechanism (§§17 and 18) and appeal to court (§13/4). (Similarly, if the content is blocked/removed and the user decides to challenge the blocking/removal).

The platform is not liable for copyright infringement if the user has pre-flagged the use and such pre-flagging is not obviously incorrect, until the complaint procedure has been concluded or the period for deciding on the complaint has expired (§16). 

If the user does not pre-flag the content, the platforms should (considering their obligation to not block lawful uses) still keep the content online if they believe the use is lawful. This is particularly the case with regards to uses under the de minimis exception, where the platforms can mechanically verify the quantity limitations and in many cases are also able to mechanically verify the noncommercial purpose of the use. According to the explanatory memorandum, this exception should be automatically enforced even when the user has not pre-flagged it.     

Unfortunately (and that’s a point for improvement of the draft), platforms are not exempted from liability if the user does not pre-flag the content, even with regards to uses that fall under the de minimis exception. Therefore, they might feel tempted not to enforce the exceptions without an action from the user. However, it should be noted that, if platforms erroneously block or remove authorized uses repeatedly, consumer and user associations may claim injunctive relief from them (§19/4).

Sanctioning abusive behaviour from rightholders and users

In addition to protecting users from repeated blocking/removals of lawful content by platforms, the German text also provides for measures against abuse from rightholders. False ownership claims can lead (the platform has the right, but not the obligation) to preventing the self-proclaimed rightholder from requesting to block/remove content for a reasonable period of time (§19/1). In addition the self-proclaimed rightholder is liable towards both the user and the platform (§19/2).

We believe that a similar safeguard should exist also for repeated removal requests of lawful content by rightholders, since those also are a matter of concern. And it would be only fair, since the platform is entitled to exclude the user for a reasonable period of time from the possibility of pre-flagging uses if the user falsely pre-flags uses as lawful repeatedly (§19/3).

One thought on “The German Model to Protect User Rights when implementing Article 17

  1. Pingback: Direttiva Copyright: a che punto siamo – Wikimedia Italia