On Wednesday the 21st of June we held a special lunch salon on Advocate General Saugmandsgaard Øe’s Opinion in Case C-401/19, the Polish request to annul Article 17 of the CDSM directive.
Moderated by Teresa Nobre, the salon started with Paul Keller (COMMUNIA/Open Future) assessing the opinion in the context of the Commission’s stakeholder dialogue and the ongoing national implementations (from min. 03:11 to min. 12:35 in the recording). While the opinion doesn’t provide for Article 17’s annulment, it provides important clarifications on users rights safeguards against automated, preventive content blocking systems adopted by sharing services providers.
Then Martin Husovec (London School of Economics) took a closer look at the overall strategy of the AG’s opinion while dwelling on its weaknesses and strengths (from min. 14:53 to min. 25.50 in the recordings). He focused on what he described as “AG Øe’s re-interpretation of Article 17” and further analysed the safeguard mechanisms provided in the opinion.
The final presentation came from Julia Reda (GFF) (from min. 27:50 to min. 40:00 in the recordings) who expressed her disappointment to the fact that AG Øe did not recommend the to reject Article 17. She went on to identify a number of inconsistencies in the parts of the opinion that attempt to reconcile the use of upload filters with the ban on general monitoring obligations. .
The discussion was followed by a Q&A session with the participants (from min. 42:20 onwards).
Last week, Advocate General Saugmandsgaard Øe at the CJEU issued his opinion in Case C-401/19, the Polish request to annul Article 17 of the CDSM directive. According to his Opinion, the preventive measures to monitor and block users’ uploads envisioned by Article 17(4) constitute a limitation on the exercise of the right to freedom of expression and information of the users of sharing services, but such a limitation is compatible with Article 11 of the Charter of Fundamental Rights of the European Union, since all the conditions laid down in Article 52(1) of the Charter are satisfied.
In particular, the Advocate General found that the new liability regime established by Article 17(4) respects the proportionality requirement – despite entailing significant risks for freedom of expression – as it is accompanied by sufficient safeguards to minimise those risks:
sharing service providers are not authorised to preventively block all content which reproduces the copyright-protected content identified by rightholders, including lawful content (Article 17(7));
those providers are obliged to detect and block only content the unlawfulness of which seems manifest in the light of the ‘relevant and necessary’ information provided by the rightholders (Article 17(8));
additionally, and as final safeguard for situations where, despite the obligation in Article 17(7), those providers nevertheless block such legitimate content mistakenly, users have at their disposal a complaint and redress mechanism as well as out-of-court mechanisms (Article 17(9)).
On Monday the 7th of June – the day of the implementation deadline for the DSM Directive – we held a special COMMUNIA Salon to mark the entry into force of the Directive, to assess the implementation process being made in the 27 Member States and to discuss the Article 17 implementation guidance published by the Commission just before the entry into force of the Directive.
The Salon kicked off with Teresa Nobre and Dimitar Dimitrov presenting our ongoing work supporting user rights’ advocates across the EU Member States to ensure a user rights’ friendly implementation of the Directive into national legislations (from 1:11 to 18:13 in the video recording). This was followed by the launch of the Eurovision DSM Contest, a new website which tracks the implementation progress for each Member State.
The second part of the event (from 20:00 in the recording onwards) consisted of a discussion on the Commission’s Article 17 guidance. Paul Keller opened the discussion by noting that, while the guidance establishes important user rights’ safeguards that put into question the implementation approach chosen by Member States – most notably by France and Denmark –, it also contains a massive loophole by allowing rightholders to “earmark” their content as economically valuable.
Julia Reda (GFF) followed up by comparing the guidance to the recently adopted German implementation law. She argued that the German law seemed to be largely in line with the principles established by the guidance and highlighted that the German rules, for certain types of “high-value” content, are highly targeted and limited. In this light, this is much less problematic than the “earmarking” mechanism introduced by the Directive.
Today, on the last day before the implementation deadline for the Copyright in the Digital Single Market Directive, the European Commission has finally published its long overdue implementation guidance for Article 17 of the Directive. The guidance, which marks the end of a stakeholder process that started in October 2019, is supposed to provide Member States implementing the Directive with guidance on how to reconcile the contradicting objectives contained in Article 17 of the Directive. It comes at a time where only a handful of Member States have implemented the Directive into their national law.
In the final version of the guidance published today, the Commission will require Member States to include ex-ante safeguards for user rights in their national implementation legislation. In doing so, it provides support to the implementation approach taken by Germany (and discussed in Austria and Finland), while making it clear that Member States who have limited themselves to merely re-stating the provisions of the Directive (such as France, The Netherlands and Hungary) will need to include such additional safeguards (more on this below).
Unfortunately, and confirming the suspicions that we had expressed in our recent open letter, the final version of the guidance walks back the strong commitment to protect users’ fundamental rights that the Commission had shown earlier in the process. As a result of relentless pressure from the entertainment industry, the final version of the guidance contains an “earmarking” mechanism that is designed to allow rightholders to override safeguards against automated blocking of user uploads that are not manifestly infringing, by claiming that a use of their works “could cause significant economic harm”. This provision is ripe for abuse by rightholders and undermines the relatively strong principles for safeguarding users’ fundamental rights, which the guidelines require Member States to include in their national implementations (see our detailed description of how the “earmarking” provision undermines the principles of the guidance here).
The “earmarking” mechanism was added to the guidance in the last three months, in closed-door deliberations of the Commission and in reaction to massive pressure from rightholders. This back-room dealing of the Commission in the last months stands in stark contrast to the transparent and balanced way in which the Commission had handled the initial stages of the stakeholder process. After a series of public stakeholder dialogue meetings, the Commission had released a remarkably balanced consultation draft of the guidance in July of last year. The Commission then used the principles outlined in the draft to defend the legality of Article 17 before the CJEU, only to agree on a final version that substantially undermines these principles, behind closed doors and without further consultation of the stakeholders involved in the process.
This conduct abuses the stakeholder process that the Commission was legally required to hold as part of the hard fought-political compromise embodied in Article 17. Where the Commission initially lived up to its role as a neutral steward of the legislative compromise, it has abandoned this role unilaterally, changing the final result based on massive political pressure from rightholders.
In the final guidance, the Commission maintains that it is “not enough for the transposition and application of Article 17(7) to restore legitimate content ex post under Article 17(9), once it has been removed or disabled” and argues that only manifestly infringing content should be blocked automatically, but these “principles” are included in name only. By introducing the ability for rightholders to “earmark” any content that has the potential to ”cause significant economic harm”, the guidance allows rightholders to easily override these principles, whenever they see fit, and to force platforms to automatically block user uploads even if they are not manifestly infringing.
In the remainder of this post we will examine these last minute changes to the guidance in more detail. Before we do that, we will briefly recall the key problem that the guidance was supposed to resolve and the principles that underpinned previous versions of the guidance.
Today the German Bundestag adopted the law implementing the provisions of the DSM directive into German law. The law still needs to be confirmed by the Bundesrat before it can be signed into law by the President, but it is not expected that the law will undergo any more changes during that process. As we have explained here before, the German implementation law is one of the most ambitious implementations of the DSM directive especially with regards to the way it implements the provisions of Article 17 of the directive. And while the discussions in the Bundestag have led to a number of changes to the text of the law, the key mechanism underpinning the government proposal for implementing Article 17 has emerged essentially unchanged. In addition, the discussion in the Bundestag has resulted in a number of substantial improvements in other parts of the law.
In order to achieve this, the German implementation relies on the concept of “uses presumably authorised by law”, which must not be blocked automatically. For an upload to qualify as “presumably authorised by law”, it needs to fulfil the following cumulative criteria:
The use must consist of less than 50% of the original protected work,
The use must combine the parts of the work with other content, and
The use must be minor (a non-commercial use of less than 15 seconds of audio or video, 160 characters of text or 125 kB of graphics) or, if it generates significant revenues or exceeds these thresholds, the user must flag it as being covered by an exception.
If these conditions are met, the use is considered to be “presumably authorised by law” and cannot be blocked automatically. Rightholders can still challenge the legality of such uses but platforms are required to keep the uploads online until those complaints have been reviewed by the platforms (there is an exception that allows “trusted rightholders” to request immediate removal if they consider the use evidently infringing and commercially harmful).
This mechanism had been the target of massive criticism from rightholders throughout the parliamentary debate and it is welcome to see that the Bundestag has had the courage to hold the line here. The version of the law adopted today makes one small concession to rightholders. It now specifies that the “presumably authorised by law” mechanism does not apply to “the use of cinematographic works or videos until the end of their first public broadcast, in particular during the simultaneous broadcast of sports events, provided that the rightholder requests this from the service provider”. This change addresses concerns expressed by sports associations who argued that allowing people to share 15 second clips of sports events during an event would ruin their business model. While this seems highly dubious, the exception is so narrow that its impact on legal uses will be fairly minimal.
While the Commission has been hinting at the imminent release of the guidance for a few months now, the timing indicates that the Commission precisely wanted to avoid giving the Advocate General time to study the guidance. This does not bode well for the fundamental rights safeguards the Commission is planning to present. Signals are mounting that the delays are the result of intense behind-the-scenes political wrangling aimed at undermining the user rights safeguards to be included in the guidance.
The letter further highlights the fact that, by issuing guidance that substantially diverges from the position taken before the CJEU, the Commission would indicate that it is ultimately lacking the political will to ensure that the required fundamental rights protections will be included in national implementations of the directive.
The initial presentation was followed by perspectives from Marco Pancini (YouTube), Xavier Blanc (AEPO-ARTIS) and Julia Reda (GFF) who highlighted different aspects of the legislative proposal. Speaking from the perspective of large pan European platforms Marco Pancini expressed concerns about the variation of legislative approaches in the Member States with Germany marking one end of the spectrum. According to him this will lead to fragmentation of the digital single market and create substantial compliance burdens for all types of platforms.
As always, the COMMUNIA Salon is open for everyone to attend and will be held on Zoom. Join us on Wednesday, the 17th of March, at 1530 CET, by registering here. Registered participants will receive login information ahead of the event.
Just as the original discussion draft, the Referentenentwurf drew massive criticism from rightsholders that was primarily directed at the user rights safeguards contained in the proposal. Unfortunately the Ministry of Justice has now caved in to some of these demands and as a result the user rights safeguards have been further cut back in the final proposal adopted by the government (“Regierungsentwurf“).
The most substantial change in the Regierungsentwurf concerns the legal mechanism for ensuring that user rights are protected in line with the requirements of paragraph 17(7). The original proposal relied on a new exception that legalised minor uses of copyrighted works (any use consisting of less than 20 seconds of audio or video, 1000 characters of text or images smaller than 250Kb) and would have prevented platforms from blocking such uses.
In the updated proposal this exception is gone and has been replaced by a more limited construction of “uses presumably authorised by law” which cannot be blocked automatically. For a use to be “presumably authorised by law” it needs to fulfil the following cumulative criteria: