Civil society groups have placed a lot of hope in the European Commission to limit the dangers to fundamental rights caused by upload filters through the Commission’s Article 17 guidance, which is supposed to help member states implement Article 17 of the DSM directive in a fundamental rights-preserving manner. But with less than two months to go before the implementation deadline, the guidance is still nowhere to be seen. In an open letter published today, twenty user rights organisations are therefore calling on the Commission not to undermine elements of the guidance that would protect users’ fundamental rights by limiting the use of automated upload filters to manifestly infringing content.
Late last week, the CJEU unexpectedly postponed the Advocate General opinion in the Polish case challenging the fundamental rights compliance of Article 17 of the DSM directive by almost three months. Knowing that the upcoming Commission guidance was discussed extensively at the CJEU hearing on the Polish case in November, the postponement could very well mean that the Advocate General wants to see the document before issuing an opinion.
While the Commission has been hinting at the imminent release of the guidance for a few months now, the timing indicates that the Commission precisely wanted to avoid giving the Advocate General time to study the guidance. This does not bode well for the fundamental rights safeguards the Commission is planning to present. Signals are mounting that the delays are the result of intense behind-the-scenes political wrangling aimed at undermining the user rights safeguards to be included in the guidance.
That’s why, together with 20 other users’ rights organisations who have participated in the EU stakeholder dialogue on the implementation of Article 17, we have sent an open letter to the Commission, raising our concerns about the handling of the final phase of this process. The letter urges the Commission “not to weaken its guidance through open ended exception clauses that seem to benefit particular rightsholders at the expense of users’ fundamental rights” and stresses that “strong ex-ante fundamental rights protections are necessary to meet the obligation of result to protect users’ fundamental rights.”
The letter further highlights the fact that, by issuing guidance that substantially diverges from the position taken before the CJEU, the Commission would indicate that it is ultimately lacking the political will to ensure that the required fundamental rights protections will be included in national implementations of the directive.
Last week the French Ministry of Culture held a virtual event to present the second report on content recognition tools on digital sharing platforms commissioned by the Conseil Supérieur de la Propriété Littéraire et Artistique (High Council for literary and artistic property – CSPLA). The new CSPLA report, authored by Jean-Philippe Mochon (who had also authored the previous report on content recognition tools), focuses on “proposals for the implementation of Article 17 of the EU copyright directive”. The report consists of three parts:
The first part contains a “review of existing best practices” of the use of content recognition tools. Here, the authors argue that such tools “must be given their rightful place in the implementation of Article 17 of the Directive”. The second part of the report focuses on the “balance between the fundamental rights set out in Article 17”. The third and concluding part of the report contains a number of recommendations for implementing Article 17 in France (and beyond).
The central argument that is woven throughout the CSPLA report is that automated content recognition technologies already play an important role in managing copyright on digital sharing platforms, that Article 17 provides for sufficient fundamental rights protection through the complaint and redress mechanism alone, and that temporary restrictions on freedom of expression are considered acceptable to achieve the goal of stronger protection of intellectual property rights. A more detailed critique of some of the core arguments contained in the middle part can be found in this post on the Kluwer Copyright Blog.
To mark the occasion of the publication of the CSPLA report, the French permanent representation in Brussels hosted a (virtual) event that was clearly intended to demonstrate additional support for the French position in the discussion about the implementation of Article 17.
For the presentation of the report the organizers had invited the authors of the report and three external speakers, representing the European co-legislators: MEP Axel Voss (the European Parliament’s rapporteur for the DSM directive), Marco Giorello (the head of the Copyright Unit of the European Commission) and Ricardo Castanheira (representing the Portugese Council Presidency). If the organizers of the event had hoped that these speakers would express support for the French position in the Article 17 implementation discussion, then they must have been quite disappointed: None of the three respondents came forward with unqualified support for French interpretation of Article 17. As expected, Marco Giorello made it clear once more that the Commission does not agree with the French position that there is no need for ex-ante user rights safeguards in national implementations of Article 17:
On Tuesday, November 10, the Court of Justice of the European Union (CJEU) heard case C-401/19. This case is a request by the Polish government to annul the filtering obligation contained in Article 17 of the Copyright in the Digital Single Market (DSM) Directive on the grounds that it will lead to censorship and will limit the freedom of expression and the freedom to receive and impart information guaranteed in Article 13 of the EU Charter of Fundamental Rights (Charter).
The defendants in this case are the European Parliament and the Council of the European Union. In addition, the European Commission and the governments of France and Spain intervened in the case on the side of the defendants. Advocate General Saugmandsgaard Øe was also present at the hearing.
Even for astute followers of the discussions around the implementation of Article 17, the hearing contained a number of surprises. While several Member States have been soldiering on with their national implementation proposals with little regard for the fundamental rights implications of Article 17, the hearing showed that the Court is taking Poland’s complaint very seriously and that the compliance of the contested provisions of Article 17 with the Charter is far from evident. Frequent reference was made during the hearing to the recent opinion of Advocate General Saugmandsgaard Øe in the YouTube and Cyando cases, which is highly critical of extensive obligations on platforms to police the copyright infringements of their users.
On the face of it, the case is about Poland’s request to annul Articles 17(4)(b) and (c) of the DSM directive. Poland argued its case, which essentially rests on the observation that while not explicitly mandating them, Article 17(4)(b) and (c) effectively require platforms to implement upload filters because there are no other effective means to comply with the obligations contained therein. Poland argues that this will lead to censorship and will limit the freedom of information of the users of online platforms.
According to Poland, the key problem with the directive is the move away from active participation of rightholders (as initiators of removal requests in the context of notice and takedown procedures) and instead handing the responsibility of removing infringing uploads over to platforms who will have to develop private enforcement systems to avoid liability for copyright infringement. Because they are not facing any comparable risk when they limit user rights by blocking access to legal content, this creates strong incentives for over-blocking. This in turn will result in censorship and violation of the fundamental rights to freedom of expression and information under the Charter. Consequently, the problematic parts of Article 17 should be annulled by the Court.
All other parties intervening in the case objected to this line of argument and stated that in their view Article 17 does not violate any fundamental rights. However, they presented strikingly contradictory interpretations of what Article 17 actually requires of platforms. There are two distinct lines of argument: The Commission, the Council and the European Parliament argued that that Article 17 contains enough internal safeguards to prevent users’ fundamental rights from being unduly limited. On the other hand, France and Spain argued that some limitations of fundamental freedoms are justified by the objective that Article 17 seeks to achieve. Continue reading
On Thursday the European Parliament voted 550-34 (with 25 abstentions) to approve the Directive on Open Data and Public Sector Information. The directive updates the rules controlling the re-use of public sector information held by public sector bodies of the Member States and also governs the re-use of documents held by public undertakings, such as water, energy, transport, and postal services. The recast directive is expanded to cover publicly funded research data. It states that charges related to the provision of PSI should in principle be limited to marginal costs related to the initial provision of the documents. And it also prioritises the identification and sharing of “high-value” datasets that should be available for free re-use via APIs.
The purpose of the refreshed directive is to promote the use of open data and stimulate innovation in products and services in the Digital Single Market. The directive says Member States should approach the re-use of PSI according to the principle of “open by design and by default.”
Communia has been active in the discussion on the legal framework for re-use of public sector information in the EU for many years, producing position papers in 2012, 2014, and 2018, and providing feedback to the recast proposal in July 2018. We’ve supported changes that would expand the scope of the directive, and pushed for increased legal clarity around aspects such as standard open licenses for PSI. The final Directive addresses some of our concerns, but after it is formally approved by the Council of the EU, it will be up to the Member States to implement the recast directive rules into their national laws. Transposition must be completed within two years.
Below we discuss a few pieces of the directive we’ve been following. Continue reading
Today, after a 30-month long legislative procedure, the European Parliament voted on the Directive on Copyright in the Digital Single Market. Members of the Parliament approved the Directive, with 348 voting in favor and 274 voting against, and 36 abstaining.
The Directive is the most important European regulation of the digital sphere in the last several years. It will define the shape of copyright in Europe for years to come — and have spillover effects for regulation around the globe. We believe that the approved directive will not meet the goal of providing a modern framework that balances the interests of rightsholder and users, protects human rights and enables creativity and innovation to flourish. Instead, it is a biased regulation that supports one business sector, at the cost of European citizens.
In the last two and a half years, and especially since last June, we faced an extremely heated debate and intense legislative process. During this time, together with a broad coalition of activists, experts and organisations, we attempted to remove (or improve) its most controversial parts. In the last weeks, we supported an effort to amend the directive during the plenary, in a last attempt to remove the most detrimental provision — Article 13. Unfortunately, the European Parliament rejected a motion to vote on amendments to the Directive, with 312 MEPs voting in favor, and 317 voting against. This motion would have opened the door to remove Article 13 but keep the rest of the directive intact. It failed.
The Directive was therefore approved, with all the controversial elements that we have been criticising: content filters introduced by Article 13, new rights for publishers introduced by Article 11, and a mechanism for overriding copyright exceptions for education by private agreements introduced by Article 4/2.
European parliamentarians, together with the Commission and the governments of the Member States have given a strong signal of support to the entertainment industries and their incumbent players — at a dire cost to internet users and freedom of expression. We believe that it is an unbalanced approach that will have severe repercussions. These legal provisions will not only cost millions to small and medium sized European platforms, but most importantly put fundamental freedoms at risk and set dangerous precedents for user rights.
Earlier today, on the eve of tomorrow’s vote, we distributed 750 copies a journal titled “Say YES to copyright and NO to Article 13” to the offices of the MEPs in Strasbourg. This is part of a last ditch effort by civil society organisations to prevent MEPs from approving a new copyright directive that includes the disastrous Article 13. You can download our journal here (pdf) and we are re-publishing the text of the editorial below.
Say YES to copyright and NO to Article 13
Article 13 of the proposed Copyright Directive will put even more control over European culture and knowledge into the hands of online monopolies. As organisations representing digital creators and knowledge workers, we urge you to reject this provision that will replace the rule of law with proprietary algorithms controlled by big tech companies.
It is high time that Europe adapts its copyright framework to meet the needs of the digital age. The proposed directive contains many measures that take steps in the right direction, such as improving the negotiation position of authors and performers, better safeguarding the public domain, and by allowing researchers and cultural heritage institutions to make better use the opportunities created by the digital environment.
In spite of widespread opposition from academics, internet users and millions of concerned citizens, the directive still contains provisions that will force most internet platforms to filter all content uploaded by their users to remove any copyrighted works flagged by rightsholders. This will cost European companies and new startups millions, and what’s worse, it won’t work. The idea that technology can reliably differentiate between legitimate and unauthorised uses of copyrighted material has been credibly disputed by experts across the spectrum. Putting the regulation of speech and creative expression in the hands of private corporations lacks public support.
- Instead of taking the right step toward a Digital Single Market that works for all, a directive that includes Article 13 would sow even more legal uncertainties.
- Instead of empowering European creators, it will entrench the position of dominant platforms.
- Instead of balancing fundamental rights, it will weaken the law by shifting power towards algorithms and away from crucial users’ rights upholding freedom of expression.
We support the objective to ensure that creators are rewarded adequately for their creativity. Upload filters themselves will not achieve this objective. This directive needs to take the interests of all stakeholders into account, not only “big tech” and “big content”. Copyright should be a matter of social contract that upholds the public interest, not of secret algorithms controlled by private actors. We therefore ask you to reject the text of the directive as long as it includes Article 13.
Days before the final vote of the European Parliament on the copyright directive, the discussion about the directive seems entirely focussed on Article 13. A wide coalition of civil society groups, online creators, academics and citizens is calling for the removal of Article 13 from the directive. On the other side 270 organisations representing rightsholders are calling on MEPs to say “yes to copyright” and pass the directive in its current form (including Article 13).
Behind the facade of these well known (and deeply entrenched) positions, something interesting is going on. If we start un-peeling the arguments brought forward by both sides, it seems that they are closer than it appears. When it comes to Article 13 there seem to be two points that almost everyone seems to agree on:
(1) Nobody really wants to see the widespread use of upload filters and (2) Everybody agrees that there is a need to ensure that creators are fairly rewarded on the basis of licenses obtained by the online platforms.
This agreement is emerging as a result of several recent developments. On the side of the opponents of Article 13 the intense discussion of the previous weeks has resurfaced the fact that underneath the calls for a deletion of Article 13 there is widespread acknowledgement that there is a real need for platforms to pay those creators who want to be paid for uses of their works by the platforms. On the side of the proponents of Article 13 there seems to be an increasing realisation that an Article 13 that does require widespread use of upload filters may lack sufficient support within the EP (and certainly outside of it).
Upload Filters have become toxic
This second development represents a marked shift in the positioning of the supporters of Article 13. The most prominent example of this is a position paper of the German CDU (the same political party that rapporteur Axel Voss belongs to) in which the promise (to an enraged German electorate) that Germany would implement Article 13 in such a way that there will be no need for upload filters (by requiring platforms to obtain blanket licenses). While the substance of this claim is way out of line with the actual text of Article 13 and the requirements of the rest of the EU framework, it does illustrate that even for the CDU, which was instrumental in pushing through the current text, upload filters have become too toxic to be associated with. Continue reading
The final vote on copyright reform in the plenary session of the Parliament is scheduled for March 26. After more than 30 months of work on this topic our overall assessment remains unchanged: The proposed Directive is bad, and will not make the internet work for people. The final “compromise” text has done nothing to accommodate the concerns we and others have raised over the past 2+ years.
As long as Article 13 remains part of the package, the only sensible way forward it to make sure that Directive will be rejected by the European Parliament.
There is still time to act! Read along to find out what you can do in the last days before the vote.
Upload filters don’t (and can’t) respect users’ rights
Through the lens of copyright, Article 13 turns upside down how the web works. Instead of permitting users to upload content to platforms and resolving platforms from liability as long as they act quickly to remove infringing content once notified, Article 13 would require nearly all for-profit platforms that allow UGC to conclude licenses all user uploads. If they don’t obtain the licenses, then the only option will be to install upload filters and censor content in order to ensure that any unsanctioned content remains off their service. If the platforms don’t comply, they could be held liable for significant copyright infringement damages. Continue reading
After yesterday’s agreement between the European Parliament, the Council and the European Commission on a compromise text, the EU copyright reform process has entered into its final phase. The good news is that after yesterday’s compromise the text cannot get any worse: it will either be adopted or it will be rejected. The bad news is that the text that was agreed on yesterday is **the worst version that we have seen yet**. After three days of negotiations, the negotiators have agreed on a text that would benefit big corporate rightsholders, Google and other dominant platforms at the expense of users, creators and the rest of the European internet economy.
To understand what has happened during the negotiations, it is illustrative to look at the differences between the final compromise and the text that had been agreed among the EU member states last week (which was the result of horse trading between the French and German governments).
A win for dominant platforms…
Yesterday’s compromise text is largely in line with the French-German deal. This includes a terrible version of Article 13 that will severely limit users ability to express themselves online. It will also further consolidate the power of dominant platforms, as smaller platforms will struggle with implementing expensive filtering technology and supporting the increased costs for dealing with increased liability.
It also introduces a EU-wide neighbouring right for press publishers that will have very similar effects. It benefits dominant platforms who can afford compliance while creating additional costs and risks for smaller players. As a result, users will likely end up with less access to information and the diversity of information available online will likely suffer. Under these conditions it remains to be seen if rightsholders will indeed manage to extract more value from the large intermediaries.
…at the expense of users and creators
As if this would not be bad enough, the negotiators have introduced last minute changes to the text that further weaken provisions that were intended to protect the rights of users and individual creators. The French/German deal did not (at least not clearly) include a UGC exception for users of every online platform, but it used language that at least applied to user-generated content uploaded to the platforms covered by Article 13. The final compromise has adopted questionable language that may or may not provide a meaningful protection for users of platforms covered by Article 13, depending on whether Member States are obliged to fully implement the existing quotation and parody exceptions provided in the InfoSoc Directive, and make them applicable to user-generated content, which is not evident from the text. Continue reading
Later today the negotiators of the Commission, the European Parliament and the Council will meet for the 4th trilogue meeting. After having dealt with less controversial parts of the proposal during the three preceding meetings, tonight, will finally see a discussion about Article 13 of the proposed DSM directive.
Given that all three legislators bring similar versions of article 13 to the table, we can expect that a final compromise text will include some version of the article 13 upload filters. There is still a good chance that the negotiations will be inconclusive or that the eventual outcome of the trilogue negotiations will not be approved by either the Member States or the Parliament (which would mean that the directive will fail and there will be no upload filtering requirement for the foreseeable future). But in the context of the ongoing trilogue, the deletion of article 13 (which has been our position so far) is not an option anymore.
This raises the question of how the damage that article 13 will do to the internet ecosystem and freedom of expression can best be contained. Before we do so let’s take a quick look at the positions that are on the table:
EP position: general blocking of all unlicensed content
The provision adopted by the European parliament can only be described as a total disaster. As the result of a misguided attempt to remove the mention of “measures” from the text of the article the European Parliament adopted a version of article 13 that makes platforms liable for copyright infringements for every single work uploaded by their users. This would include any photo, drawing or text uploaded by a user, regardless if these are old works, works that have been created for the express purpose of being shared widely, or the latest blockbuster movie. As a result of making platforms liable for all works uploaded by their users, they are practically forced to install filters that will block everything that has not been licensed to them. In other words, the EP version of article 13 would turn open platforms into platforms that distribute content licensed by the entertainment industry and nothing else. Continue reading