Yesterday 51 leading European copyright scholars published a statement on “Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive“. The statement is intended as input into the ongoing stakeholder dialogue. It focuses on the interplay between user rights and the filtering obligations established by Article 17. From the Kluwer Copyright blog:
Against this backdrop, a group of European academics (including the author of this post) has drafted a document with recommendations on user freedoms and safeguards included in Article 17 of the DSM Directive – namely in its paragraphs (7) and (9), to be read in the context of the aforementioned stakeholder dialogues. The recommendations are divided into three sections: on promoting licensing and limiting preventive measures; on exceptions and limitations in Article 17 (user freedoms); and on minimizing the risks of broad filtering and over-blocking (user safeguards). Despite the controversial nature of the topic, the recommendation has already been endorsed by around 50 European academics in this area, which is a testament to its balanced approach. Our intention is that these recommendations and interpretative guidelines are taken into consideration by the Commission, stakeholders, and Member States in their discussions on national implementations of Article 17 DSM Directive.
Based on a close reading of paragraphs 17(7) and 17(9), the academics show that Article 17 requires online platforms (OCSSPs in the language of the directive) to prioritise protecting users rights over blocking content. This statement provides strong support for our positions in the stakeholder dialogue. Continue reading
We have argued again and again that copyright reform is also fundamental rights matter – therefore we co-signed an open letter to the European decision-makers asking them to add human rights safeguards to Article 13 of the proposed Copyright Directive on the Digital Single Market throughout the negotiation process.
The fundamental rights safeguards crucial for ensuring compliance of the new Directive with the Charter of Fundamental Rights are in accordance with our four principles for minimising harm to users, creators and the internet. The letter signed by 27 fundamental and digital rights organizations raises concerns about the current state of play for Article 13 and calls for:
Platforms control all information available on the internet and they are empowered to rank and take down content at their discretion. These platforms serve “the internet” as we know it now. Internet platforms are able to make decisions about freedom of expression with no transparency or accountability and the proposed Directive does not change that. In cases where content is blocked or taken down, it is critical that they properly justify their decisions; decisions that should be subject to proper redress mechanisms to ensure free speech and freedom of information. Besides providing an alternative dispute resolution, the EU could provide, for free, legal mechanisms across the EU to settle disputes between users, copyright holders and internet platforms.Continue reading
The ongoing and neverending EU copyright reform is often depicted in the way that the main issues are money, value distribution, and how to protect existing business models. These are of course valid concerns, but they shape the whole discussion in very narrowly-framed way. The most important question is how copyright reform will influence various groups of stakeholders, not only when it comes to businesses that want to make money, but also for the individuals and organisations that both need access to information and content, and who also wish to create and share. And when it comes to science and knowledge sharing (not only in academia, but generally innovation) the answer is very clear – the proposal presented by the European Commission will block the free circulation of knowledge.
Vox Scientia is a group of organisations (including Communia) and individual educators, researchers, librarians, cultural heritage professionals, and students who are standing up and aiming to be the ‘Voice of Knowledge’ – ‘Vox scientia’ – in this debate. The parties engaged believe in a world where all people are able to freely exchange ideas, create, learn, and contribute to the global knowledge commons. The aim of the initiative is to mobilize people to stand up against dangerous and restrictive copyright solutions backed by the European Commision. Continue reading
The concept of content filtering has been making quite a career. Not only did it land in the copyright directive proposal, but also it has been introduced into the draft of the Audiovisual Media Service Directive (AVMSD) that is currently making its way through the European Parliament. In the context of the AVMSD, filtering of uploads by video-sharing platforms would serve to prevent legal audiovisual content that could harm children. As important as protecting children may be, the CULT Committee has just voted against that idea. This was the right thing to do.
A seemingly quick solution to filter whatever the decision makers don’t want users to see is a very dangerous tool in any context. It is an arbitrary approach to the flow if information online and as such it can be used as a censorship machine. This “automatized conscience” will operate on a very abstract definitions of content that could impair children’s “physical, mental or moral development” or incitement to terrorism, violence and hatred. Humans often argue about what constitutes such incitement with many cases finding their finale in court. How could we trust algorithms with such a dispute?
Fortunately, 17 members of the CULT Committee understood that. Nine of them either do not see the danger or have an unwavering faith in the potency of technology to solve complex societal problems. Hopefully, the AVMSD debate helped CULT Committee see both the danger and the pointlessness of content filtering and they will take a similar decision for a better copyright. After all, in the context of copyright, putting the interest of rightholders before the interest of the public is an even worse reason to employ algorithms as censors.
Catherine Stihler, Rapporteur of the Committee on the Internal Market and Consumer Protection (IMCO) released her draft opinion on the proposed Copyright in the Digital Single Market Directive. In this opinion, Stihler rightly states that article 13, which proposes to implement content filter mechanisms that would block some of users’ uploads, fails to achieve its purpose. She tries to make sure rightsholders and creators would receive a fair and balanced compensation for the exploitation of their work without negatively impacting the digital economy or internet freedoms of consumers. Acting on this, Stihler tries to fix article 13. However, we believe that the only appropriate response is to delete it altogether.
The filter must go
It is commendable that in her opinion MEP Stihler explicitly says that any attempt to address the value gap cannot be enforced if it has a negative impact on fundamental rights and internet freedoms of consumers. This is something the potential beneficiaries of the proposed article seem to ignore.
Explaining why the upload filter must be removed, MEP Stihler states that filter machines are not capable nor suitable to take into account user rights such as exceptions and limitations. This is something all the opponents of the upload filter, including COMMUNIA, have pointed out before. Therefore in her amendments she rightfully removes all references to the ‘effective’ recognition technologies, which would make the Directive text more technology neutral and future-proof. Continue reading
Today we are publishing the fifth in a series of position papers dealing with the various parts of the European Commission’s proposal for a Directive on Copyright in the Digital Single Market (the previous papers dealt with the education exception, text and data mining, the press publisher’s right and freedom of panorama). The paper deals with article 13 of the Commission’s proposal which introduces a filtering obligation on online platforms that allow users to upload content (such as facebook, youtube flickr and many other online services). The proposal fails to establish clear rules for internet users that make it clear how they can share and remix content legally. Instead it introduces a filtering requirement for online platforms that can potentially serve as a censorship machine and will violate users’ fundamental rights and distort the existing legal framework. From our perspective article 13 and the related recitals should be deleted from the proposal (You can download a pdf version of the position paper here).
Position paper: Use of Protected Content by Information Society Service Providers
Article 13 of the European Commission’s proposal for a Directive on Copyright in the Digital Single Market attempts to address the disparity in revenues generated for rightsholders and platforms from online uses of protected content. The proposed article attempts this by introducing an obligation for “Information society service providers that store and provide to the public access to large amounts of works” to filter user uploads. It would also require these providers to set up licensing agreements with rightsholders.
These proposed measures, however, do not address the issue adequately; instead, they violate fundamental rights of users, contradict the E-Commerce Directive, and go against CJEU case law.
The measures proposed in the Commission’s proposal stem from an unbalanced vision of copyright as an issue between rightsholders and ‘infringers’. The proposal chooses to ignore limitations and exceptions to copyright, fundamental freedoms, and existing users’ practices. In addition, the proposal fails to establish clear rules with regard to how citizens can use protected works in transformative ways—such as remixes and other forms of so-called “user-generated content” (UGC). As a result, a system of this kind would greatly restrict the way Europeans create, share, and communicate online.