The concept of content filtering has been making quite a career. Not only did it land in the copyright directive proposal, but also it has been introduced into the draft of the Audiovisual Media Service Directive (AVMSD) that is currently making its way through the European Parliament. In the context of the AVMSD, filtering of uploads by video-sharing platforms would serve to prevent legal audiovisual content that could harm children. As important as protecting children may be, the CULT Committee has just voted against that idea. This was the right thing to do.
A seemingly quick solution to filter whatever the decision makers don’t want users to see is a very dangerous tool in any context. It is an arbitrary approach to the flow if information online and as such it can be used as a censorship machine. This “automatized conscience” will operate on a very abstract definitions of content that could impair children’s “physical, mental or moral development” or incitement to terrorism, violence and hatred. Humans often argue about what constitutes such incitement with many cases finding their finale in court. How could we trust algorithms with such a dispute?
Fortunately, 17 members of the CULT Committee understood that. Nine of them either do not see the danger or have an unwavering faith in the potency of technology to solve complex societal problems. Hopefully, the AVMSD debate helped CULT Committee see both the danger and the pointlessness of content filtering and they will take a similar decision for a better copyright. After all, in the context of copyright, putting the interest of rightholders before the interest of the public is an even worse reason to employ algorithms as censors.
Catherine Stihler, Rapporteur of the Committee on the Internal Market and Consumer Protection (IMCO) released her draft opinion on the proposed Copyright in the Digital Single Market Directive. In this opinion, Stihler rightly states that article 13, which proposes to implement content filter mechanisms that would block some of users’ uploads, fails to achieve its purpose. She tries to make sure rightsholders and creators would receive a fair and balanced compensation for the exploitation of their work without negatively impacting the digital economy or internet freedoms of consumers. Acting on this, Stihler tries to fix article 13. However, we believe that the only appropriate response is to delete it altogether.
The filter must go
It is commendable that in her opinion MEP Stihler explicitly says that any attempt to address the value gap cannot be enforced if it has a negative impact on fundamental rights and internet freedoms of consumers. This is something the potential beneficiaries of the proposed article seem to ignore.
Explaining why the upload filter must be removed, MEP Stihler states that filter machines are not capable nor suitable to take into account user rights such as exceptions and limitations. This is something all the opponents of the upload filter, including COMMUNIA, have pointed out before. Therefore in her amendments she rightfully removes all references to the ‘effective’ recognition technologies, which would make the Directive text more technology neutral and future-proof. Continue reading
Today we are publishing the fifth in a series of position papers dealing with the various parts of the European Commission’s proposal for a Directive on Copyright in the Digital Single Market (the previous papers dealt with the education exception, text and data mining, the press publisher’s right and freedom of panorama). The paper deals with article 13 of the Commission’s proposal which introduces a filtering obligation on online platforms that allow users to upload content (such as facebook, youtube flickr and many other online services). The proposal fails to establish clear rules for internet users that make it clear how they can share and remix content legally. Instead it introduces a filtering requirement for online platforms that can potentially serve as a censorship machine and will violate users’ fundamental rights and distort the existing legal framework. From our perspective article 13 and the related recitals should be deleted from the proposal (You can download a pdf version of the position paper here).
Position paper: Use of Protected Content by Information Society Service Providers
Article 13 of the European Commission’s proposal for a Directive on Copyright in the Digital Single Market attempts to address the disparity in revenues generated for rightsholders and platforms from online uses of protected content. The proposed article attempts this by introducing an obligation for “Information society service providers that store and provide to the public access to large amounts of works” to filter user uploads. It would also require these providers to set up licensing agreements with rightsholders.
These proposed measures, however, do not address the issue adequately; instead, they violate fundamental rights of users, contradict the E-Commerce Directive, and go against CJEU case law.
The measures proposed in the Commission’s proposal stem from an unbalanced vision of copyright as an issue between rightsholders and ‘infringers’. The proposal chooses to ignore limitations and exceptions to copyright, fundamental freedoms, and existing users’ practices. In addition, the proposal fails to establish clear rules with regard to how citizens can use protected works in transformative ways—such as remixes and other forms of so-called “user-generated content” (UGC). As a result, a system of this kind would greatly restrict the way Europeans create, share, and communicate online.