Today, the Court of Justice of the European Union issued its long awaited judgement on the compliance of the new liability regime established by Article 17 of the DSM Directive with fundamental rights. Rejecting the request of the Polish government to annul Article 17(4)(b) and the last half sentence of subparagraph (c), the Court confirmed that Article 17 can be reconciled with the right to freedom of expression because the article also provides ex-ante and ex-post safeguards to users rights that limit its impact on the right to freedom of expression and information.
Article 17 contains sufficient safeguards to minimise the impact of upload filters on fundamental freedoms
According to the CJEU, the preventive measures to monitor and block users’ uploads envisioned by Article 17(4) constitute a limitation on the exercise of the right to freedom of expression and information of the users of online sharing services, but such a limitation is compatible with Article 11 of the Charter of Fundamental Rights of the European Union, since all the conditions laid down in Article 52(1) of the Charter are satisfied.
The Court held that the risks that the use of upload filters entails for the right to freedom of expression and information of users of sharing platforms have been sufficiently addressed by the EU legislator, which laid down sufficient safeguards in Article 17(7), (8) and (9) to protect those rights:
- online sharing providers have an obligation of result to not preventively block lawful content (Article 17(7));
- those providers are only obliged to detect and block content in on the basis of relevant and necessary information provided by rightholders and cannot be required to block content which, in order to be found unlawful, would require an independent assessment of the content by them (Article 17(8));
- additionally, and as final safeguard for situations where, despite the obligation in Article 17(7), those providers nevertheless block such legitimate content, users have at their disposal a complaint and redress mechanism as well as out-of-court mechanisms (Article 17(9)).
Upload filters can only be used if they are able to recognize, and not automatically block, lawful uploads
While Article 17(4) does not define the specific measures that online sharing platforms should adopt to monitor and block users’ uploads, the Court considered that the EU legislator has laid down a clear and precise limit in Article 17(7) and (9): measures which filter and block lawful content at upload are off limits.
In the Court’s view, the measures adopted by sharing services to comply with their obligations under Article 17(4) must “be strictly targeted in order to enable effective protection of copyright but without thereby affecting users who are lawfully using those providers’ services”. The Court says that this obligation imposed on sharing services to not affect the essence of the fundamental rights of users who share lawful content on their platforms clearly follows from Articles 17(7) and (9).
This implies that, if the online sharing platform adopts a filtering system, such a system must be able to recognize, and not automatically block, lawful content. The Court is adamant:
“(A) filtering system which might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications, would be incompatible with the right to freedom of expression and information, guaranteed in Article 11 of the Charter, and would not respect the fair balance between that right and the right to intellectual property.”
Unlike the Advocate General in his Opinion, the Court does not formulate any criteria to help determine cases where the content concerned must be presumed to be lawful and the upload must be permitted. The Court only clarifies that this should not require an independent assessment by the platforms of the information provided by the rightholders and of any copyright exceptions. In other words fully automated filters should be only allowed to block (manifestly) infringing uploads. In all other cases matches must be flagged and reviewed by rightholders who can request the removal. According to the Court, such removal requests must be “duly substantiated”.
Ex-post safeguards are additional to ex-ante protections of users rights
With this ruling, the CJEU decided one of the most controversial questions in the debate surrounding the implementation of user rights safeguards: whether those rights need to be protected ex-ante, or if it is enough to consider them after the upload has been blocked.
Rejecting the interpretation put forward by the French and Spanish governments, according to which content could be systematically blocked ex-ante, provided that users could obtain its reinstatement ex-post, the Court ruled that:
“(T)he first and second subparagraphs of Article 17(9) of Directive 2019/790 introduce several procedural safeguards, which are additional to those provided for in Article 17(7) and (8) of that directive, and which protect the right to freedom of expression and information of users of online content-sharing services in cases where, notwithstanding the safeguards laid down in those latter provisions, the providers of those services nonetheless erroneously or unjustifiably block lawful content.” (emphasis added)
According to the Court, the complaint and redress mechanisms and out-of-court mechanisms are merely an additional safeguard for situations where service providers mistakenly or unjustifiably block legitimate content. Separately, and cumulatively, Article 17(7) requires service providers to not preventively and systematically block legitimate content. In other words, the obligation to protect users rights after their uploads have been blocked (as mandated by Article 17(9)) does not replace the obligation to protect users when they are uploading lawful content (as mandated by Article 17(7)).
Most Member States will need to amend their implementation of Article 17
Today’s judgement brings a preliminary end to nearly three years of heated discussions on how to implement Article 17 — the most controversial element of the 2019 Copyright Directive, which sparked massive protests from internet users both online and offline — into national law. It confirms the position developed by COMMUNIA and other civil society organisations, as well as by a large group of academics, that Article 17 requires strong user rights safeguards that prevent upload filters from blocking uploads unless they are infringing. This reading of Article 17 has since been endorsed by the European Commission — both in its 2021 implementation guidance for Article 17 and in its intervention during the hearing of the CJEU case — and in the national implementations adopted by Austria and Germany.
The ruling will force Member States that have implemented Article 17 in a manner that does not prevent legal uploads from being blocked to reverse course. Spain and Italy — both of which have included provisions in their implementations that require that disputed content remains unavailable until the resolution of a complaint — will need to bring their implementation laws into compliance with the standards set by the CJEU. Member States such as France and the Netherlands, who have merely restated the provisions of the directive in their national laws and that have not included ex-ante measures to prevent lawful uploads from being blocked, will likely need to include additional safeguards (alternatively these implementations will have to be interpreted in line with the standards set by the CJEU). Those Member States who have — wisely — waited for today’s judgement to provide them with guidance, should follow the example set by the German implementations which is the only existing implementation that seems to meet all the standards established by the CJEU in today’s judgement