Skip to content
christian-wiediger-ZYLmudR28SA-unsplash
08 February 2021

The Digital Services Act and freedom of expression: triumph or failure?

The Digital Services Act (‘DSA’) is part of the long-awaited package aimed at providing a transparency and accountability framework for online platforms and laying down additional duties for large providers with gatekeeping powers. There is surely a lot to unpack in this hefty proposal. This piece looks at new obligations and regulatory powers introduced by the DSA and examines their potential to safeguard freedom of expression. It also uncovers some of the DSA’s controversial points which policymakers should be looking out for in the course of the legislative process.

Valentina Golunova & Juncal Montero Regules


If it ain’t broken, don’t fix it

There is little doubt that the rules on intermediary liability can have a crucial impact on fundamental rights of Internet users. Freedom of expression is subject to the most sizeable impact. A tight grip on service providers can make them precautiously take down controversial materials rather than leave them intact at the risk of incurring liability, thus causing the so-called overremoval of user-generated content. The DSA clearly recognises it: in contrast to the e-Commerce Directive, which mentioned fundamental rights exactly zero times, the DSA abounds with respective references.

One of the advantages of the DSA is that it does not encroach on things which already work well. The DSA reaffirms the negligence-based model of liability for service providers: platforms cannot be held liable for user-generated content if they lack actual knowledge of it and, upon obtaining such knowledge, take an expeditious action to remove it. Tilting this fragile balance could again create incentives for service providers to be “better safe than sorry” and take disproportionate action on legitimate content. It was also a relief to see that the DSA’s scope is limited to illegal content despite the original aspiration to cover harmful content as well. According to Věra Jourová, providers should be encouraged to control the distribution and display of harmful content rather than wipe it out completely. In light of controversies provoked by the Online Harms White Paper, the final response to which was published on the same day as the DSA, a more restrictive approach makes perfect sense. 

New rules on intermediary liability 

While the DSA upholds the general rules on intermediary liability, it adds a few twists aimed at fostering fundamental rights of users of digital services. Article 6 DSA stipulates a famous ‘Good Samaritan’ clause, which serves as a shield from liability for good-faith efforts to remove illegal content in a proactive manner. But recital 18 DSA still maintains that the liability exemptions do not apply when a provider does not engage in “merely technical and automatic processing” of the content it stores but rather plays an active role in its dissemination or promotion. The CJEU is expected to clarify – yet again – what is meant by active role in pending C-500/19 Puls 4 TV. If it rules that, by deploying a recommender system, YouTube no longer qualifies for the exemptions for liability, the net effect of the ‘Good Samaritan’ protection will be questionable.

In addition, the DSA reaffirms the prohibition of general monitoring: a provision which has caused substantial controversy in recent years. In fact, the goes beyond mere recognition of this provision and introduces specific provisions which give it a specific meaning. Drawing from the CJEU’s jurisprudence, Article 8 DSA sets out criteria of lawful injunctions against ISSPs. Interestingly enough, the type of injunctions authorised by the CJEU in C-18/18 Glawischnig will likely contravene Article 8(2)(a) DSA as it requires that any order should contain both exact URLs and any additional information necessary for identification of the content at hand. But one question remains: how can Article 8(2)(a) be reconciled with Article 5(4) DSA, which still lets MS to require intermediaries not only to terminate, but also prevent infringements? As it is not possible to identify the content which might only appear in the future by means of an URL, it remains to be seen whether the DSA still allows an exception for copies of illegal content if the latter has been identified appropriately. 

The DSA also stipulates an EU-wide notice and action mechanism, which is something researchers have called for. But, as rightly noted by Access Now, it does not set out different procedures depending on the type of content. Even though Article 2(g) provides a definition of ‘illegal content’, as emphasised by Article 19, the DSA still leaves it up to online platforms to decide whether the materials in question deserve protection or are to be discarded. Plus, Article 17 DSA establishes an internal complaints mechanism whereby the complaints are reviewed against both legal standards and the platforms’ terms and conditions. This way, platforms would examine their decisions based on their own standards – an obviously problematic approach. More precise rules in this regard are therefore welcome. 

Automated content moderation

The DSA also contains specific provisions on automated content moderation, which has become a vital point of contemporary research agenda. Researchers have reported extensively on its ominous effect of content-filtering tools on enjoyment of fundamental rights: most of them are inaccurate, biased, and can be easily circumvented. In light of the public outcry provoked by the proposed Terrorist Content Regulation, it is promising that the DSA neither encourages nor condemns the use of content-filtering technologies. Rather, it focuses on ensuring transparency. Under Article 23(1)(c), online platforms are obliged to disclose information on “automatic means for the purpose of content moderation”. Notably, SMEs are exempted from this duty: Article 13 DSA which sets out general transparency requirements simply asks for “the number and type of measures”. The rationale of this policy choice is not clear. Was it assumed that service providers which do not qualify as online platforms are not likely to resort to automated content moderation? If so, this is not true: while not many of them develop such systems in-house, more and more smaller providers deploy ones devised by third-party firms. In this line, more favourable approach could be to introduce a horizontal rule on transparency of automated content moderation equally applicable to all service providers. 

One useful safeguard relating to transparency of automated decision-making is also embedded in the provision on the notice and action regime. Under Article 14(6) DSA, service providers must inform rightholders about the use of automated means for processing of their notices. The provider of content subjected to restrictions by a provider is also entitled to such information under Article 15(2)(c) DSA. This provision also applies beyond a notice and action procedure, for example, when illegal content was detected or identified by means of automated tools. But both these provisions are silent on the necessary scope of disclosure that providers are expected to ensure. Is it enough to merely acknowledge the fact that automated decision-making was engaged, or does the DSA allude to a more sophisticated procedure when a service provider would also need to provide reasons for its deployment as well as account for its effects? Perhaps, the DSA provisions on automated content moderation should be more aligned with Article 13 GDPR, which guarantees that the data subject is entitled to meaningful information about the “logic” of automated decision-making as well as its “significance and the envisaged consequences”.

Asymmetric obligations… 

Interestingly, the DSA introduces asymmetric due diligence obligations, assigning different duties to intermediaries depending on their size. Only very large online platforms (VLOPs) are subject to the full scope of the Regulation, with other types of intermediaries holding decreasing levels of obligations. At the outset, this approach is only to be welcomed. It materialises the demands from both scholars and the industry to design rules proportionate to the scale of reach and to the technical and operational capabilities of intermediaries. A one-size-fits-all approach would create obstacles to the growth of SMEs and eventually prevent them from entering the market, thus interfering with the fundamental right of freedom to conduct business. By awarding more due diligence obligations to VLOPs, the draft DSA also recognises their vital role in shaping online speech.

However, this approach is questionable from the perspective of tackling illegal content. First, VLOPs are defined by reference to use by population at EU level (platforms with at least 45 million active users in the EU), potentially overlooking significant platforms at national level. Second, some small platforms are widely used by extremist movements and fall outside of the scope of the most relevant obligations against illegal content under the DSA. This can negatively affect the DSA’s objective of “ensuring a safe, predictable and trusted online environment”: if illegal content moves to non-VLOPs, such content will fall out of the scope of action of some parts of the Regulation, including risk management, assessment, and closer scrutiny by the European Commission. Finally, the systemic risk assessment that VLOPs are to carry out pursuant to Article 26 DSA stems as a promising step towards a comprehensive, organic, and proactive risk management. However, the fact that it is the platforms themselves who are responsible for such assessment, with little involvement of public independent oversight, brings this measure into question.

…and beyond

Other than establishing different obligations for different types of service providers, the DSA creates a new institutional architecture. Just like due diligence obligations, regulatory powers are also asymmetric, with national Digital Services Coordinators of Member States regulating non-VLOPs (Article 38 – 46 DSA) and the European Commission regulating VLOPs (Articles 52 – 59 DSA). The powers of the Commission are intense – it will have a central place in platform governance, as it holds a vital role under the current version of the DSA. But such architecture is expected to have a controversial impact on protection of fundamental rights. The institutional bodies who have a say on content moderation are the ones deciding the content moderation procedures and mechanisms which impact fundamental rights, be it through advisory actions, monitoring, application, enforcement or fine imposition. The regime laid down in the DSA does not reflect scholarly demands for providing an Ombudsperson entrusted with fundamental rights protection in dispute settlement proceedings reviewing content moderation, nor does it follow the European Parliament’s proposition to establish a new EU entity with monitoring functions.

Way forward

The draft DSA is definitely a promising start of the ambitious legislative reform. It rests on the noble objective to safeguard fundamental rights of EU citizens and puts forward profound approaches to some acute issues. But the DSA does not do much to alter the hands-off approach to regulation of online content. This issue looks even more urgent in the aftermath of the Great De-Platforming resulting in accounts of the former president being suspended by the whole range of social media companies. The EU legislator might want to consider if the DSA should clamp down on voluntary moderation efforts of service providers and reaffirm that regulation of online content is entrusted to competent public institutions. It is also crucial to keep in mind that regulation of digital services does not only trigger freedom of expression, but also a plethora of other fundamental rights, including the right to data protection and the right to effective remedy. This holistic approach should underpin the forthcoming discussions on the DSA.


Valentina Golunova, PhD researcher at the Faculty of Law of Maastricht University (the Netherlands)

Juncal Montero Regules, PhD fellow of Research Foundation Flanders (FWO) at the Faculty of Law, Hasselt University (Belgium)

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.