Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-22T04:41:50.727Z Has data issue: false hasContentIssue false

Using Terms and Conditions to apply Fundamental Rights to Content Moderation

Published online by Cambridge University Press:  11 July 2023

João Pedro Quintais*
Affiliation:
Institute for Information Law, University of Amsterdam, Amsterdam, Netherlands
Naomi Appelman
Affiliation:
Institute for Information Law, University of Amsterdam, Amsterdam, Netherlands
Ronan Ó Fathaigh
Affiliation:
Institute for Information Law, University of Amsterdam, Amsterdam, Netherlands
*
Corresponding author: João Pedro Quintais; Email: [email protected]

Abstract

Large online platforms provide an unprecedented means for exercising freedom of expression online and wield enormous power over public participation in the online democratic space. However, it is increasingly clear that their systems, where (automated) content moderation decisions are taken based on a platformʼs terms and conditions (T&Cs), are fundamentally broken. Content moderation systems have been said to undermine freedom of expression, especially where important public interest speech ends up suppressed, such as speech by minority and marginalized groups. Indeed, these content moderation systems have been criticized for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation. Therefore, in order to better protect freedom of expression online, international human rights bodies and civil society organizations have argued that platforms “should incorporate directly” principles of fundamental rights law into their T&Cs. Under EU law, and apart from a rule in the Terrorist Content Regulation, platforms had until recently no explicit obligation to incorporate fundamental rights into their T&Cs. However, an important provision in the Digital Services Act (DSA) will change this. Crucially, Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights. In this article, we critically examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires platforms to apply EU fundamental rights law and to what extent this may curb the power of Big Tech over online speech. We conclude that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards. If this is possible Article 14 may fulfil its revolutionary potential.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the German Law Journal

A. Introduction

As the European Court of Human Rights (ECtHR) has emphasized, online platforms, such as Meta/Facebook, Twitter and YouTube, provide an “unprecedented” means for exercising freedom of expression online. Footnote 1 International human rights bodies have recognized the “enormous power” platforms wield over participation in the online “democratic space.” Footnote 2 However, it is increasingly clear that the systems operated by platforms, where automated content moderation decisions are taken based on a platformʼs terms of service, are “fundamentally broken.” Footnote 3 Content moderation systems are said to “undermine freedom of expression,” Footnote 4 especially where important public interest speech ends up being suppressed, such as speech by minority and marginalized groups, Footnote 5 black activist groups, Footnote 6 environmental activist groups, Footnote 7 and other activists. Footnote 8 Indeed, the UN Special Rapporteur on freedom of expression has criticized these content moderation systems for their overly vague rules of operation, inconsistent enforcement, and overdependence on automation, which can lead to over-blocking and pre-publication censorship. Footnote 9 This criticism is combined with, and amplified by, the notion that Big Tech exercises too much power over our online public sphere. Therefore, to better protect free expression online, the UN Special Rapporteur and civil society have argued platforms “should incorporate directly” principles of fundamental rights law into their terms and conditions (T&Cs). Footnote 10

Crucially, as the legal relationship between platforms and users is conceptualized as a private one, with the T&Cs as the contractual basis, in principle, EU fundamental rights are not directly applicable to this relationship. The European Convention on Human Rights (ECHR) and the Charter of Fundamental Rights of the EU Charter primarily govern the vertical relationship between states and citizens, not the horizontal relationship between private parties. Footnote 11 Moreover, the contractual nature of T&Cs implies that the law conceptualizes the relationship between intermediary service providers and users as an equal one. Service providers, as private companies, have the right to offer their services on their own terms and, for their part, users are free to enter into this contract, negotiate other terms, or find an alternative provider. Footnote 12 From this perspective, the provider’s ability to set its own terms of service can be seen as a direct extension of its fundamental right to property and freedom of contract. However, this legal abstraction of two equal contractual parties does not hold in practice. Footnote 13 The reason is that large providers, such as major social media platforms, have significantly more bargaining power than their individual users when determining how their services are used. Users, for their part, are mostly in a dependent position and must accept the “house rules” of providers, which ultimately shape how individuals participate in the online exchange of ideas and information. This inequality is a feature of platform contractual relationships between users and providers. It has long been identified by researchers and, to some extent, is increasingly recognized in certain areas of EU law. Footnote 14

Apart from a rule in the Terrorist Content Regulation Footnote 15 and until recently, EU law did not impose on intermediary services in general, and online platforms in particular, an obligation to incorporate fundamental rights into their T&Cs. But an important provision in the Digital Services Act (DSA) Footnote 16 aims to change this. Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that they must have “due regard” to the “fundamental rights” of users under the EU Charter. Footnote 17 In this article, we examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires online platforms to apply EU fundamental rights law and to what extent it may curb the power of Big Tech over online speech or merely serve to legitimize their content moderation practices concerning the protection of human rights. In this inquiry, we also explore the opportunities and limitations of a rule that pushes platforms towards enforcement of fundamental rights in the application of their T&Cs.

The focus of our analysis is on EU law, including its interpretation by the Court of Justice of the EU (CJEU). Because the Charter is, under Article 52(3), internally connected to the ECHR and its interpretation by the ECtHR, these are included in the analysis.Footnote 18 In addition, we are primarily concerned with the role of the largest Big Tech platforms, what in the DSA terminology are called “very large online platforms” (VLOPs) Footnote 19 —in essence, large-scale user-upload platforms such as YouTube, Meta’s Facebook and Instagram, TikTok or Twitter—and how their T&Cs shape their content moderation activities.Footnote 20 The article proceeds as follows. After this introduction, Part B provides the necessary context on the inherent tension regarding the desirable role and responsibility of platforms, the emergent landscape of platform regulation, and the lack of regulation on platforms’ T&Cs. Part C examines the mechanics of Article 14 DSA, both as regards information obligations, and application and enforcement of fundamental rights. Based on the previous analysis, Part D identifies three elements for critical discussion: Possible horizontal application of fundamental rights and how this relates to differences in application in EU Member States; compliance and enforcement of Article 14 DSA; and the potential limitations arising from overreliance on fundamental rights in this area. Part E concludes that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, along the lines suggested in our analysis.

B. The Emergence of EU Platform Regulation and Its Limited Regulation of Terms and Conditions

I. The Case for Platform Regulation and Its Perils

For the most part, EU law leaves a broad margin of discretion for platforms to set and enforce their content moderation policies through a combination of their T&Cs and human and algorithmic practices. Footnote 21 This freedom and the power it bestows on platforms have given rise to pointed criticism. Footnote 22 As a result, calls to reign in the power of Big Tech over online speech have increased over time from civil society, academia and even governments. Footnote 23 While in the first few years of their existence, platforms were seen as the champions of free speech vis-à-vis government intervention and censorship, their public image and perception has shifted significantly. Footnote 24 In this age of the “techlash,” Footnote 25 Big Tech platforms and their content moderation practices are increasingly viewed as a potential threat to free expression and a healthy public debate. Footnote 26 This criticism includes a range of concrete and often contradictory charges. Some critics chastise platforms for intervening too much in the content they host, leading to over-removal and suppression of lawful content, for example, from vulnerable groups or specific political orientations. Others criticize platforms for doing too little to address harmful or outright illegal conduct on their services, such as the spread of disinformation or hate speech. Footnote 27 In addition, because these platforms are private companies, their actions—including on content moderation—are primarily driven by commercial or business interests and values. Footnote 28 Such actions are rarely, and perhaps only accidently, aligned with public values, even if most Big Tech platforms explicitly adopt the language of public values or even fundamental rights. Footnote 29

However, from a policy perspective, there is uncertainty on the best approach to tackle the power of Big Tech. On the one hand, there is a growing realization of the outsized role and influence of certain platforms on online speech has negative consequences for the polity and public interest. From this perspective, the regulatory push towards an increased role and responsibility for intermediaries is understandable. As the primary mediators of online content, intermediary service providers like platforms are in a better position to control the flow of content as compared to governments and law enforcement agencies. Increasingly, the EU has been relying on this position of control by transferring responsibility for the ex-ante and ex-post filtering, blocking and removal of specific types of illegal content directly to intermediaries. Footnote 30 Many of the EU and national level instruments we mention below denote features of this approach, including obligations to impose proactive preventive measures, such as filtering and notice-and stay-down mechanisms, reactive removal obligations, such as notice-and-takedown mechanisms—some of which in time frames as short as 24 hours—injunctions against non-liable intermediaries for third party unlawful content on their services, such as website or -DNS- blocking, account termination or content removal, and extensive information obligations on potential infringers, to name just a few. Footnote 31 In addition, it is also possible to identify at national level several recent laws, legislative proposals and national court judgments advancing rules and interpretations that range from “must-carry”-like obligations—imposing a duty on platforms to host certain types of legal content, sometimes from specific sources like politicians—to more nuanced rulings where in specific circumstances, certain platforms can be obligated to reinstate accounts and specific content. Footnote 32

On the other hand, there is a countervailing concern that the wrong type of regulatory intervention might cause more harm than good. Therefore, even if understandable, this “enhanced responsibility” turn for intermediaries has been subject to sharp criticism. From a constitutional legitimacy perspective, it has been argued that private entities should not be conducting government tasks. Footnote 33 Concretely, there are concerns about the effect such regulatory interventions may have inter alia on the right to freedom of expression of platform users. A crucial concern is that this new wave of platform-regulation rules result in a form of undesirable privatized enforcement that poses serious risks of over-removal of legal content, putting undue restrictions on people’s ability to express themselves online which, as described by the ECtHR, can have a “chilling effect” on freedom of expression. Footnote 34 This claim was made at the very conception of this type of regulation in the 1990s and has gained force with the increased vertical EU regulation and national enforcement acts, as described below.

II. The Emergence of EU ‘Platform Regulation’ and the Concept of Enhanced Responsibility’

The above concerns and criticisms have led to policy and legislative action to reign in the power of platforms through regulation of illegal and harmful content online. In Europe, this is occurring both at the EU and national levels. For most of the early 21st century, EU law on intermediaries was sparse, with no comprehensive harmonization of intermediary liability. The centerpiece of the legal framework was the 2000 e-Commerce Directive, which contains mere conditional liability exemptions, or “safe harbors,” for certain types of intermediary services involving claims for damages: mere conduit or access, caching, and hosting. Footnote 35 The Directive further contains a prohibition on the imposition by Member States on intermediary service providers of general monitoring obligations. Footnote 36 Under this regime, intermediaries may still be required to take measures against the infringement of third-party rights, because it remains possible to subject intermediaries to injunctions for instance in regards to intellectual property rights, Footnote 37 and duties of care. Footnote 38

The interpretation of this constellation of provisions is complex and far from settled. Footnote 39 The most relevant CJEU case law for online platforms—as types of hosting service providers—refers to Article 14 e-Commerce Directive. The CJEU has applied Article 14 to a search engine’s advertising service (Google France/Louis Vuitton), an online sales platform (L’Oréal/eBay), social networking platforms (Netlog; Glawischnig-Piesczek) and video-sharing platforms (YouTube and Cyando). Footnote 40 In its case law, the CJEU has noted that safe harbors require a sufficient degree of “neutrality” from the intermediary. This approach has created a grey area for the qualification of certain online platforms as “neutral” or “passive” compared to “active” intermediaries for the purposes of the hosting safe harbor. Footnote 41 Generally, when a provider plays “an active role of such a kind as to give it knowledge of, or control over, the data stored,” it falls outside the scope of safe-harbor protection. Footnote 42 However, the determination of what constitutes such an active role in practice might depend on the type of platform at issue, the content it hosts, and several factors that qualify its conduct or knowledge vis-à-vis the illegal content it hosts.Footnote 43

A further aspect of the legal framework is controversial. Article 15 e-Commerce Directive—supported by Recital 47—requires that a distinction be made between prohibited “general” monitoring obligations and permitted obligations to monitor in “specific” cases. The Court has attempted to draw lines to clarify this distinction, including in L’Oréal/eBay, Scarlet Extended, Netlog, Eva Glawischnig-Piesczek, YouTube and Cyando, and Poland v. Commission and Council. Footnote 44 However, there remains significant legal uncertainty as to what would constitute specific monitoring for distinct types of illegal content in different scenarios.

Against this background and development of CJEU case law in interpreting specific subject matter rules to extend the reach of harmonized EU law to online intermediaries, like in the context of intellectual property, there has been an increasing push towards additional regulation of online platforms. This push has been justified around a somewhat blurry concept of legal, societal, political and even moral “responsibility” of online platforms. Footnote 45 The potential result, as noted by Frosio and Husovec, could “represent a substantial shift in intermediary liability theory,” signaling a “move away from a well-established utilitarian approach toward a moral approach by rejecting negligence-based intermediary liability arrangements,” practically leading to a “broader move towards private enforcement online.” Footnote 46

At the policy level, this push is visible in Commission instruments dating back to 2016. Starting timidly in its 2016 Communication on “Online Platforms in the Digital Single Market,” the Commission brings up the need for platforms to act responsibly, but largely endorses self- and co-regulation as the most flexible and future-proof approaches to address the increasing role and importance of platforms in the online ecosystem. Footnote 47 The Commissionʼs approach becomes sharper in its “Tackling Illegal Content Online” policy agenda, first with a Communication in 2017 and then with a Recommendation in 2018. Footnote 48

In line with the conceptual shift noted above, the Communication is titled “Towards an Enhanced Responsibility for Online Platforms,” it provides a “set of guidelines and principles for online platforms to step up the fight against illegal content online in cooperation with national authorities, Member States and other relevant stakeholders.” Footnote 49 Its objectives are to improve a range of content moderation actions by platforms in the notice-and-action spectrum, transparency rules, protection of fundamental rights online, as well as to clarify the liability regime of platforms under the e-Commerce Directive. Footnote 50 Among the structural proposals by the Commission is the endorsement of voluntary proactive measures by platforms to detect and remove illegal content online as well as notice and stay-down mechanisms, the deployment of which by platforms is considered not to lead to the loss of the hosting liability exemption. Footnote 51 This is said to be particularly the case when proactive measures “are taken as part of the application of the terms of services of the online platform,” Footnote 52 presumably because this would reinforce their voluntary status. As regards T&Cs, the remaining references in the Communication amount to an endorsement of their use to impose information and transparency obligations on platforms regarding content moderation practices. Footnote 53

This approach sets the tone for the subsequent recommendation, including some of its most problematic aspects from a freedom expression perspective. In particular, the instrument offers specific recommendations relating to terrorist content, including demanding proactive measures. Footnote 54 For instance, it recommends that if hosting providers remove or disable access to content identified in government agencies’ referrals, they should do so “as a general rule, within one hour from the moment at which they received the referral.” Footnote 55 In this context, it is recommended that hosting service providers “expressly set out in their terms of service that they will not store terrorist content.” Footnote 56 In addition, as regards T&Cs, the Recommendation echoes the Communication’s call for ex ante transparency in such documents regarding platformʼs content moderation practices while retaining their freedom to “set and enforce their terms of service in accordance with Union law and the laws of the Member States.” Footnote 57

At this point, it should be noted that the policy agenda for tackling illegal content online covers different issues, such as “incitement to terrorism, illegal hate speech, child sexual abuse material, infringements of intellectual property rights and consumer protection.” Footnote 58 These topics are dealt with in a combination of binding and non-binding instruments, which reflect the policy shift towards the “enhanced responsibility” of platforms, characterized by changes to the liability framework incentivizing and in some cases imposing proactive measures, additional obligations on platforms, for example, as regards transparency, and procedural ex post safeguards, such as complaint and redress mechanisms.

Among the noteworthy binding instruments that manifest this new approach are the 2018 Terrorist Content Regulation, the 2018 revision of the Audiovisual Media Services Directive (AVMSD) which extended rules and obligations to video sharing platforms and social media services, Footnote 59 the new liability regime for online content-sharing platforms in Article 17 of the Copyright in the Digital Single Market Directive (-CDSMD-), Footnote 60 and, most relevant among them, the 2022 DSA. Footnote 61 By moving the “responsibility of intermediaries away from the area of liability and deeper into the realm of regulation,” the DSA becomes the exponent of EU lawʼs turn from intermediary liability towards “platform regulation.” Footnote 62 As for the non-binding instruments, which manifest a co- or self-regulatory approach, they include the -EU- Code of Conduct on countering illegal hate speech online, the Memorandum of understanding on the sale of counterfeit goods on the internet, and the strengthened Code of Practice on Disinformation. Footnote 63

Finally, it should be noted that neither national legislators nor courts have been quiet in this regard. Notable platform regulation enacted or proposed instruments include the German NetzDG as amended, Footnote 64 Austria’s “Anti-Hate Speech Law,” Footnote 65 and France’s “Avia Law” Footnote 66 —the first iteration of which saw main portions struck down by the French Constitutional Council. Footnote 67 It remains to be seen to what extent parts of these national initiatives will be preempted by the DSA. Still, from our perspective, the most relevant developments at national level on the intersection of platform regulation, T&Cs and fundamental rights, are found in national court decisions, which we further examine below. Footnote 68

III. The Limited Rules on T&Cs and Fundamental Rights in EU Platform Regulation

Until the DSA, there was relatively little explicit regulation of platforms’ T&Cs, for instance in the AVMSD, the CDSMD and the Terrorist Content Regulation. Footnote 69 The consequence is that platforms enjoy significant discretion and power in determining their content moderation practices vis-a-vis users in their “house rules.”

The revised AVMSD imposes on “video-sharing platforms,” a type of online platform, Footnote 70 a set of “appropriate measures” to protect minors and the general public from certain types of harmful and illegal content. Footnote 71 These measures consist inter alia of including and applying in the T&Cs of the platform services certain requirements set out in Directive. Footnote 72 The CDSMD establishes a new liability regime for “online-content sharing service providers,” a type of online platform hosting copyright-protected content. Footnote 73 The new rules impose direct liability on these providers for user-uploaded copyright-protected content they host, set aside the application of the hosting safe-harbor in the e-Commerce Directive, and mandate best efforts obligations on providers to deploy proactive and reactive preventive measures to detect and remove copyright-infringing content. Footnote 74 Among the safeguards recognized in this regime, these providers are required to inform their users in their T&Cs that they can use protected content under exceptions to copyright. Footnote 75 Crucially, this includes user-uploaded content that qualifies as “quotation, criticism, review” or “use for the purpose of caricature, parody or pastiche,” recognized by the -CJEU- as freedom of expression-based “user rights.” Footnote 76

Finally, the Terrorist Content Regulation lays down uniform rules to address the misuse of hosting services for the dissemination to the public of terrorist content online. Footnote 77 Building on the approach of the 2018 Recommendation on Tackling Illegal Content Online, this Regulation imposes on providers of hosting services a number of obligations regarding measures to address the dissemination of terrorist content online, including being subject to orders to remove or disable access to this type of content in all Member States. Footnote 78 This Regulation defines T&Cs as “all terms, conditions and clauses, irrespective of their name or form, which govern the contractual relationship between a hosting service provider and its users.” Footnote 79 Article 5(1) is the clear inspiration for Article 14 DSA. The provision states that hosting service providers exposed to terrorist content “shall, where applicable, include in [their T&Cs] and apply provisions to address the misuse of its services for the dissemination to the public of terrorist content.” Footnote 80 Furthermore, these obligations shall be carried out “in a diligent, proportionate and non-discriminatory manner, with due regard, in all circumstances, to the fundamental rights of the users and taking into account, in particular, the fundamental importance of the freedom of expression and information in an open and democratic society, with a view to avoiding the removal of material which is not terrorist content.” Footnote 81 Finally, T&Cs are also relevant in the context of safeguards, in particular as regards transparency obligations for hosting service providers. These providers must spell out clearly in their T&Cs their policy for addressing the dissemination of terrorist content, including, the explanation of the functioning of specific measures, such as the use of automated tools. Footnote 82 To the best of our knowledge, this obligation is underexplored in theory and practice, with little research or evidence on how platforms have implemented it.

It is against this background, where the push towards additional platform regulation clashes with concerns over its effect on users’ freedom of expression, that any proposals to regulate platforms’ T&Cs must be considered. Article 14 DSA is meant to fill an important gap in addressing the wide margin of discretion enjoyed by intermediary service providers in their contractual relationship with users, which governs the human and automated moderation of illegal and harmful content online. The provision is particularly significant due to the horizontal nature of the DSA in EU law. The DSA lays down harmonized rules on the provision of intermediary services in the internal market and will apply to intermediary services provided to recipients of the service—including individual users—that have their place of establishment or are located in the EU. Footnote 83 As such, rules on T&Cs in the DSA will govern the contractual relationship between service providers and a “recipient of the service”—including individual users Footnote 84 in the EU directly regulating the conditions under which intermediaries can offer their services. When applied to Big Tech platforms, Article 14 DSA has therefore the potential to shape the provision of what some authors consider to be their core service: content moderation. Footnote 85

C. Mechanics of Article 14 DSA

The DSA was adopted and published in the official EU journal on October 19, 2022, Footnote 86 and is a regulation that is divided into five chapters. Chapter II sets out the regime for the liability of intermediary services providers. This consists of a revised version of the liability exemptions—for services of “mere conduit,” “caching” and hosting—and a prohibition on general monitoring obligations in Articles 12 to 15 e-Commerce Directive, with some noteworthy additions in the form of a quasi “Good Samaritan” clause and rules on orders or injunctions. Footnote 87 Chapter III deals with due diligence obligations that are independent of the liability assessment made under the previous chapter, and which constitute a novelty in relation to the e-Commerce Directive. Footnote 88 It distinguishes between specific categories of providers, by setting out asymmetric obligations that apply in a tiered way to all providers of intermediary services, hosting providers, online platforms, very large online platforms (VLOPs) and very large online search engines (VLOSEs). VLOPs are online platforms that reach a number of average monthly active recipients of the service in the EU equal to or greater than 45 million—a number that represents about 10% of the EU population—and which are so designated pursuant to a specific procedure. Footnote 89 In practice, only the largest user-upload Big Tech platforms operating in the -EU-, like YouTube, Facebook, Instagram or TikTok, will likely qualify as VLOPs. Footnote 90 Hosting providers are a type of provider of intermediary services, online platforms a type of hosting provider, and VLOPs a type of online platform. The due diligence obligations are cumulative. Consequently, providers of intermediary services are subject to the fewest obligations and VLOPs and VLOSEs are subject to the most obligations. Crucially, all providers are subject to Article 14 DSA.

Article 14 is titled “Terms and conditions,” a concept that is defined as “all terms and conditions or clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.” Footnote 91 The rationale behind Article 14 is that while the freedom of contract of intermediaries is the general rule, “it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.” Footnote 92 As such, Article 14’s aim is to increase the transparency of T&Cs and bring their enforcement in direct relation to fundamental rights.

A crucial feature of Article 14 is that it not only applies to illegal content, like Chapter II, but also covers content contrary to the T&Cs of a provider of intermediary services. In doing so, and because it applies to all providers of intermediary services, the provision extends the obligations of Chapter III beyond illegal content. Consequently, the DSA covers a much broader scope of content moderation decisions than the e-Commerce Directive.

Article 14’s aims of transparency and enforcement are dealt with in two distinct sets of obligations. One set of obligations deals with information and transparency obligations. Footnote 93 The other deals with application and enforcement and, arguably, brings providers’ T&Cs within the scope of EU fundamental rights. Footnote 94 We discuss each set of obligations in turn, with the emphasis on the fundamental rights provision, which is central to our subsequent analysis.

I. Information and Transparency Obligations

Article 14(1) sets out a broad information obligation for providers of intermediary services regarding certain content moderation practices outlined in their T&Cs. It aims to ensure that the T&Cs are transparent and clear as to how, when and on what basis user-generated content can be restricted. The object of the obligation appears to be acts of content moderation by providers that impose “any restrictions” on recipients of the service. But it is unclear whether content moderation actions by the provider that do not strictly restrict what content users can post, such as ranking, recommending or demonetizing content, are within the scope of Article 14.

The second sentence of paragraph (1) explicitly refers to “content moderation,” a concept that covers activities undertaken by providers to detect, identify and address user-generated content that is either (i)“illegal content” Footnote 95 or (ii) incompatible with their T&Cs. Footnote 96 Further, the provision explicitly mentions “algorithmic decision-making,” raising the important question of what providing information on “any policies, procedures, measures and tools” might look like. Footnote 97 However, the exact scope of the paragraph remains unclear, as the phrasing in the first sentence of “any restrictions” appears wider than the definition of content moderation in the DSA, thereby likely broadening the provision’s scope.

In its last sentence, Article 14(1) sets out how this information should be conveyed. Echoing similar obligations in the General Data Protection Regulation (GDPR), Footnote 98 the T&Cs should be “clear.” However, where the GDPR refers to “clear and plain” language, Article 14(1) DSA goes one step further by requiring “intelligible, user friendly and unambiguous language,” which appears to result in a higher threshold obligation. Footnote 99 The same sentence further adds the requirement that the information at issue should be publicly available in an easily accessible and machine-readable format.

Building on this general clause, paragraphs (2), (3), (5) and (6) add more specific information and transparency obligations. First, inspired by an amendment proposed by the IMCO Committee, paragraph (2) requires that providers inform the recipients of the service of any significant change to the T&Cs. Footnote 100 The obligation is consequential because provider’s T&Cs change frequently, making it difficult for users to determine which obligations they are subject to over a period of time, a phenomenon that has been characterized and criticized has the “complexification and opacification” of platforms’ rules. Footnote 101 Second, paragraph (3) states that services (i) primarily directed at minors or (ii) predominantly used by them must explain their conditions and restrictions of use in a way that minors can understand. Footnote 102 This obligation reflects the fact that "the protection of minors is an important policy objective of the Union,” Footnote 103 and tracks other DSA provisions aiming to ensure additional protection to minors, such as those on transparency in recommender systems, negative effects on minors as a systemic risk, and mitigation measures. Footnote 104 Third, paragraphs (5) and (6) impose on VLOPS and VLOSEs additional obligations to: provide recipients of a service with “concise, easily accessible and in machine-readable format summary of their T&Cs,” which must include “the available remedies and redress mechanisms, in clear and unambiguous language;” Footnote 105 and publish their T&Cs “in the official languages of all Member States in which they offer their services.” Footnote 106 These providers are encumbered with more onerous obligations due to their “special role and reach,” whereas smaller providers are spared so as to “avoid disproportionate burdens.” Footnote 107

Finally, some DSA provisions outside Article 14 can be viewed as complementing its transparency and information obligations. For instance, Article 27(1) sets out a somewhat similar, although less detailed, information obligation for all online platforms vis-à-vis recipients of the service to set out in plain and intelligible language the main parameters used in their recommender systems, as well as any options to modify or influence those parameters. In another illustration, Article 44(1)(b) states the Commission shall support the development and implementation of voluntary standards set by relevant European and international standardization bodies inter alia for “templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms and conditions and changes thereto.” Footnote 108

II. Fundamental Rights Application and Enforcement

From a fundamental rights perspective, the exciting part of Article 14 is paragraph (4), which regulates the application and enforcement of T&Cs: Footnote 109

Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

The scope of the provision is the same as paragraph (1): It only applies to the enforcement of T&Cs that restrict user-generated content. The core obligation is directed at the providers to weigh the “rights and legitimate interests of all parties involved” in a “diligent, objective and proportionate” manner when applying their T&Cs. Special emphasis is placed on the fundamental rights of recipients of the service, with an explicit mention of their right to freedom of expression, including freedom and pluralism of the media.

Some guidance is provided in the recitals. Footnote 110 First, it is stated that the obligation applies not just to the application and enforcement of restrictions, but also to its design. Second, although Article 14(4) mentions the rights and interests of “all parties,” the recitals refer exclusively to the fundamental rights of the “recipients of the service.” Third, when complying with this obligation, providers should act not only in a “diligent, objective and proportionate manner” but also in a “non-arbitrary and non-discriminatory manner.” Fourth, it is emphasized that VLOPs should pay particular regard to “freedom of expression and of information, including media freedom and pluralism.” Fifth, a clear connection is made between the provision and the need for all providers to “pay due regard to relevant international standards for the protection of human rights, such as the United Nations Guiding Principles on Business and Human Rights.” Footnote 111 We return to these points below. Footnote 112

In addition, some provisions in the DSA directly or indirectly refer to Article 14(4) and the application and enforcement of T&Cs in light of fundamental rights. Footnote 113 One instance is is found in the “risk assessment” regime for VLOPs. Footnote 114 Under this regime, VLOPs shall “diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.” Footnote 115 To do so, they must carry out yearly risk assessments, including vis-à-vis specific categories of systemic risks, such as the dissemination of illegal content through their services, and any actual or foreseeable negative effects for the exercise of fundamental rights. Footnote 116 Crucially, VLOPs must focus not only on illegal content but also information that is legal but contributes to the systemic risks identified in the DSA, like disinformation. Footnote 117 When carrying out such assessments, VLOPs are required to take into account whether and how certain facts influence the specified systemic risks, including the appropriateness of the applicable T&Cs and their enforcement. Footnote 118 If such a risk is identified, VLOPs must then “put in place reasonable, proportionate and effective mitigation measures” tailored to the systemic risk identified, “with particular consideration to the impacts of such measures on fundamental rights.” In this case, such a measure would entail the adaptation of a VLOP’s T&Cs and enforcement, in line with Article 14 and the remaining provisions on T&Cs. Footnote 119 Similar mitigation measures may be applicable in the context of the special regime for crisis protocols. Footnote 120

D. The Application and Enforcement of Fundamental Rights through Terms and Conditions in the Digital Services Act

Building on the analysis above, this section identifies three crucial issues related with the application and enforcement of fundamental rights through T&Cs under Article 14 DSA. First, we explore whether Article 14(4) leads to the horizontal application of fundamental rights in platformʼs T&Cs, considering ECtHR and CJEU case law, as well as some recent national judgments on the topic (D.I.). Second, we explore whether and how it is possible to operationalize the vague wording of Article 14 though other more concrete provisions in the DSA (D.II.). Third, we reflect on the limitations of applying fundamental rights in T&Cs for addressing the unequal contractual relationship between platforms and users, drawing from external critiques to the law to highlight the risks of overreliance on fundamental rights in this context (D.III.). Guiding our analysis is the overarching question of whether Article 14, and in particular applying EU fundamental rights to the enforcement of platforms’ T&Cs, constitutes a suitable mechanism to rebalance the unequal position between platforms and their users and effectively curb the power of Big Tech.

I. Indirect Horizontal Effect of Fundamental Rights through Terms and Conditions

This first crucial issue we examine refers to the implications of interpreting Article 14(4) as imposing an obligation on intermediary service providers—in particular online platforms—to apply and enforce restrictions on content under their T&Cs “with due regard” to the rights of all parties involved, including the fundamental rights of recipients of the service “as enshrined” in the EU Charter. Footnote 121 To answer this question, we first highlight relevant human rights standards under the ECHR and Charter (D.I.1), followed by a reference to select national decisions in Germany and the Netherlands (D.I.2), and conclude with a reflection of the potential horizontal effect of Article 14(4) DSA (D.I.3).

1. The ECHR and the Charter

The first point to note is that such a reading of Article 14 would be perfectly consistent with influential recommendations from international and European human rights bodies. As noted, under international human rights standards, the UN Special Rapporteur on freedom of expression has explicitly stated that platforms should “incorporate directly” principles of fundamental rights law into their T&Cs. Footnote 122 This is to ensure that content moderation is guided by the “same standards of legality, necessity and legitimacy that bind State regulation of expression.” Footnote 123 While the Council of Europe’s Committee of Ministers and international human rights organizations have made similar recommendations. Footnote 124

Notably, the incorporation and application of fundamental rights is not envisaged to be merely conducting box-ticking human rights impact assessments of a platform’s T&Cs. Rather, it would include the actual application of fundamental rights principles to content moderation decisions, such as the principle of proportionality, where any restriction of content should align with the “least restrictive means” principle and be “limited in scope and duration.” Footnote 125 As such, Article 14(4) could be read as translating such human rights recommendations into a legal obligation on platforms to apply fundamental rights law under their T&Cs not just through general obligations, like risk assessments, but also in concrete decisions.

For instance, Recital 47 DSA makes specific reference to human rights standards, namely the UN Guiding Principles on Business and Human Rights. Footnote 126 This framework includes a set of foundational and operational principles on corporate social responsibility to respect human rights that would be relevant in this context. For this purpose, business enterprises should inter alia have in place certain policies and processes, such as: Embed their responsibility to respect human rights in clear policy statements; carry out human rights due diligence processes and risks assessments for actual or potential adverse human rights impacts from their activities; deploy and monitor the effectiveness of mitigation measures to address the risks identified; put in place transparency reporting obligations regarding such assessments and measures; and provide for or cooperate in the remediation of adverse impacts through legitimate processes. Footnote 127 Importantly, the application and development of these principles and the “Protect, Respect and Remedy” Framework to online intermediaries—including Big Tech platforms—has been endorsed by Council of Europe’s (CoE) Committee of Ministers, and recommended to member states and online platforms. Footnote 128 Interestingly, it is possible to find echoes of these principles not only in Article 14 DSA but also in the risk assessment and mitigation measures referring to it in the DSA.

First, these standard-setting instruments provide further helpful guidance on how Article 14(4) DSA could be operationalized by platforms. For example, the aforementioned CoE’s Committee of Ministers has detailed guidance for platforms on the implications of applying fundamental rights, including ensuring that all staff who are “engaged in content moderation” should be given “adequate” training on the applicable “international human rights standards, their relationship with the intermediaries’ terms of service and their internal standards, as well as on the action to be taken in case of conflict.” Footnote 129 This is a crucial point for ensuring platforms apply fundamental rights in content moderation decisions, and while the wording of Article 14 might not suggest an obligation for platforms to ensure adequate training and an alignment between their public-facing T&Cs and internal policy for staff, it is difficult to see how Article 14 could function without these requirements for platforms.

Second, the question arises as to what EU fundamental rights law would a platform apply, and what specific principles would be applied. The fundamental right most readily engaged through restrictions on user content under a platform’s T&Cs would be the right to freedom of expression, guaranteed in Article 11 Charter. However, content moderation decisions may also affect different rights, such as one user’s post which implicates another user’s right to private life, guaranteed under Article 7 Charter, or another user’s right to protection of personal data, under Article 8 Charter, or even the right to non-discrimination under Article 21 Charter. And finally, it must be recognized that platforms themselves also enjoy fundamental rights under the Charter, including the right to freedom of expression, Footnote 130 the right to property (Article 17) and freedom to conduct a business (Article 16). As discussed below, these rights and freedoms play a part in national case law. Footnote 131

As such, when a platform makes a content moderation decision about a user post it may have to engage in a complex fundamental rights balancing exercise. It will be required to have due regard to a user’s right to freedom of expression, but may also be required to balance that user’s freedom of expression with another user’s rights, or indeed, the platform’s own rights. However, the wording of the rights guaranteed in the EU Charter is broad and open-ended, and gives little concrete guidance to platforms. Therefore, in order to apply these rights, platforms must consider the relevant -CJEU- case law, as well as the considerable ECtHR case law on freedom of expression online. This is because in an important 2019 judgment, the -CJEU- expressly confirmed that Article 11 Charter should be given the same meaning and the same scope” as Article 10 ECHR, “as interpreted by the case-law of the European Court of Human Rights.” Footnote 132

Although this is no easy task, the significant body of ECtHR case law within the remit of Article 14(4) DSA could be beneficial from the standpoint of fundamental rights protection. The reason is that while the CJEU has some case law on freedom of expression, the ECtHR has delivered many—and more detailed—judgments concerning online platforms and content posted by users, which would provide concrete guidance for the application of Article 14(4) DSA. For example, the ECtHR has delivered important judgements on relevant issues for content uploaded to platforms, with prominent examples including judgments whether posts on Instagram as part of the #metoo movement, describing a public figure as a “rapist bastard,” were protected by freedom of expression; Footnote 133 whether posts on Facebook should be considered terrorist propaganda; Footnote 134 whether religious videos uploaded to YouTube constituted hate speech; Footnote 135 whether uploading Pussy Riot videos to YouTube constituted “extremist“ content; Footnote 136 whether posting intimate images of a person to social media without consent violated the right to private life; Footnote 137 or whether posting hostile remarks about the police was extremist content. Footnote 138 Importantly, some of these judgments have involved not only freedom of expression, but also how to balance this right with the rights of others, such as a person’s right to private life under Article 8 ECHR.

In this regard, it is noteworthy that the ECtHR has held that when balancing rights under Article 8 and Article 10, rights enjoy “equal protection” under the ECHR, and the outcome of the balancing exercise should not vary depending upon whether it is the person asserting their Article 10 rights or the person targeted asserting their Article 8 rights, because these rights deserve, “in principle, equal respect.” Footnote 139 As such, although the requirement in Article 14(4) DSA for platforms to balance the rights and interests of “all parties” is consistent with ECtHR case law, we are less certain about the supporting Recitals’ exclusive reference to the fundamental rights of the “recipients of the service.” That is to say, to consider that platforms should take into account only a sub-set of fundamental rights in their balancing exercise is questionable, because non-recipients of a service’s fundamental rights must also be taken into consideration as per ECtHR case law. Footnote 140 The upshot is likely to be an even more complex balancing exercise for intermediary service providers.

Of course, the judgments mentioned above involve the application of fundamental rights in the “vertical” relationship between individuals and the State. They are judgments on platform users initiating applications to the ECtHR over government interference with freedom of expression, such as a prosecution for incitement to hatred over a YouTube video. Crucially, however, there is no case law from the ECtHR or CJEU concerning a user initiating legal proceedings against a platform for removing content. Still, the important principles contained in the ECtHR judgments can be applied by platforms, particularly on what constitutes incitement to violence, incitement to hatred, terrorist propaganda, or extremist material, and the kind of weight to attach to certain political content, such as expression on “matters of public interest,” which enjoys the highest protection under Article 10 ECHR. Footnote 141

Thus, for example, platforms should ensure that content rules set out in their T&Cs satisfy the requirements of Article 10 ECHR, namely that restrictions are “accessible” to users and “foreseeable” as to their effects, and “formulated with sufficient precision” to enable users to “foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail and to regulate their conduct.” Footnote 142 In this regard, the UN Special Rapporteur on freedom of expression has stated that Facebook’s “coordinated inauthentic behavior” policies can have an “adverse effect” on freedom of expression due to the “lack of clarity,” and can have “adverse effects” on legitimate campaigning and activist activities online. Footnote 143 Notably, the ECtHR has also elaborated upon the principle that Article 10 incorporates important procedural safeguards beyond mere ex ante transparency and information obligations. These include that users should have "advance notification” before content is blocked or removed, know the grounds for content being blocked, and have a forum to challenge a restriction on expression. Footnote 144 Further, content moderation decisions should “pursue a legitimate aim,” and be “necessary in a democratic society”—measure taken must be “proportionate.” Footnote 145

Indeed, one platform, Meta, the owner of Facebook and Instagram which are both qualified as VLOPs under the DSA, is already attempting to put in place mechanisms to engage in content moderation through the application of international human rights law, similar to the application of ECtHR case law mentioned above.Footnote 146 In this regard, Meta’s Oversight Board’s decisions are particularly instructive. Footnote 147 For example, the Oversight Board recently examined Meta’s decision to remove a post by a Swedish user describing incidents of sexual violence against two minors, for violating Meta’s T&Cs on child sexual exploitation, abuse and nudity. Footnote 148 In overturning Meta’s decision to remove the post, the Oversight Board applied principles developed by UN human rights bodies under Article 19 of the International Covenant on Civil and Political Rights (ICCPR), Footnote 149 which guarantees the right to freedom of expression, including the framework of (a) legality; (b) legitimate aim; and (c) necessity and proportionality. The Board found that Meta failed to define key terms, such as “sexualization of a minor;” and concluded that the post was “not showing a minor in a ‘sexualised context,’” as the “broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.” Footnote 150 As such, removing content “discussing sex crimes against minors, an issue of public interest and a subject of public debate, does not constitute the least intrusive instrument.” Footnote 151

Thus, the Meta Oversight Board’s approach to apply international human rights standards is instructive for how Article 14(4) DSA could be applied to platform’s content moderation decisions. Of course, one major limitation of Meta’s Oversight Board is that—at the time of writing—the Board “receives thousands of appeals per week, but “only selects a small number of cases for review;” and as such, the application of international fundamental rights is only occurring on appeal following Meta’s removal of content. Footnote 152 That is to say, the scope of competence of the Oversight Board only covers a limited portion of what under the DSA would qualify as restrictions imposed to the users of platforms’ services, such as in the context of broadly defined content moderation activities. As such, Article 14(4) DSA would obligate the application of fundamental rights during the initial content moderation decision.

What seems clear from the analysis so far is that certain international human rights standards may be partly met by the imposition on platforms of information and transparency obligations of the type Article 14 DSA imposes. Footnote 153 Still, the most substantive requirements stemming from these standards would necessitate a concrete implementation of Article 14(4) in the content moderation practices of platforms.

2. National Courts

National courts in EU member states are also seeking to apply fundamental rights in the relationship between online platforms and users. Dutch and German courts in particular have delivered several notable judgments on this issue in the context of ‘put-back requests’ where users want platforms to reinstate their content as well as accounts. Footnote 154 As Article 14(4) DSA targets the T&Cs, the provision will ultimately have its impact in civil litigation based on tort or contract law, which fall within national competencies.

In the short span of a little more than a year, several Dutch district courts issued at least six judgments on requests by public figures to reinstate their content which was removed by Facebook (Meta), LinkedIn or YouTube, for violating their T&Cs. Footnote 155 Notably, in all six cases the claims are explicitly based on a direct or indirect application of the right to freedom of expression under Article 10 ECHR, an admissible cause of action under Dutch law. Footnote 156 The cases all concern violations of the providers’ T&Cs for Covid-19 related disinformation and recognize the importance of large social media platforms for online public debate and people’s ability to express themselves as a factor in their reasoning. The judgments emphasized that in modern society these platforms are “of great importance to convey a message” Footnote 157 and are, “to a certain extent, entrusted with the care for public interests.” Footnote 158 Nevertheless, in all cases this reach and role of the large social media platforms were deemed insufficient ground to require them to carry or reinstate content based on the right to freedom of expression.

Despite the novelty of the topic, it is possible to identify a shared approach in this case law, whereby the courts are cautious and reserved in their application of fundamental rights. A direct horizontal application of the right to freedom of expression is rejected and, in the indirect horizontal application, the fair balance between the different fundamental rights is only marginally reviewed, setting a high bar for a reinstatement request on these grounds. Footnote 159 This indirect horizontal effect is constructed via open norms governing the private contractual relationship between platform and users. Footnote 160 The fundamental rights of the user, such as freedom of expression, are weighed against the right of property of the platform of which the T&Cs are a manifestation. Footnote 161 Subsequently, the Dutch courts rely on ECtHR case law to infer that there is only space for a marginal review of the fair balance between these conflicting fundamental rights. Footnote 162 Only if the restrictions in a platform’s T&Cs can be said to make “any effective exercise” impossible and “the essence of the right has been destroyed” are the courts required to grant a request for reinstatement. Footnote 163 Notably, there has been criticism that this case law is too conservative in the application of fundamental rights. Footnote 164

The German case law differs from the Dutch not only due to the particular rules of Dutch constitutional review but also, in large part, to Germanyʼs unique constitutional tradition of indirect horizontal application (mittelbare Drittwirkung) of fundamental rights. Footnote 165 Analysis of earlier case law involving “put-back requests” on inter alia social media platforms like Facebook and Twitter has led Kettemann and Tiedeke to conclude that within the German context “the violation of the terms of service does not always suffice to justify a deletion of a statement if it is protected under the fundamental right to freedom of expression.” Footnote 166

Since 2019, two influential cases show how German courts are less reluctant to instate rights-based obligations for platforms based on a fundamental rights analysis. Footnote 167 These cases both involved Facebook and the suspension of accounts. In the latter case, from July 2021, Germany’s Federal Court of Justice ordered Facebook to reinstate posts it had deleted from a user’s account based on its T&Cs on hate speech. Footnote 168 The Court held that in an interest-based balance of the conflicting basic rights, it was necessary for Facebook to inform the user of the intended blocking of his account in advance, inform him of the reason, and give him the opportunity to reply. Examining this case law, Hennemann and Heldt conclude that, notwithstanding the right of even the largest platforms to set contractual prohibitions that go beyond illegal content, the indirect horizontal application of constitutional rights imposes on these platforms procedural obligations when applying and enforcing their T&Cs to content that is otherwise lawful. Footnote 169

Based on this brief overview, at least two insights emerge that are relevant to our analysis. First, through the prism of the indirect effect of fundamental rights, both German and Dutch case law show how public values seep and can be integrated into private relations between platforms and users. Footnote 170 This insight matters for the correct design, application and enforcement of fundamental rights because the judgments in question identify concrete factors to consider in this context, especially when the information or content at issue is legal but incompatible with a platformʼs T&Cs. Such factors include the size, role, market power and resources of the platform making the content moderation decision, as well as the type of user in question, whether a private individual, politician, other public figure, and their communication, for example, whether it is a political or public interest statement or not. Second, as argued by van der Donk, it appears clear that national differences in the legal conceptualization of this private relation between platforms and users greatly influence the possibility and chances of a successful appeal to fundamental rights. Footnote 171 This aspect matters when considering a provision like Article 14 DSA, which aims at EU-wide harmonization of this diverse playing field of application and enforcement of fundamental rights through T&Cs.

3. Horizontal Effect

In light of the analysis of the ECHR, Charter and national case law above, the question arises as to whether and what extent does Article 14(4) DSA imply the horizontal application of fundamental rights as between platforms and recipients of the service, especially users. Indeed, Article 14(4) may result in a regulatory body or court having to review how a platform—a private company—applied fundamental rights affecting a user—a private individual. In this regard, a question that the enforceability of Article 14(4) would raise is how would or should a decision by a platform, where fundamental rights were applied pursuant to that provision, be reviewed by an independent body, regulator or indeed a national court.

In answering this question, it must be noted that an important element of Article 10 ECHR and Article 11 Charter is the principle of positive obligations. Footnote 172 Under the right to freedom of expression, States not only have a negative obligation to “abstain” from interfering with free expression, but they also have a positive obligation to protect free expression, “even in the sphere of relations between individuals,” including between an individual and a “private company.” Footnote 173 Helpfully, the ECtHR has a line of case law on positive obligations, where it has reviewed interferences with free expression by private companies, and given guidance on how any balancing of fundamental rights should be reviewed. Indeed, the 17-judge Grand Chamber of the Court has confirmed that States can have a positive obligation to protect freedom of expression, “even against interference by private persons.” Footnote 174

In its landmark Remuszko v. Poland judgment, the ECtHR has allowed a private individual to make an application to the ECtHR where a Polish media outlet had refused to publish a paid advertisement, with the Court reviewing the decision under Article 10 ECHR. Footnote 175 The Court rejected the Polish government’s argument that the ECtHR could not review the decision because it was a dispute “between private parties, whereas the rights and freedoms enshrined in the Convention were of a vertical nature, in that they concerned relations between the State and individuals.” Footnote 176 The Court held that “genuine, effective exercise of the freedom of expression does not depend merely on the State’s duty not to interfere,” and a State’s positive obligations under Article 10 ECHR may require measures of protection “even in the sphere of relations between individuals.” Footnote 177 The Court then reviewed the domestic courts decisions upholding the media outlet’s decision, and held the domestic courts (a) “carefully weighed the applicant’s interests against the legitimate rights of the publishers, such as their own freedom of expression and economic freedom;” Footnote 178 (b) were “aware” of the “human rights issues” of the case, and applied ECtHR’s principles on freedom of expression; and (c) applied the “principle of proportionality.” As such, the ECtHR concluded that the analysis was “fully compatible with the Convention standards.” Footnote 179

The review framework applied by the ECtHR could be an appropriate framework for weighing the rights of a user versus the rights of platforms, while the ECtHR also applies a relatively light standard of review: If the reasoning of the domestic courts’ decisions concerning the limits of freedom of expression is “sufficient and consistent” with the criteria established by the Court’s case-law, the Court would require “strong reasons“ to “substitute its view for that of the domestic courts.” Footnote 180

Finally, in the context of the EU Charter, it should be mentioned that Advocate General (AG) Øe of the CJEU has discussed the issue of platforms and positive obligations flowing from Article 11 EU Charter in Poland v. Parliament and Council. Footnote 181 Although the issue was not necessary to the case, and was merely obiter dictum, AG Øe emphasized the “importance” of certain platforms, which had become “essential infrastructures,” for the exercise of freedom of online communication. Footnote 182 And of particular note, AG Øe admitted that it was an open question whether certain platforms are “required,” given their important role, to “respect the fundamental rights of users,” citing the ECtHR’s case law. Footnote 183

In our view, the analysis in this section points clearly towards an acceptance of the indirect horizontal effect of human rights in the relationship between online platforms and their users. If that is the case, then in order for the application and enforcement, and arguably the design, of T&Cs to take due regard of fundamental rights, it must do so within the framework of the international and European standards discussed herein. This is especially true where the restrictions imposed by platforms relate to content that is otherwise lawful. Still, even with the rich case law of the ECtHR outlining some of the substantive and operational principles in this context—and national case law pioneering its application—more concrete guidance is needed on how exactly to make the framework provision like Article 14(4) operational in the horizontal relationship between platform and user. We now turn to this crucial issue.

II. Operationalizing and Compliance: How Article 14(4) Interacts with the Rest of the DSA

The text of Article 14(4) DSA provides little guidance on the concrete obligations and rights it creates for, respectively, platforms and users. This section teases out the unanswered questions on how platforms can operationalize and comply with Article 14(4) as well as what claims users can base on the provision by placing it in the context of the entire DSA, focusing on VLOPs. Building on our explanation of Article 14(4) mechanics above (C.II), it is possible to identify several mechanisms within the DSA that could mandate how platforms should operationalize Article 14(4), and possibly offer users a way to ensure platforms apply this provision to their content moderation.

The DSA takes a mixed approach to the operationalization of Article 14(4). On the one hand, it contains a set of provisions that focus on individual procedural safeguards against platformsʼ content moderations decisions vis-a-vis their T&Cs (Articles 17, 20, 21, 23 and 53). On the other hand, it also attempts to regulate the application and enforcement of fundamental rights through T&Cs in a more holistic fashion for VLOPs, by treating it as a systemic risk subject to mitigation measures in Articles 34 and 35. We look at these two dimensions in turn.

1. Individual Procedural Safeguards

First, Article 17 DSA requires that providers of hosting services—including online platforms—offer a “clear and specific” statement of reasons to affected users for a range of content moderation restrictions, including when the decision is based on T&Cs. This statement must include a “reference to the contractual ground relied on and explanations as to why the information is considered incompatible with that ground.” Footnote 184 In light of Article 14(4), it can be argued that statements of reasons issued by platforms would need to include the platform’s application of fundamental rights principles, similar to the decisions issued by Meta’s Oversight Board. In this regard at least, the operationalization of Article 14(4) is carried out by reinforcing the information and transparency obligations of online platforms.

Second, Article 20(1) DSA requires platforms to establish internal complaint-handling systems, where users can complain over a platform’s content moderation decisions, including those based on T&Cs. When Article 20(1) is read together with Article 14(4) DSA, it can be argued that platforms’ complaint-handling systems should issue decisions “having due regard” to the user’s fundamental rights, including freedom of expression. Moreover, Article 20(4) seems to imply that users can complain directly to the platform about the way in which their fundamental rights have been considered in a platforms’ content moderation decision. This would mean the platform does not only need to review whether the conduct or content was illegal or in violation of their T&Cs, but also review its own enforcement and application process and whether the userʼs fundamental rights have been weighed appropriately. Importantly, Article 21 DSA provides that users shall be entitled to select a certain certified out-of-court dispute settlement body “to resolve disputes” relating to platform decisions “including those that could not be resolved in a satisfactory manner through the internal complaint-handling systems”Footnote 185 under Article 20. Crucially, under Article 21(2), these dispute settlement bodies “shall not have the power to impose a binding settlement of the dispute.” In our reading, Article 21 suggests that a user can ask a certified settlement body to not only review a platform’s content moderation decision but also whether in the process leading up to this decision platforms considered the relevant fundamental rights pursuant to Article 14(4). The lack of detail in Article 21 for the establishment of these dispute resolution bodies, the standards they should employ or the procedural safeguards has led to severe criticism for creating a possible “race to the bottom” as well as legal uncertainty. Footnote 186 Depending on the implementation and possible further guidance, the combination of these provisions could provide for a concrete, albeit non-binding, application or consideration of fundamental rights in the enforcement of T&Cs by platforms in the context of both internal complaint-handling systems and out-of-court dispute settlement. Thus, especially for users, these provisions could offer a some procedural paths to invoke fundamental rights against all online platforms as regards content moderation restrictions.

Third, Article 53 DSA provides that inter alia platform users—or any representative body, organisation or association—shall have a right to lodge a complaint against providers of intermediary services “alleging an infringement of this Regulation” with a Digital Services Coordinator (DSC), further requiring that the DSC “assess the complaint.” Footnote 187 Crucially, the broad wording of Article 53 and especially the wording of supporting Recital 118 implies that a user can lodge a complaint with the DSC on the grounds that a platform violated Article 14 by failing to adequately apply fundamental rights in a content moderation decision. Footnote 188 Notably, this right to lodge a complaint extends to the compliance of all intermediary services where article 17 and 20 DSA only applies to respectively hosting service providers and online platforms, and by extension VLOPs. While these provisions do indicate some minimum norms on how online platforms and VLOPs are supposed to operationalize and comply with Article 14, they offer nothing to the broader category of intermediary services covered by Article 53.

Read together, the provisions examined above offer some indication not only of what compliance with Article 14(4) means but also of the possibilities it creates for users. Important to note here is that these are due diligence provisions, which are mostly procedural in nature. That is to say, they do not create a new substantive right or right of action as a judicial remedy for recipients of the service, namely individual platform users. Footnote 189

Overall, the guidance provided in the DSA as regards the application and enforcement of these provisions in connection with Article 14 is limited and significant uncertainty remains. It is possible to identify a number of challenges. First, at a minimum, Articles 17 and 20 mean that in their content moderation enforcement, online platforms and VLOPs will have to not only offer clarity on the contractual or legal basis for the decision but also on whether the enforcement of these norms is in line with the applicable fundamental rights. Yet, given the enormous amount of content reviewed by platforms, the level of detail and nuance in these individual cases required for compliance remains to be seen.

Second, and similarly, the lack of guidance in Article 14(4) DSA itself on how the weighing of the different interests and rights that should be involved, adds another layer of uncertainty as to whether it will, in practice, work to offer both meaningful transparency and force platforms to actually weigh fundamental rights in individual cases.

Third, there is uncertainty related to the right to lodge a complaint created by Article 53 DSA. It seems that the DSC has extensive discretionary power to decide whether or not to pick up the complaint, which is particularly relevant because the provision does not create a separate right of action for the recipient of the service.

Fourth, as the DSC´s will handle the complaint in accordance with national administrative law, we can expect in this context divergences similar to those observed in for Data Protection Authorities in the GDPR across different Member States, especially as regards the capacity and policy focus of DSCs. Footnote 190

Fifth, a shared challenge for the several individual complaint procedures is the possibility of coordinated abuse. With regard to Articles 17 and 20 DSA, the potential for abuse is explicitly foreseen in Article 23 DSA, which we have mentioned above. Footnote 191 This provision obligates online platforms to suspend users that “frequently submit notices or complaints that are manifestly unfounded.” Article 23(4) imposes an information obligation by requiring that this procedure leading up to this suspension must be set out clearly and in a detailed fashion in platforms’ T&Cs. However, the focus on individual entities or people frequently submitting these manifestly unfounded notices calls into question to what extent Article 23 DSA is able to address coordinated abuse and harassment through these complaint mechanisms. For example, so called “coordinated flagging,” where different users decide to simultaneously submit notices “with the purpose of having the impacted individual(s) banned or suspended,”Footnote 192 does not necessarily clear the high bar set by Article 23(3) if the people involved do not engage in this type of harassment “frequently” or when their notices are not “manifestly unfounded.” Similarly, there is no specific measure addressing this in the context of Article 53 DSA, which means DSCs have to fall back on national administrative law.

2. Systemic Obligations

The obligations discussed thus far tackle one important dimension of the operationalization of Article 14(4), namely as regards procedural safeguards for individuals’ fundamental rights vis-a-vis restrictive content moderation decisions by platforms. To be sure, this type of provisions is a necessary component of any serious legal suite of rules to ensure the concrete application and enforcement of fundamental rights via T&Cs. This is because general obligations on platforms, such as human rights impact assessments, would be per se insufficient in this regard. Footnote 193 On the one hand, their implementation with any degree of comprehensiveness and effectiveness would likely be disproportionate for small and medium-sized platforms, who lack the capacity and resources to meet such obligations. On the other hand, because such obligations speak only to the structure of the system and not the application in individual cases, they lack the concreteness required to be effective at the individual level.

However, on the flip side, a sole focus on individual decisions fails to take into account the effect of T&Cs enforcement at the systemic level. Such an outcome would be woefully inadequate from the perspective of fundamental rights protection of users on the largest platforms, such as VLOPs. As Douek argues, the massive scale and speed of content moderation decisions on these platforms equates this endeavor to “a project of mass speech administration and that looking past a post-by-post evaluation of platform decision-making reveals a complex and dynamic system that need a more proactive and continuous form of governance than the vehicle of individual error correction allows.” Footnote 194

As previously discussed, the “risk assessment” regime for VLOPs in the DSA is meant to address this systemic level and to subject the T&Cs as a whole to a type of human rights assessment procedure. Footnote 195 The way in which the T&Cs are applied and enforced is explicitly referred to as one of the factors that should be assessed with regard to their “actual or foreseeable negative effects for the exercise of fundamental rights.” Footnote 196 In addition, adapting the T&Cs and their enforcement is one of the possible measures VLOPs can take to mitigate such systemic risks. Footnote 197 In our view, these assessment and mitigation obligations provide an opportunity to assess the wider context and framework of content moderation practices, beyond the remit of individual decisions. From this perspective, these obligations are consistent with some of the requirements stemming from International human rights standards mentioned above, including the application to platforms of procedural principles of corporate social responsibility. Footnote 198 This would also be one possible approach to have “due regard” for fundamental rights in a the “design” of T&Cs, as mentioned in Recital 47 DSA, as a necessary result of the implementation of risk mitigation measures to bring those terms in line with the requirements of Article 14(4) DSA.

But this approach embodied in the DSAʼs systemic risks provisions is not immune to criticism. Footnote 199 Barata, for instance, points to the vagueness of risk assessment obligations, noting their uncertainty, the challenges associated with applying them in practice, and the excessive discretion they grants to the relevant authorities. Footnote 200 Laux, Wachter and Mittelstadt further warn against the peril of “audit capture,” where “VLOPs may leverage their market power against their new mandatory auditors and risk assessors.” Footnote 201 These valid concerns, if realized, will substantially weaken the potential positive affect of this risk assessment regime.

Finally, as hinted to in the preceding paragraphs, there remain open questions with regard to the enforcement of Article 14(4) DSA. These range from the different national approaches to the interaction of tort law with fundamental rights, the embedding of DSCs in national administrative law, and the coordination between the supervision authorities on a European level. A possibility is that to further substantiate platforms’ obligations and clarify enforcement, the Commission and the European Board for Digital Services will draw up codes of conduct, pursuant to Article 45 DSA. This could potentially offer more legal certainty as well as create the possibility for a tiered application of Article 14(4) DSA where only the VLOPs would take on more substantial obligations that would not be proportional for all service providers. However, such a construction would rely on platforms’ voluntary cooperation.

III. Fundamental Rights in T&Cs: Limitations of Relying on Fundamental Rights

In the preceding sections, we analyzed from an internal legal perspective how Article 14 DSA aims to rebalance the unequal contractual relationship between users and platforms by injecting fundamental rights into the heart of that contract, the application and enforcement of T&Cs. However, questions on the efficacy as well as the aim of the provision itself should, similarly, be raised from an external perspective so as to appreciate the scope and limitations of the fundamental rights framework in the context of content moderation.

Within the broader context of EU platform regulation, Footnote 202 the push for the protection of fundamental rights in the content moderation process can be seen as part of a ‘digital constitutionalism’ response to the relative dominance of the big social media platforms. Celeste describes this as the “ideology that adapts the values of contemporary constitutionalism to the digital society” and as the “set of values and ideals that permeate, guide and inform” normative instruments regulating platforms. Footnote 203 Forcing platforms to relate directly to fundamental rights in enforcement of content moderation policies fits perfectly with the broader movement of scholars arguing for rule-of-law-based responses to the particular challenges of platform’s private ordering, as it will normatively constrain their actions without relying on inappropriate state intervention. Footnote 204 Notwithstanding the potential importance of Article 14’s fundamental rights intervention, as well as the value of the broader rule of law response, three important limitations to applying the fundamental rights framework in this way must be noted from a systemic perspective. Appreciating these limitations can help us understand where the value of Article 14 is to be found and can help guard us against inflated claims co-opting the provision.

The first limitation relates to the scope of—and the inherent risks of over-reliance in—the fundamental rights framework itself. There is a rich academic tradition in, mainly, critical legal studies and feminist legal theory that criticizes, amongst others, the individualistic, apolitical and acontextual orientation of fundamental rights that makes them often unsuitable to deal with collective or structural injustices. Footnote 205 Building on this tradition, Griffin argues how this individual-oriented framework is unable to address important societal issues, such as those faced by marginalized communities on big social media platforms. Footnote 206 For example, an individual approach is arguably unsuitable to articulate the unequally distributed error margins of algorithmic systems of content moderation, Footnote 207 or the collective harm of perpetuating certain stereotypes. Additionally, an over-reliance on fundamental rights can eclipse the very existence of collective and systemic harms. Because they fall outside the frame of individualized harms or are not connected to the language of rights infringements, these systemic issues become difficult to articulate and easy to obscure. As such, an emphasis on human rights might eclipse many systemic harms in favor of an individualized and procedural weighing exercise. This danger is compounded by the depoliticizing effect human rights are at times criticized for. Footnote 208 The universalistic language, the focus on individual cases, and the emphasis on procedure could serve to neutralize the political nature of content moderation norms. Indeed, considering the fact that the DSA, and Article 14(4) in particular, extend the regulatory framework to also cover lawful content that is in violation of the T&Cs, overly focusing on human rights compliance can neutralize the very real political nature of these speech norms. Even though there are clear arguments for applying instruments designed to depoliticize these types of decisions rather than relying on the platform’s own discretion, the criticism remains that the human rights framework is not able to address certain harms and that its depoliticizing function can obfuscate both their own ideological background as well as the political discussion they are addressing.

A second and related limitation builds on the critique that human rights are not well positioned to address structural inequalities. In particular, the advertisement-based business model that social media platforms rely upon is not easily questioned from this perspective. As is well established in the online speech governance literature, the broader commercial interests of the platforms are a main driver for its content moderation policies. Footnote 209 These interests range from shielding itself from liability, creating an inviting space for users, to attracting enough advertisers through offering access to enough and the right type of people and associated data points. As such, the political economy of social media platforms forms the primary logic for its content moderation practices and is one of the most important structural conditions that can produce inequality in these practices. The fundamental rights framework disregards how this political economic dimension forms the structural background and, rather, legitimizes this dynamic by including the platforms’ own right of property and/or freedom to conduct a business, which are to be balanced against users’ rights. Footnote 210 In Part B we discussed how the unequal contractual relationship between platforms and users in the T&Cs forms one of the major challenges in regulating content moderation that the European fundamental rights framework was hoped to address. And yet, we see here how the structural conditions underlying this imbalance remain mostly untouched by that framework, and are even given space to reproduce by weighing platforms’ commercial interests on the same level as users’ fundamental rights. Footnote 211

A third limitation is where the setup of Article 14 DSA can serve to constrain platforms in their content moderation practices and to reorient them towards their users’ fundamental rights, it does not address the background conditions of where the power to set and enforce online speech norms is located. Even though fundamental rights do contain proportionality and procedural requirements in their de facto enjoyment and protection, they do not center on questions of power as such. Who can influence the T&Cs themselves, as well as the content moderation response to a violation—removal, downranking, labelling, etcetera—including the design and deployment of algorithmic systems to enforce those T&Cs—remains largely uncontestable. This connects to the broader critiques on the depoliticizing effect of the fundamental rights discourse and on the distribution of power and involvement of different actors in online speech governance. Footnote 212

Looking at these limitations of the fundamental rights framework, one could, of course, argue that it is not fair to expect too much from one provision such as Article 14 DSA and that, in its limited scope, this provision can still have a positive effect. Indeed, Article 14 is only one part of the larger regulatory framework aimed to ensure a safer and fairer online environment within the overarching EU law goal of ensuring the functioning of the internal market. Footnote 213 Nevertheless, this critique does carry weight if combined with the other challenges described in the previous sections and, crucially, the normative weight of the fundamental rights discourse. In arguing for fundamental rights-based regulation, Suzor already hints at this latter point when noting that it might be in platforms’ “long-term interests to adopt human rights processes that make their decision-making process more legitimate.” Footnote 214 Being able to claim that their content moderation practices align, in whatever way, with fundamental rights standards affords platforms a higher degree of legitimacy in how they govern online speech. The powerful legitimizing effect of the fundamental rights discourse can work to depoliticize content moderation decisions and to deflect attention from structural questions on power and political economy. This resonates with the wider discussions on corporate social responsibility and “bluewashing” through the legitimacy companies stand to win with paying lip service to human rights without substantially changing their business practices. Footnote 215

E. Conclusion

This article examines the issue of enforceability of fundamental rights via T&Cs through the prism of Article 14(4) DSA. The DSA is the centerpiece of an expanding and complex puzzle of platform regulation at EU level. A major thread running through the EU’s policy approach is to attempt to reign in the power of Big Tech platforms by imposing on them “enhanced” responsibility for the content they host, thereby reducing potential harms from their content moderation practices to free expression and a healthy public debate.

Until the DSA, only a few sectoral instruments contained minimal restrictions on platforms’ significant discretion and power in determining their “house rules.” Article 14 DSA fills an important gap as it addresses the wide margin of discretion enjoyed by intermediaries in their contractual relationship with users, which governs the moderation of illegal and harmful content online. The provision is particularly important due to the horizontal nature of the DSA in EU law. Rules on T&Cs in the DSA will govern the contractual relationship between intermediary service providers, including Big Tech platforms, and their users in the EU. Article 14 aims to increase the transparency of T&Cs and bring their enforcement in direct relation to fundamental rights. It does so via two sets of obligations, the first focusing on information and transparency, and the second on application and enforcement. These obligations apply to all intermediaries and cover restrictions they impose in their—human and algorithmic—moderation of content that is illegal or merely incompatible with their T&Cs.

At the heart of our inquiry is the question of whether applying EU fundamental rights to the enforcement of platforms’ T&Cs is a suitable mechanism to rebalance the unequal position between platforms and their users, and effectively curb the power of Big Tech. After explaining the mechanics of the provision and its systematic place within the DSA, our analysis focused on three crucial issues related to the application and enforcement of fundamental rights through T&Cs under Article 14 DSA. First, we explored whether Article 14(4) leads to the horizontal application of fundamental rights in platformʼs T&Cs, considering ECtHR and CJEU case law, as well as select national judgments. Our analysis concludes that it is possible, in principle, to establish the indirect horizontal effect of human rights in the relationship between online platforms and their users. But in order for the application and enforcement, and arguably the design, of T&Cs to take due regard of fundamental rights, it must do so within the framework of the international and European standards. This is particularly relevant as regards restrictions on content that is lawful but incompatible with platforms’ T&Cs. However, Article 14(4) DSA remains in essence a framework provision. Thus, even relying on ECtHR and leading national case law for guidance on substantive and operational principles, more detail is needed on how to operationalize it in the horizontal relationship between platform and user.

This leads to our second crucial issue, namely to whether and how it is possible to operationalise the wording of Article 14 through other more concrete provisions in the DSA. In this respect, we identify several mechanisms within the DSA that could mandate how platforms should operationalize Article 14(4), and possibly offer users a way to ensure platforms apply this provision to their content moderation. The DSA takes a mixed approach to the operationalization of Article 14(4). On the one hand, it contains a set of provisions that focus on individual procedural safeguards against platforms’ content moderations decisions vis-a-vis their T&Cs (Articles 17, 20, 21, 23 and 53). At this level, the operationalization of Article 14(4) is carried out partly by reinforcing the information and transparency obligations of online platforms, and partly by procedural due diligence obligations to the benefit of users. Especially regarding the latter there is significant uncertainty that may hamper the effective protection of users. On the other hand, the -DSA- also attempts to regulate the application and enforcement of fundamental rights through T&Cs in a more holistic fashion for VLOPs, by treating it as a systemic risk subject to mitigation measures in Articles 34 and 35. In our view, these obligations offer an opportunity to assess the wider context and framework of content moderation practices, beyond individual decisions. As such, they are consistent with some of the requirements stemming from international human rights standards, including the application to platforms of procedural principles of corporate social responsibility. They also provide an avenue to have “due regard” for fundamental rights in the “design” of T&Cs. However, the systemic risk regime is also open to criticism due to its vagueness, the broad discretion it offers to authorities, the susceptibility of “audit capture,” and the concerns as to its effective enforcement at different -EU- and national levels. All these challenges combined may imperil the effective application of this regime, to the detriment of fundamental rights protection.

Finally, we reflected on the limitations of applying fundamental rights in T&Cs for addressing the unequal contractual relationship between platforms and users, drawing from external critiques to the law to highlight the risks of overreliance on fundamental rights in this context. In our view, three interrelated limitations are noteworthy. First, the potential unsuitability of fundamental rights as a sole solution to collective or structural injustices. Second and more concretely, the inadequacy of fundamental rights to address structural inequalities the solution to which is outside its “toolbox,” such as the advertisement-based business model driving social media platforms. Third, the consideration that the fundamental rights framework does not truly tackle the background conditions of the power to set and enforce online speech norms. That is to say, despite proportionality and procedural safeguards, it is difficult for the fundamental rights framework to reach and influence in practice the design, application and enforcement of T&Cs and content moderation practices. If these criticisms hold, then the real risk of overreliance on fundamental rights is of legitimizing platforms that adopt that discourse while failing to uphold it, and of depoliticizing content moderation practices while obfuscating structural questions on power and political economy.

In this light, two lessons can be drawn from our analysis. First, in working to align content moderation practices with fundamental rights, it is important to appreciate the limitations of this framework and take a more holistic view. Efforts to realign platforms’ T&Cs towards public values and a safer, more equal online environment must not only orient themselves towards fundamental rights but also towards competition and consumer law, as well as place these in the wider context of social injustices, while ensuring their correct implementation through the human and algorithmic practices of platforms. Footnote 216 Second, the relevant DSA provisions must be properly interpreted and enforced, so as to ensure that the legitimacy platforms gain by fundamental rights compliance exercises is justified. On this point, as we have shown, the devil is in the details. Article 14 is a framework provision. If it operates in isolation, it might amount to little more than a paper tiger. However, our analysis shows that the DSA already contains concrete norms on compliance and enforcement. If these are interpreted in light of international and European standards and properly integrated with national legal systems, they can help to substantiate and operationalize Article 14 DSA towards desirable outcomes, and fulfil its revolutionary potential.

Acknowledgments

The authors would like to thank the participants of the following seminars, workshops and conferences for helpful comments on earlier drafts of this Article: the 2021 Digital Legal Talks conference (hosted by the University of Amsterdam, Tilburg University, Radboud University Nijmegen and Maastricht University); Percolating the DSA Package – Perspectives for a Digital Decade 2021 (Online Workshop hosted by the Weizenbaum Institute and the Internet Policy Review); the IE Private Law Series seminar 2022 (hosted by the IE Law School); and the TILT Seminar Series 2022 (hosted by the Tilburg Institute for Law, Technology, and Society (TILT)).

Competing Interests

The authors declare none.

Funding Statement

João Pedro Quintais’s research in this article is part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).

References

1 Cengiz et al. v. Turkey, App. No. 48226/10 & App. No. 14027/11, (Mar. 1, 2016), http://hudoc.echr.coe.int/.

2 Rep. of the Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association, at para. 4, U.N. Doc. A/HRC/41/41 (2019).

3 Jillian C. York & Corynne McSherry, Content Moderation is Broken. Let Us Count the Ways., Electronic Frontier Foundation (Apr. 29, 2019), https://www.eff.org/deeplinks/2019/04/content-moderation-broken-let-us-count-ways (last visited July 22, 2022). See also Elizabeth Culliford & Katie Paul, Facebook Offers Up First-Ever Estimate of Hate Speech Prevalence on Its Platform, Reuters, (Nov. 19, 2020), https://www.reuters.com/article/uk-facebook-content-idINKBN27Z2QY (last visited July 22, 2022).

4 Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights, Amnesty International (Nov. 21, 2019), https://www.amnesty.org/en/documents/pol30/1404/2019/en/ (last visited July 22, 2022).

5 Dottie Lux & Lil Miss Hot Mess, Facebook’s Hate Speech Policies Censor Marginalized Users, Wired (Aug. 14, 2017), https://www.wired.com/story/facebooks-hate-speech-policies-censor-marginalized-users/ (last visited July 22, 2022).

6 Jessica Guyinn, Facebook While Black: Users call it Getting “Zucked,” Say Talking About Racism is Censored as Hate Speech, USA TODAY (July 9, 2020), https://www.usatoday.com/story/news/2019/04/24/facebook-while-black-zucked-users-say-they-get-blocked-racism-discussion/2859593002/ (last visited July 22, 2022).

7 Justine Calma, Facebook says It ‘Mistakenly’ Suspended Hundreds of Activists’ Accounts, The Verge (Sept. 24, 2020), https://www.theverge.com/2020/9/24/21454554/facebook-acitivists-suspended-accounts-coastal-gaslink-pipeline (last visited July 22, 2022).

8 Akin Olla, Facebook is Banning Leftwing Users like Me—and It’s Going Largely Unnoticed, The Guardian (Jan. 29, 2021), https://www.theguardian.com/commentisfree/2021/jan/29/facebook-banned-me-because-i-am-leftwing-i-am-not-the-only-one (last visited July 22, 2022).

9 Rep. of the Special Rapporteur on the Promotion and Prot. of the Right to Freedom of Op. and Expression, U.N. Doc A/HRC/38/35 (Apr. 6, 2018).

10 See Rep. by the Special Rapporteur on the Promotion & Prot. of the Right to Freedom of Op. & Expression, U.N. Doc. A/HRC/38/35 (2018). See also Side-Stepping Rights: Regulating Speech by Contract (Policy Brief), Article 19 (June 9, 2018), https://www.article19.org/resources/side-stepping-rights-regulating-speech-by-contract/.

11 Convention for the Protection of Human Rights and Fundamental Freedoms, 213 U.N.T.S. 221 (Nov. 4, 1950); Charter of Fundamental Rights of the European Union 364/1, 2000 O.J. (C 364)1.

12 Basic principle of contract law: freedom of contract. See Jürgen Basedow, Freedom of Contract in the European Union, 16 Eur. Rev. Priv. L. 901 (2008); Maria Rosaria Marella, The Old and the New Limits to Freedom of Contract in Europe, 2 Eur. Rev. Cont. L. 257 (2006).

13 Niva Elkin-Koren, Giovanni De Gregorio & Maayan Perel, Social Media as Contractual Networks: A Bottom Up Check on Content Moderation, Iowa L. Rev., (forthcoming 2010).

14 Michael Karanicolas, Too Long; Didn’t Read: Finding Meaning in Platforms’ Terms of Service Agreements, 52 Tol. L. Rev. 3 (2021); Wayne R. Bames, Social Media and the Rise in Consumer Bargaining Power, 14 Pa. J. Bus. L. 661 (2012); Ellen Wauters, Eva Lievens & Peggy Valcke, Towards a Better Protection of Social Media Users: A Legal Perspective on the Terms of Use of Social Networking Sites, 22 Intʼl J. L. & Info. Tech. 254 (2014). See also the longstanding discussion within contract law and consumer law on “standard contracts” or “adhesion contracts.” See e.g., Friederich Kessler, The Contracts of Adhesion—Some Thoughts About Freedom of Contract Role of Compulsion in Economic Transactions, 43 Colum. L. Rev. 629 (1943); Todd D. Rakoff, Contracts of Adhesion: An Essay in Reconstruction, 96 Harv. L. Rev. 1173 (1983). Recently in the European context, see Caterina Gardiner, Unfair Contract Terms in the Digital Age: The Challenge of Protecting European Consumers in the Online Marketplace (2022).

15 For text with EEA relevance and Terrorist Content Regulation, see Commission Regulation of the European Parliament and of the Council of 29 April 2021 on Addressing the Dissemination of Terrorist Content Online 2021/784, art. 5(1), 2021 O.J. (L 172) [hereinafter Commission Regulation 2021/784].

16 For the Digital Services Act and text with EEA relevance, see Commission Regulation of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services 2022/2065, 2022 O.J. (EU); Commission Directive 2000/31/EC, 2022 O.J. (EU). For a summary and analysis see Folkert Wilman, The Digital Services Act (DSA) – An Overview, (2022), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4304586; Putting the DSA into Practice Enforcement, Access to Justice, and Global Implications, Verfassungsbooks (Joris van Hoboken, João Pedro Quintais, Naomi Appelman, Ronan Fahay, Ilaria Buri & Marlene Straub eds., 2023), https://verfassungsblog.de/wp-content/uploads/2023/02/vHoboken-et-al_Putting-the-DSA-into-Practice.pdf.

17 Charter of Fundamental Rights of the European Union 326/02, 2012 O.J., 391–407 (ELI), http://data.europa.eu/eli/treaty/char_2012/oj.

18 Generally on this topic in the context of digital innovation, see Evangelia Psychogiopoulou, Judicial Dialogue and Digitalization: CJEU Engagement with ECtHR Case Law and Fundamental Rights Standards in the EU, 13 J. Intell. Prop. Info. Tech. & Elec. Com. L. (2022), http://www.jipitec.eu/issues/jipitec-13-2-2022/5541.

19 For a definition of “online platforms,” see supra note 16, at art. 3(i). VLOPs are regulated in Articles 33–43. See supra note 16, at arts. 33–43.

20 See Digital Services Act: Commission Designates First Set of very Large Online Platforms and Search Engines, European Commission (April 25, 2023), https://ec.europa.eu/commission/presscorner/detail/en/IP_23_2413.

21 Among is significant scholarly literature discussing human and algorithmic content moderation. See e.g., Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media (2018); Jennifer Cobbe, Algorithmic Censorship by Social Platforms: Power and Resistance, Phil. &Tech. (2020); Robert Gorwa, Reuben Binns & Christian Katzenbach, Algorithmic Content Moderation: Technical and Political Challenges in the Automation of Platform Governance, 7 Big Data & Socʼy (2020); James Grimmelmann, The Virtues of Moderation, 17 Yale J. L. & Tech. 42 (2015); Sarah T. Roberts, Behind the Screen (2019); Sarah T. Roberts, Content Moderation (2017).

22 For a historical and critical examination of Big Tech platforms and how they have developed their content moderations policies, including their T&Cs, see Jillian C. York, Silicon Values: The Future of Free Speech Under Surveillance Capitalism (2021).

23 For a description of regulatory efforts at EU level, see infra Part B.II. For an overview of different proposals on so-called “lawful-but-awful” speech, see Daphne Keller, If Lawmakers Don’t Like Platforms’ Speech Rules, Here’s What They Can Do About It. Spoiler: The Options Aren’t Great., Techdirt (2020), https://www.techdirt.com/2020/09/09/if-lawmakers-dont-like-platforms-speech-rules-heres-what-they-can-do-about-it-spoiler-options-arent-great/ (last visited July 22, 2022). For a recent description of civil society efforts, see the efforts described in York, supra note 22. In scholarship, see for example David Kaye, Speech Police: The Global Struggle to Govern the Internet (2019); Nicolas Suzor, Lawless: The Secret Rules That Govern Our Digital Lives (2019); Roberts (2019), supra note 21; Gillespie, supra note 21; Frank Pasquale, The Black Box Society (2016); Rebecca MacKinnon, Consent of the Networked (2017).

24 See York, supra note 22. Evgeny Morozov, The Net Delusion: The Dark Side of Internet Freedom (2011).

25 The origin of the term “techlash” is difficult to determine with certainty, but the Cambridge Dictionary defines the term as “a strong negative feeling among a group of people in reaction to modern technology and the behavior of big technology companies.” See Techlash, Cambridge Dictionary Online, https://dictionary.cambridge.org/dictionary/english/techlash (last visited Sept. 20, 2022). See also Evelyn Douek, Transatlantic Techlash Continues as U.K. and U.S. Lawmakers Release Proposals for Regulation, Lawfare (Aug. 8, 2018), https://www.lawfareblog.com/transatlantic-techlash-continues-uk-and-us-lawmakers-release-proposals-regulation (last visited July 22, 2022).

26 York, supra note 22; Kaye, supra note 23; Philip M. Napoli, Social Media and the Public Interest: Media Regulation in the Disinformation Age (2019).

27 Illustrations of these claims in highly publicized scandals abound have plagued companies such as Metaʼs Facebook, Instagram, YouTube, Twitter, and TikTok in the past few years. See e.g., Ben Smith, Inside the Big Facebook Leak, The New York Times (Oct. 25, 2021), https://www.nytimes.com/2021/10/24/business/media/facebook-leak-frances-haugen.html (last visited July 22, 2022); Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military, The New York Times (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html (last visited Feb. 18, 2022).

28 See e.g., Jean-Christophe Plantin, Carl Lagoze, & Christian Sandvig, Infrastructure Studies Meet Platform Studies in the Age of Google and Facebook, 20 New Media & Socʼy 293 (2018); Roberts (2019), supra note 21; Niva Elkin-Koren & Maayan Perel (Filmar), Algorithmic Governance by Online Intermediaries, in Oxford Handbook of International Economic Governance and Market Regulation (Eric Brousseau, Jean-Michel Glachant, & Jérôme Sgard eds., 2018); Elkin-Koren et al., supra note 13; Sarah T. Roberts, Digital Detritus: “Error” and the Logic of Opacity in Social Media Content Moderation, 23 First Monday (2018).

29 York, supra note 22; Edoardo Celeste, Terms of Service and Bills of Rights: New Mechanisms of Constitutionalisation in the Social Media Environment?, 33 Intʼl Rev. L., Comput. & Tech. 122 (2018); Luca Belli & Jamila Venturini, Private Ordering and the Rise of Terms of Service as Cyber-Regulation, 5 Internet Polʼy Rev. (2016).

30 Marco Bassini, Fundamental Rights and Private Enforcement in the Digital Age, 25 Eur. L. J. 182 (2019).

31 See infra at Part B.II.

32 See e.g., Matthias C. Kettemann & Anna Sophia Tiedeke, Back up: Can Users sue Platforms to Reinstate Deleted Content?, 9 Internet Polʼy Rev. 1 (2020); Maria Luisa Stasi, Keep It up: Social Media Platforms and the Duty to Carry Content, federalismi.it (2022).

33 See e.g., Human Rights in the Age of Platforms (Rikke Frank JØrgensen ed. 2019).

34 See e.g., Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, App. No. 22947/13, para. 86 (Feb. 2, 2016) http://hudoc.echr.coe.int/; Ronan Ó Fathaigh, The Chilling Effect of Liability for Online Reader Comments, 22 Eur. Hum. Rts. L. Rev. 387 (2017). A similar point has been made by the CJEU in Case C-401/19, Poland v. Parlimentary Council, para. 84 et. seq. (July 15, 2021) (especially paras. 85–6), and Case C-18/18, Eva Glawischnig-Piesczek v. Facebook Ireland Ltd. (Nov. 22, 2019).

35 Directive 2000/31/EC, arts. 12–14, 2000 O.J. (L 178) http://data.europa.eu/eli/dir/2000/31/oj.

36 Id. at art. 15.

37 See the rules on injunctions against intermediaries in Directive 2001/29/EC, art. 8(3), 2001 O.J. (L 167), and Directive 2004/48/ec, art. 11, 2004 O.J. (L 157).

38 See supra note 35, arts. 13(2), 14(3), 18.

39 Remuszko v. Poland, App. No. 1562/10, para. 85 (July 16, 2013), http://hudoc.echr.coe.int/.

40 See Joined Cases C-236/08, C237/08 & C-238/08, Google France -SARL- v. Louis Vuitton Malletier -SA-, 2010 E.C.R. I-2417; Case C-324/09, LʼOreal -SA- v. eBay Int’l -AG-, 2011 E.C.R. I-6011; Case C-360/10, Belgische Vereniging van Auteurs, Componisten en Uitgevers -CVBA- v. Netlog -NV-, -ECLI-:-EU-:C:2012:85 (Feb. 16, 2012), https://curia.europa.eu/juris/liste.jsf?num=C-360/10. See Eva Glawischnig-Piesczek, Case C-18/18; Joined Cases C-682/18 and C-683/18 , YouTube v Cyando, ECLI:EU:C:2021:503 (Jun. 22, 2021). The Court has also refused to apply the liability exemption in Article 14 to a newspaper publishing company which operates a website on which the online version of a newspaper is posted. See ECJ, Case C-291/13, Sotiris Papasavvas v. O Fileleftheros Dimosia Etaireia Ltd. et al., ECLI:EU:C:2014:2209 (Sept. 11, 2014) http://curia.europa.eu/juris/liste.jsf?num=C-291/13.

41 See Remuszko, App. No. 1562/10 at para. 85.

42 Palomo Sánchez et al. v. Spain, App. No. 28955/06, para. 57 (Sept. 12, 2011), http://hudoc.echr.coe.int/.

43 See, in the context of copyright, the Courtʼs development of a multi-factor test to consider whether a service provider like YouTube carries out a “deliberate intervention” sufficient for the attribution of direct liability for communication the infringing works to the public, thereby setting aside the hosting liability exemption in Article 14 of the e-Commerce Directive. See e-Commerce Directive 2000/31, art. 14, 2000 O.J. (EC). For a commentary, see infra note 101 and Angelopoulos, C., Harmonising Intermediary Copyright Liability in the EU: A Summary. Giancarlo Frosio (ed(s)), The Oxford Handbook of Online Intermediary Liability (Oxford University Press, 2020), pp. 315-334.

44 See Google France SARL, C-236/08; Case C-401/19, Poland v. Parlimentary Council, -ECLI-:-EU-:C:2022:297 (Apr. 26, 2022), http://curia.europa.eu/juris/liste.jsf?num=C-401/19.

45 See e.g., Giancarlo Frosio & Martin Husovec, Accountability and Responsibility of Online Intermediaries, in The Oxford Handbook of Online Intermediary Liability (2020) (arguing that this emerging notion of responsibility combines legal liability rules with “a mixture of voluntary agreements, self-regulation, corporate social responsibility, and ad-hoc deal-making”).

46 See e.g., Frosio & Husovec, supra note 45.

47 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: Online Platforms and the Digital Single Market Opportunities and Challenges for Europe, at para. 20, COM (2016) 288 final (May 25, 2016) [hereinafter Communication on Online Platforms in the Digital Single Market].

48 See id.

49 Id. at para. 3.

50 Id.

51 Id. at paras. 10–13, 18–19.

52 Id. at para. 11.

53 Id. at paras. 16, 18–19.

54 Commission Recommendation 2018/334 of Mar. 1, 2018, On Measures to Effectively Tackle Illegal Content Online 2018 O.J. (L 63) 50.

55 Id. at para. 35.

56 Id. at para. 30.

57 Id. at paras. 23, 33. Also see id. at 56, 58–59. One related but important aspect is that Recital 34 recommends that competent authorities “should be able to request the removal or disabling of access to content which they consider to be terrorist content either with reference to the relevant applicable laws or to the terms of service of the hosting service provider concerned.” In other words, it is suggested that T&Cs can be used to define a category of illegal content, such as terrorist content, and determine relevant consequences, like removal or disabling, without the need to refer to applicable laws.

59 See Directive 2018/1808 of 14 Nov. 2018, Amending Directive 2010/13/EU on the Coordination of Certain Provisions laid down by Law, Regulation or Administrative Action in Member States Concerning the Provision of Audiovisual Media Services (Audiovisual Media Services Directive) in View of Changing Market Realities, 2018 O.J. (L 303) 69.

60 Directive 2019/790 of 17 Apr. 2019, On Copyright and Related Rights in the Digital Single Market and Amending Directives 96/9/EC and 2001/29/EC, 2019 O.J. (L 130) 92.

61 Another relevant instrument which we do not analyze in this Article is the forthcoming Digital Markets Act. See generally, Pierre Larouche & Alexandre de Streel, The European Digital Markets Act: A Revolution Grounded on Traditions, 12 J. of Eur. Competition L. & Prac. 542 (2021).

62 Miriam C. Buiten, The Digital Services Act: From Intermediary Liability to Platform Regulation, 12 J. Intell. Prop. Info. Tech. & Elec. Com. L. (2022).

64 Netzwerkdurchsetzungsgesetz [Network Enforcement Act], Sept. 1, 2017, BGBl. I S. at 3352 (Ger), das zuletzt durch Artikel 3 des Gesetzes July 21, 2022 [last amended by Article 3 of the Law], BGBl. I S. 1182 (Ger.). For English language context, see: The Federal Government of Germany, Targeted Steps to Combat Hate Crimes (2017). https://www.bundesregierung.de/breg-en/news/bekaempfung-hasskriminalitaet-1738462.

65 Nationalrat [NR] [National Council] Gesetzgebungsperiode [GP] 27 Beilage [Blg] No. 151/2020, https://www.ris.bka.gv.at/GeltendeFassung.wxe?Abfrage=Bundesnormen&Gesetzesnummer=20011415 (Austria) [hereinafter Communication Platforms Act]. See European Commission, Directorate -General: Internal Market, Industry, Entrepereneurship and SMEs (Sept. 1, 2020). For a critical first analysis see, epicenter.works, First Analysis of the Austrian Anti-Hate Speech Law, EDRi, (Sept. 10, 2020) https://edri.org/our-work/first-analysis-of-the-austrian-anti-hate-speech-law-netdg-koplg/.

66 See European Commission, Law Aimed at Combating Hate Content on the Internet, (Aug. 21, 2019).

67 See Conseil constitutionnel [CC] [Constitutional Court] decision No. 2020-801DC, June 18, 2020 (Fr.) (Loi visant à lutter contre les contenus haineux sur internet). See also France: Constitutional Council declares French hate speech ‘Avia’law unconstitutional, Article 19 (June 18, 2020), https://www.article19.org/resources/france-constitutional-council-declares-french-hate-speech-avialaw-unconstitutional/.

68 See infra Part D.I.2.

69 Certainly, the EU acquis on consumer law contains a number of rules that should apply to the contractual relationship between users, as consumers, and platforms, as service providers. N.B. the Platform to Business Regulation contains a detailed definition of “terms and conditions,” as well as certain related obligations aimed at the providers of “online intermediation services.” See Commission Regulation 2019/1150 of June 20, 2019, Promoting Fairness and Transparency for Business Users of Online Intermediation Services, 2019 O.J. (L 186) 57, arts. 2–3 [hereinafter P2B Regulation]. But contrast the subject matter and scope of Article 1 of the P2B Regulation with that of the DSA (at 1 and 1a), and note that the DSA does not apply to matters regulated in the P2B Regulation (Art. 2(4) DSA).

70 For the definition of “video-sharing platform service,” see Directive 2018/1808, art. 1(aa).

71 See id. at art. 28(c).

72 See id. art. 28b(3)(a)–(b).

73 This new type of provider is defined in art. 2(6), whereas the new liability regime is contained in art. 17. See Directive 2019/790, art. 2(6), 17.

74 See id. art. 17(1)–(4).

75 See id. art. 17(9).

76 Poland, Case C-401/19, at para. 88.

77 See Commission Regulation 2021/784, art. 1.

78 Id. arts. 3–5.

79 Id. art. 2(8).

80 Id. art. 2(7).

81 Id. art. 5(1).

82 Id. art. 7(1).

83 Commission Regulation 2022/2065 of Oct. 19, 2022, On a Single Market for Digital Services, 2022 O.J. (L 277) 1, 41 [hereinafter DSA].

84 Id. art. 3(b).

85 See Gillespie, supra note 21.

86 See generally DSA.

87 Id. arts. 7, 9–10.

88 Id. arts. 11–48.

89 Id. art. 33.

90 See, for example, the number of average users in the EU for these platforms reported in DSA Impact Assessment. Id. art. 34.

91 Id. art. 3(u).

92 Id. at para. 45.

93 Id. arts. 14(1)–(3), (5)–(6).

94 Id. art. 14(4).

95 DSA art. 2(g).

96 Article 3(t) of the DSA contains the definition of “content moderation.” Id. art. 3(t).

97 See e.g., Maranke Wieringa, What to Account for when Accounting for Algorithms: A Systematic Literature Review on Algorithmic Accountability, in Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency 1 (2020); Lilian Edwards & Michael Veale, Slave to the Algorithm? Why a “Right to an Explanation” Is Probably Not the Remedy You Are Looking For, 16 Duke L. & Tech. Rev. 18 (2017). See also Committee Recommendation CM/Rec(2020)1 of Apr. 8, 2020, On the Human Rights Impacts of Algorithmic Systems, 2020 O.J. (C 1) 1, 9.

98 See Commission Regulation 2016/279 of Apr. 27, 2016, On the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC, 2016 O.J. (L 119) 1, arts. 7(2), 12(1), 14(2).

99 Recital 45 adds that service providers ”may use graphical elements in their terms of service, such as icons or images, to illustrate the main elements of the information requirements set out in this Regulation.” See DSA, at para. 45.

100 See Draft Report 2020/0361(COD) of May 28, 2021, On the Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services and Amending Directive 2003/31/EC, 2021 O.J. (C 361) 1, 42. Recital 45 provides some guidance on the meaning of “significant” by referring to changes that “could directly impact the ability of the recipients to make use of the service.” See DSA, at para. 45.

101 João Pedro Quintais et al., Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis 284 (2022).

102 See DSA, at para. 46.

103 See id. at para. 71.

104 See, respectively, Articles 28, 34, and 35. See id. arts. 28, 34–35.

105 See id. art. 14(5).

106 See id. art. 14(6).

107 See id. at paras. 48–49.

108 See id. at para. 102.

109 As noted, Article 14(4) of the DSA is similarly worded to Article 5 of the Terrorist Content Online Regulation.

110 See DSA, at para. 47.

111 Id.

112 See infra Part B.

113 In the examples below we do not make reference to DSA Article 21(3)(b), which refers to certification of out-of-court dispute settlement bodies by the Digital Services Coordinator, and imposes as a requirement that such a body “has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute” (emphasis added).

114 See id., at art. 34.

115 See id., at art. 34(1).

116 See id.

117 See id., at para. 83 (exemplifying with the dissemination or amplification of misleading or deceptive content, including disinformation).

118 See id., at art. 34(2)(c); id., at para. 84.

119 See id., at art. 35(1)(b); id., at para. 87 (“They should adapt and apply their terms and conditions, as necessary, and in accordance with the rules of this Regulation on terms and conditions.”).

120 See id., at art. 48(2)c; id., at para. 90.

121 See Charter of Fundamental Rights of the European Union Dec. 18, 2000, 2000 O.J. (C 364) 1.

122 See supra note 10, at para. 45.

123 Id.

124 See Committee of Ministers Recommendation CM/Rec(2018)2 of Mar. 7, 2018, On the Roles and Responsibilities of Internet Intermediaries, 2018 O.J. (C 2) 1, app. § 2.1.1. See also Article 19, supra note 10, at 39 (arguing that platforms should ensure that their T&Cs comply with international human rights standards).

125 See Committee of Ministers Recommendation CM/Rec(2018)2 app. § 2.3.1.

126 See Rep. of the Special Rep. of the Secretary-General, Guiding Principles on Business and Human Rights: Implementing the United Nations “Protect, Respect, and Remedy” Framework, U.N. Doc. A/HRC/17/31 (June 16, 2011).

127 Id. at paras. 15–22.

128 See Committee of Ministers Recommendation CM/Rec(2018)2, at para. 11 (“In line with the UN Guiding Principles on Business and Human Rights and the “Protect, Respect and Remedy” Framework, intermediaries should respect the human rights of their users and affected parties in all their actions.”).

129 Id. at app. § 2.3.4.

130 See e.g., Tamiz v. United Kingdom, App. No. 3877/14, para. 90 (Sept. 19, 2017), https://hudoc.echr.coe.int/eng?i=001-178106 (confirming that Google enjoys a “right to freedom of expression guaranteed by [Article 10 of the European Convention on Human Rights]”).

131 See infra Part D.I.2.

132 Case C-345/17, Sergejs Buivids v. Datu valsts inspekcija, -ECLI-:-EU-:C:2019:122, para. 65 (Feb. 14, 2019).

133 See Einarsson v. Iceland, App. No. 24703/15, para. 52 (Nov. 7, 2017), https://hudoc.echr.coe.int/eng?i=001-178362.

134 Yasin Özdemir v. Turkey, App. No. 14606/18, paras. 11–13 (Dec. 7, 2021), https://hudoc.echr.coe.int/eng?i=001-213773.

135 Belkacem v. Belgium, App. No. 34367/14, para. 79 (June 27, 2017), https://hudoc.echr.coe.int/eng?i=001-200344.

136 Mariya Alekhina et al. v. Russia, App. No. 38004/12, paras. 70–80 (July 17, 2018), https://hudoc.echr.coe.int/eng?i=001-184666.

137 Volodina v. Russia (no. 2), App. No. 40419/19, paras. 50–68 (Sept. 14, 2021), https://hudoc.echr.coe.int/eng?i=001-211794.

138 See Savva Terentyev v. Russia, App. No. 10692/09, (Aug. 28, 2018), https://hudoc.echr.coe.int/eng?i=001-185307.

139 Bédat v. Switzerland, App. No. 56925/08, para. 52 (Mar. 29, 2016), https://hudoc.echr.coe.int/eng?i=001-161898.

140 An example would be the Article 8 rights of a non-Facebook user targeted by a Facebook post.

141 See e.g., OOO Regnum v. Russia, App. No. 22649/08, para. 59 (Sept. 8, 2020), https://hudoc.echr.coe.int/eng?i=001-204319.

142 Savva Terentyev v. Russia, App. No. 10692/09, para. 54 (Aug. 28, 2018) https://hudoc.echr.coe.int/eng?i=001-185307.

143 Irene Khan (Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression), Disinformation and Freedom of Op. and Expression, at para. 70, U.N. Doc. A/HRC/47/25 (Apr. 13, 2021). For the policies at issue in their current version, see Nathaniel Gleicher, Coordinated Inauthentic Behavior, Meta (Dec. 6, 2018), https://about.fb.com/news/tag/coordinated-inauthentic-behavior/.

144 Vladimir Kharitonov v. Russia, App. no. 10795/14, para. 36 (June 23, 2020), https://hudoc.echr.coe.int/eng?i=001-203177.

145 Terentyev, App. No. 10692/09, at para. 64.

146 For an analysis of the compatibility of Facebook’s content policies over time with applicable international standards on freedom of expresión, see Konstantinos Stylianou, Nicole Zingales, Nicolo & Stefania Di Stefano, Is Facebook Keeping up with International Standards on Freedom of Expression? A Time-Series Analysis 2005–2020, (Feb. 11, 2022), https://ssrn.com/abstract=4032703.

147 See generally Oversight Board, https://oversightboard.com/decision/ (last visited May 17, 2023) (containing case decisions and policy advisory opinions on Meta’s content decisions). See also Welcome to the FOB Blog: Overseeing the Facebook Oversight Board, Lawfare Blog, https://www.lawfareblog.com/fob-blog (last visited May 17, 2023). For an examination of the setting up of the Oversight Board, see Kate Klonick, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 Yale L. J. 2418 (2020).

148 See Swedish Journalist Reporting Sexual Violence Against Minors, Oversight Board (Feb. 16, 2021),

https://www.oversightboard.com/decision/FB-P9PR9RSA (last visited May 17, 2023) (overturning Meta’s decision).

149 International Covenant on Civil and Political Rights, Dec. 16, 1966, 999 U.N.T.S. 171.

150 See supra note 148, at para. 8.1.

151 Id., at para. 8.3.

152 See Oversight Board, https://www.oversightboard.com/appeals-process/ (last visited May 17, 2023) (describing the board’s appeal process).

153 See supra Part C.I.

154 See Kettemann & Tiedeke, supra note 32.

155 See Rb. Amsterdam 9 september 2020, NJ 2020, 650 m.nt MDB (Café Welschmertz/YouTube) (Neth.); Rb. Amsterdam 13 oktober 2020, NJ 2020, 783 m.nt A.W. Hins (Viruswaarheid/Facebook) (Neth.); Rb. Amsterdam 18 augustus 2021, NJF 2021, 384 m.nt (van Haga/Youtube) (Neth.); Rb. Amsterdam 15 september 2021, NJ 2021, 621 m.nt M. Klos (Forum/YouTube) (Neth.); Rb. Noord-Holland 6 oktober 2021, -NJ- 2021, 432 m.nt M. Weij (van Haga/LinkedIn) (Neth.); Rb. Rotterdam 29 oktober 2021, NJ 2021, 622 m.nt (Engel/Facebook) (Neth.).

156 These claims are based directly on Article 10 ECHR due to a constitutional provision which prohibits Dutch Courts from conducting constitutional review of Dutch legal provisions, but explicitly allows Courts to directly apply international human rights treaties. See Gw. [Constitution], art. 93–94, 120 (Neth.).

157 See Forum/YouTube, C/13/704886, at para. 4.8.

158 See Van Haga/LinkedIn, C/15/319230, at para. 4.14.

159 See Viruswaarheid/Facebook, C/13/689184, at para. 4.24.

160 See Forum/YouTube, C/13/704886, at para. 4.5.

161 See Engel/Facebook, C/10/622391, at para. 4.12.

162 Appleby et al. v. U.K., App. No. 44306/98, at para. 40 (May 6, 2003), https://hudoc.echr.coe.int/eng?i=001-61080.

163 See van Haga/Youtube, C/13/703563, at para. 4.11 (quoting Appleby, App. No. 44306/98, at para. 47).

164 See generally A.W. Hins, No. 17—Rb. North Holland (pres.) October 6, 2021, Van Haga/LinkedIn, MediaForum (Jan. 7, 2022) https://www.mediaforum.nl/art/90-4753_Nr-17-Rb-Noord-Holland-vzr-6-oktober-2021-Van-Haga-LinkedIn.

165 Bundesverfassungsgericht [BVerfGE] [Federal Constitutional Court], Jan. 15, 1958, 7 Entscheidungen des Bundesverfassungsgerichts [BVerfGE] 198, paras. 198–230 (Ger.) [hereinafter Lüth decision].

166 See Kettemann & Tiedeke, supra note 32, at 11.

167 See Bundesgerichtshof [BGH] [Federal Court of Justice] Press Release No. 149/2021, On Claims Against the Provider of a Social Network that has Deleted Posts and Blocked Accounts on Charges of “Hate Speech” (July 29, 2021) (discussing cases III ZR 192/20 and III ZR 179/20) [hereinafter Press Release No. 149/2021]; Bundesverfassungsgericht [BVerfGE] [Federal Constitutional Court], May 22, 2019, 42 Neue Juristische Wochenschrift [NJW] 19 (Ger.) [hereinafter Order of May 22, 2019].

168 See Press Release No. 149/2021.

169 Mauritz Hennemann & Amelie Heldt, Prozedurale Vermessung: Das Kuratieren kommunikativer Räume durch soziale Netzwerke Zugleich Besprechung von BGH, Urteile vom 29.7.2021—III ZR 179/20 (ZUM 2021, 953) und III ZR 192/20, 65 ZUM 981 (2021). See also with a similar conclusion under earlier case law, Kettemann & Tiedeke, supra note 32, at 11 (“Thus, the violation of the terms of service does not always suffice to justify a deletion of a statement if it is protected under Art. 5 (1) (1) Basic Law (GG), thus restricting the rights of Facebook under Arts. 2, 12, 14 Basic Law (GG)”).

170 Kettemann & Tiedeke, supra note 32.

171 Berdien van der Donk, European Views on the Privatisation of the “Online Public Space,” Media Research Blog (2022), https://leibniz-hbi.de/de/blog/european-views-on-the-privatization-of-the-online-public-space.

172 See e.g., Case C-140/20, G.D. v. Commissioner of An Garda Síochána et al., ECLI:EU:C:2022:258, para. 49 (Apr. 5, 2022), https://curia.europa.eu/juris/liste.jsf?num=C-140/20. See also Aleksandra Kuczerawy, The Power of Positive Thinking: Intermediary Liability and the Effective Enjoyment of the Right to Freedom of Expression, 8 J. Intell. Prop. Info. Tech . & Elec . Com . L. 226 (2017); Aleksandra Kuczerawy, Intermediary Liability and Freedom of Expression in the EU: From Concepts to Safeguards (2018).

173 Palomo Sánchez et al. v. Spain, App. No. 28955/06, para. 59 (Sept. 12, 2011), https://hudoc.echr.coe.int/eng?i=001-106178.

174 Sánchez, App. No. 28955/06, at para. 59.

175 See Remuszko v. Poland, App. No. 1562/10 (July 16, 2013), https://hudoc.echr.coe.int/eng?i=001-122373.

176 Remuszko, App. No. 1562/10, at para. 57.

177 Id., at para. 62.

178 Id., at para. 85.

179 Id.

180 Sánchez, App. No. 28955/06, at para. 57.

181 Opinion of Advocate General Øe, Case C-401/19, Poland v. Parliament and Council (July 15, 2021). In the ensuing judgment, the CEJU is less explicit than the Advocate General but clearly endorses the underlying principles of his approach as regards applicability of ECtHR case law on freedom of expression. See Case C-401/19, Poland v. Parliament and Council, paras. 44–47, 68, 74 (Apr. 26, 2022).

182 Opinion of Advocate General Saugmandsgaard Øe at n.90, Case C-401/19, Poland v. Parliament and Council (July 15, 2021).

183 Opinion of Advocate General Saugmandsgaard Øe, supra note 182, at n. 90.

184 See DSA, at art. 17(3)(e).

185 DSA, at para. 59.

186 See Jörg Wimmers, The Out-of-Court Dispute Settlement Mechanism in the Digital Services Act: A Disservice to Its Own Goals, 12 J. Intell. Prop. Info. Tech. & Elec. Com. L. 381 (2022). See also Daniel Holznagel, The Digital Services Act Wants You to “Sue” Facebook Over Content Decisions in Private de facto Courts, Verfassungsblog (June 24, 2021) https://verfassungsblog.de/dsa-art-18/.

187 NB, the right to lodge a complaint is limited to the DSC of the Member State where the recipient of the service is located or established.

188 “[I]ndividuals or representative organizations should be able to lodge any complaint related to compliance with this Regulation.” Similarly argued by Bengi Zeybek, Joris van Hoboken & Ilaria Buri, Redressing Infringements of Individuals’ Rights Under the Digital Services Act, DSA Observatory (May 4, 2022), https://dsa-observatory.eu/2022/05/04/redressing-infringements-of-individuals-rights-under-the-digital-services-act/.

189 See Zeybek, Hoboken & Buri, supra note 188.

190 See e.g., Giulia Gentile & Orla Lynskey, Deficient by Design? The Transnational Enforcement of the GDPR, 71 Intʼl & Compar. L. Q. 799 (2022).

191 See supra Part C.II.

192 Cynthia Khoo, Coordinated Flagging, in Glossary of Platform law & Policy Terms, (Luca Belli & Nicolo Zingales eds., 2021).

193 Similarly, institutional solutions such as the fundamental rights assessments given by the Facebook Oversight Board are inadequate, as the Board only offers its assessment ex post and in a small number of cases.

194 Evelyn Douek, Content Moderation as Administration, 136 Harv. L. Rev. 526 (2022).

195 See infra Part 3.2. See also DSA, at arts. 34(1)(b), 34(2)(c), 35(1)(b).

196 See DSA, at art. 34(1)(b).

197 Id.

198 See supra Part D.I., as well as DSA, at para. 47.

199 Joan Barata, The Digital Services Act and Its Impact on the Right to Freedom of Expression: Special Focus on Risk Mitigation Obligations, DSA Observatory (July 27, 2021), https://dsa-observatory.eu/2021/07/27/the-digital-services-act-and-its-impact-on-the-right-to-freedom-of-expression-special-focus-on-risk-mitigation-obligations/; Zohar Efroni, The Digital Services Act: Risk-Based Regulation of Online Platforms, Internet Policy Review (Nov. 16, 2021), https://policyreview.info/articles/news/digital-services-act-risk-based-regulation-online-platforms/1606; Johann Laux, Sandra Wachter & Brent Mittelstadt, Taming the Few: Platform Regulation, Independent Audits, and the Risks of Capture Created by the DMA and DSA, 43 Comput. L. & Sec. Rev. 105613 (2021).

200 See Barata, supra note 199.

201 See Laux, Wachter, & Mittelstadt, supra note 199.

202 See supra Part B.II.

203 Edoardo Celeste, Digital Constitutionalism: A New Systematic Theorisation, 33 Intʼl Rev. L. Comput. & Tech. 76 (2019). Generally on the topic, see Giovanni De Gregorio, Digital Constitutionalism in Europe: Reframing Rights & Powers in the Algorithmic Society 77 (2022).

204 Nicolas Suzor, Digital Constitutionalism: Using the Rule of Law to Evaluate the Legitimacy of Governance by Platforms, 4 Soc. Media & Socʼy 11 (2018); Giovanni De Gregorio, The Rise of Digital Constitutionalism in the European Union, 19 Intʼl J. Con. L. 41 (2021).

205 See e.g., David Kennedy, The International Human Rights Regime: Still Part of the Problem?, in Examining Critical Perspectives on Human Rights 19 (Colin Murray et al. eds., 2012); Elizabeth Kiss, Alchemy or Fool’s Gold? Assessing Feminist Doubts About Rights, Dissent Magazine, Summer 1995.

206 Rachel Griffin, Rethinking Rights in Social Media Governance: Human Rights, Ideology, and Inequality, 2 Eur. L. Open (forthcoming 2022).

207 See Evelyn Douek, Governing Online Speech: From “Posts-As-Trumps” to Proportionality and Probability, 126 Colum. L. Rev. 759 (2021).

208 See Griffin, supra note 206. See also Kennedy Duncan, The Stakes of Law, or Hale and Foucault!, XV Legal Stud. F. 327 (1991).

209 See Gillespie, supra note 21; Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 73 (2018); Roberts, supra note 28.

210 See Griffin, supra note 206; Jørgensen, supra note 33. Note that freedom to conduct a business is guaranteed under Article 16 Charter, but not under the ECHR; and the right to property is guaranteed under Article 17 Charter, but was not included in the ECHR and is instead guaranteed under Article 1 of Protocol No. 1 to the ECHR.

211 However, the broader Digital Services Act Package, including the Digital Markets Act, do contain provisions aimed at platform’s advertisement based business model. See e.g., DSA, at arts. 34(2)(d), 39; Commission Regulation 2022/1925 of Sept. 14, 2022, On Contestable Fair Markets in the Digital Sector, 2022 O.J. (L 265) 1, arts. 5(g), 6(1)(g).

212 Elettra Bietti, A Genealogy of Digital Platform Regulation, 7 Geo. L. & Tech. Rev. (forthcoming 2022); Evelyn Douek, Content Moderation as Administration, 136 Harv. L. Rev. 526 (forthcoming 2022); Nathaniel Persily & Joshua A. Tucker, Social Media and Democracy: The State of the Field, Prospects for Reform (2020).

213 See DSA, at art. 1(1).

214 See Suzor, supra note 23.

215 Daniel Berliner & Aseem Prakash, “Bluewashing” the Firm? Voluntary Regulations, Program Design, and Member Compliance with the United Nations Global Compact, 43 Polʼy Studies J. 115 (2015).

216 For a consideration of how T&Cs can be embedded in a system of “cooperative responsibility” to ensure that the governing of online platforms aligns with “public values,” see Natali Helberger, Jo Pierson & Thomas Poell, Governing Online Platforms: From Contested to Cooperative Responsibility, 34 Info. Socʼy 1 (2018).