A. Introduction
As the European Court of Human Rights (ECtHR) has emphasized, online platforms, such as Meta/Facebook, Twitter and YouTube, provide an “unprecedented” means for exercising freedom of expression online. Footnote 1 International human rights bodies have recognized the “enormous power” platforms wield over participation in the online “democratic space.” Footnote 2 However, it is increasingly clear that the systems operated by platforms, where automated content moderation decisions are taken based on a platformʼs terms of service, are “fundamentally broken.” Footnote 3 Content moderation systems are said to “undermine freedom of expression,” Footnote 4 especially where important public interest speech ends up being suppressed, such as speech by minority and marginalized groups, Footnote 5 black activist groups, Footnote 6 environmental activist groups, Footnote 7 and other activists. Footnote 8 Indeed, the UN Special Rapporteur on freedom of expression has criticized these content moderation systems for their overly vague rules of operation, inconsistent enforcement, and overdependence on automation, which can lead to over-blocking and pre-publication censorship. Footnote 9 This criticism is combined with, and amplified by, the notion that Big Tech exercises too much power over our online public sphere. Therefore, to better protect free expression online, the UN Special Rapporteur and civil society have argued platforms “should incorporate directly” principles of fundamental rights law into their terms and conditions (T&Cs). Footnote 10
Crucially, as the legal relationship between platforms and users is conceptualized as a private one, with the T&Cs as the contractual basis, in principle, EU fundamental rights are not directly applicable to this relationship. The European Convention on Human Rights (ECHR) and the Charter of Fundamental Rights of the EU Charter primarily govern the vertical relationship between states and citizens, not the horizontal relationship between private parties. Footnote 11 Moreover, the contractual nature of T&Cs implies that the law conceptualizes the relationship between intermediary service providers and users as an equal one. Service providers, as private companies, have the right to offer their services on their own terms and, for their part, users are free to enter into this contract, negotiate other terms, or find an alternative provider. Footnote 12 From this perspective, the provider’s ability to set its own terms of service can be seen as a direct extension of its fundamental right to property and freedom of contract. However, this legal abstraction of two equal contractual parties does not hold in practice. Footnote 13 The reason is that large providers, such as major social media platforms, have significantly more bargaining power than their individual users when determining how their services are used. Users, for their part, are mostly in a dependent position and must accept the “house rules” of providers, which ultimately shape how individuals participate in the online exchange of ideas and information. This inequality is a feature of platform contractual relationships between users and providers. It has long been identified by researchers and, to some extent, is increasingly recognized in certain areas of EU law. Footnote 14
Apart from a rule in the Terrorist Content Regulation Footnote 15 and until recently, EU law did not impose on intermediary services in general, and online platforms in particular, an obligation to incorporate fundamental rights into their T&Cs. But an important provision in the Digital Services Act (DSA) Footnote 16 aims to change this. Article 14 DSA lays down new rules on how platforms can enforce their T&Cs, including that they must have “due regard” to the “fundamental rights” of users under the EU Charter. Footnote 17 In this article, we examine the topic of enforceability of fundamental rights via T&Cs through the prism of Article 14 DSA. We ask whether this provision requires online platforms to apply EU fundamental rights law and to what extent it may curb the power of Big Tech over online speech or merely serve to legitimize their content moderation practices concerning the protection of human rights. In this inquiry, we also explore the opportunities and limitations of a rule that pushes platforms towards enforcement of fundamental rights in the application of their T&Cs.
The focus of our analysis is on EU law, including its interpretation by the Court of Justice of the EU (CJEU). Because the Charter is, under Article 52(3), internally connected to the ECHR and its interpretation by the ECtHR, these are included in the analysis.Footnote 18 In addition, we are primarily concerned with the role of the largest Big Tech platforms, what in the DSA terminology are called “very large online platforms” (VLOPs) Footnote 19 —in essence, large-scale user-upload platforms such as YouTube, Meta’s Facebook and Instagram, TikTok or Twitter—and how their T&Cs shape their content moderation activities.Footnote 20 The article proceeds as follows. After this introduction, Part B provides the necessary context on the inherent tension regarding the desirable role and responsibility of platforms, the emergent landscape of platform regulation, and the lack of regulation on platforms’ T&Cs. Part C examines the mechanics of Article 14 DSA, both as regards information obligations, and application and enforcement of fundamental rights. Based on the previous analysis, Part D identifies three elements for critical discussion: Possible horizontal application of fundamental rights and how this relates to differences in application in EU Member States; compliance and enforcement of Article 14 DSA; and the potential limitations arising from overreliance on fundamental rights in this area. Part E concludes that Article 14 will make it possible, in principle, to establish the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users. But in order for the application and enforcement of T&Cs to take due regard of fundamental rights, Article 14 must be operationalized within the framework of the international and European fundamental rights standards, along the lines suggested in our analysis.
B. The Emergence of EU Platform Regulation and Its Limited Regulation of Terms and Conditions
I. The Case for Platform Regulation and Its Perils
For the most part, EU law leaves a broad margin of discretion for platforms to set and enforce their content moderation policies through a combination of their T&Cs and human and algorithmic practices. Footnote 21 This freedom and the power it bestows on platforms have given rise to pointed criticism. Footnote 22 As a result, calls to reign in the power of Big Tech over online speech have increased over time from civil society, academia and even governments. Footnote 23 While in the first few years of their existence, platforms were seen as the champions of free speech vis-à-vis government intervention and censorship, their public image and perception has shifted significantly. Footnote 24 In this age of the “techlash,” Footnote 25 Big Tech platforms and their content moderation practices are increasingly viewed as a potential threat to free expression and a healthy public debate. Footnote 26 This criticism includes a range of concrete and often contradictory charges. Some critics chastise platforms for intervening too much in the content they host, leading to over-removal and suppression of lawful content, for example, from vulnerable groups or specific political orientations. Others criticize platforms for doing too little to address harmful or outright illegal conduct on their services, such as the spread of disinformation or hate speech. Footnote 27 In addition, because these platforms are private companies, their actions—including on content moderation—are primarily driven by commercial or business interests and values. Footnote 28 Such actions are rarely, and perhaps only accidently, aligned with public values, even if most Big Tech platforms explicitly adopt the language of public values or even fundamental rights. Footnote 29
However, from a policy perspective, there is uncertainty on the best approach to tackle the power of Big Tech. On the one hand, there is a growing realization of the outsized role and influence of certain platforms on online speech has negative consequences for the polity and public interest. From this perspective, the regulatory push towards an increased role and responsibility for intermediaries is understandable. As the primary mediators of online content, intermediary service providers like platforms are in a better position to control the flow of content as compared to governments and law enforcement agencies. Increasingly, the EU has been relying on this position of control by transferring responsibility for the ex-ante and ex-post filtering, blocking and removal of specific types of illegal content directly to intermediaries. Footnote 30 Many of the EU and national level instruments we mention below denote features of this approach, including obligations to impose proactive preventive measures, such as filtering and notice-and stay-down mechanisms, reactive removal obligations, such as notice-and-takedown mechanisms—some of which in time frames as short as 24 hours—injunctions against non-liable intermediaries for third party unlawful content on their services, such as website or -DNS- blocking, account termination or content removal, and extensive information obligations on potential infringers, to name just a few. Footnote 31 In addition, it is also possible to identify at national level several recent laws, legislative proposals and national court judgments advancing rules and interpretations that range from “must-carry”-like obligations—imposing a duty on platforms to host certain types of legal content, sometimes from specific sources like politicians—to more nuanced rulings where in specific circumstances, certain platforms can be obligated to reinstate accounts and specific content. Footnote 32
On the other hand, there is a countervailing concern that the wrong type of regulatory intervention might cause more harm than good. Therefore, even if understandable, this “enhanced responsibility” turn for intermediaries has been subject to sharp criticism. From a constitutional legitimacy perspective, it has been argued that private entities should not be conducting government tasks. Footnote 33 Concretely, there are concerns about the effect such regulatory interventions may have inter alia on the right to freedom of expression of platform users. A crucial concern is that this new wave of platform-regulation rules result in a form of undesirable privatized enforcement that poses serious risks of over-removal of legal content, putting undue restrictions on people’s ability to express themselves online which, as described by the ECtHR, can have a “chilling effect” on freedom of expression. Footnote 34 This claim was made at the very conception of this type of regulation in the 1990s and has gained force with the increased vertical EU regulation and national enforcement acts, as described below.
II. The Emergence of EU ‘Platform Regulation’ and the Concept of Enhanced Responsibility’
The above concerns and criticisms have led to policy and legislative action to reign in the power of platforms through regulation of illegal and harmful content online. In Europe, this is occurring both at the EU and national levels. For most of the early 21st century, EU law on intermediaries was sparse, with no comprehensive harmonization of intermediary liability. The centerpiece of the legal framework was the 2000 e-Commerce Directive, which contains mere conditional liability exemptions, or “safe harbors,” for certain types of intermediary services involving claims for damages: mere conduit or access, caching, and hosting. Footnote 35 The Directive further contains a prohibition on the imposition by Member States on intermediary service providers of general monitoring obligations. Footnote 36 Under this regime, intermediaries may still be required to take measures against the infringement of third-party rights, because it remains possible to subject intermediaries to injunctions for instance in regards to intellectual property rights, Footnote 37 and duties of care. Footnote 38
The interpretation of this constellation of provisions is complex and far from settled. Footnote 39 The most relevant CJEU case law for online platforms—as types of hosting service providers—refers to Article 14 e-Commerce Directive. The CJEU has applied Article 14 to a search engine’s advertising service (Google France/Louis Vuitton), an online sales platform (L’Oréal/eBay), social networking platforms (Netlog; Glawischnig-Piesczek) and video-sharing platforms (YouTube and Cyando). Footnote 40 In its case law, the CJEU has noted that safe harbors require a sufficient degree of “neutrality” from the intermediary. This approach has created a grey area for the qualification of certain online platforms as “neutral” or “passive” compared to “active” intermediaries for the purposes of the hosting safe harbor. Footnote 41 Generally, when a provider plays “an active role of such a kind as to give it knowledge of, or control over, the data stored,” it falls outside the scope of safe-harbor protection. Footnote 42 However, the determination of what constitutes such an active role in practice might depend on the type of platform at issue, the content it hosts, and several factors that qualify its conduct or knowledge vis-à-vis the illegal content it hosts.Footnote 43
A further aspect of the legal framework is controversial. Article 15 e-Commerce Directive—supported by Recital 47—requires that a distinction be made between prohibited “general” monitoring obligations and permitted obligations to monitor in “specific” cases. The Court has attempted to draw lines to clarify this distinction, including in L’Oréal/eBay, Scarlet Extended, Netlog, Eva Glawischnig-Piesczek, YouTube and Cyando, and Poland v. Commission and Council. Footnote 44 However, there remains significant legal uncertainty as to what would constitute specific monitoring for distinct types of illegal content in different scenarios.
Against this background and development of CJEU case law in interpreting specific subject matter rules to extend the reach of harmonized EU law to online intermediaries, like in the context of intellectual property, there has been an increasing push towards additional regulation of online platforms. This push has been justified around a somewhat blurry concept of legal, societal, political and even moral “responsibility” of online platforms. Footnote 45 The potential result, as noted by Frosio and Husovec, could “represent a substantial shift in intermediary liability theory,” signaling a “move away from a well-established utilitarian approach toward a moral approach by rejecting negligence-based intermediary liability arrangements,” practically leading to a “broader move towards private enforcement online.” Footnote 46
At the policy level, this push is visible in Commission instruments dating back to 2016. Starting timidly in its 2016 Communication on “Online Platforms in the Digital Single Market,” the Commission brings up the need for platforms to act responsibly, but largely endorses self- and co-regulation as the most flexible and future-proof approaches to address the increasing role and importance of platforms in the online ecosystem. Footnote 47 The Commissionʼs approach becomes sharper in its “Tackling Illegal Content Online” policy agenda, first with a Communication in 2017 and then with a Recommendation in 2018. Footnote 48
In line with the conceptual shift noted above, the Communication is titled “Towards an Enhanced Responsibility for Online Platforms,” it provides a “set of guidelines and principles for online platforms to step up the fight against illegal content online in cooperation with national authorities, Member States and other relevant stakeholders.” Footnote 49 Its objectives are to improve a range of content moderation actions by platforms in the notice-and-action spectrum, transparency rules, protection of fundamental rights online, as well as to clarify the liability regime of platforms under the e-Commerce Directive. Footnote 50 Among the structural proposals by the Commission is the endorsement of voluntary proactive measures by platforms to detect and remove illegal content online as well as notice and stay-down mechanisms, the deployment of which by platforms is considered not to lead to the loss of the hosting liability exemption. Footnote 51 This is said to be particularly the case when proactive measures “are taken as part of the application of the terms of services of the online platform,” Footnote 52 presumably because this would reinforce their voluntary status. As regards T&Cs, the remaining references in the Communication amount to an endorsement of their use to impose information and transparency obligations on platforms regarding content moderation practices. Footnote 53
This approach sets the tone for the subsequent recommendation, including some of its most problematic aspects from a freedom expression perspective. In particular, the instrument offers specific recommendations relating to terrorist content, including demanding proactive measures. Footnote 54 For instance, it recommends that if hosting providers remove or disable access to content identified in government agencies’ referrals, they should do so “as a general rule, within one hour from the moment at which they received the referral.” Footnote 55 In this context, it is recommended that hosting service providers “expressly set out in their terms of service that they will not store terrorist content.” Footnote 56 In addition, as regards T&Cs, the Recommendation echoes the Communication’s call for ex ante transparency in such documents regarding platformʼs content moderation practices while retaining their freedom to “set and enforce their terms of service in accordance with Union law and the laws of the Member States.” Footnote 57
At this point, it should be noted that the policy agenda for tackling illegal content online covers different issues, such as “incitement to terrorism, illegal hate speech, child sexual abuse material, infringements of intellectual property rights and consumer protection.” Footnote 58 These topics are dealt with in a combination of binding and non-binding instruments, which reflect the policy shift towards the “enhanced responsibility” of platforms, characterized by changes to the liability framework incentivizing and in some cases imposing proactive measures, additional obligations on platforms, for example, as regards transparency, and procedural ex post safeguards, such as complaint and redress mechanisms.
Among the noteworthy binding instruments that manifest this new approach are the 2018 Terrorist Content Regulation, the 2018 revision of the Audiovisual Media Services Directive (AVMSD) which extended rules and obligations to video sharing platforms and social media services, Footnote 59 the new liability regime for online content-sharing platforms in Article 17 of the Copyright in the Digital Single Market Directive (-CDSMD-), Footnote 60 and, most relevant among them, the 2022 DSA. Footnote 61 By moving the “responsibility of intermediaries away from the area of liability and deeper into the realm of regulation,” the DSA becomes the exponent of EU lawʼs turn from intermediary liability towards “platform regulation.” Footnote 62 As for the non-binding instruments, which manifest a co- or self-regulatory approach, they include the -EU- Code of Conduct on countering illegal hate speech online, the Memorandum of understanding on the sale of counterfeit goods on the internet, and the strengthened Code of Practice on Disinformation. Footnote 63
Finally, it should be noted that neither national legislators nor courts have been quiet in this regard. Notable platform regulation enacted or proposed instruments include the German NetzDG as amended, Footnote 64 Austria’s “Anti-Hate Speech Law,” Footnote 65 and France’s “Avia Law” Footnote 66 —the first iteration of which saw main portions struck down by the French Constitutional Council. Footnote 67 It remains to be seen to what extent parts of these national initiatives will be preempted by the DSA. Still, from our perspective, the most relevant developments at national level on the intersection of platform regulation, T&Cs and fundamental rights, are found in national court decisions, which we further examine below. Footnote 68
III. The Limited Rules on T&Cs and Fundamental Rights in EU Platform Regulation
Until the DSA, there was relatively little explicit regulation of platforms’ T&Cs, for instance in the AVMSD, the CDSMD and the Terrorist Content Regulation. Footnote 69 The consequence is that platforms enjoy significant discretion and power in determining their content moderation practices vis-a-vis users in their “house rules.”
The revised AVMSD imposes on “video-sharing platforms,” a type of online platform, Footnote 70 a set of “appropriate measures” to protect minors and the general public from certain types of harmful and illegal content. Footnote 71 These measures consist inter alia of including and applying in the T&Cs of the platform services certain requirements set out in Directive. Footnote 72 The CDSMD establishes a new liability regime for “online-content sharing service providers,” a type of online platform hosting copyright-protected content. Footnote 73 The new rules impose direct liability on these providers for user-uploaded copyright-protected content they host, set aside the application of the hosting safe-harbor in the e-Commerce Directive, and mandate best efforts obligations on providers to deploy proactive and reactive preventive measures to detect and remove copyright-infringing content. Footnote 74 Among the safeguards recognized in this regime, these providers are required to inform their users in their T&Cs that they can use protected content under exceptions to copyright. Footnote 75 Crucially, this includes user-uploaded content that qualifies as “quotation, criticism, review” or “use for the purpose of caricature, parody or pastiche,” recognized by the -CJEU- as freedom of expression-based “user rights.” Footnote 76
Finally, the Terrorist Content Regulation lays down uniform rules to address the misuse of hosting services for the dissemination to the public of terrorist content online. Footnote 77 Building on the approach of the 2018 Recommendation on Tackling Illegal Content Online, this Regulation imposes on providers of hosting services a number of obligations regarding measures to address the dissemination of terrorist content online, including being subject to orders to remove or disable access to this type of content in all Member States. Footnote 78 This Regulation defines T&Cs as “all terms, conditions and clauses, irrespective of their name or form, which govern the contractual relationship between a hosting service provider and its users.” Footnote 79 Article 5(1) is the clear inspiration for Article 14 DSA. The provision states that hosting service providers exposed to terrorist content “shall, where applicable, include in [their T&Cs] and apply provisions to address the misuse of its services for the dissemination to the public of terrorist content.” Footnote 80 Furthermore, these obligations shall be carried out “in a diligent, proportionate and non-discriminatory manner, with due regard, in all circumstances, to the fundamental rights of the users and taking into account, in particular, the fundamental importance of the freedom of expression and information in an open and democratic society, with a view to avoiding the removal of material which is not terrorist content.” Footnote 81 Finally, T&Cs are also relevant in the context of safeguards, in particular as regards transparency obligations for hosting service providers. These providers must spell out clearly in their T&Cs their policy for addressing the dissemination of terrorist content, including, the explanation of the functioning of specific measures, such as the use of automated tools. Footnote 82 To the best of our knowledge, this obligation is underexplored in theory and practice, with little research or evidence on how platforms have implemented it.
It is against this background, where the push towards additional platform regulation clashes with concerns over its effect on users’ freedom of expression, that any proposals to regulate platforms’ T&Cs must be considered. Article 14 DSA is meant to fill an important gap in addressing the wide margin of discretion enjoyed by intermediary service providers in their contractual relationship with users, which governs the human and automated moderation of illegal and harmful content online. The provision is particularly significant due to the horizontal nature of the DSA in EU law. The DSA lays down harmonized rules on the provision of intermediary services in the internal market and will apply to intermediary services provided to recipients of the service—including individual users—that have their place of establishment or are located in the EU. Footnote 83 As such, rules on T&Cs in the DSA will govern the contractual relationship between service providers and a “recipient of the service”—including individual users Footnote 84 in the EU directly regulating the conditions under which intermediaries can offer their services. When applied to Big Tech platforms, Article 14 DSA has therefore the potential to shape the provision of what some authors consider to be their core service: content moderation. Footnote 85
C. Mechanics of Article 14 DSA
The DSA was adopted and published in the official EU journal on October 19, 2022, Footnote 86 and is a regulation that is divided into five chapters. Chapter II sets out the regime for the liability of intermediary services providers. This consists of a revised version of the liability exemptions—for services of “mere conduit,” “caching” and hosting—and a prohibition on general monitoring obligations in Articles 12 to 15 e-Commerce Directive, with some noteworthy additions in the form of a quasi “Good Samaritan” clause and rules on orders or injunctions. Footnote 87 Chapter III deals with due diligence obligations that are independent of the liability assessment made under the previous chapter, and which constitute a novelty in relation to the e-Commerce Directive. Footnote 88 It distinguishes between specific categories of providers, by setting out asymmetric obligations that apply in a tiered way to all providers of intermediary services, hosting providers, online platforms, very large online platforms (VLOPs) and very large online search engines (VLOSEs). VLOPs are online platforms that reach a number of average monthly active recipients of the service in the EU equal to or greater than 45 million—a number that represents about 10% of the EU population—and which are so designated pursuant to a specific procedure. Footnote 89 In practice, only the largest user-upload Big Tech platforms operating in the -EU-, like YouTube, Facebook, Instagram or TikTok, will likely qualify as VLOPs. Footnote 90 Hosting providers are a type of provider of intermediary services, online platforms a type of hosting provider, and VLOPs a type of online platform. The due diligence obligations are cumulative. Consequently, providers of intermediary services are subject to the fewest obligations and VLOPs and VLOSEs are subject to the most obligations. Crucially, all providers are subject to Article 14 DSA.
Article 14 is titled “Terms and conditions,” a concept that is defined as “all terms and conditions or clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.” Footnote 91 The rationale behind Article 14 is that while the freedom of contract of intermediaries is the general rule, “it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.” Footnote 92 As such, Article 14’s aim is to increase the transparency of T&Cs and bring their enforcement in direct relation to fundamental rights.
A crucial feature of Article 14 is that it not only applies to illegal content, like Chapter II, but also covers content contrary to the T&Cs of a provider of intermediary services. In doing so, and because it applies to all providers of intermediary services, the provision extends the obligations of Chapter III beyond illegal content. Consequently, the DSA covers a much broader scope of content moderation decisions than the e-Commerce Directive.
Article 14’s aims of transparency and enforcement are dealt with in two distinct sets of obligations. One set of obligations deals with information and transparency obligations. Footnote 93 The other deals with application and enforcement and, arguably, brings providers’ T&Cs within the scope of EU fundamental rights. Footnote 94 We discuss each set of obligations in turn, with the emphasis on the fundamental rights provision, which is central to our subsequent analysis.
I. Information and Transparency Obligations
Article 14(1) sets out a broad information obligation for providers of intermediary services regarding certain content moderation practices outlined in their T&Cs. It aims to ensure that the T&Cs are transparent and clear as to how, when and on what basis user-generated content can be restricted. The object of the obligation appears to be acts of content moderation by providers that impose “any restrictions” on recipients of the service. But it is unclear whether content moderation actions by the provider that do not strictly restrict what content users can post, such as ranking, recommending or demonetizing content, are within the scope of Article 14.
The second sentence of paragraph (1) explicitly refers to “content moderation,” a concept that covers activities undertaken by providers to detect, identify and address user-generated content that is either (i)“illegal content” Footnote 95 or (ii) incompatible with their T&Cs. Footnote 96 Further, the provision explicitly mentions “algorithmic decision-making,” raising the important question of what providing information on “any policies, procedures, measures and tools” might look like. Footnote 97 However, the exact scope of the paragraph remains unclear, as the phrasing in the first sentence of “any restrictions” appears wider than the definition of content moderation in the DSA, thereby likely broadening the provision’s scope.
In its last sentence, Article 14(1) sets out how this information should be conveyed. Echoing similar obligations in the General Data Protection Regulation (GDPR), Footnote 98 the T&Cs should be “clear.” However, where the GDPR refers to “clear and plain” language, Article 14(1) DSA goes one step further by requiring “intelligible, user friendly and unambiguous language,” which appears to result in a higher threshold obligation. Footnote 99 The same sentence further adds the requirement that the information at issue should be publicly available in an easily accessible and machine-readable format.
Building on this general clause, paragraphs (2), (3), (5) and (6) add more specific information and transparency obligations. First, inspired by an amendment proposed by the IMCO Committee, paragraph (2) requires that providers inform the recipients of the service of any significant change to the T&Cs. Footnote 100 The obligation is consequential because provider’s T&Cs change frequently, making it difficult for users to determine which obligations they are subject to over a period of time, a phenomenon that has been characterized and criticized has the “complexification and opacification” of platforms’ rules. Footnote 101 Second, paragraph (3) states that services (i) primarily directed at minors or (ii) predominantly used by them must explain their conditions and restrictions of use in a way that minors can understand. Footnote 102 This obligation reflects the fact that "the protection of minors is an important policy objective of the Union,” Footnote 103 and tracks other DSA provisions aiming to ensure additional protection to minors, such as those on transparency in recommender systems, negative effects on minors as a systemic risk, and mitigation measures. Footnote 104 Third, paragraphs (5) and (6) impose on VLOPS and VLOSEs additional obligations to: provide recipients of a service with “concise, easily accessible and in machine-readable format summary of their T&Cs,” which must include “the available remedies and redress mechanisms, in clear and unambiguous language;” Footnote 105 and publish their T&Cs “in the official languages of all Member States in which they offer their services.” Footnote 106 These providers are encumbered with more onerous obligations due to their “special role and reach,” whereas smaller providers are spared so as to “avoid disproportionate burdens.” Footnote 107
Finally, some DSA provisions outside Article 14 can be viewed as complementing its transparency and information obligations. For instance, Article 27(1) sets out a somewhat similar, although less detailed, information obligation for all online platforms vis-à-vis recipients of the service to set out in plain and intelligible language the main parameters used in their recommender systems, as well as any options to modify or influence those parameters. In another illustration, Article 44(1)(b) states the Commission shall support the development and implementation of voluntary standards set by relevant European and international standardization bodies inter alia for “templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms and conditions and changes thereto.” Footnote 108
II. Fundamental Rights Application and Enforcement
From a fundamental rights perspective, the exciting part of Article 14 is paragraph (4), which regulates the application and enforcement of T&Cs: Footnote 109
Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.
The scope of the provision is the same as paragraph (1): It only applies to the enforcement of T&Cs that restrict user-generated content. The core obligation is directed at the providers to weigh the “rights and legitimate interests of all parties involved” in a “diligent, objective and proportionate” manner when applying their T&Cs. Special emphasis is placed on the fundamental rights of recipients of the service, with an explicit mention of their right to freedom of expression, including freedom and pluralism of the media.
Some guidance is provided in the recitals. Footnote 110 First, it is stated that the obligation applies not just to the application and enforcement of restrictions, but also to its design. Second, although Article 14(4) mentions the rights and interests of “all parties,” the recitals refer exclusively to the fundamental rights of the “recipients of the service.” Third, when complying with this obligation, providers should act not only in a “diligent, objective and proportionate manner” but also in a “non-arbitrary and non-discriminatory manner.” Fourth, it is emphasized that VLOPs should pay particular regard to “freedom of expression and of information, including media freedom and pluralism.” Fifth, a clear connection is made between the provision and the need for all providers to “pay due regard to relevant international standards for the protection of human rights, such as the United Nations Guiding Principles on Business and Human Rights.” Footnote 111 We return to these points below. Footnote 112
In addition, some provisions in the DSA directly or indirectly refer to Article 14(4) and the application and enforcement of T&Cs in light of fundamental rights. Footnote 113 One instance is is found in the “risk assessment” regime for VLOPs. Footnote 114 Under this regime, VLOPs shall “diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.” Footnote 115 To do so, they must carry out yearly risk assessments, including vis-à-vis specific categories of systemic risks, such as the dissemination of illegal content through their services, and any actual or foreseeable negative effects for the exercise of fundamental rights. Footnote 116 Crucially, VLOPs must focus not only on illegal content but also information that is legal but contributes to the systemic risks identified in the DSA, like disinformation. Footnote 117 When carrying out such assessments, VLOPs are required to take into account whether and how certain facts influence the specified systemic risks, including the appropriateness of the applicable T&Cs and their enforcement. Footnote 118 If such a risk is identified, VLOPs must then “put in place reasonable, proportionate and effective mitigation measures” tailored to the systemic risk identified, “with particular consideration to the impacts of such measures on fundamental rights.” In this case, such a measure would entail the adaptation of a VLOP’s T&Cs and enforcement, in line with Article 14 and the remaining provisions on T&Cs. Footnote 119 Similar mitigation measures may be applicable in the context of the special regime for crisis protocols. Footnote 120
D. The Application and Enforcement of Fundamental Rights through Terms and Conditions in the Digital Services Act
Building on the analysis above, this section identifies three crucial issues related with the application and enforcement of fundamental rights through T&Cs under Article 14 DSA. First, we explore whether Article 14(4) leads to the horizontal application of fundamental rights in platformʼs T&Cs, considering ECtHR and CJEU case law, as well as some recent national judgments on the topic (D.I.). Second, we explore whether and how it is possible to operationalize the vague wording of Article 14 though other more concrete provisions in the DSA (D.II.). Third, we reflect on the limitations of applying fundamental rights in T&Cs for addressing the unequal contractual relationship between platforms and users, drawing from external critiques to the law to highlight the risks of overreliance on fundamental rights in this context (D.III.). Guiding our analysis is the overarching question of whether Article 14, and in particular applying EU fundamental rights to the enforcement of platforms’ T&Cs, constitutes a suitable mechanism to rebalance the unequal position between platforms and their users and effectively curb the power of Big Tech.
I. Indirect Horizontal Effect of Fundamental Rights through Terms and Conditions
This first crucial issue we examine refers to the implications of interpreting Article 14(4) as imposing an obligation on intermediary service providers—in particular online platforms—to apply and enforce restrictions on content under their T&Cs “with due regard” to the rights of all parties involved, including the fundamental rights of recipients of the service “as enshrined” in the EU Charter. Footnote 121 To answer this question, we first highlight relevant human rights standards under the ECHR and Charter (D.I.1), followed by a reference to select national decisions in Germany and the Netherlands (D.I.2), and conclude with a reflection of the potential horizontal effect of Article 14(4) DSA (D.I.3).
1. The ECHR and the Charter
The first point to note is that such a reading of Article 14 would be perfectly consistent with influential recommendations from international and European human rights bodies. As noted, under international human rights standards, the UN Special Rapporteur on freedom of expression has explicitly stated that platforms should “incorporate directly” principles of fundamental rights law into their T&Cs. Footnote 122 This is to ensure that content moderation is guided by the “same standards of legality, necessity and legitimacy that bind State regulation of expression.” Footnote 123 While the Council of Europe’s Committee of Ministers and international human rights organizations have made similar recommendations. Footnote 124
Notably, the incorporation and application of fundamental rights is not envisaged to be merely conducting box-ticking human rights impact assessments of a platform’s T&Cs. Rather, it would include the actual application of fundamental rights principles to content moderation decisions, such as the principle of proportionality, where any restriction of content should align with the “least restrictive means” principle and be “limited in scope and duration.” Footnote 125 As such, Article 14(4) could be read as translating such human rights recommendations into a legal obligation on platforms to apply fundamental rights law under their T&Cs not just through general obligations, like risk assessments, but also in concrete decisions.
For instance, Recital 47 DSA makes specific reference to human rights standards, namely the UN Guiding Principles on Business and Human Rights. Footnote 126 This framework includes a set of foundational and operational principles on corporate social responsibility to respect human rights that would be relevant in this context. For this purpose, business enterprises should inter alia have in place certain policies and processes, such as: Embed their responsibility to respect human rights in clear policy statements; carry out human rights due diligence processes and risks assessments for actual or potential adverse human rights impacts from their activities; deploy and monitor the effectiveness of mitigation measures to address the risks identified; put in place transparency reporting obligations regarding such assessments and measures; and provide for or cooperate in the remediation of adverse impacts through legitimate processes. Footnote 127 Importantly, the application and development of these principles and the “Protect, Respect and Remedy” Framework to online intermediaries—including Big Tech platforms—has been endorsed by Council of Europe’s (CoE) Committee of Ministers, and recommended to member states and online platforms. Footnote 128 Interestingly, it is possible to find echoes of these principles not only in Article 14 DSA but also in the risk assessment and mitigation measures referring to it in the DSA.
First, these standard-setting instruments provide further helpful guidance on how Article 14(4) DSA could be operationalized by platforms. For example, the aforementioned CoE’s Committee of Ministers has detailed guidance for platforms on the implications of applying fundamental rights, including ensuring that all staff who are “engaged in content moderation” should be given “adequate” training on the applicable “international human rights standards, their relationship with the intermediaries’ terms of service and their internal standards, as well as on the action to be taken in case of conflict.” Footnote 129 This is a crucial point for ensuring platforms apply fundamental rights in content moderation decisions, and while the wording of Article 14 might not suggest an obligation for platforms to ensure adequate training and an alignment between their public-facing T&Cs and internal policy for staff, it is difficult to see how Article 14 could function without these requirements for platforms.
Second, the question arises as to what EU fundamental rights law would a platform apply, and what specific principles would be applied. The fundamental right most readily engaged through restrictions on user content under a platform’s T&Cs would be the right to freedom of expression, guaranteed in Article 11 Charter. However, content moderation decisions may also affect different rights, such as one user’s post which implicates another user’s right to private life, guaranteed under Article 7 Charter, or another user’s right to protection of personal data, under Article 8 Charter, or even the right to non-discrimination under Article 21 Charter. And finally, it must be recognized that platforms themselves also enjoy fundamental rights under the Charter, including the right to freedom of expression, Footnote 130 the right to property (Article 17) and freedom to conduct a business (Article 16). As discussed below, these rights and freedoms play a part in national case law. Footnote 131
As such, when a platform makes a content moderation decision about a user post it may have to engage in a complex fundamental rights balancing exercise. It will be required to have due regard to a user’s right to freedom of expression, but may also be required to balance that user’s freedom of expression with another user’s rights, or indeed, the platform’s own rights. However, the wording of the rights guaranteed in the EU Charter is broad and open-ended, and gives little concrete guidance to platforms. Therefore, in order to apply these rights, platforms must consider the relevant -CJEU- case law, as well as the considerable ECtHR case law on freedom of expression online. This is because in an important 2019 judgment, the -CJEU- expressly confirmed that Article 11 Charter should be given the “same meaning and the same scope” as Article 10 ECHR, “as interpreted by the case-law of the European Court of Human Rights.” Footnote 132
Although this is no easy task, the significant body of ECtHR case law within the remit of Article 14(4) DSA could be beneficial from the standpoint of fundamental rights protection. The reason is that while the CJEU has some case law on freedom of expression, the ECtHR has delivered many—and more detailed—judgments concerning online platforms and content posted by users, which would provide concrete guidance for the application of Article 14(4) DSA. For example, the ECtHR has delivered important judgements on relevant issues for content uploaded to platforms, with prominent examples including judgments whether posts on Instagram as part of the #metoo movement, describing a public figure as a “rapist bastard,” were protected by freedom of expression; Footnote 133 whether posts on Facebook should be considered terrorist propaganda; Footnote 134 whether religious videos uploaded to YouTube constituted hate speech; Footnote 135 whether uploading Pussy Riot videos to YouTube constituted “extremist“ content; Footnote 136 whether posting intimate images of a person to social media without consent violated the right to private life; Footnote 137 or whether posting hostile remarks about the police was extremist content. Footnote 138 Importantly, some of these judgments have involved not only freedom of expression, but also how to balance this right with the rights of others, such as a person’s right to private life under Article 8 ECHR.
In this regard, it is noteworthy that the ECtHR has held that when balancing rights under Article 8 and Article 10, rights enjoy “equal protection” under the ECHR, and the outcome of the balancing exercise should not vary depending upon whether it is the person asserting their Article 10 rights or the person targeted asserting their Article 8 rights, because these rights deserve, “in principle, equal respect.” Footnote 139 As such, although the requirement in Article 14(4) DSA for platforms to balance the rights and interests of “all parties” is consistent with ECtHR case law, we are less certain about the supporting Recitals’ exclusive reference to the fundamental rights of the “recipients of the service.” That is to say, to consider that platforms should take into account only a sub-set of fundamental rights in their balancing exercise is questionable, because non-recipients of a service’s fundamental rights must also be taken into consideration as per ECtHR case law. Footnote 140 The upshot is likely to be an even more complex balancing exercise for intermediary service providers.
Of course, the judgments mentioned above involve the application of fundamental rights in the “vertical” relationship between individuals and the State. They are judgments on platform users initiating applications to the ECtHR over government interference with freedom of expression, such as a prosecution for incitement to hatred over a YouTube video. Crucially, however, there is no case law from the ECtHR or CJEU concerning a user initiating legal proceedings against a platform for removing content. Still, the important principles contained in the ECtHR judgments can be applied by platforms, particularly on what constitutes incitement to violence, incitement to hatred, terrorist propaganda, or extremist material, and the kind of weight to attach to certain political content, such as expression on “matters of public interest,” which enjoys the highest protection under Article 10 ECHR. Footnote 141
Thus, for example, platforms should ensure that content rules set out in their T&Cs satisfy the requirements of Article 10 ECHR, namely that restrictions are “accessible” to users and “foreseeable” as to their effects, and “formulated with sufficient precision” to enable users to “foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail and to regulate their conduct.” Footnote 142 In this regard, the UN Special Rapporteur on freedom of expression has stated that Facebook’s “coordinated inauthentic behavior” policies can have an “adverse effect” on freedom of expression due to the “lack of clarity,” and can have “adverse effects” on legitimate campaigning and activist activities online. Footnote 143 Notably, the ECtHR has also elaborated upon the principle that Article 10 incorporates important procedural safeguards beyond mere ex ante transparency and information obligations. These include that users should have "advance notification” before content is blocked or removed, know the grounds for content being blocked, and have a forum to challenge a restriction on expression. Footnote 144 Further, content moderation decisions should “pursue a legitimate aim,” and be “necessary in a democratic society”—measure taken must be “proportionate.” Footnote 145
Indeed, one platform, Meta, the owner of Facebook and Instagram which are both qualified as VLOPs under the DSA, is already attempting to put in place mechanisms to engage in content moderation through the application of international human rights law, similar to the application of ECtHR case law mentioned above.Footnote 146 In this regard, Meta’s Oversight Board’s decisions are particularly instructive. Footnote 147 For example, the Oversight Board recently examined Meta’s decision to remove a post by a Swedish user describing incidents of sexual violence against two minors, for violating Meta’s T&Cs on child sexual exploitation, abuse and nudity. Footnote 148 In overturning Meta’s decision to remove the post, the Oversight Board applied principles developed by UN human rights bodies under Article 19 of the International Covenant on Civil and Political Rights (ICCPR), Footnote 149 which guarantees the right to freedom of expression, including the framework of (a) legality; (b) legitimate aim; and (c) necessity and proportionality. The Board found that Meta failed to define key terms, such as “sexualization of a minor;” and concluded that the post was “not showing a minor in a ‘sexualised context,’” as the “broader context of the post makes it clear that the user was reporting on an issue of public interest and condemning the sexual exploitation of a minor.” Footnote 150 As such, removing content “discussing sex crimes against minors, an issue of public interest and a subject of public debate, does not constitute the least intrusive instrument.” Footnote 151
Thus, the Meta Oversight Board’s approach to apply international human rights standards is instructive for how Article 14(4) DSA could be applied to platform’s content moderation decisions. Of course, one major limitation of Meta’s Oversight Board is that—at the time of writing—the Board “receives thousands of appeals per week, but “only selects a small number of cases for review;” and as such, the application of international fundamental rights is only occurring on appeal following Meta’s removal of content. Footnote 152 That is to say, the scope of competence of the Oversight Board only covers a limited portion of what under the DSA would qualify as restrictions imposed to the users of platforms’ services, such as in the context of broadly defined content moderation activities. As such, Article 14(4) DSA would obligate the application of fundamental rights during the initial content moderation decision.
What seems clear from the analysis so far is that certain international human rights standards may be partly met by the imposition on platforms of information and transparency obligations of the type Article 14 DSA imposes. Footnote 153 Still, the most substantive requirements stemming from these standards would necessitate a concrete implementation of Article 14(4) in the content moderation practices of platforms.
2. National Courts
National courts in EU member states are also seeking to apply fundamental rights in the relationship between online platforms and users. Dutch and German courts in particular have delivered several notable judgments on this issue in the context of ‘put-back requests’ where users want platforms to reinstate their content as well as accounts. Footnote 154 As Article 14(4) DSA targets the T&Cs, the provision will ultimately have its impact in civil litigation based on tort or contract law, which fall within national competencies.
In the short span of a little more than a year, several Dutch district courts issued at least six judgments on requests by public figures to reinstate their content which was removed by Facebook (Meta), LinkedIn or YouTube, for violating their T&Cs. Footnote 155 Notably, in all six cases the claims are explicitly based on a direct or indirect application of the right to freedom of expression under Article 10 ECHR, an admissible cause of action under Dutch law. Footnote 156 The cases all concern violations of the providers’ T&Cs for Covid-19 related disinformation and recognize the importance of large social media platforms for online public debate and people’s ability to express themselves as a factor in their reasoning. The judgments emphasized that in modern society these platforms are “of great importance to convey a message” Footnote 157 and are, “to a certain extent, entrusted with the care for public interests.” Footnote 158 Nevertheless, in all cases this reach and role of the large social media platforms were deemed insufficient ground to require them to carry or reinstate content based on the right to freedom of expression.
Despite the novelty of the topic, it is possible to identify a shared approach in this case law, whereby the courts are cautious and reserved in their application of fundamental rights. A direct horizontal application of the right to freedom of expression is rejected and, in the indirect horizontal application, the fair balance between the different fundamental rights is only marginally reviewed, setting a high bar for a reinstatement request on these grounds. Footnote 159 This indirect horizontal effect is constructed via open norms governing the private contractual relationship between platform and users. Footnote 160 The fundamental rights of the user, such as freedom of expression, are weighed against the right of property of the platform of which the T&Cs are a manifestation. Footnote 161 Subsequently, the Dutch courts rely on ECtHR case law to infer that there is only space for a marginal review of the fair balance between these conflicting fundamental rights. Footnote 162 Only if the restrictions in a platform’s T&Cs can be said to make “any effective exercise” impossible and “the essence of the right has been destroyed” are the courts required to grant a request for reinstatement. Footnote 163 Notably, there has been criticism that this case law is too conservative in the application of fundamental rights. Footnote 164
The German case law differs from the Dutch not only due to the particular rules of Dutch constitutional review but also, in large part, to Germanyʼs unique constitutional tradition of indirect horizontal application (mittelbare Drittwirkung) of fundamental rights. Footnote 165 Analysis of earlier case law involving “put-back requests” on inter alia social media platforms like Facebook and Twitter has led Kettemann and Tiedeke to conclude that within the German context “the violation of the terms of service does not always suffice to justify a deletion of a statement if it is protected under the fundamental right to freedom of expression.” Footnote 166
Since 2019, two influential cases show how German courts are less reluctant to instate rights-based obligations for platforms based on a fundamental rights analysis. Footnote 167 These cases both involved Facebook and the suspension of accounts. In the latter case, from July 2021, Germany’s Federal Court of Justice ordered Facebook to reinstate posts it had deleted from a user’s account based on its T&Cs on hate speech. Footnote 168 The Court held that in an interest-based balance of the conflicting basic rights, it was necessary for Facebook to inform the user of the intended blocking of his account in advance, inform him of the reason, and give him the opportunity to reply. Examining this case law, Hennemann and Heldt conclude that, notwithstanding the right of even the largest platforms to set contractual prohibitions that go beyond illegal content, the indirect horizontal application of constitutional rights imposes on these platforms procedural obligations when applying and enforcing their T&Cs to content that is otherwise lawful. Footnote 169
Based on this brief overview, at least two insights emerge that are relevant to our analysis. First, through the prism of the indirect effect of fundamental rights, both German and Dutch case law show how public values seep and can be integrated into private relations between platforms and users. Footnote 170 This insight matters for the correct design, application and enforcement of fundamental rights because the judgments in question identify concrete factors to consider in this context, especially when the information or content at issue is legal but incompatible with a platformʼs T&Cs. Such factors include the size, role, market power and resources of the platform making the content moderation decision, as well as the type of user in question, whether a private individual, politician, other public figure, and their communication, for example, whether it is a political or public interest statement or not. Second, as argued by van der Donk, it appears clear that national differences in the legal conceptualization of this private relation between platforms and users greatly influence the possibility and chances of a successful appeal to fundamental rights. Footnote 171 This aspect matters when considering a provision like Article 14 DSA, which aims at EU-wide harmonization of this diverse playing field of application and enforcement of fundamental rights through T&Cs.
3. Horizontal Effect
In light of the analysis of the ECHR, Charter and national case law above, the question arises as to whether and what extent does Article 14(4) DSA imply the horizontal application of fundamental rights as between platforms and recipients of the service, especially users. Indeed, Article 14(4) may result in a regulatory body or court having to review how a platform—a private company—applied fundamental rights affecting a user—a private individual. In this regard, a question that the enforceability of Article 14(4) would raise is how would or should a decision by a platform, where fundamental rights were applied pursuant to that provision, be reviewed by an independent body, regulator or indeed a national court.
In answering this question, it must be noted that an important element of Article 10 ECHR and Article 11 Charter is the principle of positive obligations. Footnote 172 Under the right to freedom of expression, States not only have a negative obligation to “abstain” from interfering with free expression, but they also have a positive obligation to protect free expression, “even in the sphere of relations between individuals,” including between an individual and a “private company.” Footnote 173 Helpfully, the ECtHR has a line of case law on positive obligations, where it has reviewed interferences with free expression by private companies, and given guidance on how any balancing of fundamental rights should be reviewed. Indeed, the 17-judge Grand Chamber of the Court has confirmed that States can have a positive obligation to protect freedom of expression, “even against interference by private persons.” Footnote 174
In its landmark Remuszko v. Poland judgment, the ECtHR has allowed a private individual to make an application to the ECtHR where a Polish media outlet had refused to publish a paid advertisement, with the Court reviewing the decision under Article 10 ECHR. Footnote 175 The Court rejected the Polish government’s argument that the ECtHR could not review the decision because it was a dispute “between private parties, whereas the rights and freedoms enshrined in the Convention were of a vertical nature, in that they concerned relations between the State and individuals.” Footnote 176 The Court held that “genuine, effective exercise of the freedom of expression does not depend merely on the State’s duty not to interfere,” and a State’s positive obligations under Article 10 ECHR may require measures of protection “even in the sphere of relations between individuals.” Footnote 177 The Court then reviewed the domestic courts decisions upholding the media outlet’s decision, and held the domestic courts (a) “carefully weighed the applicant’s interests against the legitimate rights of the publishers, such as their own freedom of expression and economic freedom;” Footnote 178 (b) were “aware” of the “human rights issues” of the case, and applied ECtHR’s principles on freedom of expression; and (c) applied the “principle of proportionality.” As such, the ECtHR concluded that the analysis was “fully compatible with the Convention standards.” Footnote 179
The review framework applied by the ECtHR could be an appropriate framework for weighing the rights of a user versus the rights of platforms, while the ECtHR also applies a relatively light standard of review: If the reasoning of the domestic courts’ decisions concerning the limits of freedom of expression is “sufficient and consistent” with the criteria established by the Court’s case-law, the Court would require “strong reasons“ to “substitute its view for that of the domestic courts.” Footnote 180
Finally, in the context of the EU Charter, it should be mentioned that Advocate General (AG) Øe of the CJEU has discussed the issue of platforms and positive obligations flowing from Article 11 EU Charter in Poland v. Parliament and Council. Footnote 181 Although the issue was not necessary to the case, and was merely obiter dictum, AG Øe emphasized the “importance” of certain platforms, which had become “essential infrastructures,” for the exercise of freedom of online communication. Footnote 182 And of particular note, AG Øe admitted that it was an open question whether certain platforms are “required,” given their important role, to “respect the fundamental rights of users,” citing the ECtHR’s case law. Footnote 183
In our view, the analysis in this section points clearly towards an acceptance of the indirect horizontal effect of human rights in the relationship between online platforms and their users. If that is the case, then in order for the application and enforcement, and arguably the design, of T&Cs to take due regard of fundamental rights, it must do so within the framework of the international and European standards discussed herein. This is especially true where the restrictions imposed by platforms relate to content that is otherwise lawful. Still, even with the rich case law of the ECtHR outlining some of the substantive and operational principles in this context—and national case law pioneering its application—more concrete guidance is needed on how exactly to make the framework provision like Article 14(4) operational in the horizontal relationship between platform and user. We now turn to this crucial issue.
II. Operationalizing and Compliance: How Article 14(4) Interacts with the Rest of the DSA
The text of Article 14(4) DSA provides little guidance on the concrete obligations and rights it creates for, respectively, platforms and users. This section teases out the unanswered questions on how platforms can operationalize and comply with Article 14(4) as well as what claims users can base on the provision by placing it in the context of the entire DSA, focusing on VLOPs. Building on our explanation of Article 14(4) mechanics above (C.II), it is possible to identify several mechanisms within the DSA that could mandate how platforms should operationalize Article 14(4), and possibly offer users a way to ensure platforms apply this provision to their content moderation.
The DSA takes a mixed approach to the operationalization of Article 14(4). On the one hand, it contains a set of provisions that focus on individual procedural safeguards against platformsʼ content moderations decisions vis-a-vis their T&Cs (Articles 17, 20, 21, 23 and 53). On the other hand, it also attempts to regulate the application and enforcement of fundamental rights through T&Cs in a more holistic fashion for VLOPs, by treating it as a systemic risk subject to mitigation measures in Articles 34 and 35. We look at these two dimensions in turn.
1. Individual Procedural Safeguards
First, Article 17 DSA requires that providers of hosting services—including online platforms—offer a “clear and specific” statement of reasons to affected users for a range of content moderation restrictions, including when the decision is based on T&Cs. This statement must include a “reference to the contractual ground relied on and explanations as to why the information is considered incompatible with that ground.” Footnote 184 In light of Article 14(4), it can be argued that statements of reasons issued by platforms would need to include the platform’s application of fundamental rights principles, similar to the decisions issued by Meta’s Oversight Board. In this regard at least, the operationalization of Article 14(4) is carried out by reinforcing the information and transparency obligations of online platforms.
Second, Article 20(1) DSA requires platforms to establish internal complaint-handling systems, where users can complain over a platform’s content moderation decisions, including those based on T&Cs. When Article 20(1) is read together with Article 14(4) DSA, it can be argued that platforms’ complaint-handling systems should issue decisions “having due regard” to the user’s fundamental rights, including freedom of expression. Moreover, Article 20(4) seems to imply that users can complain directly to the platform about the way in which their fundamental rights have been considered in a platforms’ content moderation decision. This would mean the platform does not only need to review whether the conduct or content was illegal or in violation of their T&Cs, but also review its own enforcement and application process and whether the userʼs fundamental rights have been weighed appropriately. Importantly, Article 21 DSA provides that users shall be entitled to select a certain certified out-of-court dispute settlement body “to resolve disputes” relating to platform decisions “including those that could not be resolved in a satisfactory manner through the internal complaint-handling systems”Footnote 185 under Article 20. Crucially, under Article 21(2), these dispute settlement bodies “shall not have the power to impose a binding settlement of the dispute.” In our reading, Article 21 suggests that a user can ask a certified settlement body to not only review a platform’s content moderation decision but also whether in the process leading up to this decision platforms considered the relevant fundamental rights pursuant to Article 14(4). The lack of detail in Article 21 for the establishment of these dispute resolution bodies, the standards they should employ or the procedural safeguards has led to severe criticism for creating a possible “race to the bottom” as well as legal uncertainty. Footnote 186 Depending on the implementation and possible further guidance, the combination of these provisions could provide for a concrete, albeit non-binding, application or consideration of fundamental rights in the enforcement of T&Cs by platforms in the context of both internal complaint-handling systems and out-of-court dispute settlement. Thus, especially for users, these provisions could offer a some procedural paths to invoke fundamental rights against all online platforms as regards content moderation restrictions.
Third, Article 53 DSA provides that inter alia platform users—or any representative body, organisation or association—shall have a right to lodge a complaint against providers of intermediary services “alleging an infringement of this Regulation” with a Digital Services Coordinator (DSC), further requiring that the DSC “assess the complaint.” Footnote 187 Crucially, the broad wording of Article 53 and especially the wording of supporting Recital 118 implies that a user can lodge a complaint with the DSC on the grounds that a platform violated Article 14 by failing to adequately apply fundamental rights in a content moderation decision. Footnote 188 Notably, this right to lodge a complaint extends to the compliance of all intermediary services where article 17 and 20 DSA only applies to respectively hosting service providers and online platforms, and by extension VLOPs. While these provisions do indicate some minimum norms on how online platforms and VLOPs are supposed to operationalize and comply with Article 14, they offer nothing to the broader category of intermediary services covered by Article 53.
Read together, the provisions examined above offer some indication not only of what compliance with Article 14(4) means but also of the possibilities it creates for users. Important to note here is that these are due diligence provisions, which are mostly procedural in nature. That is to say, they do not create a new substantive right or right of action as a judicial remedy for recipients of the service, namely individual platform users. Footnote 189
Overall, the guidance provided in the DSA as regards the application and enforcement of these provisions in connection with Article 14 is limited and significant uncertainty remains. It is possible to identify a number of challenges. First, at a minimum, Articles 17 and 20 mean that in their content moderation enforcement, online platforms and VLOPs will have to not only offer clarity on the contractual or legal basis for the decision but also on whether the enforcement of these norms is in line with the applicable fundamental rights. Yet, given the enormous amount of content reviewed by platforms, the level of detail and nuance in these individual cases required for compliance remains to be seen.
Second, and similarly, the lack of guidance in Article 14(4) DSA itself on how the weighing of the different interests and rights that should be involved, adds another layer of uncertainty as to whether it will, in practice, work to offer both meaningful transparency and force platforms to actually weigh fundamental rights in individual cases.
Third, there is uncertainty related to the right to lodge a complaint created by Article 53 DSA. It seems that the DSC has extensive discretionary power to decide whether or not to pick up the complaint, which is particularly relevant because the provision does not create a separate right of action for the recipient of the service.
Fourth, as the DSC´s will handle the complaint in accordance with national administrative law, we can expect in this context divergences similar to those observed in for Data Protection Authorities in the GDPR across different Member States, especially as regards the capacity and policy focus of DSCs. Footnote 190
Fifth, a shared challenge for the several individual complaint procedures is the possibility of coordinated abuse. With regard to Articles 17 and 20 DSA, the potential for abuse is explicitly foreseen in Article 23 DSA, which we have mentioned above. Footnote 191 This provision obligates online platforms to suspend users that “frequently submit notices or complaints that are manifestly unfounded.” Article 23(4) imposes an information obligation by requiring that this procedure leading up to this suspension must be set out clearly and in a detailed fashion in platforms’ T&Cs. However, the focus on individual entities or people frequently submitting these manifestly unfounded notices calls into question to what extent Article 23 DSA is able to address coordinated abuse and harassment through these complaint mechanisms. For example, so called “coordinated flagging,” where different users decide to simultaneously submit notices “with the purpose of having the impacted individual(s) banned or suspended,”Footnote 192 does not necessarily clear the high bar set by Article 23(3) if the people involved do not engage in this type of harassment “frequently” or when their notices are not “manifestly unfounded.” Similarly, there is no specific measure addressing this in the context of Article 53 DSA, which means DSCs have to fall back on national administrative law.
2. Systemic Obligations
The obligations discussed thus far tackle one important dimension of the operationalization of Article 14(4), namely as regards procedural safeguards for individuals’ fundamental rights vis-a-vis restrictive content moderation decisions by platforms. To be sure, this type of provisions is a necessary component of any serious legal suite of rules to ensure the concrete application and enforcement of fundamental rights via T&Cs. This is because general obligations on platforms, such as human rights impact assessments, would be per se insufficient in this regard. Footnote 193 On the one hand, their implementation with any degree of comprehensiveness and effectiveness would likely be disproportionate for small and medium-sized platforms, who lack the capacity and resources to meet such obligations. On the other hand, because such obligations speak only to the structure of the system and not the application in individual cases, they lack the concreteness required to be effective at the individual level.
However, on the flip side, a sole focus on individual decisions fails to take into account the effect of T&Cs enforcement at the systemic level. Such an outcome would be woefully inadequate from the perspective of fundamental rights protection of users on the largest platforms, such as VLOPs. As Douek argues, the massive scale and speed of content moderation decisions on these platforms equates this endeavor to “a project of mass speech administration and that looking past a post-by-post evaluation of platform decision-making reveals a complex and dynamic system that need a more proactive and continuous form of governance than the vehicle of individual error correction allows.” Footnote 194
As previously discussed, the “risk assessment” regime for VLOPs in the DSA is meant to address this systemic level and to subject the T&Cs as a whole to a type of human rights assessment procedure. Footnote 195 The way in which the T&Cs are applied and enforced is explicitly referred to as one of the factors that should be assessed with regard to their “actual or foreseeable negative effects for the exercise of fundamental rights.” Footnote 196 In addition, adapting the T&Cs and their enforcement is one of the possible measures VLOPs can take to mitigate such systemic risks. Footnote 197 In our view, these assessment and mitigation obligations provide an opportunity to assess the wider context and framework of content moderation practices, beyond the remit of individual decisions. From this perspective, these obligations are consistent with some of the requirements stemming from International human rights standards mentioned above, including the application to platforms of procedural principles of corporate social responsibility. Footnote 198 This would also be one possible approach to have “due regard” for fundamental rights in a the “design” of T&Cs, as mentioned in Recital 47 DSA, as a necessary result of the implementation of risk mitigation measures to bring those terms in line with the requirements of Article 14(4) DSA.
But this approach embodied in the DSAʼs systemic risks provisions is not immune to criticism. Footnote 199 Barata, for instance, points to the vagueness of risk assessment obligations, noting their uncertainty, the challenges associated with applying them in practice, and the excessive discretion they grants to the relevant authorities. Footnote 200 Laux, Wachter and Mittelstadt further warn against the peril of “audit capture,” where “VLOPs may leverage their market power against their new mandatory auditors and risk assessors.” Footnote 201 These valid concerns, if realized, will substantially weaken the potential positive affect of this risk assessment regime.
Finally, as hinted to in the preceding paragraphs, there remain open questions with regard to the enforcement of Article 14(4) DSA. These range from the different national approaches to the interaction of tort law with fundamental rights, the embedding of DSCs in national administrative law, and the coordination between the supervision authorities on a European level. A possibility is that to further substantiate platforms’ obligations and clarify enforcement, the Commission and the European Board for Digital Services will draw up codes of conduct, pursuant to Article 45 DSA. This could potentially offer more legal certainty as well as create the possibility for a tiered application of Article 14(4) DSA where only the VLOPs would take on more substantial obligations that would not be proportional for all service providers. However, such a construction would rely on platforms’ voluntary cooperation.
III. Fundamental Rights in T&Cs: Limitations of Relying on Fundamental Rights
In the preceding sections, we analyzed from an internal legal perspective how Article 14 DSA aims to rebalance the unequal contractual relationship between users and platforms by injecting fundamental rights into the heart of that contract, the application and enforcement of T&Cs. However, questions on the efficacy as well as the aim of the provision itself should, similarly, be raised from an external perspective so as to appreciate the scope and limitations of the fundamental rights framework in the context of content moderation.
Within the broader context of EU platform regulation, Footnote 202 the push for the protection of fundamental rights in the content moderation process can be seen as part of a ‘digital constitutionalism’ response to the relative dominance of the big social media platforms. Celeste describes this as the “ideology that adapts the values of contemporary constitutionalism to the digital society” and as the “set of values and ideals that permeate, guide and inform” normative instruments regulating platforms. Footnote 203 Forcing platforms to relate directly to fundamental rights in enforcement of content moderation policies fits perfectly with the broader movement of scholars arguing for rule-of-law-based responses to the particular challenges of platform’s private ordering, as it will normatively constrain their actions without relying on inappropriate state intervention. Footnote 204 Notwithstanding the potential importance of Article 14’s fundamental rights intervention, as well as the value of the broader rule of law response, three important limitations to applying the fundamental rights framework in this way must be noted from a systemic perspective. Appreciating these limitations can help us understand where the value of Article 14 is to be found and can help guard us against inflated claims co-opting the provision.
The first limitation relates to the scope of—and the inherent risks of over-reliance in—the fundamental rights framework itself. There is a rich academic tradition in, mainly, critical legal studies and feminist legal theory that criticizes, amongst others, the individualistic, apolitical and acontextual orientation of fundamental rights that makes them often unsuitable to deal with collective or structural injustices. Footnote 205 Building on this tradition, Griffin argues how this individual-oriented framework is unable to address important societal issues, such as those faced by marginalized communities on big social media platforms. Footnote 206 For example, an individual approach is arguably unsuitable to articulate the unequally distributed error margins of algorithmic systems of content moderation, Footnote 207 or the collective harm of perpetuating certain stereotypes. Additionally, an over-reliance on fundamental rights can eclipse the very existence of collective and systemic harms. Because they fall outside the frame of individualized harms or are not connected to the language of rights infringements, these systemic issues become difficult to articulate and easy to obscure. As such, an emphasis on human rights might eclipse many systemic harms in favor of an individualized and procedural weighing exercise. This danger is compounded by the depoliticizing effect human rights are at times criticized for. Footnote 208 The universalistic language, the focus on individual cases, and the emphasis on procedure could serve to neutralize the political nature of content moderation norms. Indeed, considering the fact that the DSA, and Article 14(4) in particular, extend the regulatory framework to also cover lawful content that is in violation of the T&Cs, overly focusing on human rights compliance can neutralize the very real political nature of these speech norms. Even though there are clear arguments for applying instruments designed to depoliticize these types of decisions rather than relying on the platform’s own discretion, the criticism remains that the human rights framework is not able to address certain harms and that its depoliticizing function can obfuscate both their own ideological background as well as the political discussion they are addressing.
A second and related limitation builds on the critique that human rights are not well positioned to address structural inequalities. In particular, the advertisement-based business model that social media platforms rely upon is not easily questioned from this perspective. As is well established in the online speech governance literature, the broader commercial interests of the platforms are a main driver for its content moderation policies. Footnote 209 These interests range from shielding itself from liability, creating an inviting space for users, to attracting enough advertisers through offering access to enough and the right type of people and associated data points. As such, the political economy of social media platforms forms the primary logic for its content moderation practices and is one of the most important structural conditions that can produce inequality in these practices. The fundamental rights framework disregards how this political economic dimension forms the structural background and, rather, legitimizes this dynamic by including the platforms’ own right of property and/or freedom to conduct a business, which are to be balanced against users’ rights. Footnote 210 In Part B we discussed how the unequal contractual relationship between platforms and users in the T&Cs forms one of the major challenges in regulating content moderation that the European fundamental rights framework was hoped to address. And yet, we see here how the structural conditions underlying this imbalance remain mostly untouched by that framework, and are even given space to reproduce by weighing platforms’ commercial interests on the same level as users’ fundamental rights. Footnote 211
A third limitation is where the setup of Article 14 DSA can serve to constrain platforms in their content moderation practices and to reorient them towards their users’ fundamental rights, it does not address the background conditions of where the power to set and enforce online speech norms is located. Even though fundamental rights do contain proportionality and procedural requirements in their de facto enjoyment and protection, they do not center on questions of power as such. Who can influence the T&Cs themselves, as well as the content moderation response to a violation—removal, downranking, labelling, etcetera—including the design and deployment of algorithmic systems to enforce those T&Cs—remains largely uncontestable. This connects to the broader critiques on the depoliticizing effect of the fundamental rights discourse and on the distribution of power and involvement of different actors in online speech governance. Footnote 212
Looking at these limitations of the fundamental rights framework, one could, of course, argue that it is not fair to expect too much from one provision such as Article 14 DSA and that, in its limited scope, this provision can still have a positive effect. Indeed, Article 14 is only one part of the larger regulatory framework aimed to ensure a safer and fairer online environment within the overarching EU law goal of ensuring the functioning of the internal market. Footnote 213 Nevertheless, this critique does carry weight if combined with the other challenges described in the previous sections and, crucially, the normative weight of the fundamental rights discourse. In arguing for fundamental rights-based regulation, Suzor already hints at this latter point when noting that it might be in platforms’ “long-term interests to adopt human rights processes that make their decision-making process more legitimate.” Footnote 214 Being able to claim that their content moderation practices align, in whatever way, with fundamental rights standards affords platforms a higher degree of legitimacy in how they govern online speech. The powerful legitimizing effect of the fundamental rights discourse can work to depoliticize content moderation decisions and to deflect attention from structural questions on power and political economy. This resonates with the wider discussions on corporate social responsibility and “bluewashing” through the legitimacy companies stand to win with paying lip service to human rights without substantially changing their business practices. Footnote 215
E. Conclusion
This article examines the issue of enforceability of fundamental rights via T&Cs through the prism of Article 14(4) DSA. The DSA is the centerpiece of an expanding and complex puzzle of platform regulation at EU level. A major thread running through the EU’s policy approach is to attempt to reign in the power of Big Tech platforms by imposing on them “enhanced” responsibility for the content they host, thereby reducing potential harms from their content moderation practices to free expression and a healthy public debate.
Until the DSA, only a few sectoral instruments contained minimal restrictions on platforms’ significant discretion and power in determining their “house rules.” Article 14 DSA fills an important gap as it addresses the wide margin of discretion enjoyed by intermediaries in their contractual relationship with users, which governs the moderation of illegal and harmful content online. The provision is particularly important due to the horizontal nature of the DSA in EU law. Rules on T&Cs in the DSA will govern the contractual relationship between intermediary service providers, including Big Tech platforms, and their users in the EU. Article 14 aims to increase the transparency of T&Cs and bring their enforcement in direct relation to fundamental rights. It does so via two sets of obligations, the first focusing on information and transparency, and the second on application and enforcement. These obligations apply to all intermediaries and cover restrictions they impose in their—human and algorithmic—moderation of content that is illegal or merely incompatible with their T&Cs.
At the heart of our inquiry is the question of whether applying EU fundamental rights to the enforcement of platforms’ T&Cs is a suitable mechanism to rebalance the unequal position between platforms and their users, and effectively curb the power of Big Tech. After explaining the mechanics of the provision and its systematic place within the DSA, our analysis focused on three crucial issues related to the application and enforcement of fundamental rights through T&Cs under Article 14 DSA. First, we explored whether Article 14(4) leads to the horizontal application of fundamental rights in platformʼs T&Cs, considering ECtHR and CJEU case law, as well as select national judgments. Our analysis concludes that it is possible, in principle, to establish the indirect horizontal effect of human rights in the relationship between online platforms and their users. But in order for the application and enforcement, and arguably the design, of T&Cs to take due regard of fundamental rights, it must do so within the framework of the international and European standards. This is particularly relevant as regards restrictions on content that is lawful but incompatible with platforms’ T&Cs. However, Article 14(4) DSA remains in essence a framework provision. Thus, even relying on ECtHR and leading national case law for guidance on substantive and operational principles, more detail is needed on how to operationalize it in the horizontal relationship between platform and user.
This leads to our second crucial issue, namely to whether and how it is possible to operationalise the wording of Article 14 through other more concrete provisions in the DSA. In this respect, we identify several mechanisms within the DSA that could mandate how platforms should operationalize Article 14(4), and possibly offer users a way to ensure platforms apply this provision to their content moderation. The DSA takes a mixed approach to the operationalization of Article 14(4). On the one hand, it contains a set of provisions that focus on individual procedural safeguards against platforms’ content moderations decisions vis-a-vis their T&Cs (Articles 17, 20, 21, 23 and 53). At this level, the operationalization of Article 14(4) is carried out partly by reinforcing the information and transparency obligations of online platforms, and partly by procedural due diligence obligations to the benefit of users. Especially regarding the latter there is significant uncertainty that may hamper the effective protection of users. On the other hand, the -DSA- also attempts to regulate the application and enforcement of fundamental rights through T&Cs in a more holistic fashion for VLOPs, by treating it as a systemic risk subject to mitigation measures in Articles 34 and 35. In our view, these obligations offer an opportunity to assess the wider context and framework of content moderation practices, beyond individual decisions. As such, they are consistent with some of the requirements stemming from international human rights standards, including the application to platforms of procedural principles of corporate social responsibility. They also provide an avenue to have “due regard” for fundamental rights in the “design” of T&Cs. However, the systemic risk regime is also open to criticism due to its vagueness, the broad discretion it offers to authorities, the susceptibility of “audit capture,” and the concerns as to its effective enforcement at different -EU- and national levels. All these challenges combined may imperil the effective application of this regime, to the detriment of fundamental rights protection.
Finally, we reflected on the limitations of applying fundamental rights in T&Cs for addressing the unequal contractual relationship between platforms and users, drawing from external critiques to the law to highlight the risks of overreliance on fundamental rights in this context. In our view, three interrelated limitations are noteworthy. First, the potential unsuitability of fundamental rights as a sole solution to collective or structural injustices. Second and more concretely, the inadequacy of fundamental rights to address structural inequalities the solution to which is outside its “toolbox,” such as the advertisement-based business model driving social media platforms. Third, the consideration that the fundamental rights framework does not truly tackle the background conditions of the power to set and enforce online speech norms. That is to say, despite proportionality and procedural safeguards, it is difficult for the fundamental rights framework to reach and influence in practice the design, application and enforcement of T&Cs and content moderation practices. If these criticisms hold, then the real risk of overreliance on fundamental rights is of legitimizing platforms that adopt that discourse while failing to uphold it, and of depoliticizing content moderation practices while obfuscating structural questions on power and political economy.
In this light, two lessons can be drawn from our analysis. First, in working to align content moderation practices with fundamental rights, it is important to appreciate the limitations of this framework and take a more holistic view. Efforts to realign platforms’ T&Cs towards public values and a safer, more equal online environment must not only orient themselves towards fundamental rights but also towards competition and consumer law, as well as place these in the wider context of social injustices, while ensuring their correct implementation through the human and algorithmic practices of platforms. Footnote 216 Second, the relevant DSA provisions must be properly interpreted and enforced, so as to ensure that the legitimacy platforms gain by fundamental rights compliance exercises is justified. On this point, as we have shown, the devil is in the details. Article 14 is a framework provision. If it operates in isolation, it might amount to little more than a paper tiger. However, our analysis shows that the DSA already contains concrete norms on compliance and enforcement. If these are interpreted in light of international and European standards and properly integrated with national legal systems, they can help to substantiate and operationalize Article 14 DSA towards desirable outcomes, and fulfil its revolutionary potential.
Acknowledgments
The authors would like to thank the participants of the following seminars, workshops and conferences for helpful comments on earlier drafts of this Article: the 2021 Digital Legal Talks conference (hosted by the University of Amsterdam, Tilburg University, Radboud University Nijmegen and Maastricht University); Percolating the DSA Package – Perspectives for a Digital Decade 2021 (Online Workshop hosted by the Weizenbaum Institute and the Internet Policy Review); the IE Private Law Series seminar 2022 (hosted by the IE Law School); and the TILT Seminar Series 2022 (hosted by the Tilburg Institute for Law, Technology, and Society (TILT)).
Competing Interests
The authors declare none.
Funding Statement
João Pedro Quintais’s research in this article is part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).