Hostname: page-component-78c5997874-fbnjt Total loading time: 0 Render date: 2024-11-09T12:35:42.819Z Has data issue: false hasContentIssue false

Competing Legal Futures – “Commodification Bets” All the Way From Personal Data to AI

Published online by Cambridge University Press:  18 September 2024

Marco Giraudo*
Affiliation:
University of Turin, Turin, Italy
Eduard Fosch-Villaronga
Affiliation:
University of Leiden, Leiden, Netherlands
Gianclaudio Malgieri
Affiliation:
University of Leiden, Leiden, Netherlands
*
Corresponding author: Marco Giraudo; Email: [email protected]

Abstract

This Article explores the implications of Artificial Intelligence (AI)-driven innovation across sectors, highlighting the resulting legal uncertainties. Despite the transformative influence of AI in healthcare, retail, finance and more, regulatory responses to these developments are often contradictory, contributing to the opacity of the legal underpinnings of AI business. Our Article notes the common trend of commercializing AI tools amidst legal uncertainty, using innovative contractual solutions to secure claims. Over time, these innovations trigger overlooked legal conflicts, sometimes leading to outright bans on AI products due to negative impacts on some fundamental rights and democratic Governance. The core argument of our Article is that an over-reliance on co-regulatory strategies, such as those proposed by the European AI Act, exacerbates legal instability in emerging technological markets. This panorama creates an ’extended legal present’ when alternative legal expectations coexist, thus causing economic and political uncertainty that may elicit legal instability in the future. The concept of ’competing legal futures’ is introduced to illustrate how economic actors must bet on a legal future in the absence of guarantees that this future will materialize. To help analyze this complex narrative, we propose a theoretical framework for understanding legal, technological, and economic dynamics, highlighting anomalies in market exchanges within the co-regulatory model. Despite the focus on European developments, the practical and theoretical implications extend beyond the EU, making the Article relevant to a broader understanding of the legal-economic challenges posed by AI and digital innovation. We conclude by arguing for a course correction, proposing institutional diversification for resilient governance of legal innovation under uncertainty.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of the German Law Journal

A. Introduction

Waves of technological innovations based on AI fuel narratives of unprecedented economic and social progress in every possible imaginable field: from healthcare to retail, from the stock market to marketing and advertising, from farming to sports.Footnote 1 The novelty of AI tools makes their related impact on society unknown or difficult to be known, as it is often masked under a veil of extreme short term efficiency bearing on the stability of their legal foundations.Footnote 2 At the same time, forms of folie normative (or regulatory madness) and the adoption of countless resolutions, ethical guidelines, standards, and laws—often contradictory—and unclear, further blur the visibility over the legal robustness of the foundations of AI-powered business. It is unsurprising therefore that AI tools are marketed in a context of legal uncertainty oftentimes through the use of innovative contractual solutions to secure legal entitlements to perform such new activities and attract further investments in these newly emerging markets.Footnote 3

Over time, such technological waves are often accompanied by legal shockwaves arising from formal and substantive conflicts between claimed legal entitlements to market new products and counterclaims to protect the prevailing legal order.Footnote 4 Many times, the exaggerated expectations of the legal sustainability of innovative solutions that accompany the commercial release of new products are contradicted by widespread recognition of adverse effects in terms of fundamental rights and democratic order, as well as other constitutionally protected interests.Footnote 5 As a result, some AI products are already being rejected outright, either by the intervention of law enforcement agencies or by judicial pronouncements retroactively declaring them unlawful. For example, facial recognition technologies or AI-based virtual friendship services have been banned in some jurisdictions by the courts and likely confirmed by legislative action.Footnote 6

In economic terms, many investors may soon face significant losses as a result of the outright prohibition of markets that they had imagined to be “legal” and whose legal foundations eventually “disappear” because they are found to be incompatible with some fundamental rights and democratic public order. Yet, many investors and policymakers seem unconcerned about the potential economic and political consequences of increasing uncertainty about the stability of the legal foundations of AI-based products, as if the legal rules announced by courts or enforcement decisions by specialized agencies in the EU or the FTC in the U.S. do not affect the economic value of new activities and services.Footnote 7 Such disregard for the legal dynamics and increasing inalienability of the entitlements being traded is in striking contrast to what would be expected of investors faced with such profound fluctuations in legal uncertainty within an industry.Footnote 8

All the more so when we consider the lack of learning from similar legal-economic patterns that have been playing out for more than a decade in the context of the first phase of information capitalism, whose legal foundations are currently in a stalemate.Footnote 9 For a long time now, economic actors have been unfettered by faltering legal claims to commodify personal data, the “new currency of the Internet.”Footnote 10 Even there, despite widespread warnings about the legal dangers of “everydayness as a commercialization strategy,” we have witnessed a sustained flow of investment into an industry whose core resource has long been suspected of not qualifying as a “commodity.”Footnote 11 Today, the entire industry is hanging by a thread, riding on what appears to be a cluster of “legal bubbles,” as legal support for the commodification of personal data diminishes and surveillance practices become increasingly costly, if not prohibited. And yet, with no apparent rational explanation, the very same private actors whose behavioral commodification bets are in jeopardy are the ones raising the stakes with AI gambles to even higher levels of magnitude, with an inadequate response from rule makers and enforcers. If history is any guide, the recent anomalous economic inaction in the face of legal dynamism portends legal instability ‘on steroids’ for the AI-based industry.Footnote 12

In this Article, we provide a legal-economic canvas of these institutional co-evolutionary dynamics shattering the legal foundations of the digital markets to make sense of contested commodification bets all the way from personal data to AI.Footnote 13 From the experience of the early phase of information capitalisms and current flares up of legal uncertainty in the face of AI powered services, we articulate theoretical insights concerning anomalous coordination between legal, technological, and economic dynamics as amplified by overreliance on co-regulatory strategies proposed by current regulatory actions for example, the European AI Act proposal, as a solution of the “pacing problem,” when do we regulate technology.Footnote 14 In this spirit, we elaborate on the growing literature on the legal fragility of the market for personal data and AI-based services and connect it with reflections on economic anomalies in the ‘market for legal rules’, which is increasingly supported by judicial evidence and litigation.Footnote 15

We argue that the co-regulatory model exacerbates legal instability in emerging markets also because it is subject to moral hazard dynamics, all too favored by an over-reliance on the goodwill of private actors as well as on the enforcement priorities of Member States’ DPAs.Footnote 16 Stalling strategies and opportunistic litigation can flourish within such a model to the advantage of economic agents’ commodification bets, thus prolonging what we call the “extended legal present,” in which a plurality of possible legal futures compete with each other, and which is fraught with economic and political uncertainty.Footnote 17

In particular, different legal futures also map onto complementary economic futures, as they affect the legal existence and the cost structure of markets for these new “exemplary goods.”Footnote 18 During such a period, economic agents have to “bet” on one legal future to ground their business models, with no guarantees of legal success.Footnote 19 We call this phenomenon competing legal futures, which can fuel dangerous legal bubbles if not properly identified and addressed, due to the overlooked instability of legal entitlements at the core of innovative business models.Footnote 20 At the same time, legal dissonances and tensions between innovative legal practices and the prevailing institutional order may lead to geopolitical tensions or internal constitutional crisis.

Against this background, we articulate a common intellectual framework for thinking in the face of the current legal-economic waves of uncertainty affecting AI and other digital innovations, likely to be amplified by political negotiations and compromises leading to the final text of the so called Artificial Intelligence Act (AIA).Footnote 21 The AIA is posed to be the first regulatory framework that addresses the impacts that AI has for society, laying down the rules for developers building these systems and user rights. This proposed piece of legislation relies heavily on private standardization activities, also named as co-regulation. The presence of co-evolutionary legal, economic and technological dynamics at the frontier of innovation requires a comparative and interdisciplinary approach, and so we propose a series of “neologisms” as conceptual experiments for the holistic study of complex social objects.Footnote 22 Although the Article mainly refers to European events as a case study, their theoretical and practical implications are by no means limited to the EU legal-economic area for the well-known “Brussel effect,” nor are the theoretical insights we draw from them.Footnote 23 The Article continues as follows. The first part briefly recounts how the foundational commodification bets on which the digital economy has been deployed and partly rejected by the EU judiciary and DPA’s, pointing to the poor adaptation of economic agents’ legal practices despite emerging legal fault lines at the core of the European digital markets. It then outlines the sense of dejà vu in making the legal foundations for AI, which are currently being shaken by flares of legal uncertainty as the commercial release of AI-based products unfolds. The second part sketches out a theoretical framework of the functioning of the “market for legal rules” to navigate uncertain legal futures. It also emphasizes the presence of anomalies and distortions typically associated with market exchanges at large, as they are exacerbated by the institutional functioning of the co-regulatory model. It concludes by calling for a course correction in the over-reliance on co-regulation and proposes a number of strategies as a resilient strategy for the governance of legal innovation in the face of legal uncertainty.

B. The Case of Information Capitalism

I. Foundational Commodification Bets

The emergence of personal data as an available means of exchange has been hailed as a breakthrough in economic innovation, enabling a host of new activities, products and services that were previously unimaginable.Footnote 24 Despite the technological availability of personal data, there has long been legal uncertainty about its actual legal qualification as a tradable commodity, as well as little visibility as to the level of legal transaction costs that will eventually arise at critical points in the personal data pipelines.Footnote 25 However, legal categorization and interpretation also define the economic qualities of these resources, such as tradability and fungibility, to the point of determining the very existence of a market for personal data.Footnote 26 This lack of legal clarity and visibility has, in the long run, called into question the economic nature of the new resources as commodities. Footnote 27

In this frontier context, economic agents have dealt with uncertain futures along their business models’ economic and legal dimensions. However, the mere possibility of such a market was enough to set considerable incentives and opportunities to invest in exploiting the new resource. According to the industry’s economic interest and legal imagination, economic agents have not been deterred by legal uncertainty concerning the validity of commodification claims over new resources. On the contrary, these entities have been making interrelated investments in the legal and economic aspects of their innovative businesses, emphasizing the use of personal data as a supplementary form of value alongside fiat currency.Footnote 28 Several legal advancements have been introduced across various levels to establish a legal foundation for incorporating “behavioral commodification bets” based on personal data within their business models. The commodification of personal data has long served as a legal channel for “exploiting human experience as a commodity through covert commercial practices involving extraction, prediction, and sales” – a phenomenon known as Surveillance capitalism.Footnote 29 Within this context, investments have flown on broad-based bets on commodifying human behavior, treating personal data as if it were the currency of the internet.

To legally enable these markets of “future behavior,” they have placed multifold legal bets with the prospect of high profits in case of success in securing tradeable entitlements over the new resource.Footnote 30 First, the primary bet was on characterizing personal data as a tradeable commodity within the imagined personal data market. The very existence of such a market for personal data was indeed conditional on such an implicit bet. Second, downstream legal bets relate to the most “efficient” way of balancing commodification claims with the prevailing legal interests and rights over personal data. Economic agents have been brokering contract solutions for the industry to govern legal relations with personal data subjects, competitors, and public authorities.Footnote 31

At the same time, understanding that legal innovation requires judicial support, economic agents have crafted measured narratives and frameworks. Their objective was to gain judicial approval for their ventures in the emerging personal data market. This approach aims to integrate their activities into legal precedents, establishing a firm legal foundation for their market practices.Footnote 32 They deployed socio-economic narratives and “ethical” theories to ensure that the novel use and application of legal categories and concepts to market new products “is both proper and permissible,” to turn their legal imaginaries into reality.Footnote 33 A whole ecosystem supported and encouraged these narratives.Footnote 34 Personal data extracted through behavioral surveillance started being seen as commodities that could be easily and cheaply extracted with little consequences, given that these operations were based on perceived lawless spaces or favorable legal rules announced in early legal decisions sustained by early optimistic academic and policy doctrines.Footnote 35 In this respect, national data protection authorities within the EU have also been inconsistent in defining the economic nature of personal data in the digital market. For example, while the Italian consumer protection authority in the 2018 Facebook case affirmed that personal data are commodities and was later confirmed by the Italian Council of State in 2021. The German Antitrust authority in its famous 2016 Facebook case, Meta Platforms Inc. and others v. Bundeskartellamt, held that personal data are not commodities and Gesetz gegen Wettbewerbsbeschränkungen [GWB] [German Competition Act], Jun. 26, 2013, Bundesgesetzblatt (Federal Law Gazette) at 1, 1750, 3245.cannot be sold for a price and this decision was affirmed by the CJEU in 2023.Footnote 36

The industry perceived the fact that the Courts did not immediately reject the “commodification consensus,” as a green light.Footnote 37 Soon, Big Tech and most technological companies released innovations pointing to the commodification of these previously unexploited resources, investing in them as if their commodification bets could not fail and as if everything was legally permitted.Footnote 38 Over the years, we have witnessed a growing industry upon these commodification bets, juiced up by the initial success in managing some leeway from courts and some specialized agencies—if not temporary support.Footnote 39 The pouring of investments in the multi-billion industry and a policy consensus that personal data are an irreplaceable resource for the future have long fed the expectations that the commodification bid was successful. A sort of normalization of surveillance has appeared to change the social norm.Footnote 40

In the industry’s legal viewpoint, the priority was often given to economically profiting from personal data, assuming proper procedures and risk assessments were in place. In this context, the tech industry considered the legal costs of using personal data quite low, similar to handling a commonly traded asset like currency.Footnote 41 Similarly, possible fines or compensation for unlawful processing were perceived as the mere cost of doing business.Footnote 42 It is no surprise that personal data have long been used and circulated as freely as a commodity to feed data driven business models devised “without much concern about social, economic, or legal consequences […] As the saying goes, it was better to ask for forgiveness than permission.”Footnote 43

After an initial period in which the support of the courts and the acceptance of the techno-optimist narratives and ethics promulgated by enthusiastic academics in favor of the commodification of personal data seemed to be secured, there has been a widening gap between the legal practices within the economy and the rules upheld in the courts. The ECJ played a major role in this shift, given that it has tended to favor the protection of fundamental rights and freedoms—rather than economic claims.Footnote 44 The big question about the viability and stability of personal data as the currency of the Internet is closely related to the fate of these systemic commodification bets placed by primary economic actors and elements of governmental bodies. If the bet fails for good, the legal basis of the market would disappear, and liabilities and bankruptcies could eventually follow.

II. Inadaptations as a Denial of the Legal-Economic Drift

Recent developments have complicated the general agreement on data commodification, especially due to various cases being considered or already decided by the Court of Justice of the European Union (CJEU). A major issue arising from these cases is determining the legal grounds for processing behavioral data on social media, which is proving to be a complex legal challenge. Social media could choose among different lawful grounds under the GDPR, for example, consent (Article 6(1)(a), contractual necessity (Article 6(1)(b)), or legitimate interests (Article 6(1)(f)). Interestingly, all these potentially lawful bases are now being strongly questioned by the CJEU. For instance, in the Meta v. Bundeskartellamt Case, Footnote 45 the Court declared that social media has a dominant position in the market, and the freedom of consent could be problematic to prove. Accordingly, consent would not be the best option for processing behavioral data. In another pending case, Noyb v. Meta, the European Data Protection Board (EDPB) and the Irish Data Protection Authority declared that it is not lawful and fair to use “contractual necessity” as a lawful basis for processing the personal data of social media users for behavioral advertising purposes.Footnote 46 Similarly, several DPAs have already clarified that even the “legitimate interest” cannot be used as a lawful basis See, for example, the Italian Data Protection Authorities (DPA) on Tik Tok, but also the latest decision of the Norwegian DPA to suspend META’s behavioral adv services in Norway.Footnote 47 These considerations are even more meaningful if we consider the recently introduced ban on behavioral advertising based on profiling on special categories of data in Article 26(3) of the Digital Services Act and the parallel—and unpredictable—extension of the notion of sensitive data also to indirect inferences for example, quasi-sensitive data that can only indirectly refer to sensitive characteristics, due to the recent 2022 CJEU case OT v. Vyriausioji.Footnote 48

Except for a few significant cases, the industry has largely disregarded fines and legal rulings, seemingly unaffected by their impact on investment value.Footnote 49 Despite the growing likelihood of legal challenges and the potential dismantling of business models reliant on extensive personal data exploitation and surveillance-based governance, the industry has shown little substantial adaptation.Footnote 50 In a sense, these models are teetering on a precipice, waiting for a single decision that could trigger their collapse, as some corporate lawyers have commented to the press.Footnote 51 Whenever industries effectively respond to collective actions for data protection and significant fines, while also maintaining EU-U.S. data transfers, both the industry and the majority of investors tend to view this as a conclusive victory.

Companies and European governments, including the European Commission, are failing to adapt their perspectives and step up their actions in response to the emerging foundational shifts about commodification and trade of personal data. This lack of adaptation is evident in the ongoing transatlantic legal disputes concerning the legal basis for transferring personal data to the United States. Whenever CJEU invalidates political agreements between the European Commission and the U.S. Department of Commerce regarding mechanisms for transatlantic data transfers in support of commerce, the European Commission consistently renews agreements that highly fail to address the fundamental issues highlighted in the CJEU rulings.Footnote 52

These temporary legal remedies are treated as if they could reverse the ongoing structural reshuffling of the industry’s legal foundations. The potential “existential threat” to the industry’s data-driven advertising model is becoming more apparent to the general public, with tabloids and newspapers acknowledging the legal risks associated with “ads signal loss,” as Meta’s CEO acknowledged.Footnote 53 At the same time, there is a growing recognition of the existential threat posed to liberal democracies by mass surveillance and the commodification of individual rights and attention.Footnote 54 The predictions of the so-called “Zuckerberg Laws” of a shift in social norms towards a greater willingness to share personal information have been largely disproved by the success of Apple’s App Tracking Transparency (ATT) framework, which allows people to opt out of sharing their data with third parties when given the choice.Footnote 55

The protagonists from inside the industry publicly and privately acknowledge the fact that their business is increasingly dissonant—if not entirely incompatible—with the emerging legal foundations in courts.Footnote 56 Instead of adapting to legal rules and constitutional constraints being affirmed by top EU courts and EU or national regulators which are substantially constraining the tradability of personal data, both internally and externally. Either they double down on their bets on behavioral commodification, or they resume even more fragile legal solutions—which are immediately stopped by the court system and some data protection authorities.Footnote 57 Instead, they are gearing up for the legal battles ahead.Footnote 58

Moreover, the implicit reintegration of failing regulatory strategies buys time for corporations to penetrate surveillance practices and the political-economic texture. The so-called “folie normative” of relentlessly adopting unclear legislative measures bearing unclear and often contradictory notions and rules is convenient for economic agents, at least in the short term.Footnote 59 They can strategically stall their enforcement with endless litigation, thus jeopardizing the speed and scope of any possible shift of legal regime. This is particularly evident when certain Data Protection Authorities (DPAs) have fostered close relationships with the technology industry, which can give an edge through initial light touch enforcement. For instance, in the case of major tech companies based in Ireland, where private entities and some public officials seem to have overestimated their discretionary power in shaping enforcement strategies for the GDPR and other EU laws and treaties.Footnote 60 Given the significant presence of US-based tech giants, Irish regulators have developed a mutual understanding with the regulated entities, characterized by an optimistic regulatory approach that prioritizes personal data exploitation over data protection.

This approach favors the perception of the legal viability of commodification claims. However, this excessive reliance on early regulatory autonomy has proven detrimental, as the majority of decisions made by the Irish Data Protection Commission regarding cross-border enforcement of the GDPR have been overturned by the European Data Protection Board (EDPB).Footnote 61 The rapid conclusion we could draw from this debate is that the whole business model on which digital capitalism was built might be largely unlawful and that the “commodification bet” was based mainly on a legal bubble that is about to explode. Nevertheless, most economic agents directly or directly involved in the industry seem unfettered by the corrosion of the legal foundations of digital markets.

III. Legal Storms in AI: A Déjà-vu?

There is a kind of déjà-vu happening in the creation of the legal foundations of AI-enabled markets, when compared to the experience of the commodification dynamics of personal data discussed so far. In the EU in particular, the continued reliance on co-regulation and industry standards to regulate new waves of digital innovation,Footnote 62 with AI products at the forefront of legal innovation, is happening with considerable disregard for the core concerns raised by the EDPB about the “digital service package,” as the EDPB and EDPS have already alarmingly pointed out.Footnote 63 They are scrambling to create rules for the exploitation of personal data at the core of AI systems, without addressing the core, fundamental questions about the rules governing the circulation and exploitation of personal data per se, and their implications for constitutional democracy. They regulate a downstream ecosystem for AI whose core resource is subject to strong circulation restrictions, if not outright inalienability.Footnote 64

Meanwhile, economic actors are again prototyping legal solutions for “exemplary goods” powered by AI systems that test the traditional boundaries of constitutionally protected rights and interests without prior legal authorization or guarantee of success, even though the legal existence of the markets they imagine for their innovative products is already highly uncertain.Footnote 65 In short, we are once again witnessing entrepreneurs willing to commercialize “never-before-seen” digital products that have to overcome uncertainty about the tradability of the entitlements they claim over them due to their unknown legal, social, and political implications. Who are now facing fierce counterclaims from major industrial players in the creative industries, on top of already existing privacy and data protection lawsuits.Footnote 66

The very same private actors whose bets on the commodification of behavior are at risk are the ones who are raising the stakes with AI gambles to an even higher level of magnitude. Unfettered by the potential unraveling of the legal underpinnings of the digital economy that provided the “fuel” to train AI systems, the AI industry is replicating at a higher scale the very same strategies to capture both the narrative and the regulatory process in order to have their commodification bets on AI supported by both legislators and, ultimately, the courts.Footnote 67 These private entities are pushing their commodification bets into new areas, further testing the constitutional limits of market exchange, while lawmakers—especially the EU Commission—continue to adhere to co-regulatory models that rely heavily on contractual innovation as the primary form of balancing the short-term benefits of commodifying AI systems and the systemic threats to fundamental liberties such as biometric surveillance in public spaces for example, Article 5 AI Act).Footnote 68

Despite the poor performance revealed by the state of disarray of the legal foundations of digital markets, the EU and other major legal economic blocs are doubling down on co-regulation to address AI legal issues via the AI Act, such as Article 40. For example, the EU Commission and Council continue to support the ’commodification consensus’, rejecting the EU Parliament’s proposal for a fundamental rights impact assessment or relying on standardization bodies,Footnote 69 which combined with widespread risk-based approaches, may ultimately allow the deployment of authoritarian technologies such as the emotion recognition or remote biometric control systems that seriously threaten individual freedoms and dignity.Footnote 70

In particular, the AIA proposal’s structural reliance on a supposedly “clearly” defined risk-based approach to the design of market boundaries for new AI-based solutions feeds the myth that the vagueness of the legal rules it lays down can be supplemented by the rigorous application of “risk management science.” Footnote 71 A science which, when applied to the specific cases of protection of fundamental rights, simply does not exist.Footnote 72 There are instead mere risk focusing subjective and discretionary forecasting techniques, which might help

[S]ystemize decision making and render what is tacit explicit, but what it cannot in itself do, is provide a plan for what agencies should do. It does not determine how to construct discrete ‘risks’ or suggest how risk creators are to be dealt with in order to increase compliance or the furthering of statutory objectives.Footnote 73

These heuristics alone do not provide any objective, predictable and enforceable criteria as to which activities are lawful or unlawful, let alone what the legal consequences of the use of the technologies under discussion are. This means that “risk calculations” and formal procedures do not lead to normative, predictable guidance on how AI products can be used and deployed without undermining the protection of fundamental rights.Footnote 74 Such a convention, which exaggerates the ability of risk management techniques to generate a “degree of certainty based on probabilistic logic,” risks becoming ideological or wishful thinking.Footnote 75 For the rhetorical move to label what is “uncertain” as “risky” makes it possible to play down the problem of knowledge about the actual impact of new products on constitutionally protected interests and the ensuing questions of democratic public order. The move is also instrumental to justify a de facto transfer of regulatory power to the EU Commission for example, Article 7 AIA, standardization bodies for example, Article 65(6) AIA and an entire private ecosystem by claiming that the agency problems associated with its exercise are limited by well-defined and verifiable criteria and limits.Footnote 76

Negotiations have been choppy and often on the verge of breaking down.Footnote 77 Once again, the EU Commission has succeeded in pushing an economic primacy approach, showing that a kind of deference to corporate expertise and techno-epistemic communities, especially when closely linked to corporate interests, is still very influential and conducive to the repetition of regulatory failures.Footnote 78 In doing so, they are ignoring warning signals about the legal sustainability of newly formulated AI regulatory frameworks coming from Data Protection Authorities (DPAs) and the European Data Protection Board (EDPB), civil society and academia.Footnote 79 However, despite some authoritative statements and positions, the way forward seems to be locked into an old-fashioned regulatory culture hardwired within the EU, in line with previous experiences of experts hired in advisory positions within the EU bureaucracy. Footnote 80 All of this to the benefit of the moral hazard dynamic leveraged by private actors who are already strategically exploiting existing open-ended legal materials and delaying public enforcement through opportunistic use of multiple legal bases and contractual flexibility, and strategic litigation. Unsurprisingly, anomalous patterns are already visible in the context of AI-based solutions, albeit at a much faster rate of development and greater scale and impact. The legal underpinnings of some AI-based products, like ChatGPT, are already boiling to the point of suspicion that, if not promptly addressed, these anomalies may inflate multiple legal bubbles with even more disruptive effects on the rule of law and institutional stability. Footnote 81

C. Uncertain Legal Futures

The experiences of the commodification of personal data and the current AI legal storms are instructive for the creation of the legal foundations of innovative markets, as they show the role trust plays as a complement of legal and economic “real” uncertainty in shaping expectations about the future. Footnote 82 It shows some important economic consequences of the fact that even if the legal future is unknown, this does not mean that it is “unimaginable,” nor that economic agents can build entire industries on these imaginaries, however fictional they may be.Footnote 83 The impact of “imagined futures on the dynamics of capitalism” is not limited to competition and credit, but extends to the expected future legal existence of a given market. If properly appreciated, this aspect adds another layer to the multiple effects of imagination on the dynamics of economic systems. Indeed, only if there is a common expected future in which the legal systems support its commodification over time against other competing claims, will investment in any new resource be attractive. This second part explores the legal-economic implications of uncertain legal futures to understand the origins of the current state of disarray of the legal foundations of digital markets and to possibly prevent even more disruptive legal shockwaves in AI-based markets.Footnote 84

I. Co-Regulation Models as Markets for Legal Rules

In recent decades, the “collaborative state” has embraced co-regulation as the dominant regulatory model to enable new “markets in legal rules” for newly emerging resources to cope with the legal uncertainty and risks associated with many “imagined markets.” Footnote 85 Seen as a solution to the lack of technical know-how and resources readily available to legislators, such a regulatory model has created the institutional conditions for discovering the legal qualities of emerging activities and products through competitive selection of legal innovations. The idea behind co-regulation has long been to harness the knowledge and ingenuity of private parties through the prototyping of different solutions to frame and exploit the new resources without compromising the prevailing hierarchically superior rights, such as the fundamental rights of users, the rights of competitors, and the democratic order as a whole. On the enforcement side, co-regulation envisages an important role for specialized agencies as they are supposed to have special institutional capacity, whose role in shaping legal expectations will also become important.Footnote 86

Delegating basic decision-making to organized private bodies as co-regulation does not come free of charge or magically guarantee legal certainty.Footnote 87 On the contrary, the functioning of co-regulation, which also involves standardization bodies—with mainly private parties—and decision-making processes that lack formal political deliberative procedures that allow feedback mechanisms from the community, academia, and NGOs, creates fundamental legitimacy problems without solving the knowledge problem either. Footnote 88 In this sense, the extension of market-based solutions to the selection of suitable legal innovations turns into a discovery process about the legal qualities of competing legal solutions, which may affect both the existence and the profitability of the imagined market built upon these de facto bets on several legal innovations. It is indeed a market with specific characteristics, because the success of legal innovations is not only determined by their acceptance by market participants, but also by the judicial system, which must uphold the solution after having assessed it in the light of fundamental principles and the limits of the constitutional order.Footnote 89

Thus, the role that courts’ trust in legal innovators plays in co-regulatory models is crucial, as it can complement the initial impossibility to ascertain the uncertain qualities of legal solutions. Footnote 90 As Lee and See define trust as “the attitude that an agent will help achieve an individual’s goals in a situation of uncertainty and vulnerability,” entrepreneurs need to secure and maintain trust from rule makers and courts in order to maintain agency over legal innovations as new solutions become contested and waves of litigation gain traction. Footnote 91 Trust is neither costless, nor granted, nor stable over time, as legislators and courts can learn about the trustworthiness of those they regulate through their behavior, statements, and adjusting their enforcement and interpretive strategies accordingly. Indeed, as has been argued about the market in general, the market for legal rules is a process that extends to discovering “who not to trust” and allows rule enforcers to adjust accordingly. Footnote 92 As a result, despite an early tolerance of contractual innovation and market expansion, courts and rule makers may later restrict the scope of private legal innovation and impose top-down legal solutions to newly emerging conflicts, also because of a loss of trust in the “disruptors.”

II. Competing Legal-Economic Futures

As happens with some economic aspects of products, the legal qualities of innovative business models are “not inherent in the product but inter-subjectively determined” and contingent “on future developments, which are not yet knowable.”Footnote 93 Among the most critical legal qualities to be found in a product is that of being tradeable as a commodity through contractual-based exchanges. The legal characterization of new resources as commodities and the set of available entitlements over them in other words., property, liability, or inalienability, does not follow from mere de facto power over them.Footnote 94

This is particularly true where co-regulation enables the contract-driven expansion of markets which dangerously test the constitutional limits of legal orders, namely fundamental rights and democratic public order. Footnote 95 In such cases, it may be unclear whether the commodification of new products is compatible with the prevailing constitutional boundaries. The closer the commodification bets come to testing the boundary between contract and property, as well as the distinction between legal objects and legal subjects, the more unstable the legal foundations of new industries become.Footnote 96

Entrepreneurs who wish to commodify so-called “exemplary goods” must overcome uncertainty about the tradability of the claims they make to them, and must also persuade regulators and judges that this is appropriate and legitimate in the face of non-commodification claims.Footnote 97 Indeed, commodification claims are valid and robust to the extent that they are supported over time by a “collective intentionality” of regulators in general and courts in particular. Footnote 98 However, the repugnance and perceived illegality of current practices may emerge and reinforce each other in time too, thus placing unexpected ex-post constraints on market expansion.Footnote 99

The use of open-ended and collaborative frameworks—such as the GDPR or the AIA proposal—may favor initial compromises that are easier to find through framing techniques and general propositions. Yet, these upstream negotiations of adopting fuzzy norms and often conflicting principles will inevitably allow for very different downstream practical entitlements structures, ranging from strong fundamental rights protection views to pro-commodification solutions leading to lower protection combined with higher profitability. As a result, reliance on upstream standardized solutions as a means of providing a stable legal basis for business in areas characterized by legal uncertainty must be resisted, as these compromises may not stand up to judicial scrutiny, as unexpected constitutional issues and incompatibilities may arise.Footnote 100

Sometimes these incompatibilities are more obvious while others are more uncertain. If the new resource at the heart of innovative business models is far removed from fundamental rights, then there is little room for legal uncertainty to flare up. For example, if the new market involves the trading of new types of sneakers, there is little doubt about their tradability per se, nor about their implications for the protection of fundamental rights or public order. Conversely, in the most extreme case of innovative activities involving new resources that are closely intertwined with fundamental rights, the legal foundations may turn unstable over time up to questioning the “legal existence” of innovative markets. In these veins, the emergence of constitutional externalities may trump innovative business models, by triggering retroactive shifts in entitlements.Footnote 101 Outbreaks of legal uncertainty may elicit the “disappearance of markets,” thus hampering the ability of businesses to adapt in the short term.

In such a context of volatile legal-economic expectations, the ability of economic agents to anticipate the legal future—in other words the legal consensus that will eventually prevail within the judiciary and the legal system—is largely limited, yet economically vital.Footnote 102 Economic agents have to decide and act in the light of expectations based on one of the possible legal futures that will come to be established and that will affect the legal qualification of new activities and the balancing of the various interests involved. They have to plan investments in innovative business models and markets, whose legal foundations will be assessed and confirmed ex post, by courts’ interpretations in the light of future knowledge of legal implications and consensus dynamics within the judiciary.Footnote 103

Borrowing from the economics literature, we call the extended legal present of these markets the period during which multiple possible legal futures are temporarily co-present and in competition with each other. Footnote 104 During such periods, legal futures coexist and economic agents have to use one of these legal innovations to ground their emerging economic activities, with no guarantee of legal success or the existence of what we may call the correlative “imagined markets.”Footnote 105 These alternative prospects, each resulting from different legal characterizations, balancing of rights, and political preferences give rise to a form of pluralism of legal futures that may persist until a consensus is consolidated within the judiciary and adverse effects on the prevailing legal order do not occur.Footnote 106

This is the case of the commodification of personal data and some AI-based products.Footnote 107 In particular, the behavioral commodification bets at the core of surveillance capitalism test both the above-mentioned boundary conditions of markets concerning the distinction between contract law and property law, as well as the divide between a legal subject and a legal object. Especially in the case of personal data, the decision of whether these commodification claims were compatible with the prevailing order of rights has been left open by the EU legislator, which did not make it clear its view offering a spectrum of possible solutions. The industry has been experimenting with commodifying resources closely related to human beings’ dignity and liberties, using contractual solutions whose validity and robustness have long been questioned even if considered the legal standard of the industry.Footnote 108

III. Anticipating Future Legal Consensus

When alternative legal futures compete, the cost structures and outcomes of transactions already conducted within the imagined markets may be retroactively reshuffled. At the end of the “extended legal present,” when the actual legal future is realized, winners and losers are finally and retrospectively defined. Indeed, legal decisions in both civil and common law countries tend to be “typically treated as merely a recognition of what the law has always been,”Footnote 109 even though such a shift may amount to an ex-post reshuffling of legal entitlements, causing legal bubbles to burst, with ripple effects of unpredictable social and political consequences.

Investing in competing legal futures becomes a matter of anticipating future legal consensus, which may evolve in multiple and largely unpredictable ways. Footnote 110 Economic agents’ inability to foresee the costs and benefits of alternative entitlement structures prevents them from making reliable economic calculations to plan their investments. Nevertheless, the incentives for economic actors to participate in the “race to the market” and to bet on some plausible solutions are such as to justify the gamble and the legal uncertainty. At the end of the race to the market, the winners are rewarded with the legal entitlement to engage in profitable, innovative activities over newly emerging resources and the ability to claim residuals and revenues from their investments, often in quasi-monopoly contexts.Footnote 111 The losers face the possibility of having legal innovations at the core of their business models rejected, with the consequence of having to bear costly liabilities or even bankruptcy if the resources they have been trading in are declared inalienable. As one can imagine, these consequences bring instability at multiple levels, including social—for example, job losses—and ethical—for instance, how responsible these dramatic market behaviors are.

The key to stabilizing legal foundations and selecting surviving legal innovations lies in the courts’ shared expectations of the legal future, which may consolidate over time. Footnote 112 Indeed, courts are prominent in legal communities as the legal system guardians, as they combine rules and principles to decide cases that determine winners and losers in the market for legal rules. Footnote 113 As the saying goes, “the legislator speaks, but no one listens,” and it is up to the courts to translate the general and often incomplete, contradictory statutory or regulatory propositions into norms of behavior. Footnote 114 As a result, legal success depends to a large extent on anticipating which legal consensus will prevail at the end of the extended legal present and thus retroactively (de)stabilizing and consolidating the legal foundations of new markets.

Unless we make “heroic cognitive assumptions” about the judges sitting in the courts, anticipating the courts’ shared legal expectations as a group is hardly predictable, even for the judges themselves. Footnote 115 They also suffer from cognitive and foresight deficits in characterizing and weighing the legal qualities of unknown technological solutions, let alone in guessing the future consensus of their peers. Footnote 116 They are also constrained by time, as judges cannot wait to see how new activities will develop, nor can they wait for the effects of the legal rules they will apply to be “adequately proven.” Footnote 117 As a result, courts, and judges themselves are forced to implicitly predict the legal future, trying to anticipate the implications of newly released legal solutions and the likely legal consensus on how to balance and characterize them. Footnote 118

Their predictions and expectations are not made in solitude or an institutional vacuum. Footnote 119 Their conjectures can be exposed to the narratives provided by primary corporate epistemic agents, who have a strategic advantage over the uncoordinated collective responses of users and public enforcement agencies to support their contractually designed commodification solutions, which are ultimately embedded in the technological infrastructure of the new industries.Footnote 120 The legal characterization of new activities and the balance between commodification claims and prevailing rights can fall prey to these imagined legal futures embedded in such general ideas that prevail in society.Footnote 121

In this way, the courts’ filtering function of legal innovation may initially be subject to the fallacy of composition and short-sighted conventions that limit their ability to imagine all the adverse effects and the systemic implications of solutions that seem to work in a single case.Footnote 122 As a result, the legal perception of innovation in this area may initially be biased in favor of new solutions due to a lack of readily available conceptualization of unintended harms, as has happened, for example, in relation to ubiquitous surveillance and invasions of privacy or safety. Footnote 123

However, adherence to these “conventions” fluctuates over time as adverse effects, competing doctrines, and alternative conceptualizations gain traction and persuasiveness. This is why it is hard to predict court decisions, even using AI.Footnote 124 As a result, legal consensus can shift from over-optimism to disillusionment, and rejecting once-dominant legal views can disrupt reliance interests. This explains why over-reliance on early legal entitlement structures can backfire, as they may eventually be dismissed or reshaped due to learning dynamics within the judiciary and enforcement agencies at large. In other words, as long as the claims of commodification are not stably underpinned by the judicial system and confidence in the contractual statements of economic agents is not maintained, these markets will remain only imaginary because their actual legal existence will have to be discovered and confirmed over time. If it is not, economic disruption may appear.

IV. The Hidden Hazards of “Competitive Co-Regulation”

A further element of instability and uncertainty within these “markets for legal rules” is added by the central role of enforcement authorities in co-regulation, as their competing enforcement strategies playing out in the “extended legal present,” may further blur whose legal innovation will ultimately prevail.Footnote 125 Indeed, even the enforcement strategies of these independent authorities can be disqualified by ex-post intervention by the judiciary and other regulatory bodies, with the consequence that those economic actors who rely on these enforcement practices as a proxy for future legal consensus may suffer investment devaluation. Contrary to what has long been argued, co-regulation is not a form of delegated legislation in favor of privileged relationships between private parties’ legal innovations and specialized regulatory and enforcement bodies - in the case of data protection, the leading supervisory authorities’ peculiar interpretation of EU law.Footnote 126 Ultimate control over the constitutional limits of legal innovation remains in the hands of the European Court of Justice and national judiciaries.Footnote 127

“Collaborative governance” legislative frameworks, often referred to as co-regulation, such as the GDPR, have an often-overlooked competitive dimension that provides check-and-balance accountability mechanisms to control and select the outcomes of the interlocution between independent authorities and private entities. Regulatory competition is not limited to the global stage; it is also unfolding within the EU in enforcing EU data protection law. For this reason, we propose to call this regulatory model “competitive co-regulation.”Footnote 128

As far as the collaborative side of the model envisaged in the GDPR and other EU instruments is concerned, private parties are encouraged to develop prototypical rules that propose a balance of rights and interests that suits their business. The relevant DPAs will first assess the legal qualities of these business-driven prototypical rules as a “primary disambiguator” of legal solutions, which can support them rather than reject them in the first place. Footnote 129 In turn, such a collaborative system allows different DPAs to experiment with alternative enforcement strategies in order to find the most effective way to combine the enforcement of the GDPR with the prevailing rights and interests under EU law and those of the economic operators under their supervision. Indeed, some DPAs may prioritize data sharing over data protection and privacy and shape their enforcement agenda accordingly, while others may experiment with opposite strategies. Footnote 130 Thanks to this initial and exclusive dialogue, the competent enforcement authorities, together with private actors, enjoy a kind of first-mover advantage in shaping the interpretive strategies and legal practices within their jurisdiction with effects throughout the EU so that they can direct investments and induce the rise of reliance interests based on these interpretations.Footnote 131 In a sense, the definition of enforcement strategy by different DPAs also based on expected legal futures in that it is shaped according to public actors’ expectations of the future legal consensus on the appropriateness and legality of their administrative actions.

Then comes the competitive side of the collaborative model. Such ex-ante regulatory, by enforcement, autonomy enjoyed by DPAs is not unaccountable and unconstrained, as it will be assessed ex-post in the light of the actual results of the chosen strategy by several overseeing EU Bodies.Footnote 132 This competitive dimension between different public authorities allows one to select the most desirable enforcement strategy due to the outcomes of other enforcement actions. Of course, in the absence of an already settled interpretation of the legal rules to be enforced, the competition between alternative enforcement strategies will also be decided ex-post. Again, the European Court of Justice will referee the competition by ensuring ex-post control of fair and effective compliance with and enforcement of EU law. Footnote 133

The existence of regulatory competition between different DPAs provides accountability mechanisms through which DPA enforcement strategies may be rejected by the EDPB, the European Commission and ultimately, the European Court of Justice. Competition between DPAs, which may have alternative views on how the EDPB—the collective body of all DPAs—and ultimately the European Court of Justice (ECJ), the guardian of the EU legal order, will interpret EU law, is an important mechanism of accountability. The existence of an ex-post selection of the most appropriate enforcement strategies used by DPAs aims to mitigate the risk that any of them will fall prey to “regulatory capture” at the expense of the protection of EU citizens without being accountable to their peers and to the rule of law. Footnote 134 In other words, such freedom to experiment with data protection enforcement comes with the risk of potential loss to better legal solutions or strategies adopted by competing economic actors or DPAs, as endorsed by the EDPB and, ultimately, the ECJ.

Costs and liabilities can arise for private actors and Member States whose enforcement strategies are ultimately rejected. As is the case in the price system and any institutional competitive system, there are inevitably losers and winners in the competition for legal innovation that are revealed by the discovery of the actual legal qualities of the resources traded in imagined markets as well as of the relative merit of competing enforcement strategies. This is true for private and public actors involved in the race to develop such collaborative legal futures. For private actors who fail to comply with EU law, the GDPR is very clear about the economic costs to be borne in case of failed legal bets or rejected regulatory strategies.Footnote 135

First and foremost, Article 7(1) of the GDPR places the burden of proof and the costs of failing to secure a sound legal basis for their activities based on processing personal data on the economic actors. It is not the users, the public, or other constituencies, but the personal data processing entity that bears the risk and the cost of failed legal innovations, wrong anticipation of the legal future, and, ultimately, baseless data processing. If economic agents over-invest in bad legal innovations, they will bear the costs of their wrong anticipation of the legal future as envisaged by the courts.

Similarly, public DPAs, which enjoy regulatory authority as lead authorities, are also liable in case of misinterpretation of EU law if their use of EU legal material is rejected by the EU Court of Justice or the Assembly of Peers (EDPB).Footnote 136 As for the Member States’ liability in case of failure to effectively enforce EU law, this may lead to the activation of the infringement procedure, for example, Article 258 and 260 of the TFEU. Indeed, the regulatory competition between the different DPAs of the Member States is subject to the control of the European Commission, the EDPB, and ultimately, the European Court of Justice to ensure that it does not take place at the expense of the rights of EU citizens under EU law.

However, in practice, this institutional structure failed to produce credible and deterrent feedback on the quality of legal innovations being massively deployed within the common market, thus defeating any efficiency promise, let alone deterrence of risky investments and legal stability during the selection process. On the contrary, agents appear to have been exploiting and weaponizing the differences, delays and inconsistencies between the enforcement strategies of some DPAs to their advantage.Footnote 137 They have successfully exploited the well-known “pacing problem” to disseminate convenient legal innovations and capture the market to become “too big to be banned.”Footnote 138 Such forms of perceived legal unaccountability have encouraged dangerous innovation strategies aimed at stalling and manipulating law enforcement, thus removing much of the incentives to carefully negotiate the most balanced and least disruptive legal innovation for the prevailing legal order. Footnote 139

Such risk-taking behavior amounts to a form of moral hazard in the market for legal innovation, thus calling into question the suitability of co-regulation to regulate legal innovation when it comes to disruptive technologies. This behavior is also visible in the practice of some enforcement agencies and EU institutions, most notably the Irish DPC, whose enforcement decisions have often been overturned by the EDPB at a staggering rate.Footnote 140

The neglect of existing accountability mechanisms in the enforcement phase of the market for legal rules has several anomalous consequences, which are also detrimental to those actors who seek to take advantage of such a regulatory model. On the one hand, it leaves economic agents unprepared for the possibility that legal innovations being temporarily upheld by some DPAs will be rejected by the ECJ and the legal system due to the competitive selection of inappropriate legal interpretations and enforcement strategies. On the other hand, it may encourage some regulators to use their short-term agency over legal interpretation to attract investment and water down the limits of commodification through non-enforcement.Footnote 141 Beyond mere regulatory capture explanations, regulatory forbearance appears to have become a silent industrial policy to create local jobs and prevent layoffs, all at the expense of fundamental rights and the rule of law across the EU. Footnote 142 These anomalous patterns may pave the way for institutional conflicts between EU Member States, which may lead to further litigation before the EJC, adding uncertainty and political tension.

D. Conclusions

This Article has examined anomalies in the creation of the legal foundations of digital markets, discussing what happened from the early days of the information economy to the current phase of exuberance in the field of AI. This anomalous functioning of the market for legal rules appears to be exacerbated by the problematic inner workings of co-regulation as a regulatory model that relies heavily on contractual innovation and industry initiatives. The overall unintended consequence of competitive co-regulation within the EU is that expected legal futures, often supported by the enforcement strategies of some DPAs and on which some economic actors have based their legal expectations, may over time be rejected by higher EU bodies. As a consequence, the legal foundations of some business models will inevitably be reshaped and unexpected costs and liabilities may arise for the public and private actors involved.Footnote 143 Conversely, if legal bailouts are adopted by governments to save techno-social systems from legal bankruptcy, major and fundamental democratic issues may come to the fore with unpredictable political consequences.Footnote 144

In order to address these competing legal futures in emerging markets for AI, we put forward the following policy considerations to limit the upsurge of legal uncertainty. First, co-regulation adoption should be very limited in scope to clarify as much as possible ex-ante whether these imagined markets exist legally through a straightforward rights-based approach.Footnote 145 From this perspective, in order not to fall prey to the very same institutional dynamics at the heart of the current state of disorder in digital markets, ex-ante regulation should be expanded to shorten—if not avoid—the extended legal present in which commodification and anti-commodification claims coexist. Second, where co-regulation is maintained, institutional improvements are urgently needed to strengthen the necessary accountability mechanisms to prevent moral hazard and regulatory capture problems.

If economic agents are to be allowed to prototype how to reconcile the economic interest of exploiting AI with the prevailing legal order, an iterative regulatory feedback process with strong involvement of courts and DPAs authorities must be enabled with substantial public investment.Footnote 146 In this perspective, significant resources need to be allocated to foster the emergence of shared expectations about the legal future and enforce them consistently to quell the likely strategic attempts of economic actors to buy time to capture the market for legal rules. Similarly, it is crucial to let firms fail when their legal bets are unsuccessful and rejected by the judicial system, affecting their profitability. Freedom to innovate without accountability measures, such as potential bankruptcy in the event of poor-quality legal solutions, would amount to inefficient and illegitimate private lawmaking. Without the credible prospect of possible failure, there is no incentive to compete along the qualitative dimension of legal innovation and to preserve the creative and selective features of competition. Any attempt to hijack the market for legal rules in order to expand the extended legal present and become “too big to ban” without bringing down entire industries must be vigorously resisted.

Conversely, firms should be encouraged to experiment with strategies to diversify their investments in alternative legal solutions in order to strengthen the institutional resilience of their portfolios from the early stages of innovative markets, so that they have alternative legal-technological solutions at their disposal if some of them prove incompatible with the prevailing legal order. The more diversified markets, companies and their bets on legal innovations are, the more they can adapt to the emerging legal consensus and switch to alternative institutional solutions. In this sense, anticipating long-term legal dynamics, rather than relying on day-to-day legal assessments of court decisions, may prove crucial to avoiding making specific investments that lock in inefficient and unlawful business models. It will be essential to anticipate how the courts will view the future existence of these imagined markets in order to avoid unexpected losses, if not bankruptcy. Caution is therefore needed to avoid “putting all your eggs in one basket” and betting everything on a single uncertain legal future. The stakes are high, and it will be crucial for all concerned to assess the true legal future correctly while planning for different scenarios.

Today, the case for caution is both urgent and strong in the case of AI-based solutions. As many have already pointed out, the incentive structures resulting from poor legal innovation and high market competition are already destroying the possibility of brokering balanced solutions that minimize the harm of AI-based tools. Footnote 147 On closer inspection, the re-emergence of the spiral of moral hazard phenomena that permeates the creation of the legal foundations for AI markets is not surprising. Indeed, the attempt to “mimic the market” in the process of selecting legal innovations has also brought with it the anomalies typical of such an institutional mechanism, including moral hazard, market failure and bubble-like phenomena. Footnote 148

The emergence time and again of “conventions” and convenient narratives to overcome the uncertainty at the heart of digital markets, including AI, should not be surprising nor unique to legal domains. Footnote 149 The recent past of the international economy has provided several examples of “grave myopia” on the part of major epistemic groups and experts, including economists and political scientists, who failed to anticipate disruptive events such as the end of the Soviet Union and the 2008 financial crisis. Footnote 150 This time it may be the turn of the legal community to be so blinded by collective myopia that it fails to anticipate the next major institutional crisis at the heart of surveillance capitalism after the AI hypes have died down. Group dynamics, conformism, and conflicts of interest within the legal profession may play a role in overestimating the predictive capacity of epistemic groups, often to over-sell their consultancy services and capture the lucrative market for public and corporate expertise in law-making. Footnote 151

With prophetic powers, a former European Data Protection Supervisor warned a few years ago of the fragility of the legal architecture designed for trading digital products: “10 years ago the global financial system faced a meltdown because of a lack of accountability for millions of micro decisions in a system that nobody could understand anymore. I am afraid that we are reaching a similar boiling point now with the global digital information ecosystem.” Footnote 152 In legal terms, artificial intelligence is very similar to the exotic structured finance that led to the 2008 financial crisis. Akin to structured financial products, we are witnessing “structured legal products” with multiple legal bets at their core. These innovative AI-based products—such as facial recognition, generative AI, or voice recognition—amount to commodification bets made on top of those concerning personal data, along a continuous spectrum towards the total commodification of the human experience at its extreme pole. Footnote 153 It should come as no surprise if legal uncertainty continues to escalate following the adoption of the AIA and other regulatory instruments that rely too heavily on co-regulation.

Acknowledgments

The Authors are grateful to the meaningful comments received (on a preliminary draft) during the Workshop at University of Amsterdam on 13 November 24, especially Dr. Kristina Irion, Dr. Candida Leone, Dr. Eldar Haber. They also thank Dr. Giovanni Boggero and Prof. Michele Graziadei for their feedback. Errors are our own. This paper was jointly written and conceived, Marco Giraudo was responsible for part C.

Competing Interest

The authors declare none.

Funding Statements

Eduard Fosch-Villaronga was partly funded by the Safe and Sound project while writing this chapter, a project that received funding from the European Union’s Horizon-ERC program (Grant Agreement No. 101076929). Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Research Council. Neither the European Union nor the granting authority can be held responsible for them. Dr. Giraudo acknowledges that this paper has been partly funded by Next Generation EU - National Recovery and Resilience Plan (PNRR) - Mission 4, Component 2, Investment 1.4, National Research Centre for Agricultural Technologies -AGRITECH, identification code: CN00000022, CUP: D13C22001330005.

References

1 See e.g., Cindy Gordon, AI Is A Game Changer: PWC AI Predictions Report, Forbes (May 18, 2021), https://www.forbes.com/sites/cindygordon/2021/05/18/ai-is-a-game-changer-pwc-ai-predictions-report/.

2 See e.g., Thomas S. Mullaney, Your computer is on fire (Thomas S. Mullaney, et al. eds., 2021).

3 See Shoshana Zuboff , The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, (2019); Julie Cohen, Between Truth and Power: The legal Construction of Information Capitalism (2019); Anthony J. Evans, Information, Classification and Contestability: A Cultural Economics Approach to Uber’s Entry Into the Taxi Industry, 36 Rev. Austrian Econ. 1, 1 (2023).

4 See e.g., Sief Van Erp, Fluidity of Ownership and the Tragedy of Hierarchy , 4 Eur. Prop. L. J. 56, 56 (2015) (discussing the tragedy of hierarchy); Eduard Fosch-Villaronga & Michiel A. Heldeweg, “Regulation, I Presume?” Said the Robot. Towards an Iterative Regulatory Process for Robot Governance, 34 Comp. L. & Sec. Rev. 1258, 1258 (2018); Roland Gori, La Folie normative de nos sociétés de contrôle (The normative madness of our control societies), Université de Poitiers (2022), https://uptv.univ-poitiers.fr/program/rencontres-michel-foucault-2022/video/66955/la-folie-normative-de-nos-societes-de-controle-roland-gori/index.html; Alberto Martinetti, Peter Chemweno, Kostas Nizamis, & Eduard Forsh-Villaronga, Redefining Safety in Light of Human-Robot Interaction: A Critical Review of Current Standards and Regulations, 3 Frontiers in Chem. Eng’g 1, 1 (2021).

5 See e.g., Julie E. Cohen, Law for the Platform Economy, 51 U.C. Davis L. Rev. 133, 133 (2017); Maurice E. Stucke & Paul Grunes, Introduction: Big Data and Competition Policy, Big Data and Competition Policy, Oxford University Press, 1, 1 (2016) (discussing the impacts of information capitalism on the competitive order of the market economy); Eduard Fosch-Villaronga et al., A legal Sustainability Approach to Align the Order of Rules and Actions in the Context of Digital Innovation, in Technology and Sustainable Development: The Promise and Pitfalls of Techno-Solutionism (Herink Sætra, ed., 2023).

6 See e.g., Natasha Lomas, Replika, a ‘Virtual Friendship’ AI Chatbot, hit with Data ban in Italy Over Child Safety, Techcrunch (Feb. 3, 2023), https://techcrunch.com/2023/02/03/replika-italy-data-processing-ban/; Emre Kazim, Osman Güçlütürk, Charles Kerrigan, Elizabth Lomas, Adriano Koshiyama, Airlie Hilliard & Markus Trengove, Proposed EU AI Act—Presidency Compromise Text: Select Overview and Comment on the Changes to the Proposed Regulation, 3 AI Ethics 381—87 (2023). .

7 See Arvind Narayanan and Sayash Kapoor, AI Snake Oil, What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference, 2024, passim.

8 See e.g., Hanoch Dagan, Avihay Dorfman, Roy Kreitner & Daniel Markovits, The Law of the Market, 83 L. & Contemp. Probs. I-xviii (2020) (describing role played by law in enabling markets and in securing wealth); Susan Rose-Ackerman, Inalienability and the Theory of Property Rights, 85 Colum. L. Rev. 931, 932 (1985) (describing the economic meaning of inalienability); Nicholas Bloom, Fluctuations in Uncertainty, 28 J. Econ. Perspect. 153, 153 (2014); David M. Trubek, Max Weber on Law and the Rise of Capitalism, 1972 Wis. L. Rev. 720, 720 (1972).

9 See e.g., Morgan Maeker, The Slow Death of Surveillance Capitalism Has Begun, Wired UK (Jan. 5, 2023) https://www.wired.co.uk/article/meta-surveillance-capitalism; Remarks of CFPB Director Rohit Chopra at White House Roundtable on Protecting Americans from Harmful Data Broker Practices , Consumer Financial Protection Bureau (Aug. 16, 2023) https://www.consumerfinance.gov/about-us/newsroom/remarks-of-cfpb-director-rohit-chopra-at-white-house-roundtable-on-protecting-americans-from-harmful-data-broker-practices/.

10 See e.g., Jessica Bruder, What if Web Users Could Sell Their Own Data?, New York Times (Oct. 2, 2012) http://boss.blogs.nytimes.com/2012/10/02/what-if-web-users-could-sell-their-own-data/.

11 See Bart Custers & Gianclaudio Malgieri, Priceless Data: Why the EU Fundamental Right to Data Protection Makes Data Ownership Unsustainable, 45 Comp. L & Sec. Rev. 1, 2 (2022).

12 For a list of current class actions involving newly released AI systems see AI Litigation Database, George Washington University (accessed June 7, 2024)), https://blogs.gwu.edu/law-eti/ai-litigation-database/.

13 To do so we built upon heterodox institutional law and economics approaches. See e.g. Guido Calabresi & A. Douglas Melamed, Property Rules, Liability Rules, and Inalienability: One View of the Cathedral, 85 Harv. L. Rev. 1089 (1972); Warren J. Samuels, Interrelations between Legal and Economic Processes, 14 J.L. & Econ 435 (1971); Coasean Economics Law and Economics and the New Institutional Economics (Steven G. Medema ed., 1998) (discussing some of the general implications of the application of price theory to the study of dynamics of legal rules); Guido Calabresi, The Future of Law and Economics (2016) (discussing the economic role and implications of how lawyers’ see the legal world); Massimiliano Vatiero, The Theory of Transaction in Institutional Economics (2021) (displaying a notable attempt to bring legal, organization and economic theory together by discussing the legal dimension of transactions also underpinning political and competitive aspects); Marco Giraudo, Legal Bubbles, in Encyclopedia of Law and Economics (Alain Marciano & Giovanni Ramello eds., 2nd edition, 2022).

14 See e.g. Gary E. Marchant, Braden R. Allenby, Joseph R. Herkert, The Growing Gap Between Emerging Technologies and the Law, in 7 The International Library of Ethics, Law and Technology (2011 ed.).; Audley Genus & Andy Stirling, Collingridge and the Dilemma of Control: Towards Responsible and Accountable Innovation, 47 Rsch pol’y 61, 61 (2018).

15 See e.g. Custers & Malgieri, supra note 11, at 6; Václav Janeček & Gianclaudio Malgieri, Commerce in Data and the Dynamically Limited Alienability Rule, 21 German. L.J. 924, 940–41 (2020).

16 See Karen Jones, Trust as an Affective Attitude, 107 Ethics 4, 4 (1996).

17 See Uncertain Futures: Imaginaries, Narratives, and Calculation in the Economy (Jens Beckert & Richard Bronk eds., 2019) (arguing economics is much about fictional futures and capitalism is a future oriented institutional system with credit and economic competition at its heart); Warren J. Samuels, The Legal-Economic Nexus, 57 Geo. Wash. L. Rev. 1556, 1556 (1988-1989).

18 See Erwin Dekker & Pavel Kuchař, Exemplary Goods: The Product as Economic Variable, 136 Schmollers Jahrbuch 237, 237 (2016).

19 Katarina Pistor, The Code of Capita: How the Law Creates Wealth and Inequality 186 (2019).

20 See Gianclaudio Malgieri & Frank A. Pasquale, From Transparency to Justification: Toward Ex Ante Accountability for AI, Brook. (2022).; Marco Giraudo, On Legal Bubbles: Some Thoughts on Legal Shockwaves at the Core of the Digital Economy, 18 J. Inst. Econ. 587, 591 (2022) (Describing the economic consequences of failed systemic “commodification bets.” It can lead to the disappearance of the fundamental economic value perceived to be legally protected by entitlements to new resources by triggering a “substantial destruction of economic value” in a manner similar to speculative bubbles in financial markets).

21 In February 2024, the ambassadors of the 27 countries of the European Union approved the final text of the AIA and soon after, the European Parliament Committee on the Internal Market and Consumer Protection and Committee on Civil Liberties, Justice and Home Affairs, approved the Proposal. See Proposal of the European Parliament and of the Council Laying Down Harmonized Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, COM (2024) final (Feb. 2, 2024).

22 See Guido Calabresi, Neologisms Revisited, 64 Md. L. Rev. 736, 741 (2005); Peter T. Boettke & Alain Marciano, The Past, Present and Future of Virginia Political Economy, 163 Public Choice 53, 53 (2015) (discussing the “save the books” and “save the ideas” agenda of the Virginia School of Political Economy).

23 See e.g. Data Protection Commissioner v. Facebook Ireland Ltd, 134 Harv. L. Rev. 1567 (2020); Joshua P. Meltzer, The Court of Justice of the European Union in Schrems II: The Impact of GDPR on Data Flows and National Security, Brookings Inst. (Aug. 5, 2020) https://www.brookings.edu/research/the-court-of-justice-of-the-european-union-in-schrems-ii-the-impact-of-gdpr-on-data-flows-and-national-security; Anu Bradford, The Brussels Effect, 107 Nw. L. Rev. 1, 1 (2012); Woodrow Hartzog & Neil Richards, Privacy’s Constitutional Moment and the Limits of Data Protection, 61 B.C. L. Rev. 1687, 1687 (2020) (discussing alternative strategies in the US to constitutionalize personal data protection); Adam Moore, Towards Informational Privacy Rights, 44 San Diego L. Rev. 809, 809 (2007); Katharina Pistor, Rule by Data: The End of Markets?, 83 L. & Contemp. Probs. 101, 101 (2020).

24 See e.g., Personal Data: The Emergence of a New Asset Class, World Economic Forum (2011) https://www3.weforum.org/docs/WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf .; Paul M. Schwartz, Property, Privacy, and Personal Data, 117 Harv. L. Rev. 2056, 2057 (2004); Sarah Spiekermann & Jana Korunovska, Towards a Value Theory for Personal Data, 32 J. of Info. Tech. 62, 62 (2017).

25 See Custers & Malgieri, supra note 111, at 6; Ian C. Ballon, Cyber Boot Camp: Data Security at the Intersection of Law and Business’, in E-Commerce and Internet Law: A Legal Treatise with Forms 345 (Ian C. Ballon, 2nd ed. 2016).

26 See e.g., Daniel Cole, Economic Property Rights’ as ‘Nonsense Upon Stilts’: A comment on Hodgson, 11 J. Inst. Econ. 725, 729 (2015); Pistor, supra note 19.

27 See e.g. James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh & Angela Hung Byers, Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute 1, 11 (2011); Anupam Chander, Future-Proofing Law, 51 U.C. Davis L. Rev. 1, 19 (2017) (discussing the idea of the constitutive function of the law for digital activities.)

28 See Nadeshda Purtova, The Illusion of Personal Data as No One’s Property, 7 L., Innovation, & Tech 83, 83 (2015).

29 Zuboff, supra note 3 at Part I, Chapter 3, 63-97.

30 See e.g., Urbano Reviglio, The Untamed and Discreet Role of Data Brokers in Surveillance Capitalism: A Transnational and Interdisciplinary Overview, 11 Internet Pol’y Rev. 1, 2 (2022) (describing the role of data brokers in trading the predictive promise of “behavioral insights” inferred through profiling).

31 See Harold Demsetz, Toward a theory of property rights , 62 Am. Econ. Rev. 347 (1967); Jhon R. Umbeck, A Theory of Property Rights with Application to the California Gold Rush (1981); (detailing how, in the tradition of Harold Demsetz, Jhon Umbeck argued many decades ago, when analyzing the “golden race” in California, that the propertization of publicly available valuable goods that lack a proper default legal allocation, for example, public goods whose owners are not identified by law, like the gold in Californian mines when California was annexed to the United States, depends on the practical capability of some entities to collect those goods and exclude others from the use of those goods). See also Nadezhda Purtova, Property Rights in Personal Data: Learning from the American Discourse, 25 Comp. L & Sec. Rev. 507 (2009) (describing how Purtova applied Umbeck’s theory to personal data, suggesting that the ambiguity about the ownership of personal data across the different legal systems would rapidly create the basis for an incredible market power asymmetry to the advantage of the few entities —Big Tech— capable of monetizing personal data).

32 See Alessandro Morelli. & Oreste Pollicino, Metaphors, Judicial Frames and Fundamental Rights in Cyberspace, 63 Am. J Comp. L. 616, 616 (2020).

33 See Pavel Kuchař, Entrepreneurship and Institutional Change, 26 J. of Evolutionary Econs. 349, 350 (2016); Nora McDonald & Andrea Forte, Powerful Privacy Norms in Social Network Discourse, 5 association for computing mach. 421 (2021); Anna Lauren Hoffmann, Terms of Inclusion: Data, Discourse, Violence, 23 New Media & Soc’y 35, 39 (2021); Anna Lauren Hoffmann, Nicholas Proferes, & Michael Zimmer, “Making the World More Open and Connected”: Mark Zuckerberg and the Discursive Construction of Facebook and its Users, 20 New Media & Soc’y 199, 199 (2018); Antonio Casilli, Four Theses on Digital Mass Surveillance and the Negotiation of Privacy, (8th Annual Privacy Law Scholar Congress, Berkeley Center for Law & Technology 2015).

34 See Michaela Padden, The Transformation of Surveillance in the Digitalisation Discourse of the OECD: A Brief Genealogy, 12 Internet Pol’y Rev. 1, 1 (2023) (illustrating the results of a genealogy of OECD digitalization discourse from the 1970s to the present to show how both harms and benefits of surveillance practices have been problematized.)

35 See Soshana Zuboff, Big other: Surveillance Capitalism and the Prospects of an Information Civilization, 30 J. Inf. Tech. 75, 85–86 (2015).

36 See Cons. Stato Sent., decision of the Consiglio di Stato (court of last appeal in administrative matters) Mar. 29, 2021, n. 2631; Gesetz gegen Wettbewerbsbeschränkungen [GWB] [German Competition Act], Jun. 26, 2013, Bundesgesetzblatt (Federal Law Gazette) at 1, 1750, 3245 (aff’d, Case C-252/21 Meta Platforms Inc and others v Bundeskartellamt ECLI:EU:C:2023:537 (Jul. 4, 2023)) [hereinafter Meta Case].

37 See Ryan Calo, Privacy Harm Exceptionalism, 12 Colo Tech. L. J. 361, 362 (2014); Danielle Citron & Daniel J. Solove, Privacy Harms, 102 B.U. L. Rev. 793, 793 (2022); Gerhard Schnyder, Anna Grosman, Kun Fu, Mathias Siems & Ruth V. Aguilera, Legal Perception and Finance: The Case of IPO Firm Value, 33 British J. Mgmt. 88, 93 (2022) (describing “legal signaling” focusing on the economic implications of the perception of the quality of law and thus complements the existing institutional approaches to IPO valuation).

38 Giraudo, supra note 20, at 591.

39 See Carmen Langhanke & Martin Schmidt-Kessel, Consumer Data as Consideration, 4 J. of Eur. Consumer and mkt. L. 218, 218 (2015); Philippe Jougleux, Facebook and the (EU) Law, in 48 Law, Governance and Technology Series (2022).

40 See Woodrow Hartzog, Evan Selinger & Johanna Gunawan, Privacy Nicks: How the Law Normalizes Surveillance, 101 Wash. U. L. Rev. 717 (2023). See also Stefano Dughera & Marco Giraudo, Privacy Rights in Online Interactions and Litigation Dynamics: A Social Custom View, 67 Eur. J. Pol. Econ. 101967 (2021).

41 Janeček & Malgieri, supra note 15, at 940.

42 Ballon, supra note 25, at 418.

43 Manyika et al., supra note 27, at 12–13.

44 Karen Yeung & Lee A. Bygrave, Demystifying the Modernized European Data Protection Regime: Cross-Disciplinary Insights From Legal and Regulatory Governance Scholarship, 16 Regul. and Governance 137, 150 (2022); Hielke Hijmans, The European Union as Guardian of Internet Privacy: The Story of Art. 16 TFEU, in 31 Law, Governance and Technology Series 202 ff (2016).

45 Meta Case, supra note 36.

46 See European Data Protection Board (EDPB) Binding Decision 1/2023 on the dispute submitted by the Irish SA on data transfers by Meta Platforms Ireland Limited for its Facebook service (Art. 65 GDPR) (2023) https://edpb.europa.eu/our-work-tools/our-documents/binding-decision-board-art-65/binding-decision-12023-dispute-submitted_en

47 Datatilsynet [Norwegian Data Protection Authority], Decision n. 21/03530-16 (July 14, 2023) https://www.datatilsynet.no/contentassets/36ad4a92100943439df9a8a3a7015c19/urgent-and-provisional-measures--meta_redacted.pdf (citing Article 58(2)(f) and 66(1) of the General Data Protection Regulation “GDPR,” and ordering Meta not to process personal data for Behavioral Advertising based on Article 6(1)(b) or 6(1)(f) GDPR).

48 See Case 184/20 OT v. Vyriausioji tarnybinės etikos komisija, ECLI:EU:C:2022:601 (Aug. 1, 2022).

49 Adam Rogers, The Dirty Little Secret That Could Bring Down Big Tech , Business Insider (July 18, 2023) https://www.businessinsider.com/venture-capital-big-tech-antitrust-predatory-pricing-uber-wework-bird-2023-7?mc_cid=9f8f0ac5a5&mc_eid=addda1ba40&r=US&IR=T; Adam Satariano, Meta’s Ad Practices Ruled Illegal Under E.U. Law, New York Times (Jan. 4, 2023) https://www.nytimes.com/2023/01/04/technology/meta-facebook-eu-gdpr.html; Matt Burges, Meta’s $1.3 Billion Fine Is a Strike Against Surveillance Capitalism, Wired (May 22, 2023) https://www.wired.com/story/meta-gdpr-fine-ireland/.

50 In particular, economic agents have already “visualized” the economic implications and costs associated with legal issues arising from “accidents” with personal data in case of malicious cyber-attacks or data breaches. See e.g., Kate Beioley, Lawyers Take Frontline Role in Business Response to Cyber-Attacks, Financial Times (July 27, 2023) https://www.ft.com/content/2af44ae8-78fc-4393-88c3-0d784a850331. Only recently some Big Tech companies seem willing to consider partially adjusting their personal data harvesting practices to the emerging legal landscape, at least in the EU. See e.g., Sam Schechner, Meta Offers to Seek Consent for Highly Personalized Ads in Europe, Wall Street Journal (Aug. 1, 2023) https://www.wsj.com/articles/meta-offers-to-seek-consent-for-highly-personalized-ads-in-europe-b520cbeb; Jess Weatherbed, TikTok’s Algorithm Will be Optional in Europe, The Verge (Aug. 4, 2023) https://www.theverge.com/2023/8/4/23819878/tiktok-fyp-algorithm-eu-dsa-personalization-data-tracking.

51 See e.g., Dan Milmo, Top UK Court Blocks Legal Action Against Google Over Internet Tracking, Guardian (Nov. 10, 2021) https://www.theguardian.com/law/2021/nov/10/top-uk-court-blocks-legal-action-against-google-over-internet-tracking (discussing Google’s lawyers’ arguments about not “open[ing] the floodgates” of vast claims over companies handling millions of people’s data).

52 See European Commission Adequacy decision for the EU-US Data Privacy Framework (DPF) (July 10, 2023) https://commission.europa.eu/law/law-topic/data-protection/international-dimension-data-protection/eu-us-data-transfers_en. Eur. Parl. Doc. 2023/2501(RSP) (2023) (describing the resolution the EU Parliament adopted on May 11, 2023, against the adoption of the draft EU-US Data Privacy Framework).

53 See Mark Zuckerberg, Meta Platforms, Inc. (META) Third Quarter 2022 Results Conference Call (Oct. 26, 2022) https://s21.q4cdn.com/399680738/files/doc_financials/2022/q3/Meta-Q3-2022-Earnings-Call-Transcript.pdf. In practical terms, the ongoing shift may amount to a massive drying up of the “data pipelines.” Behavioral advertising might be the first area of the industry whose legal foundations may eventually collapse due to unanticipated legal shifts, as the recent EDBP and CJEU decisions seem to confirm. This trend implements what the CJEU in the Google Spain decision already made clear in 2014, fundamental rights of data subjects to privacy and the personal data protection override, as a rule, a controller’s economic interests. See Case C-131/12, Google Spain SL v. Agencia Espanola de Proteccion de Datos (AEPD), ECLI:EU:C:2014:317 ¶ 99. The latest indication of costs entailed by the ebbing commodification consensus is that META openly considers banning political advertisements out of concerns of expected too high costs to comply with the emerging legal rules. See Javier Espinoza & Cristina Criddle, Meta Bosses Look at Political ads ban in Europe, Financial Times (Mar. 30, 2023) https://www.ft.com/content/1a133b5c-35f9-4776-99fc-7c02095ff2aa.

54 See e.g., Carissa Véliz,, The Surveillance Delusion, in Oxford Handbook of Digital Ethics, Oxford Handbooks (Carissa Véliz ed. 2023).

55 See e.g., John Koetsier, Apple’s ATT Burned Facebook Bad. Google’s Privacy Sandbox Is A Kiss In Comparison, Forbes, (Feb. 19, 2022) https://www.forbes.com/sites/johnkoetsier/2022/02/19/apples-att-burned-facebook-bad-googles-privacy-sandbox-is-a-kiss-in-comparison/?sh=25fe3e13382d. See also, Daniel J. Solove, The Myth of the Privacy Paradox 89 Geo. Wash. L. Rev. 1, 18 (2021).

56 See, e.g., Ari E. Waldman, Industry Unbound: The Inside Story of Privacy, Data, and Corporate Power, 5 (2021); Lorenzo Franceschi-Bicchierai, What It Does With Your Data, Or Where It Goes: Leaked Document, Vice (Apr. 26, 2022) https://www.vice.com/en/article/akvmke/facebook-doesnt-know-what-it-does-with-your-data-or-where-it-goes.

57 See Garante per la protezione dei dati personali, provvedimento n. 248 del 7 luglio 2022 TikTok Technology Ltd. (Italian Data Protection authority Decision: Issuance of warning to controller that intended processing is likely to infringe national legislation transposing directive 2002/58/EC and Article 6(1)(f) GDPR)).

58 See e.g., Samuel Stolton, EU Braces for Big Tech’s Legal Backlash Against New Digital Rulebook, Politico.eu (Aug. 10, 2022) https://www.politico.eu/article/eu-brace-legal-assault-against-digital-clampdown/.

59 See Gori, supra note 4.

60 Thomas Conefrey, Edna Keenan, Michael O’Grady & Dacid Staunton, The Role of the ICT Services Sector in the Irish Economy, Quarterly Bulletin, Central Bank of Ireland (Mar. 1, 2023) https://www.centralbank.ie/docs/default-source/publications/quarterly-bulletins/quarterly-bulletin-signed-articles/the-role-of-ict-services-sector-irish-economy.pdf; Stephen Beard, Is Ireland Too Economically Dependent On Big Tech To Regulate It Properly?, Competition Policy International (Feb. 20, 2023) https://www.competitionpolicyinternational.com/is-ireland-too-economically-dependent-on-big-tech-to-regulate-it-properly/.

61 See Irish Council for Civil Liberties, 5 years: GDPR’s crisis point , ICCL report on EEA data protection authorities (2023) https://www.iccl.ie/wp-content/uploads/2023/05/5-years-GDPR-crisis.pdf (showing that “67% of the Irish Data Protection Commission’s GDPR investigation decisions in EU cases were overruled by majority vote of its European peers at the European Data Protection Board, who demand tougher enforcement action. Only one other country, in one single case, has ever been overruled in this manner.”)

62 See e.g., Jonas Schuett, Risk Management in the Artificial Intelligence Act, Eur. J. of Risk Regul. 1, 11(2023).

63 Custers & Malgieri, supra note 11, at 2; Statement, European Data Protection Board, Statement on the Digital Services Package and Data Strategy (Nov. 18, 2021) (on file with the author).

64 See Elisabeth M. Renieris, Beyond Data Reclaiming Human Rights at the Dawn of the Metaverse (2023).

65 Dekker & Kuchař, supra note 18, at 20.

66 George Washington University, supra note 12 (providing an up to date list of current lawsuits).

67 See e.g., Paresh Dave & Jaffrey Dastin, Google Told its Scientists to “Strike a Positive Tone” in AI Research-Documents, Reuters (Dec. 23, 2020), https://www.reuters.com/article/us-alphabet-google-research-focus-idUSKBN28X1CB;. Tao Phan, Jake Goldenfein, Declan Kuch & Monique Mann, Economies of Virtue: The Circulation of ‘Ethics’ in AI, Institute of Network Cultures (2022), https://networkcultures.org/blog/publication/economies-of-virtue-the-circulation-of-ethics-in-ai/; Anna Jobin, Marcello Ienca & Effy Vayena, The Global Landscape of AI Ethics Guidelines, 1 Nature Mach. Intel. 389, 398 (2009); Daniela Tafani, Do AI Systems Have Politics? Predictive Optimisation as a Move Away From the rule of law, Liberalism and Democracy, postprint, forthcoming, in Ethics & Politics (2024), https://doi.org/10.5281/zenodo.10229060.

68 See e.g., Douwe Korff, Police real-time remote biometric ID in the AI Act, Data Protection and Digital Competition (Feb. 1, 2024), https://www.ianbrown.tech/2024/02/01/police-real-time-remote-biometric-id-in-the-ai-act/; Michael Veale & Frederik Zuiderveen Borgesius, Demystifying the Draft EU Artificial Intelligence Act—Analysing the Good, the Bad, and the Unclear Elements of the Proposed Approach, 22 Comput. L. Rev. Int’l. 97, 101 (2021).

69 See e.g., Privacy Hub Brussel, More than 150 University Professors From all over Europe and Beyond are Calling on the European Institutions to Include a Fundamental Rights Impact Assessment in the Future Regulation on Artificial Intelligence, Brussels Privacy Hub (Sept. 12, 2013), https://brusselsprivacyhub.com/2023/09/12/brussels-privacy-hub-and-other-academic-institutions-ask-to-approve-a-fundamental-rights-impact-assessment-in-the-eu-artificial-intelligence-act/.

70 See e.g., Giuseppe Mobilio, Your Face is Not New to Me – Regulating the Surveillance Power of Facial Recognition Technologies, 12 Internet Pol’y Rev. 1, 23–24 (2023); Paul Nemitz, Constitutional Democracy and Technology in the Age of Artificial Intelligence, 376 Phil. Transactions of the Royal Soc’y A: Mathematical, Physical, & Eng’g Sciences (2018).

71 Commission Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonized Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, COM (2021) 206 final (Apr. 21, 2021).

72 See Julia Black, Paradoxes and Failures: ’New Governance’ Techniques and the Financial Crisis, 75 Mod. L. Rev. 1037 (2012); Tobias Mahler, Between Risk Management and Proportionality: The Risk-Based Approach in the EU’s Artificial Intelligence Act Proposal, Nordic Yearbook of L, and Informatics 247, 247 (2022); Yeung & Bygrave, supra note 44.

73 Julia Black & Robert Baldwin, When Risk-Based Regulation Aims Low: Approaches and Challenges , 6 Regul. and Governance 31, 144 (2012).

74 Id.

Risk-based regulation seeks to calculate the risks attached to certain behaviors, structures or states of the world so that resources can be allocated accordingly. Although it is sold as a rationalistic and technocratic solution to a host of complex technical, social and political problems, in practice it is no such thing.

75 See, Pietro Dunn & Giovanni De Gregorio, The European Risk-Based Approaches: Connecting Constitutional Dots in the Digital Age, 59 Common Mkt. L. Rev. 473, 473 (2022).

76 See Eduard Fosch-Villaronga & Angelo Golia, Robots, Standards and the Law. Rivalries Between Private Standards and Public Policymaking for Robot Governance, 35 Comp. Law & Sec. Rev. 129, 129 (2019); Fabrizio Esposito, The Consumer Welfare Hypothesis in Law and Economics: Towards a Synthesis for the 21st Century 55 (2022) (discussing the policy issues concerning the agency problems in EU consumer law).

77 Jedidiah Bracy, AI Act Negotiations Face Tight Timeline Ahead of EU Elections, Int’l Ass’n Privacy Profs. (Nov 6, 2023) https://iapp.org/news/a/eu-ai-act-negotiations-face-tight-timeline-ahead-of-eu-elections/.

78 See Jonas Schuett, Risk Management in the Artificial Intelligence Act, 14 Eur. J. Risk Reg. 1—19 (2023) (discussing the essential role of private standards envisaged in the AIA).

79 See, e.g., Jhoan Laux, Sandra Wachter, & Brent Mittelstadt, Trustworthy Artificial Intelligence and the European Union AI Act: On the Conflation of Trustworthiness and Acceptability of Risk, 18 Reg. & Governance (2023); See Veale et al., supra note 68; Irish Council for Civil Liberties, supra note 61; Carlos Calleja, Hadassah Drukarch, & Eduard Fosch-Villaronga, Harnessing Robot Experimentation to Optimize the Regulatory Framing of Emerging Robot Technologies, 4 Data & Pol’y (2022); Fosch-Villaronga & Heldeweg, supra note 4.

80 See, e.g., Laura Cram, Policy-making in the European Union: Conceptual Lenses and the Integration Process (1997); Altug Yalcintas, Intellectual Path Dependence in Economics Why economists do not reject refuted theories (2016) (discussing economic and policy ideas survive in the market of economic ideas even when they are falsified or invalidated by criticism and an abundance of counterevidence).

81 See Eduard Fosch-Villaronga, Hadassah Drukarch, & Marco Giraudo, A Legal Sustainability Approach to Align the Order of Rules and Actions in the Context of Digital Innovation, in Technology and Sustainable Development. The Promise and Pitfalls of Techno-Solutionism (Herink Sætra, ed., 2023); Ioannis Kampourakis, Sanne Taekema, & Alessandra Arcuri, Reappropriating the Rule of Law: Between Constituting and Limiting Private Power, 14 Juris. 76 (2022).

82 Paul Davidson, Is Probability Theory Relevant for Uncertainty? A Post Keynesian Perspective, 5 J. Econ. Persp. 129 (1991) (discussing the notion of “true” uncertainty from a post Keynesian viewpoint); Peter J. Boettke & Rosolino A. Candela, The Common Sense of Economics and Divergent Approaches in Economic Thought: A View from Risk, Uncertainty, and Profit, 17 J. Inst. Econ. 1033 (2021) (displaying an Austrian economics perspective on Knightian uncertainty).

83 Beckert & Bronk, supra at note 17. See also Ludwig M. Lachmann, From Mises to Shackle: An Essay on Austrian Economics and the Kaleidic Society, in Expectations and the Meaning of Institutions 230 (Don Lavoie, 2nd ed. 2004).

84 See Marco Giraudo, Incertezza Giuridica ed Instabilità Economica, 1 Sistemi Intelligenti 179 (2023) (describing the differnce between uncertainty and risk.

85 See Stilgoe, Who’s Driving Innovation? New Technologies and the Collaborative State (Springer 2020) (describing co-regulation–or hybrid governance–as a broad category of regulatory models across the globe which most notably includes the co-regulatory model adopted in the EU which includes a “mechanism whereby a Community legislative act entrusts the attainment of the objectives defined by the legislative authority to parties which are recognized in the field (such as economic operators, the social partners, non-governmental organizations, or associations.)”). This mechanism may be used on the basis of criteria defined in the legislative act so as to enable the legislation to be adapted to the problems and sectors concerned, to reduce the legislative burden by concentrating on “essential” aspects and to draw on the experience of the parties concerned.” See European Economic Social Council European Self and Co-regulation. Single Market Observatory, Retrieved (2023) https://www.eesc.europa.eu/sites/default/files/resources/docs/auto_coregulation_en--2.pdf; Dennis D. Hirsch, The Law and Policy of Online Privacy: Regulation, Self-Regulation, or Co-Regulation, 34 Seattle U. L. Rev. 439 (2010) (describing how in the EU, policy programs like better regulation and smart regulation—Better Regulation Guidelines, 2021—rely on hybrid governance methods that combine public and private actors).

86 See discussion infra Section B.IV. See also, Giulia Gentile and Orla Lynskey, Deficient by Design? The Transnational Enforcement of the GDPR, 71 Int’l & Comp. L. Q. 799 (2022).

87 Patricia Popelier, Governance and Better Regulation: Dealing with the Legitimacy Paradox, 17 Eur. Pub. L. 555 (2011); Fabrizio Cafaggi, New Foundations of Transnational Private Regulation, 38 J. L. and Soc’y 20 (2011).

88 Irene Kamara, Co-Regulation in EU Personal Data Protection: The Case of Technical Standards and the Privacy by Design Standardization ‘Mandate’, 8 Eur. J. L. & Tech 1 (2017); Hadassah Drukarch, Carlos Calleja, & Eduard Fosch-Villaronga, An Iterative Regulatory Process for Robot Governance, 5 Data & Pol’y 8 (2023).

89 Vatiero, supra note 13, at 9 (describing the fundamental role of State authority in providing legal foundations to any contract-based exchanges and property rights protection).

90 See Karen Jones, Trust as an Affective Attitude, 107 Ethics 4 (1996); Martin Krygier, The Rule of Law: Pasts, Presents, and Two Possible Futures, 12 Ann. Rev. L. & Soc. Sci. 199 (2016).

91 Jhon D. Lee & Katrina A. See, Trust in Automation: Designing for Appropriate Reliance, 46 Hum. Factors 50, 54 (2004). See also Joseph William Singer, No Freedom without Regulation: The Hidden Lesson of the Subprime Crisis 145 (2015) (“efficient and well-working market system depends on trust, both among market actors and contracting parties and between market actors and government regulators, including judges.”).

92 Ginny Seung Choi & Virgil Henry Storr, The Market as A Process for the Discovery of Whom not to Trust, 18 J. Inst. Econ. 467 (2022) (arguing that the market is also a discovery process through which market participants acquire knowledge about trustworthy and untrustworthy individuals); Stavros Makris, EU Competition Law as Responsive Law, 23 Camb. Y.B. Eur. Legal Stud. 228 (2021) (describing the responsive open-ended nature of law in general and EU law in particular). See also Rosalind Dixon, The New Responsive Constitutionalism, 86 Mod. Law Rev. 1—34 (2023).

93 Jens Beckert, Markets from Meaning: Quality Uncertainty and the Intersubjective Construction of Value, 44 Camb. J. Econ. 285, 286 (2020); Lord Goff, Judge, Jurist and Legislature, 2 Denning L. J. 79, 80 (1978) (“seen in the perspective of time all statements of the law [. . .] are, quite simply, temporary approximations which some people in their wisdom have found to be convincing at certain points of time”).

94 See Cohen, supra note 5; Calabresi & Melamed, supra note 13.

95 See Van Erp, supra note 4; Bram Akkermans, The Numerus Clausus of Property Rights, in Comparative Property Law: Global Perspectives (M. Graziadei & L. Smith eds., 2016); Naomi R. Lamoreaux, The Mystery of Property Rights: A U.S. Perspective, 71 J. Econ. Hist. 275 (2011).

96 Marie Daou & Alain Marciano, Commodification: The Traditional Pro-Market Arguments, in The Routledge Handbook of Commodification (Daou and Marciano eds., 2024).

97 See Pavel Kuchař, Entrepreneurship and Institutional Change, 26 J. Evol. Econ. 349 (2016); Dekker & Kuchař, supra note 18; Guido Calabresi, Ideals, Beliefs, Attitudes, and the Law (1985).

98 See, e.g., Simon Deakin, David Gindis, Geoffrey M. Hodgson, Kainan Huang, & Katharina Pistor, Legal Institutionalism: Capitalism and the Constitutive Role Of Law, 45 J. Comp. Econ. 188 (2017); Thomas W. Merrill & Henry E. Smith, What Happened to Property in Law and Economics?, 111 Yale L. J. 357 (2001); Geoffrey Hodgson, Conceptualizing Capitalism: Institutions, Evolution, Future (2015).

99 See, e.g., Michael J. Sandel, What Money Can’t Buy: The Moral Limits of Markets (2012); Margaret J. Radin, Market-Inalienability, 100 Harv. L. Rev. 1849 (1987); Alvin, E. Roth, Repugnance as a Constraint on Markets, 21 J. Econ. Persp. 37 (2007).

100 Privacy and data protection are the most obvious rights that are violated. However, their systematic infringement leads to a condition of domination over the autonomy of individuals, which jeopardizes their dignity and their ability to freely exercise most of their political and economic freedoms, including freedom of contract. See e.g. Marianna Capasso, Manipulation as Digital Invasion, in The Philosophy of Online Manipulation (Fleur Jongepier & Michael Klenk eds., 1st ed. 2022).

101 See Fosch-Villaronga et al. supra note 4; Marco Giraudo, Riflessioni sul ripristino del dibattito pubblico. Fare i conti con le “esternalità costituzionali”, Media laws/Rivista di Diritto dei Media 108 (2021).

102 See generally Elizbeth Pollman & Jordan M. Barry, Regulatory Entrepreneurship, 90 U. S. Cal. L. Rev. 383 (2017).

103 See Jason S. Johnston, Uncertainty Chaos and the Torts Process: An Economic Analysis of Legal Form, 76 Cornell L. Rev. 341 (1991) (arguing that unavoidable legal uncertainty stems from the fact that “judges and juries seek, ex post, to answer questions about ex ante behavior”).

104 Peter, J. Boettke, Christopher J. Coyne, & Peter T. Leeson, Earw(H)Ig: I Can’t Hear You Because Your Ideas Are Old, 38 Cambridge J. Econ. 538 (2014) (describing the notion of “extended present” in economics); Kenneth Boulding, After Samuelson, Who Needs Adam Smith?, 3 Hist. Pol. Econ. 225 (1971) (describing the notion of “extended present” in economics).

105 See Pistor, supra note 19 (detailing the enabling function of lawyers in capitalist market expansion).

106 See Cyril Hedoin, Institutions, Rule-Following and Conditional Reasoning, 15 J. Inst. Econ. 1 (2019) (showing that the nature, the stability, and the dynamics of any institutional process depend on how people reason about states of affairs that do not occur).

107 See, e.g., Paul Nemitz, Constitutional Democracy and Technology in the Age of Artificial Intelligence, 376 Phil. Transactions of the Royal Soc’y A: Mathematical, Physical, & Eng’g Sciences (2018).

108 See, e.g., Veale Nouwens, Midas Nouwens & Teixeiras Santos, “Impossible Asks: Can the Transparency and Consent Framework Ever Authorize Real-Time Bidding After the Belgian DPA Decision?”, 2022 Tech. & Reg. 12 (2022) (discussing the implications of the decision of the Belgian Data Protection Authority concerning IAB Europe and its Transparency and Consent Framework (TCF), a system designed to facilitate compliance of real-time bidding (RTB) —a widespread online advertising approach— with the GDPR. They argue that absent a fundamental change to RTB, IAB Europe will be unable to adapt the TCF to bring RTB into compliance with the decision. A decision concerning the TFC framework is now pending before the European Court of Justice: Case C-604/22 IAB Europe); Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 Harv. L. Rev. 1880 (2013); Helen Nissenbaum, Symposium, Privacy as Contextual Integrity, 79 Wash. L. Rev. 119 (2004).

109 Guido Calabresi, Reflections of a Tort Teacher on the Bench, 11 J. Tort L. 161, 164 (2018). See also Huiyi Chen, Balancing Implied Fundamental Rights and Reliance Interests: A Framework for Limiting the Retroactive Effects of Obergefell in Property Cases, 83 U. Chi. L. Rev. 1417 (2016); Giraudo, supra note 20, at 595.

110 See Cohen, supra note 5, at 203-204 (discussing the multifold ideas of “future-proof” law). See also Chander supra note 26.

111 See Pollman & Barry, supra note 102.

112 See Yeung & Bygrave, supra note 44, at 149 (stating “they inevitably shape the direction and focus of jurisprudence as cases are brought before them; hence some norms receive greater judicial attention while others are marginalized or overlooked”).

113 See John Bell, Judiciaries within Europe: a Comparative Review (2006); Rodolfo Sacco, Mute Law, 43 Am. J Comp. L. 455 (1995); Rodolfo Sacco, Legal Formants: A Dynamic Approach to Comparative Law (Installment I of II), 39 Am. J. Comp. L. 1 (1991).

114 Christoph Engel, Alon Klement & Keren Weinshall, Diffusion of Legal Innovations: The Case of Israeli Class Actions, 15 J. Empirical Legal Stud. 708 (2018).

115 See, e.g., Goff, supra note 92. See also Owen M. Fiss, Objectivity and Interpretation, 34 Stan. L. Rev. 739 (1982).

116 Jonathan R. Macey, The Internal and External Costs and Benefits of Stare Decisis, 65 Chi.-Kent L. Rev. 93 (1989).

117 Guido Calabresi, The Costs of Accidents-A Legal and Economic Analysis 3 (1970). See also Macey, supra note 116; Koen Lenaerts & José A. Gutiérrez-Fons, To Say What the Law of the EU Is: Methods of Interpretation and the European Court of Justice 20 Colum. J. Eur. L. 3 (2014) (providing an example on the creative role of the ECJ in interpreting EU law).

118 See Arthur Corbin, What is a Legal Relation?, 5 Ill. L. Q. 50, 51 (1921) (“The lawyer studies history and predicts the future.”); Calabresi, supra note 117, at. 3; Macey, supra note 116.

119 See e.g., Jake Goldenfein, Deirdre K. Mulligan, Helen Nissenbaum & Wendy Ju, Through the Handoff Lens: Competing Visions of Autonomous Futures, 35 U.C. Berkeley Tech. L. J. 835 (2020) (describing the role of competing visions of technological development in the domain of autonomous vehicles).

120 See Andreat Saltelli, Dorothy J. Dankel, Monica Di Fiore, Nina Holland, & Martin Pigeon, Science, The Endless Frontier of Regulatory Capture, 135 Futures 1(2022); Meredith Whittaker, The Steep Cost of Capture, 28 interactions 50 (2021). See also T. J. Jackson Lears, The Concept of Cultural Hegemony: Problems and Possibilities, 90 Am. Hist. Rev. 567 (1985) (critically discussing the classical Antonio Gramsci’s concept of cultural hegemony).

121 See Hoffmann, supra note 33.

122 Maurice A. Finocchiaro, The Fallacy of Composition: Guiding Concepts, Historical Cases, and Research Problems, 13 J. Applied Logic 24 (2015); Benjamin J. Cohen, A Grave Case of Myopia, 35 Int’l Interactions 436 (2009) (discussing the role of “conventions” to cope with uncertainty).

123 See Ignacio N. Cofone & Adriana Z. Robertson, Privacy Harms, 69 U.C. Hastings L.J. 1039 (2018); Citron & Solove, supra note 37; Martinetti et al., supra note 4.

124 Masha Medvedeva, Martijn Wieling, & Michel Vols, Rethinking the Field of Automatic Prediction of Court Decisions, 31 A.I. & L. 195 (2023).

125 See Gentile & Lynskey, supra note 86.

126 Jody Freeman, The Private Role in the Public Governance, 75 N.Y.U. L. Rev. 543 (2000); Kenneth A. Bamberger, Regulation as Delegation: Private Firms, Decision making, and Accountability in the Administrative State, 56 Duke L. J. 377 (2006); Margot E. Kaminski & Gianclaudio Malgieri, Algorithmic Impact Assessments Under the GDPR: Producing Multi-Layered Explanations, 11 Int’l Data Priv. L. 125 (2021).

127 See e.g., Arjen Boin & Susanne K. Schmidt, The European Court of Justice: Guardian of European Integration, in Guardians of Public Value 135 (Arjen Boin et al. eds., 2021).

128 Kostina Prifti, K. & Eduard Fosch-Villaronga, Towards Experimental Standardization for AI governance in the EU. Comput. L. & Sec. Rev. (forthcoming 2024).

129 See Yeung & Bygrave, supra note 44, at 146; Carlos Calleja et al., supra note 79 (providing examples of other cases where co-regulation is concerned, such as the ISO 13482:2014 on safety requirements for personal care robots, where there were no ‘disambiguators’ and the industry managed to get that standard harmonized, in other words, shows compliance with the Machinery Directive).

130 Bart Custers, Linda Louis, Maria Spinelli, & Kalliopi Terzidou, Quis Custodiet Ipsos Custodes? Data Protection in the Judiciary in EU and EEA Member States, 12 Int. Data Priv. L. 93 (2022); Thomas Streinz, The Evolution of European Data Law, in The Evolution of EU Law 902 (Paul Craig, & Grainne de Búrca eds., 2021) (describing the idea of “Hawks and Doves” in GDPR enforcement).

131 See Chris Brummer, Yesha Yadav, & David T. Zaring, Regulation by Enforcement, forthcoming,U. S. Cal. L. Rev. (forthcoming 2023), https://ssrn.com/abstract=4405036.

132 See Joel S. Hellman, Geraint Jones, & Daniel Kaufmann, Seize the State, Seize the Day: State Capture and Influence in Transition Economies, 31 J. of Compar. Econ. 4,751 (2003); Peter T. Leeson & Peter J. Boettke, Two-Tiered Entrepreneurship and Economic Development, 29 Int’l R. of L. & Econ. 252 (2009) (discussing the role productive entrepreneurs play in the creation of unproductive legally protected rents).

133 It is not a coincidence that the field of foresight science is now becoming increasingly important in the legal sector, even among public regulators. Interestingly, even the EDPS has adopted a foresight approach to the regulation of new technologies. In 2022, the EDPS launched TechSonar, a project where they adopt “anticipatory compliance” for a list of future problematic technological business solutions. The UK Data Protection Agency had a similar initiative. See European Data Protection Supervisor, TechSonar, https://edps.europa.eu/press-publications/publications/techsonar_en (last visited June 7, 2024).

134 See generally Ernesto Dal Bó, Regulatory Capture: A Review 22 Oxford Rev. Econ. Pol’y 203 (2006).

135 In other pieces of legislation covering other areas of innovation within the EU, this is extremely unclear, because many private regulations— standards— are developed without adequate oversight and, thus, can be enacted containing many problematic provisions. As a result, these provisions may allow unsafe product market entrance and escape “penalties for noncompliance” because the product may have well complied with the standard that has not been adequately supervised. See Eduard Fosch-Villaronga, ISO 13482:2014 and Its Confusing Categories: Building a Bridge Between Law and Robotics, in New Trends in Med. and Serv. Robots (Wenger P. et al. eds., 2016).

136 See Yeung & Bygrave, supra note 44.

137 See Gentile & Lynskey, supra note 86.

138 Joanna Mazur & Marcin Serafin, Stalling the State: How Digital Platforms Contribute to and Profit from Delays in the Enforcement and Adoption of Regulations, 56 Compar. Pol. Stud. 101 (2022).

139 Id.

140 See, e.g., Irish Council for Civil Liberties , supra note 61; Irish DPC “Handles” 99,93% of GDPR Complaints, Without Decision, Noyb (Apr 28, 2021) https://noyb.eu/en/irish-dpc-handles-9993-gdpr-complaints-without-decision.

141 See Matías Dewey & Donato Di Carlo, Governing Through Non-Enforcement: Regulatory Forbearance as Industrial Policy in Advanced Economies, 16 Regul. & Governance 930 (2021).

142 See Conefrey et al., supra note 60.

143 We have already said that the ability of economic agents to prototype legal solutions in a way that is profitable for them, even in the case of the use of harmonized standards, which are also supported by some data protection authorities, is not without possible liability. See Fosch-Villaronga & Golia, supra note 76; Kamara, supra note 88.

144 Kostina Prifti & Eduard Fosch-Villaronga, Towards Experimental Standardization for AI Governance in the EU, 52 Comput. L. & Sec. Rev. (2024).

145 See Malgieri & Pasquale, supra note 20.

146 See Calleja, et al., supra note 79.

147 See Dan Milmo, AI-Focused Tech Firms Locked in ‘Race To The Bottom’, Warns MIT Professor, The Guardian (Sept. 21, 2023) https://www.theguardian.com/technology/2023/sep/21/ai-focused-tech-firms-locked-race-bottom-warns-mit-professor-max-tegmark.

148 See Calabresi, supra note 13.

149 See Mario Cedrini & Magda Fontana, Just Another Niche in The Wall? How Specialization Is Changing the Face of Mainstream Economics, 42 Cambridge J. Econ. 427 (2018); Dan Simon & Nicholas Scurich, Judicial Overstating, 88 Chi.-Kent L. Rev. 411 (2013).

150 See e.g., Stephen C. Nelson & Peter J. Katzenstein, Uncertainty, Risk, and the Financial Crisis of 2008, 68 Int’l Org. 361, 362 (2014) (discussing how economist did not anticipate the financial crisis of 2008). See also Cohen, supra note 122 (discussing how political scientist failed to anticipate the collapse of the Soviet Union); Timur Kuran, Sparks and Prairie Fires: A Theory of Unanticipated Political Revolution, 61 Pub. Choice 41 (1989) (discussing how political scientist failed to anticipate the collapse of the Soviet Union).

151 See, e.g. Ralf Michaels, Comparative Law by Numbers? Legal Origins Thesis, Doing Business Reports, and the Silence of Traditional Comparative Law, 57 Am. J. Compar. L. 765 (2009) (discussing the competition between economists and lawyers within the market for consultancy in the “law reform” market).

152 Giovanni Buttarelli, European Data Protection Supervisor, Speech to LIBE on Annual Report (March 18, 2018) (transcript available at the EDPS Europa website).

153 See e.g., Steffen Krüger & Christopher Wilson, The Problem with Trust: On the Discursive Commodification of Trust in AI, 38 AI & Soc’y. 1753 (2023); Vian Bakir, Alexander Laffer, & Andrew McStay Blurring the Moral Limits of Data Markets: Biometrics, Emotion and Data Dividends, 39 AI & Soc’y (2023).