Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-24T01:35:24.716Z Has data issue: false hasContentIssue false

Part I - Networks

Published online by Cambridge University Press:  24 August 2020

Kevin Werbach
Affiliation:
University of Pennsylvania Wharton School of Business

Summary

Type
Chapter
Information
After the Digital Tornado
Networks, Algorithms, Humanity
, pp. 33 - 34
Publisher: Cambridge University Press
Print publication year: 2020
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

1 The Regulated End of Internet Law, and the Return to Computer and Information Law?

Christopher T. Marsden Footnote *

This chapter is both a retrospective, and also even a requiem, for the ‘unregulation’ argument in Internet law in the past twenty-five years, and a prospective on the next twenty-five years of computer (or cyber) law,Footnote 1 in which many of the expert treatises of the 1990s need to be dusted down and reabsorbed.Footnote 2

The global communications network connected by the Internet Protocol has transformed the consumer/prosumer and small business experience of electronic communication.Footnote 3 The Internet is not a lawless, special unregulated zone; it never was.Footnote 4 Now that broadband Internet is ubiquitous, mobile, and relatively reliable in urban and suburban areas, it is being regulated as all mass media before it. The major gatekeepers are regulated for the public good and public interest, whether that be access providers through infrastructure sharing, electronic privacy, cybersecurity and network neutrality regulation, or the social media, e-commerce and search giants through various duties of care including those for notice and rapid action – in many cases, requiring takedown of allegedly illegal material in a day or even an hour,Footnote 5 and notification of breach of security and privacy to the customer.Footnote 6 An Internet law expert arriving in a time machine from the mid-1990s would find all this quite shocking.

We have now come full circle from computer law prior to the Internet’s explaining the importance of robotics, cybernetics, and Electronic Data Interchange (EDI) in the 1980s; to an explanation of the Internet’s impact on the law in the 1990s that ranged across the entire syllabus including constitutional law and jurisprudence;Footnote 7 to more specialist examinations of law in such areas as intellectual property and telecommunications in the 2000s; to a realization that the future was delayed not denied and that cyberlaw is vital to understanding regulation of platforms, of artificial intelligence and robotics, of blockchains, of automated vehicles, and of disinformation in our democracies.

The 2020s will finally be the decade of cyberlaw, not as ‘law of the horse’, but as digital natives finally help bring the law syllabus, legal practice, and even legislatures into the Information Society.

In the first part of the chapter, I explain how the cyberlawyers of the 1990s dealt with regulation of the then novel features of the public Internet. Internet law was a subject of much interest in the 1990s in the US, and some specialist interest in UK and Europe.

In Part 2, I explain the foundational rules for the adaptation of liability online initially focused on absolving intermediaries of legal responsibility for end user-posted content. This exceptionalist approach gradually gave way. While some US authors are hamstrung by a faith in the myth of the superuser and somewhat benign intentions of corporations as opposed to federal and state government, there has been a gradual convergence on the role of regulated self-regulation (or co-regulation)Footnote 8 on both sides of the Atlantic.Footnote 9

In Part 3, I argue that the use of co-regulation has been fundamentally embedded since European nations began to enforce these rules, with limited enforcement in which judges and regulators stated that business models largely focused on encouraging illegal posting would not be protected. Settled policy on liability, privacy, trust, encryption, open Internet policies against filtering, were arrived at as a result of expert testimony and exhaustive hearings.

Finally, in Part 4, I argue that hanging those policies on a whim results in potentially catastrophic results in terms of untying the Gordian knots of intermediary safe harbour, privacy, copyright enforcement, and open Internet European regulations.

It is often forgotten that the Werbach’s ‘Digital Tornado’ paperFootnote 10 heralded a model of limited state regulation, but very substantial responsible collective self-regulation (‘consensus and running code’) within transnational law.Footnote 11 When that pact was broken by 4Chan script kiddies and two billion Facebook users, it moved regulation away from the responsible collectivism of the pioneers’ Internet.

There were three views of regulation in 1997: the type of self-regulation I have described; a belief in state regulation by those existing vested interests in broadcast, telecommunications and newspapers; and a third view that state regulation was inevitable as the Internet became ubiquitous but needed to be as reflexive and responsive as could be maintained with human rights responsibilities.

The perspective of today allows us to rethink the apparent triumph of the first view. If 2018 can in retrospect be seen as the year that the ‘Tech Bros’ view of regulation faltered and was replaced (to some extent) by state and supranational intervention, then the third option, of what I describe as co-regulation, appears to be supplanting that self-regulation option.Footnote 12 The state intervention was most notable in both scale and scope in European Union law, for data protection, consumer/prosumer protection, and also for competition enforcement.

Part 1: 1990s’ History of Internet Law

The Internet was developed in the 1960s at a group of research institutes in the United States and the United Kingdom.Footnote 13 The Internet is a network of approximately 50,000 autonomous systems, which are interconnected by the Internet Protocol. The Internet became an information network of critical mass in the 1990s with the rise of Bulletin Board Services (BBS),Footnote 14 still more so with the growth of commercial Internet service providers (ISPs) in the late 1980s, and eventually a mass market artefact with the development of the World Wide Web (‘WWW’) and release of commercial web browsers in 1993–1994. The Internet developed as a self-regulated academic network,Footnote 15 and its emergence as a commercial platform that would rapidly permeate through society was largely unpredicted.Footnote 16 Kahin and Nesson explained that the development of the Internet was bottom up and self-regulatory, and explored the emerging tensions as other nation-states began to assert a regulatory role.Footnote 17

Internet growth, together with its increasing commercial exploitation, was accompanied by an explosive growth in United States’ scholarship. In 1993, Reidenberg explained that information had become an international commodity, ill served by existing legal frameworks poorly adapted due to their focus on the tangible aspects of information-intensive products and insufficient attention to the intangible aspects of information content.Footnote 18 Reidenberg extended the argument that technology can create an environment in the absence of legal rules in his ground-breaking conception of lex informatica. In the absence of ex ante sovereign power and legal rules, technology can symbiotically create de facto commercial regulation in much the same way as the mediaeval lex mercatoria.Footnote 19 He extensively spelled out the use of technology as a parallel form of regulation.

Building on Reidenberg’s insights, Johnson and Post made the classic argument for the Internet as a borderless self-regulatory medium that should be permitted to develop with less of the state-imposed restrictions that impeded the growth and development of earlier media.Footnote 20 The growth of the application of law to its emergence was also unpredictable, although Johnson and Post argued for an ‘exceptionalism’ to permit this globalized unregulated medium to grow unfettered by state censorship, which they saw as both normatively and substantively unjustified. They drew on United States’ constitutional law and history in the argument. They suggest a structured, principled, and internationally acceptable manner for national legislators to respond to the Internet. Lessig, while rejecting excessive state intervention, warned that self-regulation could lead to an Internet controlled by corporate interests.Footnote 21 Lessig argued that state forbearance was rapidly resulting in private regulation by new monopolies, to supplement the existing regulation by technical protocols.

Although cyber-exceptionalism became the dominant viewpoint among scholars, it was not without its opponents. Goldsmith made a legal positivist stand against the Post-Johnson Internet exceptionalism, seeing as both normatively and substantively flawed any ‘claim that cyberspace is so different from other communication media that it will, or should, resist all governmental regulation’.Footnote 22 He asserted that it can be regulated, including via conflict of laws rules, although this is not a normative position on whether law should utilize its tools to regulate the Internet. In an early trans-Atlanticist article arguing against Internet exceptionalism and reactive national Internet regulation, Mayer-Schönberger and Foster argued that the global information infrastructure limits both absolutists and regulators.Footnote 23 The emerging internationalization of the Internet would lead to both jurisdictional conflicts as well as a clash of rights principles, as foreseen by Mayer-Schönberger and Foster. Samuelson argued persuasively that legislators must ensure that the impending rule making for the Internet is proportional in both economic and human rights terms to the needs and demands of users, as well as coordinated internationally.Footnote 24 Samuelson accepted the rise of the state, the need for sovereign intervention, and the efficiency self-regulation had provided, in arguing for principles for legislating on the Internet.

There have been extensive discussions as to the provenance of a field termed ‘Internet’ or ‘cyber’ law since the mid-1990s. As the law was colonizing the metaphorical “cyberspace” – communications between computer users over the Internet – most of the most authoritative and pioneering legal scholarship with regard to the new medium dates to the 1990s. Several offline subjects have themselves incorporated large literatures from their digital form, including intellectual property, non-networked computer law, telecommunications, privacy, cybercrime, and media content regulation. As the Internet was ‘born global’ but first became widely deployed in the United States, much of the literature has a bias in that direction.

Many argue that the effects of digital information retrieval on the law applies across all areas with some relevance, especially for intellectual property, and that Internet law should be considered part of the law of contracts, competition, the Constitution, and so on, with narrow exceptions for such issues as legal informatics, and telecommunications law, which are being transformed by technology, and therefore cannot remain distinctFootnote 25. Easterbrook famously argued along these lines that there is no field of ‘Internet law’, any more than there is the ‘law of the horse’.Footnote 26 Lessig responded that the transformative effects of the Internet on law, in areas including free expression, privacy, and intellectual property, are such that it offers lawyers a radically new route to thinking about private regulation and globalization, the limits of state action, as well as a powerful metaphor for explaining these wider changes to law students.Footnote 27 Sommer dismissed Lessig’s claims regarding the exceptionalism of cyberlaw, arguing that ‘a lust to define the law of the future’ is dangerous, and can create bad taxonomy and bad legal analysis.Footnote 28

Academics have constantly argued that the lack of general academic expertise and the emergence of the field mean that Internet law is a necessary short-term distinct study area, which may eventually be reintegrated into its constituent parts, as an inevitable eventual assimilation. Kerr explained two divergent views of Internet law. The first is an internalized expert view of the law, the second a technophobic view. Kerr concluded that two perspectives will converge and evolve, as more people understand the underlying technologies involved, and the useful middle ground.Footnote 29 In a survey essay into the origins of the Internet law debate, Guadamuz argued that several new fields are emerging from the study of computers and law, including legal informatics, artificial intelligence (AI) and law, and that Internet law can provide new insights into established fields that provide contemporary context for the theoretical study of several subjects, and the profession’s development as a whole.Footnote 30 Guadamuz argued that the ‘Attack of the Killer Acronym’ was preventing accessibility to Internet law for the wider legal profession, clients (and faculty).

Larouche later argued that the object of information law has mutated, scope for public intervention has been rolled back, implementation of any form of public intervention has been made more difficult, and that information law has seen its main topics expropriated by more traditional topics. The law syllabus is being digitized, literally (e-books, e-syllabi, e-libraries). He predicted the end of Internet law as a subject and the abstraction of information law to move away from a specific technology (except telecoms, media law). As a result, he argued that a ‘future information law’ will be radically amended.Footnote 31 Goldman argued for an Internet law that can be taught using new pedagogical elements employed on a survey-type course, and argued against Easterbrook that the volume of Internet-specific legislation and case law means that common law cannot provide a sufficient grounding for students to understand the transformations wrought by Internet law.Footnote 32

Specialization happened to some extent, with e-commerce part of standard contract law, platform dominance in competition law, digital copyright (and patent) law, cybercrime in criminal law, and so on, as Murray described.Footnote 33 Some of the more interesting specialist Internet law academic literature from the 1990s (and early 2000s) has also stood the test of time,Footnote 34 for instance, on network effects,Footnote 35 cyberlaw, and control by code or lex informatica,Footnote 36 free and open source software and control of the online environment,Footnote 37 network neutrality and the regulation of intermediaries by their networked environment,Footnote 38 and the creation of monopoly gatekeepers resisting yet also predicting the dominance of Google, Amazon, Facebook, Apple, and Microsoft (GAFAM).Footnote 39 Internet law has been approached as a private and public law, with policy perspectives from law and economics as well as sociolegal studies. The overviews that best introduce the topic to general readers contain contributions that provide both a commercial and a public law perspective. Some important contributions have focused on US law and policy,Footnote 40 and relatively few works provide a trans-Atlantic context.Footnote 41

The world has changed less than we think it has in the last generation, and the battle between tyranny and freedom is eternal and geographical.Footnote 42 Both the twenty-first-century Internet and the nineteenth-century telegraph are controlled by the Five Eyes (the Anglo-American powers and their former colonies in Singapore and Oceania). While the reach of international human rights law was severely limited in the nineteenth century, largely a matter of humanitarian aspects of the law of war and the extraterritorial application of domestic anti-slavery laws by the hyper-power Great Britain, we now live in what are claimed to be more enlightened times. The cabling of the planet for the Internet uses much the same undersea telegraph lanes and developments from those technologies. The first Internet link outside North America was to Norway (as part of the North Atlantic Treaty Alliance) in 1973. We have wired Africa and have an interplanetary Internet. Geography matters, and so does territorial sovereignty. Information flows through those cables, and whoever controls the cables controls the information. The tapping of telegraph lines and blocking of encrypted messages was de rigueur in the Victorian era but this policy has been challenged under international human rights law in the twenty-first century.

The likelihood that multistakeholder civil society is able to exercise useful scrutiny and control over hyper-power politicians and their obedient corporate clients or partners may appear remote, and the call for international norms for human rights law quixotic. It could mark what some might call a tectonic shift in governance of communications. Cables may girdle the Earth in only 66.8 light milliseconds, but we continue to observe covert Internet surveillance in the shadowy half-light of governance of the corporations and surveillance agencies that have for so long controlled our information.Footnote 43

Part 2: A Very Short Internet Liability Legislative History

These foundational rules for the adaptation of liability online focused on absolving faultless (and low fault, the line is shifting) intermediaries of liability for end user-posted content. More than two decades after ACLU v. Reno and the ‘Information Superhighway’ metaphor of Al Gore and Bill Clinton’s first term is as useful a time as any to look back to the future. Settled policies were arrived at as a result of expert testimony and exhaustive hearings, on liability, privacy, trust, encryption, open Internet policies against filtering. Changing those policies now may result in potentially catastrophic untying of the Gordian knots of intermediary safe harbour, privacy, copyright enforcement, and open Internet European regulations.

The legislation that underpins intermediary liability was introduced in an extraordinary ‘dot-com’ boom in the period 1996–1999, frequently dated to start on 12 April 1996, when Yahoo! underwent an initial public offering, shares making 270 per cent profit for investors on a single day. The growth of Yahoo! reflects the heady valuations of Internet stocks in the period with its peak at $118.75 a share on 3 January 2000 crashing to $8.11 on 26 September 2001 – lower than the price of its IPO.Footnote 44 The rise and fall of broader telecoms stocks (the Internet’s infrastructure plumbing) of about forty-two months was documented by Malik as amounting to an excessive valuation of about $750 billion.Footnote 45 A regulatory outcome of the large-scale fraud, accounting irregularity, and generalized lack of regulation in that period is the lack of proper investigation to learn the lessons of that boom and bust beyond the Sarbanes-Oxley Act 2002.Footnote 46 This may have contributed in small part to the failure of regulation, and far greater losses, of the ‘Great Recession’ of 2008–2009 and the ‘Age of Austerity’ that followed.Footnote 47

Two myths need rebutting to understand the ‘self-regulatory settlement’ of Internet law. The first is that the United States settled on self-regulation and a hands-off approach. While this was the spirit of the Digital Tornado paper, it was very much unreflective of the 104th Congress that voted through the Communications Decency Act as part of the Telecommunications Act 1996.Footnote 48 In the US, liability regimes have differed according to speech-based and copyright-based liabilities. Communications Decency Act 1996 s.230 provides that ‘No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.’Footnote 49 This language might shield ISPs from liability for subscriber copyright infringement as well. However, Section 230(e)(2) specifically states: ‘Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.’ Section 230 established the concept of limited liability.Footnote 50 The Digital Millennium Copyright Act 1998 s.512 laid out detailed rules for copyright infringement and the action required of intermediaries when notice of infringement, as paid out in DMCA, was sent. The introduction on 30 June 1995 of the Internet Freedom and Family Empowerment Act to amend the omnibus Telecommunications Act of 1934, was designed in part to mandate filters against adult pornography in all United States’ households, and the eventual law as amended was voted through 420–4 on 4 August 1995,Footnote 51 remaining the federal law until part struck down in the famous ACLU v. Reno Supreme Court case on 26 June 1997.Footnote 52

This non-filtered Internet regime, which arrived by accident as a result of constitutional convention, has been developed over time, and maintains a significant degree of difference from the gradually less permissive intermediary regime now permitted in the European Union.Footnote 53 Note that the 105th and 106th Congress were largely obsessed with attempting to impeach President Clinton for perjury, related to a sexual misconduct that was first publicized via that unrestricted Internet that Congress had attempted to control in 1995–7.Footnote 54 Attempts to reform the law in the period 2000 onwards were partially successful in restricting government-funded Internet services in for instance libraries, e.g. Children’s Internet Protection Act 2001,Footnote 55 although statutes such as Child Online Protection Act 1998 were struck down by the Supreme Court.Footnote 56

There is thus a patchy history of US federal legislators attempting to restrict Internet harms and place restrictions on Internet access, struck down by the Supreme Court defending individual liberty against censorship.Footnote 57 In the absence of an active Supreme Court, Europe’s lawmakers have faced fewer restrictions on controlling the Internet, although the liability regime is only modestly different. As Holznagel indicates, US courts have applied ‘safe harbour’ provisions to widely protect Internet service providers (ISPs), even where [a] it was aware of unlawful hosted content; [b] if it had been notified of this by a third party; [c] if it had paid for the data.Footnote 58 According to Yen: ‘[T]he general philosophy motivating these decisions – namely, that the liability against ISPs for subscriber libel would result in undesirable censorship on the Internet – remains vitally important in assessing the desirability of ISP liability.’Footnote 59 Despite multiple recent proposals to amend the limited liability safe harbour of s.230 Communications Decency Act to counter ‘revenge porn’, disinformation and terrorist content, the broad exemption from liability for ISPs has continued into 2020.Footnote 60 Frydman and Rorive see courts as ‘in line with the legislative intent … applied the immunity provision in an extensive manner’.Footnote 61

The second myth that needs exposing is that Europe was entirely reactive to the US Internet liability regime. While it is true that European telecoms were only formally liberalized in 1998, moves to regulate liability for online services predate the public Internet. European consumer Internet use roughly dates to 1998, with the opening of the Telecoms Single Market, and broadband to 2000, with the Local Loop Unbundling Regulation. However, a high-level group of experts led by Professor Luc Soete was set up in May 1995 to advise the European Commission on ‘social and societal changes associated with the Information Society’, which set out over one hundred initial policy suggestions in January 1996, including the infamous ‘bit tax’ to prevent e-commerce eroding the local tax base.Footnote 62 Among these suggestions was a recommendation to investigate further ‘appropriate ways in which the benefits of the Information Society can be more equally distributed between those who benefit and those who lose’. Given the upheavals of the ‘zero hours’ precariat economy of the 2010s, and the scandals of Apple, Amazon, Alphabet, Facebook and other multinationals’ failure to pay tax on in-country activities, the bit tax may be returning in 2020.Footnote 63

In the German Teleservices Act of 1997Footnote 64 and Bavaria v. Felix Somm (Compuserve) case,Footnote 65 Germany showed that it wished to see a similar limited liability regime to that in the US. This led with British support to adoption of the Electronic Commerce Directive of 2000, creating the Digital Single Market in e-commerce. 1999 seems very late in the dot-com boom – but the legislative history of the ECD is directly traceable to 16 April 1997, months before the Teleservices Act was finally ratified. The coordination of US and European lawmaking came in the International Ministerial Conference ‘Global Information Networks: Realizing the Potential’ in Bonn (then the German capital city) on 6–8 July 1997, which addressed ‘international policy-making amongst others for electronic commerce with a view to adopting a Ministerial Declaration’.Footnote 66 As with the US Telecommunications Act 1996, it was an eighteenth-month legislative process.

‘Safe harbour’ protection of ISPs from liability was only implemented on 17 January 2002, when the ECD came into force. Article 12 protects the ISP where it provides ‘mere conduit’ with no knowledge of, or editorial control over, content or receiver (‘does not initiate [or] select the receiver’). Benoit and Frydman establish that it was based on the 1997 German Teleservices Act, albeit with ‘slightly more burden on the ISPs in comparison with the former German statute’.Footnote 67 Where ISPs provide hosting services, under Article 14, they are protected from liability, in two ways:

  1. 1. the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity is apparent; or

  2. 2. the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disrupt access of the information.

Like the proverbial three blind monkeys, ISPs and web hosting services should ‘hear no evil, see no evil, speak no evil’.Footnote 68 As mere ciphers for content, they are protected; should they engage in any filtering of content, they become liable. Thus masterly inactivity except when prompted by law enforcement is the economically most advantageous policy open to them. Frydman and Rorive state ‘undoubtedly the Directive seeks to stimulate coregulation’. It does this by formally permitting national courts to override the safe harbour in the case of actual or suspected breach, of national law, including copyright law.

Whereas in the US, the absolute speech protection of the First Amendment and procedural concerns mean that Notice and Take Down is counter-balanced by ‘put back’ procedures, in Europe, where no such protection of free speech exists, speech freedom is qualified by state rights. In both jurisdictions, Notice and Take Down regimes cause Frydman and Rorive to state that: ‘[T]his may lead to politically correct or even economically correct unofficial standards that may constitute an informal but quite efficient mechanism for content-based private censorship.’Footnote 69 It is clear that the economic incentive for ISPs is simply to remove any content notified, otherwise do nothing to monitor content, and let end users, the police and courts, and ultimately the ethics of the content providers decide what is stored and sent over their access networks. Frydman and Rorive state that: ‘Business operators should never be entrusted with … guidelines defining the limits of the right to free speech and offering procedural guarantees against censorship … which belong to the very core of the human rights of a democratic people.’Footnote 70 That is nevertheless the situation that ISP Codes of Conduct seek to self-regulate.

Could a stronger case be made to make ISPs responsible for a class of their content, where it serves their commercial benefit? This is an idea that was suggested in the 1990s, before the CDA and ECD supplanted the idea. It has returned in the US with Balkin and Zittrain’s concept of information fiduciaries,Footnote 71 adapted to Europe in Perrin and Woods’ recent work on duty of care.Footnote 72

Vicarious liability tests the ability to benefit and control [i] the right and ability to supervise and [ii] a financial direct interest. This tends to make ISPs choose not to monitor even for law enforcement. The financial direct benefit is interesting in view of the ‘killer application’ for broadband deployment in the 2000s: Did this include peer-to-peer if the access charges received by the ISP is based on traffic i.e. adverts on portal or bandwidth usage? ISPs arguably benefitted from the existence of copyright infringement on the Internet. Thousands of users desired Internet service precisely because it offers free access to copyrighted materials. As Yen argued, an ISP (like the Polygram trade show operatorFootnote 73) could make copyright compliance part of its system rules and then monitor for violations.Footnote 74 The Viacom v. YouTube case in 2010 failed to fully establish the burden in such cases.Footnote 75

Similar controversies have arisen beyond content and intellectual property. The landmark 2000 French criminal case of Yahoo v. LICRA, confirmed that US multinationals must conform to national criminal law on hate speech.Footnote 76 With regard to privacy, in 2000, the Europeans and US published the ‘safe harbour’ agreement. Negotiated from 1998, it was always legal nonsense if sound policy, and was struck down by the European Court of Justice in Schrems in 2015.Footnote 77 Its replacement, the ‘privacy shield’, is equally a sticking plaster over trans-Atlantic differences, and may also be struck down. While this chapter will not describe any of the data protection law developments over the last 25 years, it is noteworthy that the Data Protection DirectiveFootnote 78 was continually attacked as unsuitable for the Internet that it was not expressly designed to regulate,Footnote 79 so the new General Data Protection Regulation is already subject to much attack for its failure to regulate artificial intelligence and robotics, yet again technologies for which it was not expressly designed … but may be adapted.Footnote 80

Part 3: The Development of Co-Regulation

The early period of frenetic legislative activity in 1997–2001 matched the growth of the Internet sector in Europe, which was very small and not officially measured until 1998, when it grew from 9 per cent to over 42 per cent in 2002 in the United Kingdom, for example.Footnote 81 This unprecedented growth of a single electronic medium was driven by broadband, mobile and Wifi-enabled Internet access as well as the growth of social media: seven in ten Europeans were using the Internet by 2010.Footnote 82 By the end of 2017, 86 per cent of European Union citizens used the Internet, with 433 million users, and 252 million users of Facebook within that number and approximately 400 million Google users.Footnote 83

The European Commission has conducted continuous monitoring of Internet self-regulation throughout the twenty-first century. A 2004 report for the European Commission concluded:

An imperfect self-regulatory solution may be better than no solution at all, and we must not raise our standards so high that self-regulation is never attempted. But there are limits to how much imperfection can be tolerated, and for how long. If self-regulatory codes and institutions are insufficiently transparent and accountable, and if they do not observe accepted standards of due diligence, they will lose the trust of the public and fail. There is a danger that some aspects of internet self-regulation fail to conform to accepted standards. We recommend co-regulatory audit as the best balance of fundamental rights and responsive regulation.Footnote 84

The development of Internet regulation has been scrutinized in real time as it developed. Self-regulation continues, and even in the absence of any new laws we would expect the development of the Internet not to be static.Footnote 85 Legislative impact assessments of Internet law that ask, ‘What happens if we do nothing?’, do not involve stasis. The zero option is that the Internet continues to develop.Footnote 86 Self-regulation is viewed as making standards and practices across industry that the European Commission, or a Member State, views agnostically in legislative terms (or pre-legislative, given the focus on areas that are emerging and which are not yet regulated), but which intends to monitor to analyse the extent to which the self-regulation approaches the standards of ‘representativeness’ that co-regulation is meant to demonstrate as a best practice. The Commission’s insistence that this is not an inevitable journey is backed by its actions in such areas as technical standard setting.

The largest European Internet companies are United States based. Half of the world’s ten largest public companies by capitalization are computer technology, Internet-based advertising, media and e-commerce conglomerates: Google (trading as Alphabet Inc.), Apple, Facebook, Amazon, and Microsoft (GAFAM). Apple is in the global top twenty corporations by revenues, with two Internet access providers in the top thirty (AT&T and Verizon). Large Internet companies have very high profit margins driven in part by their avoidance of high sales taxes, corporate taxes and transfer pricing, as well as merger activity. The European Commission explained that: ‘Google’s search engine has held very high market shares in all EEA countries, exceeding 90% in most. It has done so consistently since at least 2008.’Footnote 87 Regulation by states of the failings of those private actors is in general much slower, with the Google competition breach investigated from November 2010 until a record fine was finally issued in June 2017. The actors that enforce regulation on the Internet are thus young but globally successful multinationals, an unprecedented group of private actors regulating speech and commerce on a communications medium. In 2017, the European Commission found all these companies guilty of anticompetitive conduct:

  • Apple in Ireland, and Amazon in Luxembourg, had received illegal state aid of respectively €13 billion and €1.5 billion.

  • Google abused its dominance through its search business, EC imposing a €2.4 billion fine.

  • Facebook had flagrantly breached the terms of its merger with WhatsApp in 2014, with an EC fine of €110 million imposed in May 2017.

  • Previously dominant software and Internet company Microsoft had been found guilty of abusing its dominance three times since 2007; fined a total of €2.2 billion.

This total of fines is a record for any sector, as are the individual instances of fines. To give a sense of the scale of mergers by the companies in that period, they made 436 acquisitions worth a total $131 billion in the decade to June 2017.Footnote 88 These private actors operate with enormous scale and scope, yet they are legally regulated exactly as small commercial websites. The size and scale of their operations make their regulation more difficult than the equivalents in other industries – for instance, the infamous ‘Seven Sisters’ energy companies whose regulation inspired both energy and, to some extent, environmental law.Footnote 89 Such regulation between states and firms has been termed ‘para-diplomacy’,Footnote 90 and it is constantly engaged in by the GAFAM group.

Major platforms (now including Google, Yahoo!, Facebook, Microsoft) and access providers formed a self-regulatory group, the Global Network Initiative (GNI), in 2008 to respond to government demands for better enforcement. GNI members publish transparency reports which can be audited by the board of GNI, an example of self-regulation by a group.Footnote 91 Google first published a report in 2010, and reported in 2018 almost 4 billion annual copyright removal requests as compared to 495,000 annual “right to be forgotten” delisting requests and only 16,000 annual government content requests (affecting 221,000 websites), demonstrating that its most substantial enforcement actions are carried out on behalf of copyright owners.Footnote 92 Facebook, Twitter (since 2012), Amazon (since 2015) and others also produce annual transparency reports.Footnote 93

Co-regulation was noted by United States Congress in 2002 to describe certain aspects of European regulation: ‘government enforcement of private regulations’.Footnote 94 It actually came from Australia.Footnote 95 The European adventure in co-regulation in wider consumer protection legislation, as well as standards setting, was made detailed in 2002,Footnote 96 and became official policy in December 2003, with the Inter-Institutional Agreement on Better Law-Making (IIA), which defines co-regulation.Footnote 97 Although a non-legislative act, the IIA is virtually a constitutional document in European law, and its importance cannot be over-estimated, as it agrees the rules of engagement of the European Parliament, Council of Ministers and Commission.Footnote 98 The Commission confirms that forms of regulation short of state regulation ‘will not be applicable where fundamental rights or important political options are at stake or in situations where the rules must be applied in a uniform fashion in all Member States’.

De jure co-regulation involves legislation that tells the industry ‘regulate or else’. The UK Digital Economy Act 2010 included two specific elements of co-regulation, for the domain name authority (Nominet) and audiovisual media services online (the Authority for Television on Demand). De facto co-regulation exists where the regulators have used their powers of extreme persuasion. It is an area in which the industry players are very aware that the regulator has power. There can be de facto co-regulation taking place alongside de jure co-regulation.

The Commission in 2005 analysed co-regulation in terms of ‘better regulation’.Footnote 99 This was immediately made part of internal EC practice in the Impact Assessment Guidelines,Footnote 100 which the Commission must follow before bringing forward a new legislative or policy proposal.Footnote 101 Price and Verhulst (2005) contained significant focus on AOL and internal self-organization.Footnote 102 They identified even then increasing realism in recognizing competition problems, emerging monopolies, and dominance. Verhulst and Latzer provided excellent analysis of the types of co-regulation beginning to develop and their institutional path dependency.Footnote 103 They identify five types of regulation, short of statutory agency-led regulation:

  • Co-regulation,

  • State-supported self-regulation,

  • Collective industry self-regulation,

  • Single company self-organization,

  • Self-help/restriction by users including rankings to impose restrictions on access to content.

Note the direction of travel: both bottom-up transformations from self- into co-regulatory bodies, and top-down delegation from regulation into co- but not self-regulation. Also note examples of ‘zombie’ self-regulation – where no one will declare the patient dead or switch off the life support machine. I described these as ‘Potemkin’ self-regulators, where there was a website and the appearance of a regulator but few resources, no physical address containing offices and little or no apparent adjudication and enforcement.Footnote 104 We should note the gains and losses in the lifecycle of regulation – will self-regulation ossify if it stays true to its principles of self-regulation? If ossification were to result, would it matter other than to self-regulatory purists if a mature self-regulator were then to be made into a co-regulator? UK converged communications regulator Ofcom’s own managerial and regulatory analysis of co- and self-regulation arrives at similar conclusions.Footnote 105

The EC has made it pragmatic to fund standards and ex ante support self-regulation in cases where the US would simply ex post regulate via competition law. This leads to substantial US–European differences of approach, which may create ‘transatlantic competition of standardization philosophies … [in] consumer protection systems’.Footnote 106 Examples of co-regulation have become frequent in this field in the 2000s, notably in data privacy, domain name governance, content filtering, Internet security, and network neutrality, as well as standard setting and social network privacy regulation.Footnote 107 Both soft law and soft enforcement play a vital regulatory role which legal positivists would be in danger of overlooking by a failure to consider the law in its co-regulatory context.

A Beaufort scale of co-regulation was developed for the European Commission based on the Beaufort scale of wind speed (from calm to hurricane).Footnote 108 The wind in this case is the degree to which the government was breathing on the forms of self-regulation that were taking place. Zero was a state of calm, which would be an entirely technical standards body whose standards were formed totally within the technical community, such as the Internet Engineering Task Force, up to a state of storm, which could be the forms of co-regulation that were formalized in the Digital Economy Act. Between zero and eleven, there is a lot of room for us to see different elements of influence that have been exerted. That wind is blowing a lot more strongly from European governments and from parliaments towards trying to achieve something much closer to co-regulation than to self-regulation. There are three alternatives:

  1. 1. not to regulate, but the world develops without regulation

  2. 2. to regulate all the platforms that legislators are concerned about

  3. 3. to regulate only the dominant platforms.

It is this regulatory dilemma that I consider in the final part of the chapter.

Part 4: Back to The Future of Cyberlaw in the Ubiquitous Networked Computing Era

Internet lawyers are widening their horizons and returning to the broader notion of being information lawyers whose interests extend beyond a public IP network. The end of the special place for Internet law, and its absorption into media law, has been prematurely announced. It is not only the European institutions that are becoming excited about more Internet regulation, driven in part by self-preservation and the rise of disinformation (‘fake news’ – sic). Reed and others question how we regulate AIFootnote 109 and dominance of the ‘surveillance-industrial’ state in these post-Snowden/Schrems/GDPR times, pushing digital law into even constitutional studies.Footnote 110 These are exciting times to be an information lawyer.

To put a damp squib on too much recurrent techno-optimism or cynicism, I argue that most arguments for regulating the Internet and cyber-technologies today remain old wine in new bottles.Footnote 111 The United Kingdom regulator Ofcom has called for more regulation, and potentially a new regulator, of the Internet.Footnote 112 Most developed legal systems have lots of legal regulators of information, even if none of those is entirely shiny, new, and ‘cyber’. There is the UK Information Commissioner, Electoral Commission, Ofcom itself, the Advertising Standards Authority, and others. There are technical support institutions such as National Cyber Security Centre,Footnote 113 and a variety of non-governmental organizations such as the Nuffield Foundation-supported Ada Lovelace Foundation, the Turing Institute, and venerable Foundation for Information Policy Research.Footnote 114 In constructing what I call ‘OffData’, a regulator of electronic communications and content,Footnote 115 we need to learn the lessons of previous regulatory mergers both inside (OfCom) and outside (OfGem) communications. We need to recall what is known about sectoral regulation. UK Ofcom was set up almost twenty years ago as a result of technological convergence between broadcasting and telephony,Footnote 116 but deliberately constructed not to regulate Internet content. It is now required to so do. This is not a moment for unique solution peddling or an ahistorical view of the need to extend competences beyond a privacy, a security, a sectoral competition, and a communications regulator.

While information law is maturing, and the old Internet law/cyberlaw nomenclature may be fading, what we do as lawyers dealing with computers and their impact on society is growing more important. Some of the new ideas about regulating the Internet and artificial intelligence (AI) betray a naive faith in technology companies’ intentions towards law enforcement. It is now the job of grizzled, veteran information lawyers to help policy makers understand how to make better laws for cyberspace.Footnote 117 Hildebrandt explains the scale and scope that can create disinformation problems in social media platforms:

Due to their distributed, networked, and data-driven architecture, platforms enable the construction of invasive, over-complete, statistically inferred, profiles of individuals (exposure), the spreading of fake content and fake accounts, the intervention of botfarms and malware as well as persistent AB testing, targeted advertising, and automated, targeted recycling of fake content (manipulation).Footnote 118

Some of the claims that AI can ‘solve’ the problem of disinformation (‘fake news’) do just that. Limiting the automated execution of decisions (e.g. account suspension) on AI-discovered problems is essential in ensuring human agency and natural justice: the right to appeal. That does not prevent Internet platform operators’ suspension of ‘bot’ accounts at scale, but ensures the correct auditing of the system processes deployed.Footnote 119

Technological solutions to detect and remove illegal/undesirable content have become more effective, but they also raise questions about who is ‘judge’ in determining what is legal/illegal, desirable/undesirable in society. Underlying AI use is a difficult choice between different elements of law and technology, public and private solutions, with trade-offs between judicial decision making, scalability, and impact on users’ freedom of expression. Public and private actors have suggested that AI could play a larger role in future identification of problematic content – but these systems have their own prejudices and biases. It is worth restating that neither law nor technology is neutral: they both embody the values and priorities of those who have designed them (‘garbage in, garbage out’).

Does the use of AI that employs algorithmic processes to identify ‘undesirable’ content and nudge it out of consumers’ view, provide a means for effective self-regulation by platforms? The UK Parliament Artificial Intelligence Committee reported on some of these issues in 2017.Footnote 120 There are an enormous number of false positives in taking material down. It is very difficult for AI to tell the difference between a picture of fried chicken and a Labradoodle, simply because of the nature of the attempts by algorithms to match these things.Footnote 121 It will need human intervention to analyse these false positives. AI can be deployed, but Google and Facebook are employing 50,000 more people because they recognize that there will have to be a mixture in order to achieve any kind of aim.Footnote 122 Artificial intelligence and algorithms cannot be the only way to regulate content in future.Footnote 123

‘Mechanical Turks’ are people employed – subcontracted, typically – to carry out these activities,Footnote 124 in parts of the world where their own cultural understanding of the content they are dealing with may not be ideal.Footnote 125 One of the problems is that they are responding to a perceived need to remove more content, rather than addressing fair process and due process. Subcontracting to people on very low wages in locations other than Europe is a great deal cheaper than employing a lawyer to work out whether there should be an appeal to put content back online. The incentive structure will be for platforms to demonstrate how much content they have removed.

Transparency and explanation are necessary, but remain a small first step towards greater co-regulation.Footnote 126 Veale et al. have explained how to move beyond transparency and explicability to replicability: to be able to run the result and produce the answer that matches the answer they have.Footnote 127 The greater the transparency, the greater the amount of information you give to those users who do not read the terms of service online: the degree to which that helps is limited. Prosumers are told: ‘If you do not agree to the effectively unilateral terms of service you may no longer use Facebook.’ A better approach would be the ability to replicate the result achieved by the company producing the algorithm. Algorithms change all the time, and the algorithm for Google search, for instance, is changed constantly. There are good reasons to keep that as a trade secret. Replicability would be the ability to look at the algorithm in use at the time and, as an audit function, run it back through the data to produce the same result. It is used in medical trials as a basic principle of scientific inquiry. It would help to have more faith in what is otherwise a black box that prosumers and regulators have to trust. The European Commission has used the overarching phrase ‘a fair deal for consumers’.Footnote 128

Platform regulation is a new version of an existing regulated problem, with potentially dramatic negative effects on democracy and media pluralism.Footnote 129 In tackling disinformation (and other undesirable uses of online communication, as the history of electoral and defamation reform shows), not only the effectiveness of the technological measures needs to be considered, but also raising awareness of the individual and social responsibility for the provision and appreciation of verifiable truthful content, by independent platforms rather than a single central authority. Media pluralism and literacy go hand in hand with any technological intervention.

I predict that 2020 will see the implementation of hard law requiring ‘notice and action’ within one hour of complaints about illegal content online.Footnote 130 The vigorous action on social network regulation has not happened, in spite of urging from national and European politicians in view of terrorist content, sexual abuse, fake news, and the other vile elements of human society manifested on the Internet. European regulators continue to rely more on corporate social (ir)responsibility than hard law. The European Commission record fine for Google is being appealed, but it will have to accept some kind of co-regulation of its vertically integrated advertising in time.

I explained in the Introduction to this chapter that Werbach’s Digital Tornado, along with Reidenberg’s conception of lex informatica, heralded a model of limited state but very substantial responsible collective self-regulation. Hard law, in the shape of the proposed European Digital Services Act to be introduced in 2020, will continue in the 2020s to be accompanied by Codes of Conduct and other self- or co-regulatory measures. At the time of writing, the world was plunging into a deep economic and social depression due to the pandemic, with broadband connectivity and Internet platforms ever more vital. Even as legislatures introduce hard law to combat their particular favourite online harm, continued emphasis will focus on giant platforms’ self-regulatory practices. Cyberlaw has become mainstream in the most dramatic manner imaginable.

2 Networks, Standards, and Network-and-Standard-Based Governance

Julie E. Cohen Footnote *

The Net interprets censorship as damage and routes around it.

—John Gilmore, Interview, Time Magazine
Introduction

This chapter, adapted from a forthcoming book on the evolution of legal institutions in the networked information society, situates the disruptive effects of networked digital technologies within a longer process of institutional change catalyzed by the gradual emergence of an informationalized global political economy. Over the last half century, institutions for transnational economic governance have multiplied. The landscape of world trade agreements and enforcement processes has grown increasingly complex. New structures for transnational regulation of economic activity have emerged that seem to operate according to their own rules in ways influenced by states but not controlled by them. Other new institutions, created to govern the Internet and its constituent protocols and processes, do not operate based on state representation at all. This chapter juxtaposes the various governance processes and treats them explicitly as iterations of a new – or, more precisely, emergent – networked legal-institutional form.Footnote 1 It also considers the relationship(s) between that institutional form and new platform entities that wield enormous de facto power – though not (yet) formally acknowledged sovereign authority – based on their control of infrastructures and protocols for networked, social communication.

Although networked governance institutions differ from one another in many ways, they share a common structure: They are organized as networks constituted around standards. Each of the scholarly literatures that has grown up around the various institutions described in this chapter has grasped some essential aspects of the network-and-standard dynamic but not that of others. Legal scholars who study transnational business regulation have interrogated the political legitimacy of networked governance processes, and they also have explored the political issues surrounding the development of international technical standards. Even so, they have paid less attention to the ways that standards bind networks together, and so the two conversations do not fully join up.Footnote 2

Legal scholars who study “code as law” have explored how technical standards structure the markets organized around them, and they also have raised persistent, serious concerns about the relationships between and among automated enforcement, lock-step conformity, and authoritarian modes of governance. They have tended, however, to situate standards processes within market-based governance frameworks and to understand code’s mandatory nature as illustrating how code differs from law. Consequently, they have not taken network-and-standard-based governance seriously as a new legal-institutional type.Footnote 3 And, for the most part, the different scholarly communities have not engaged in much dialogue with one another.

To posit networked governance institutions as an emergent category of legal institutions is, of course, to beg some basic questions about what makes an institution distinctively legal. One traditional set of answers has to do with the ways that the outcomes produced by such institutions are linked to rulemaking and enforcement authority. Another traditional set of answers is more explicitly normative: what makes an institution distinctively legal is its adherence to regular procedural rules and associated rule-of-law values. Communities are accountable only to themselves and markets may mete out consequences that seem arbitrary. According to a thick conception of what makes a legal institution, law’s authoritarian bite is (or should be) mitigated by procedural fairness and conformance with principles of public reason.Footnote 4

As we are about to see, network-and-standard-based governance institutions satisfy each of these definitions in some respects while challenging them in others. For some, that means they are not law at all, but I think that answer is too pat. The shift to a networked and standard-based governance structure poses important challenges both to the realizability of rule-of-law values and to traditional conceptions of the institutional forms that those values require, but the rule-of-law constructs that legal theorists traditionally have articulated are themselves artefactual – outgrowths of the era of text-based communication and of accompanying assumptions about the feasible mechanisms for formulation, justification, and transmission of claims of authority that are now rapidly being outpaced by sociotechnical change.Footnote 5 If the new governance institutions are to serve the overarching values that traditionally have informed thicker versions of rule-of-law thinking, both institutions and constructs will need to adapt. Here, I lay some groundwork for that project.

The first part of the chapter provides an overview of the rich and varied assortment of transnational, networked governance arrangements. The next part identifies five important features of the network-and-standard-based legal-institutional form that challenge traditional understandings of how legal institutions – institutions constrained by the rule of law – ought to operate. The final part of the chapter provides a brief introduction to information platforms and the functions they perform and considers whether platforms are best understood as stakeholders or as emergent information-era sovereigns.

Networks and Standards in Transnational Governance

The processes of world trade regulation, transnational business regulation, and Internet governance span many different subject areas and involve many different participants and interests. The institutions through which those forms of regulation are conducted also vary considerably from one another in terms of their rules for membership and participation. Some assign membership and participation rights to nation states while others operate differently, and some are more highly formalized than others. Even so, juxtaposing the various institutions and processes also reveals equally important ways in which they resemble one another: They are organized as networks, the networks are constituted around standards designed to facilitate and structure flows of economic and communicative activity, and the formulation and administration of those standards reflect the increasing influence of private economic power.

The global logics of production and extraction that have become characteristic of informational capitalism rely heavily on governance arrangements for facilitating crossborder flows of trade. Norms of liberalization do not simply relate to manufactured goods or even to crossborder flows of raw materials and intermediate inputs to more complex products. Following the important Uruguay Round of negotiations, which produced the World Trade Organization (WTO), the General Agreement on Trade in Services (GATS) and the protocol on Trade-Related Aspects of Intellectual Property Rights (TRIPS), liberalization imperatives relating to services, information, and intellectual goods have emerged as separate, powerful logics driving the articulation and expansion of trade obligations.Footnote 6 The Uruguay Round also produced two important agreements on international technical standardization that have generated increasing momentum toward scientific (and quasi-scientific) rationalization of liberalization rules.Footnote 7

For many decades, the multilateral regime organized around the framework established under the General Agreement on Tariffs and Trade was the principal source of trade liberalization standards, but a series of rapid and pronounced shifts in the institutional structure of world trade governance began to occur in the mid-1990s. As noted earlier, the Uruguay Round produced several new multilateral instruments and a powerful new enforcement body, the WTO, which began operations in 1995. Following the Uruguay Round, however, the process of reaching new agreements under the established multilateral framework has ground to a halt, and trade negotiators have shifted their efforts toward framing and securing new bilateral free trade agreements. The thickening network of bilateral agreements has in turn shaped proposals for new multilateral and regional instruments.Footnote 8 Although the initial impetus for the turn toward bilateral and multilateral agreements negotiated outside the WTO framework came from the United States and other developed economies, the so-called Washington Consensus on trade liberalization has begun to fragment and other significant initiatives have emerged. For example, the Regional Coalition for Economic Participation (RCEP) has launched an effort to negotiate a new, pan-Asian trade protocol.

In parallel with the changes in institutional structure, the landscape of world trade governance and world trade activism also has broadened to include a more heterogeneous assortment of actors and interests. In particular, transnational corporations and business associations wield increasing de facto power in setting trade policy priorities. In part, that power flows through traditional channels of influence; powerful economic actors have long enjoyed privileged access to national policymakers and have learned to exploit that access to demand stronger and more effective protection for their global supply chains.Footnote 9 But global logics of production and extraction also translate into new networked models of influence that flow outside state-sanctioned channels, and assertions of corporate interest also have prompted experimentation with new forms of dispute resolution that allow corporations to assert claims directly against states.Footnote 10 Meanwhile, exploiting the same networked connectivity that has facilitated global concentrations of economic power, civil society groups have worked to challenge asserted failures of transparency and accountability, building alliances with one another and coordinating their efforts for maximum effect.Footnote 11

The landscape of transnational economic governance also includes a large and varied group of regulatory arrangements, some well-established and others more emergent, that extend through and around the boundaries of nation states. Some arrangements originate with the United Nations (UN) or its member agencies. Others are cooperative ventures among national regulators or among other entities that play well-established quasi-regulatory roles. For example, financial regulators and central bankers engage in extensive, cooperative crossborder governance of financial market activities, and data protection regulators work collaboratively on various policy issues.Footnote 12 Other regulatory arrangements involve UN officials or national regulators in collaboration with private industry oversight bodies and trade associations.

As in the case of world trade, pervasive and crosscutting themes in both the theory and the practice of transnational regulation are the increasing importance of standard-setting activities and the growing power of private “stakeholders.”Footnote 13 The universe of standard-making activities is large and diverse and comprises a thickening network of “soft law” that structures and coordinates economic conduct. Many of the UN’s standard-making initiatives are structured as public–private collaborations.Footnote 14 Additionally, in 1996, the UN adopted a consultative process intended to give civil society organizations and other nongovernmental organizations (NGOs) a formal avenue for providing input into its policymaking processes. Business NGOs have been especially active users of that process.Footnote 15 Transnational corporations engage in standard making to facilitate their own operations and those of their global supply chains, and industry associations may work to coordinate those activities.Footnote 16 In the domain of financial governance, private transnational associations spanning fields from securities to insurance to accounting perform a wide variety of governance functions.Footnote 17

Also notably, the outputs of both private and public–private standard-making processes have begun to migrate into the domain of world trade. In particular, new bilateral and multilateral trade agreements covering labor, environmental regulation, and corporate social responsibility often refer to such standards.Footnote 18 As a result, standard-making activities constitute a new and fruitful avenue for private economic actors wanting to shape the formulation of trade provisions intended to delineate the appropriate reach of domestic protective mandates.

A final important site of transnational legal-institutional entrepreneurship is the Internet and its constituent protocols and processes. The most prominent governance arrangements for the Internet are formally non-state-based and multistakeholder-oriented. The Internet Engineering Task Force (IETF), a voluntary membership organization of computer technologists, oversees the continuing evolution of the Internet’s foundational standards for information transmission, and the Internet Corporation for Assigned Names and Numbers (ICANN), a not-for-profit transnational governance corporation chartered under California law, oversees governance of the domain name system. An assortment of other organizations – some formally multilateral and some private – also play important roles, however. For example, the International Telecommunications Union, a UN-affiliated body, superintends standards for wireless telephony, and the Institute of Electrical and Electronic Engineers, a technical professional organization, coordinates the evolution of standards for wireless interconnection. The databases that map human-readable domain names to network addresses are maintained by a small group of entities – including universities, research consortia, government entities, and a few private corporations – pursuant to contracts with the Internet Assigned Numbers Authority, an entity for many years overseen by the US Department of Commerce and now administered by an affiliate of ICANN.Footnote 19

Technical standard making is front and center in Internet governance, but Internet governance arrangements also play more substantive and comprehensive roles in the governance of global networked communications, and do so via increasingly elaborate institutional structures.Footnote 20 Under pressure from a diverse mix of global stakeholders, ICANN has developed regularized pathways for participation, including formal consultative procedures for national governments and civil society organizations.Footnote 21 At its inception, the IETF was a self-selected community of volunteers that rejected “kings, presidents, and voting” in favor of “rough consensus and running code.”Footnote 22 Today, although membership remains voluntary and policy-making consensus-based, it comprises two principal divisions made up of over one hundred working groups, overseen by two steering groups and advised by two different boards. Working groups follow elaborate protocols for documenting their activities, communicating with other groups, and reporting to the steering groups and advisory boards. There is a process (so far, never used) for administrative appeals. At the same time, as in the cases of trade and transnational economic regulation, private economic power also plays a highly visible role in Internet governance. Private technology firms are well-represented in myriad working groups and steering committees, and as the Internet’s constitutive liberalization norms have been filtered through the lens of multistakeholder-based institutional design, they have produced institutional responses optimized to the needs of the most active and well-resourced stakeholders.Footnote 23

Networks, Standards, and the Rule of Law: Five Problematics

A network is a mode of organization in which hubs and nodes structure the flows of transactions and interactions. Some commentators have characterized structures for networked participation and governance as radically democratizing, while others have worried that the absence of definite chains of command undermines democratic accountability.Footnote 24 Important recent books about power and global political economy by David Singh Grewal and Manuel Castells explore the importance of networked organization for political economy generally, articulating new theoretical models of networked social, political, and communication power.Footnote 25 Network-and-standard-based governance arrangements, however, are not simply networks; they are also institutions.Footnote 26 This section seeks to identify with greater precision the various points of mismatch between the rule-of-law tradition in legal theory and the operation of the network-and-standard-based legal-institutional form. It begins by reconsidering two points of conventional wisdom about network organization and its relationship to legal power.

First, the assertion that network organization is inherently more democratic than other forms of organization because it facilitates the expression and circulation of dissenting views is open to serious question. It is true that, because network organization is nonhierarchical, even an enormously powerful hub cannot prevent information from flowing around it through other nodes.Footnote 27 Within networked governance arrangements, however, the ability to navigate interruptions works most reliably to the benefit of the powerful. The same networked affordances that enable the dissident to evade the censor also enable economically or politically dominant parties to circumvent inconvenient negotiating stalemates and avoid inconvenient but localized regulatory burdens. Within networked governance arrangements, power interprets regulatory resistance as damage and routes around it.

Second, the observation that network organization is nonhierarchical can be somewhat misleading. From an internal perspective, network organization around a standard imposes a form of hierarchical ordering that inheres in the standard itself. If other networks organized around other standards are available that may not matter much. But a standard invested with legal significance is not just a standard because participants lack the authority to depart from it. So too with a standard, such as the basic Internet protocol, that exacts universal adherence as a practical matter. Network organization under conditions of legally or practically mandated standardization may be quite exacting as to the forms of compliance, and it also may afford new opportunities for the exercise of economic and political power. From the perspective of traditional legal theory, this point is easy to miss because legal theory traditionally has drawn a distinction between rules and standards that drives in the opposite direction. Within that scholarly tradition, “rules” are granular and demand precise compliance, while “standards” are more flexible and are fleshed out via norms and interpretative conventions.Footnote 28 Network organization under conditions of legally or practically mandated standardization is a different creature entirely.

The powerful critiques of transnational governance arrangements that have emerged within legal scholarship still have not fully assimilated the hybridity of the network-and-standard-based legal-institutional form. Both the ability of power to route around inconvenient regulatory resistance and the relocation of authority into the standard strain traditional accounts of law, reliably eliciting institutional features that seem very different from those that a system of the rule of law would require. The same developments also strain conventional understandings of standards and standardization, reliably foreclosing the kinds of pathway that facilitate competition, correction, and stabilization in the contexts where standards are more usually studied. It has become vitally important to understand the ways that the intersecting vectors of governance, law, and standardization are transforming one another. This section identifies five important directions for inquiry, which relate to the nature of standard-making authority, the available pathways for contesting and changing the reigning standard, the available pathways for coopting governance mechanisms to advance authoritarian political and geopolitical interests, the mechanisms for political accountability, and the vernaculars in which mandatory standards are articulated, applied, and contested.

Dominance as Hegemony: The Problem of Unchecked Authority

One distinctive characteristic of emergent global networked legal-institutional arrangements is the way that network-and-standard-based organization reshapes the exercise of lawmaking authority. Within such arrangements, a dominant party’s ability to shape policy is both more absolute than it typically is within more traditional legal settings and more immediate than it typically is in technology standards markets. When instituted against a background of vastly unequal geopolitical power, network organization under conditions of mandated standardization has resulted in policy hegemony relatively unchecked by political or structural constraints.

In democratic societies with rule-of-law traditions, legal institutions are recognizable as such in part because of their adherence to regular, reasoned processes for making policy and for contesting policy choices. This is not to suggest that such processes always work perfectly or even well. But certain high-level constraints on institutional behavior – in particular, principles of separation of powers and procedural due process and commitments to giving reasons for official actions – also have been widely acknowledged in democratic societies.

Dominance in technology standards markets confronts different kinds of limit. Although networks do exhibit lock-in effects, various forms of competition remain possible (we will consider those forms more closely in the next section).Footnote 29 Additionally, in paradigmatic, discrete technology standards markets, the connection between market dominance and policy dominance tends to be indirect. The standards governing such matters as the layout of a typewriter keyboard or the arrangement of prongs on an appliance plug are thoroughly agnostic as to their users’ political beliefs and policy commitments. Many contemporary disagreements over technology policy arise precisely because the emergence of networked information and communications technologies has set protocol and policy on converging paths.

Network-and-standard-based legal-institutional arrangements connect protocol and policy directly to one another and eliminate separation between them. Within such arrangements, the point of mandated standardization is exactly to specify the kinds of flow that must, may, and may not travel via the network. The policy is the standard and vice versa, and that equivalence sets up the two interlocking dynamics that produce policy hegemony. On one hand, a dominant network enjoys network power – which David Grewal defines as the self-reinforcing power of a dominant network and Manuel Castells explains as a power that is “exercised not by exclusion from the networks, but by the imposition of the rules of inclusion” – simply by virtue of its dominance.Footnote 30 On the other, if a particular hub within a dominant network exercises disproportionate control over the content of the standard, then networked organization will amplify that hub’s authority to set policy and legally mandated standardization will amplify it still further.

Developments in the domains of world trade governance and transnational business regulation over the second half of the twentieth century mapped straightforwardly to this lock-in-based theoretical model (we will consider some more recent anomalies in the next section), enabling the consolidation of US policy hegemony across a wide and varied set of domains.Footnote 31 The case of Internet governance is more complicated. US observers, in particular, tend to think that Internet governance processes have avoided the worst excesses of US policy hegemony precisely because of their sui generis, multistakeholder design. As noted earlier, however, private technology companies, including especially the dominant US technology firms, wield considerable influence within Internet governance processes, and the turn to multistakeholderism reflects a long-standing and largely bipartisan preference in the United States for a strong private-sector role in Internet governance.Footnote 32 It is unsurprising, then, that the responses of marquee institutions such as ICANN and the IETF to the policy problems that have repeatedly bedeviled them – from privacy and surveillance to content regulation and censorship to intellectual property enforcement to network security – have tended to reflect the particular norms of flow enshrined in US information law and policy.

Legal Standards Wars: The Problem of Regulatory Arbitrage

A second striking characteristic of emerging global network-and-standard-based legal-institutional arrangements relates to the mechanisms available for changing a governing standard. On one hand, mandated standardization intensifies lock-in to the current standard by foreclosing many of the pathways for change that ordinarily would exist. On the other, it incentivizes efforts at regulatory disintermediation by those favoring a different or modified standard, and those efforts may gain purchase to the extent that the network remains accessible via new points of interconnection.

It is useful to begin by considering the mechanisms through which standards can change over time in market settings. Carl Shapiro and Hal Varian distinguish between evolution and revolution, with the former consisting of gradual change while maintaining backward compatibility with the original standard and the latter involving a sharp, disjunctive break between new and old standards.Footnote 33 Such changes may be implemented cooperatively, or two (or more) parties may seek conflicting changes, as in the case of the Blu Ray and HD DVD standards for digital video storage and playback, which maintained backward compatibility with the regular DVD format but were incompatible with one another. If the parties cannot agree on which course is best, a standards war may ensue.

In struggles to shape the future of a legally mandated standard, the mandatory structure of networked legal-institutional arrangements narrows the universe of possible outcomes. Gradual evolution is most feasible when it moves in directions that are compatible with the dominant standard’s underlying policy commitments. In theory, gradual retrenchment from the hegemonic norm is also possible; in practice, however, one cannot fall below the threshold level of compliance that the standard requires unless there is cooperative agreement to extend forgiveness.

Revolution against a background of mandated standardization is more difficult still. Absent cooperative agreement to depart from the dominant standard, revolutionary change – or, in the language of technologists, forking the standard – requires not only confidence in one’s installed base but also willingness to court diplomatic or even geopolitical instability. In the domain of world trade, disjunctive changes without backward compatibility risk starting trade wars; in the various domains of transnational business regulation, departure or threatened departure from agreed conventions can roil markets and create diplomatic incidents. Internet governance institutions have powerful norms against forking network standards. When such proposals have originated – generally from states that are geopolitical outsiders – they have commanded little support and have been unable to generate momentum.Footnote 34 A systemic shock can create impetus for a mutually agreed disjunctive break; so, for example, the 2008 financial crisis generated the momentum required to tighten standards for measuring bank capital adequacy.Footnote 35 Absent such a shock, however, revolutionary change is unlikely.

Standards wars can be horizontal or vertical, however, and this means that even dominant standards are characterized by their potential amenability to disintermediation by a rival standard that sits closer to the relevant activity. So, for example, although Microsoft’s Windows operating system still dominates the personal computing market, it is no longer the most important interface for those wishing to market applications to personal computer users. Web browsers provide an alternative interface for many applications, as do social networks and mobile operating systems. More recently still, the “Internet of things” and the emergent market for smart home assistants have opened new channels for companies seeking to become the intermediary of choice for as many online interactions as possible.

Networked governance arrangements organized around legally mandated standardization are similarly vulnerable to disintermediation by adjacent governance arrangements. So, for example, when developing nations began to balk at additional extensions to the TRIPS agreement that they saw as benefiting developed economies, US trade negotiators simply routed around the WTO, negotiating new bilateral and multilateral agreements incorporating the stronger provisions they wanted to see enshrined as new network standards. Developing nations fought back, gradually organizing around a proposed “development agenda” for the World Intellectual Property Organization (WIPO), a constituent body of the United Nations. The WIPO Development Agenda thereby briefly became an entry in an intellectual property standards war. Developing nations’ effort at “regime shifting” enjoyed only temporary success, however, because developed countries returned to WIPO in force and asserted their own interests.Footnote 36 Meanwhile, the copyright industries of the Global North have appropriated regime-shifting tactics to their own ends, working to introduce interdiction mandates directly into arrangements for Internet governance.Footnote 37

As a different example, consider evolving arrangements for governance of crossborder transfers of personal information. The European Union has worked to export its high standards for personal data protection to the rest of the world, whereas parties seeking greater liberalization – including especially dominant global platform firms headquartered in the US – have shifted their emphasis toward inserting strengthened mandates for crossborder flow into bilateral and multilateral trade agreements, including especially agreements involving the Asian nations that are increasingly significant players in the emerging crossborder data servicing economy.Footnote 38 Privacy NGOs have worked to thwart trade workarounds for data protection obligations, but that project has become more difficult as the center of gravity has shifted into trade governance, which had not traditionally been a focus of transnational privacy activism, and toward Asia, where civil society organizations focused on privacy and data protection have not traditionally had a significant presence. And here again, Internet governance has emerged as an important focus of regime shifting efforts; for example, even European data protection authorities have largely acquiesced in ICANN’s continuing failure to require adequate privacy protections for WHOIS registry data.Footnote 39 Each of these developments destabilizes settled expectations about where authority to regulate crossborder transfers of personal data resides and about what the reigning standard requires.

In theory, at least, a system of the rule of law is not supposed to work this way. An important principle associated with the ideal of the rule of law is that legal rules should be applied consistently, and the ideal of consistency in turn implies a degree of constancy. In fact, those ideals have been under siege since the complex legal ecologies of the late twentieth century began to offer a wider and more complex array of possibilities for regulatory arbitrage than those within which the rule-of-law ideal was first articulated. Even so, in domestic settings each strategy confronts built-in limits. At the end of the day, there is an institutional actor with the power to exercise jurisdiction over the challenged conduct, to superintend a reasoned but finite process of contestation, and to say what the law is. Relative to that benchmark, the new networked governance arrangements manifest both frustrating path dependence and a destabilizing failure of finality.

Network Power and Moral Hazard: The Problem of the Authoritarian End Run

A third distinctive attribute of global network-and-standard-based governance arrangements is a particular kind of moral hazard that concerns the relative importance of economic and political liberalization. As economic liberalization has become the primary driver of innovation in transnational legal ordering, and the overriding importance often ascribed to facilitating flows of crossborder economic activity sets up the conditions for a dynamic that I will call the authoritarian end run. In brief, an authoritarian regime wishing to stint its liberalization obligations in the interest of maintaining its political control often may do so with impunity because of the dominant network’s interest in maintaining and consolidating its economic dominance.

Recall that network power operates by harnessing and disciplining the desire for inclusion. That mechanism presents tradeoffs for the policy hegemon – the party that enjoys dominant hub status – as well as for other network participants. Simply put, there are downsides to sanctioning or expelling members for standards violations, and those downsides may lead both the policy hegemon and other network participants to overlook certain types of infraction – especially those that can plausibly be characterized as purely domestic in scope – to preserve flows of goods, services, and information across borders and within corporate supply chains. So, for example, developed nations historically have been willing to minimize the importance of certain labor practices in developing countries, to overlook local restrictions on religious and press freedoms, and to excuse certain endemic forms of gender discrimination.Footnote 40

To the extent that the authoritarian end run entails subverting the dominant standard for purposes dictated by conflicting political goals, it is broadly consistent with the dynamic of the legal standards war described in the previous section, but it is also different. In the short term, it is not an exit strategy but rather a shirking strategy available to entities lacking the power or the motivation to provoke a standards war. In the longer term, it is a strategy for alternative network making around standards that blend elements of economic liberalization with elements of mercantilist central planning and political control. Above all else, authoritarian states seek to control unwanted flows of information within and across their borders.

In principle, any state with sufficient economic and geopolitical power can wield what Manuel Castells calls network-making power – or the power to constitute its own network by establishing alternative conditions of interconnection.Footnote 41 In the contemporary geopolitical landscape, the principal author of the authoritarian end run is China. Chinese trade policy and information technology policy have emerged as powerful and mutually reinforcing components of a larger strategy for pursuing dominance of standards for global economic and technical exchange. The Chinese program for physical infrastructure development, now known in English as the Belt and Road initiative, seeks to facilitate flows of labor, goods, and raw materials across continents and oceans under conditions that advance Chinese economic interests.Footnote 42 The Chinese information technology sector has grown rapidly and now includes two firms that rank among the world’s twenty largest: search and social networking firm Tencent and e-commerce giant Alibaba.Footnote 43

As the Chinese information technology sector has matured and turned toward new markets, affordances for both economic development and state control of communications infrastructure play key roles. Tencent, Alibaba, and other Chinese platform companies have begun to make inroads in developing markets across Asia, Africa, and the Middle East, and Chinese hardware manufactures like Huawei and Xiaomi sell equipment ranging from backbone servers to mobile phones across the developing world. In terms of development, capabilities for mobile payment, banking, and credit have driven rapid penetration within populations hungry for modernization.Footnote 44 For client states inclined to control information flows to their own populations, meanwhile, Chinese firms’ relative willingness to work with host governments to implement filtering and surveillance in their own markets is a selling point.Footnote 45 The combined result of these technology policy initiatives is “a geopolitical enclave in which computational architectures and informational actors are coming together into what could be deservedly termed the Red Stack” – a networked communications infrastructure offering the ability to layer separation and control on top of the underlying connectivity afforded by the basic Internet protocols.Footnote 46

The authoritarian end run has an ambivalent relationship to the rule of law. On one hand, both Chinese trade policy and Chinese technology policy emphasize centralized control by state institutions. One byproduct of China’s accession to membership in the WTO and its movement toward greater economic liberalization has been modernization of domestic courts and other formal governance institutions along the lines that the WTO’s obligations require. To the extent that concerns about the rule of law in the era of networked governance hinge on the disintegration of sovereign authority, one might argue that some components of the Chinese strategy address those concerns. On the other, the rule-of-law construct that Chinese global governance initiatives enshrine is thin, emphasizing regularity and predictability over transparency and contestability – features that Chinese Internet policy, in particular, works to eliminate.

Extreme Multistakeholderism: The Problem of Public Accountability

A fourth striking characteristic shared by the processes described in this chapter is their unusual mechanisms for political accountability. Emergent networked governance arrangements are strikingly inhospitable to traditional mechanisms for instilling accountability within legal institutions, and they have invited powerful new variations on rent seeking by nonstate actors. That development marks the emergence of a new model of public participation in governance, which I will call extreme multistakeholderism. It is amenable to practice by those entities or coalitions that are both sufficiently well-resourced to monitor governance processes unfolding concurrently at multiple sites and sufficiently well-connected to gain access to processes and documents that may be shrouded in secrecy.

In theory, many of the transnational regulatory processes described in this chapter incorporate delegation-based accountability mechanisms.Footnote 47 In the US for example, trade policy is the domain of the executive, and the executive in turn is accountable to the Senate and to the voting public. Interventions in transnational business regulatory processes also emanate from the executive branch and its constituent agencies and commissions. In practice, however, both trade policy processes and transnational regulatory processes are far more accountable to special interests than to their official constituencies. In the U.S., members of the industries affected by trade agreements sit on trade advisory councils that operate outside the purview of open-government laws, and new “fast-track” procedures have been devised to move newly ratified agreements through the congressional approval process; both arrangements are thought to be justified by the executive’s broad authority to conduct diplomatic relations with foreign countries.Footnote 48 Transnational regulatory processes are procedurally entrepreneurial and may also incorporate substantial privatization components, so the traditional mechanisms for executive branch accountability tend not to reach them directly. The courts, for their part, are inclined to regard both trade policy choices and transnational regulatory undertakings as nonjusticiable because they involve matters committed to the discretion of political actors.Footnote 49

Internet governance processes that rely on delegation-based accountability work differently and somewhat better. The particular form of incorporation chosen for ICANN – that of a Californian public benefit corporation – imposes a set of accountability mandates that include both stakeholder representation and transparency obligations. ICANN has adopted a variety of measures to keep its constituencies informed, some taken from the corporate toolkit (e.g., quarterly performance calls) and others from the transnational governance toolkit (e.g., multilingual reporting).Footnote 50 In practice, however, the stakeholder-based model for public input has produced – and was intended to produce – a significant policy tilt toward relatively well-resourced interests concerned chiefly with protection of trademarks and other intellectual property and more recently also concerned with access to the rich trove of personal information contained in WHOIS domain registry databases.Footnote 51

The other traditional mechanism for political accountability involves direct participation. Some Internet standards governance arrangements adopt this model, but here too underlying patterns of power and access can operate to impede participatory democracy. Although membership in the IETF and a number of other technical professional organizations active in Internet standard making is exercised on an individual basis, as a practical matter participation is heavily corporatized. The World Wide Web Consortium, a private membership organization that oversees the development of the Web’s hypertext protocols, has different tiers of membership for different types of stakeholder, with large technology corporations paying the highest fees and wielding corresponding levels of clout.

From a theoretical perspective, these developments are unsurprising. Within networked governance arrangements, one would expect both assertions of power and assertions of counterpower to exhibit returns to scale.Footnote 52 The lengthy, intricate, and globally distributed nature of transnational legal-institutional processes sets an effective lower bound on the kinds of entity that can participate effectively. The affordances of networked media and communication infrastructures offset geographical limits to some extent but also favor those best positioned to make use of them to coordinate interventions across multiple, farflung sites. Civil society organizations too have learned to play the multistakeholder game, forming transnational networks that enable them to pool their resources and act cooperatively, but corporate actors and business NGOs, including the International Chamber of Commerce and the International Trademark Owners Association, have followed suit, mobilizing the comparatively greater resources of their memberships to shift policymaking efforts into more congenial arenas.Footnote 53

The flipside of procedures guaranteeing both orderly contestation and finality is a political culture prepared to honor their requirements and abide by their results. The political culture of extreme multistakeholderism is different. The rewards flow to those who can access the most up-to-date information and marshal it most effectively on a global playing field. Those who lack comparable resources are doomed to play catch-up, continually pursuing a threshold of influence that remains out of reach.

Technocracy and Its Discontents: The Problem of Publicly Available Law

A final distinctive attribute of emergent arrangements for global business governance and global network governance is their highly technocratic character. Some legal scholars who study transnational regulatory processes have worried that those processes lend themselves to capture by elites.Footnote 54 It is helpful to understand that tendency as bound up with essential but imperfectly assimilated sociotechnical shifts. The mandated standards around which networked legal institutions are organized exemplify an approach that scholars who study sociotechnical assemblages for financial regulation have called the numericization of governance.Footnote 55 They are developed via expert proceedings and encoded in lengthy, highly technical specifications whose implementation requires ongoing supervision by cadres of managerial elites.

The particular expert register in which transnational governance is conducted varies from setting to setting. In the Internet governance context, the language of governance is produced by and for computer scientists and engineers. In world trade governance and transnational financial regulation, the language of governance is predominantly economic and, particularly in financial governance settings, highly quantitative. Environmental and food and drug regulatory processes incorporate technical vernaculars from fields such as climate science, marine ecology, and epidemiology. In other transnational settings, the prevailing vernacular is more generally managerial. For example, detailed operational standards geared to the rhythms of organizational processes and to the benchmarks and reporting conventions used by professional auditors are increasingly common features of transnational environmental and labor regulation.Footnote 56

In each case, reliance on technical vernaculars produces both some obvious entry barriers and some less obvious obstacles to broadly democratic policymaking. Even where participation in network governance processes is formally open to all comers, as in the case of the IETF’s working groups, the learning curve for those without appropriate technical facility is often steep. Civil society organizations in particular have struggled to attain technical parity with their better-resourced counterparts in the business and technology communities.Footnote 57 Expertise is required, as well, to understand the ways in which methods and analytical commitments that are ostensibly technical also implicate, reflect, reinforce, and sometimes predetermine policy commitments. Disentangling fact from value and understanding the social construction of technology are perennial problems in science and technology policy, but network organization under mandated standardization exacerbates them.Footnote 58 As substantive policy choices are folded into mandated standards, they become more and more difficult to disentangle, and certain types of especially incommensurable concern – for example, concerns relating to development of capabilities for human flourishing and protection of fundamental rights – may seem to disappear altogether.Footnote 59

A corollary is that, as technocratic oversight of regulatory functions becomes more solidly entrenched, the (explicit or implicit) political commitments of the expert regulators themselves may become more difficult to identify, contest, and dislodge. So, for example, the pathbreaking “end-to-end” design of technical protocols for the Internet reflected solid technical judgment about robustness to certain kinds of disruption and also encoded the generally libertarian commitments of the original Internet pioneers. As a result, although the Internet overall is extraordinarily resistant to disruptions of service, it has proved extraordinarily hospitable to other kinds of threat that exploit networked interconnection.Footnote 60 The discourses of risk management and cost-benefit analysis that play increasingly important roles in financial and environmental regulation charge can fail to reckon adequately with certain kinds of large systemic threat.Footnote 61 In the domain of world trade, the leading theoretical models generally have viewed liberalization as an unqualified good and have tended to discount evidence suggesting that it is also important to provide for equitable distribution and domestic capability building.Footnote 62

An important element of the rule-of-law ideal is commitment to publicly accessible rules and publicly accessible reasoning about the justifications for official decisions. From that perspective, network organization under mandated standardization creates a paradox: Effective control of highly informationalized processes requires governance institutions capable of responding in kind, but the very process of optimizing regulatory controls to highly informationalized processes makes governance arrangements more opaque and less accountable to broader global publics.

Platforms as Emergent Transnational Sovereigns?

So far, the discussion has presumed that, within networked governance arrangements, nonstate entities act as stakeholders but only sovereign states function as policy hubs. But that implicit division of roles ignores both the leveling effects of network logics and the amenability of standards to disintermediation. Commentators have long puzzled over the undeniable fact that, although they are nominally stakeholders in transnational networked governance processes, transnational corporations speak with increasingly independent voices in their relationships with sovereign states and also wield considerable governance authority of their own over globally distributed labor and supply chains.Footnote 63 Dominant global platform firms – firms that have attained positions as dominant intermediaries within the emerging global, networked information and communications infrastructure – push both tendencies to new extremes.

Over the last several decades, the platform has emerged as the core organizational logic of the political economy of informationalism. In other work, I have explored the intertwined functions that platforms provide – intermediation between would-be counterparties and techniques for rendering users legible – and explained how those functions afford unprecedented control of commercial and social interaction.Footnote 64 From an economic perspective, platforms represent infrastructure-based strategies for introducing friction into networks. Platforms provide services that participants view as desirable and empowering, thereby generating and enabling participants to leverage network externalities. At the same time, they use a combination of legal boilerplate and technical protections to assert ownership and control of user data, to police access by app developers and potential competitors, and to maintain their algorithms for search, user profiling, and ad targeting as valuable trade secrets. Those strategies work to pull users closer and keep would-be competitors further away, generating the rich-get-richer dynamics that have produced dominant platforms and continually reinforce their preeminence.

As a result of their enormous and growing power over the conditions of information exchange, platforms are unmatched by other transnational corporations in the extent of the authority they wield over the day-to-day experiences and activities of their users. Here again, the distinctions developed by Manuel Castells in his exploration of communication power in the networked digital era are useful for explicating the various kinds of power that platforms possess. By virtue of their privileged and infrastructural access to flows of information, platforms wield both network power – which, as we have seen, inheres in the self-reinforcing power of a dominant network and by extension in its standards – and network-making power – or the power to constitute the network and perhaps to reconstitute it along different lines by altering the conditions for interconnection.Footnote 65

From the traditional international relations perspective, it makes no sense to speak of platforms or any other private corporations as sovereigns. Within the Westphalian international legal order, a sovereign state is, most minimally, an entity with a defined territory and a permanent population, the authority to govern its territory, and the capacity to enter into relations with other states.Footnote 66 Platform firms own premises within the territories of nation states and provide services to citizens of those states. Unlike state sovereigns, they lack authority to use physical force to assert the primacy of their laws or defend the sanctity of their borders.

Yet the growing practical sovereignty of platforms over many aspects of their users’ everyday lives blurs the boundaries that the traditional criteria impose. The network power and the network-making power of platforms are rooted in the very considerations of territory, population, and enforcement authority that platforms supposedly lack. Both technically and experientially, platform territories are clearly demarcated spaces.Footnote 67 Google and Facebook also operate substantial privatized Internet “backbone” infrastructures – the interconnection facilities that link different pieces of the Internet together.Footnote 68 Dominant platforms such as Facebook, Google, and Apple have user populations that number in the billions, vastly eclipsing the populations of all but the largest nation states.Footnote 69 The logic of platform membership is a network logic that relies on lock-in, and it persistently undercuts the strategies of exit and voice through which users police more ordinary commercial relationships. The benefits of membership accrue most visibly and predictably to users who maintain permanent and consistent membership. In part for that reason and in part because of the way that platform protocols mediate access to platform territories, the enforcement authority of platforms is real and immediate. Platform protocols structure the forms and flows of permitted conduct – e.g., sponsored search results, Facebook “likes” and “tags,” Twitter retweets – and enable swift imposition of internal sanctions that may range from content removal to account suspension or cancellation.Footnote 70

Sovereign authority also must be recognized as such by other sovereigns, and here the picture is muddier. On one hand, the dominant US platform firms actively and theatrically resist both incursions by nation states on their governance authority and various kinds of national and local regulation. On the other, platform firms have at times pursued more collaborative relationships with governments on both matters of national security and law enforcement and matters of technology policy more generally.Footnote 71 As noted earlier, Chinese platform firms have more systematically collaborated with government authorities both at home and abroad. On the global stage, platforms firms increasingly practice both diplomacy and transnational networked policymaking in the manner of sovereign actors. Facebook’s privacy team travels the world meeting with government officials to determine how best to satisfy their concerns while continuing to advance Facebook’s own interests, much as a secretary of state and his or her staff might do.Footnote 72 Speaking at a recent network security conference, Microsoft’s president sketched a vision of a future in which platform firms function as “a trusted and neutral digital Switzerland,” safeguarding private communications against all types of sovereign incursion.Footnote 73

In short, networked legal-institutional form allows for the possibility of nonstate but functionally sovereign power, and platforms represent an (arguable and emergent) example of such power. Concentrated stakeholder control of the networked communications infrastructure can produce and perhaps is beginning to produce an inversion of law- and policymaking authority, through which certain very powerful platform stakeholders become policy hubs in their own right. Theories of international relations that deny the possibility of private sovereignty are ill-equipped to respond to that possibility. Reconceptualizing transnational governance in a way that accounts for the network-making power of dominant platforms has become an increasingly important project.

Conclusion

Taking networks and standards seriously as organizing principles for a new legal-institutional form provides a helpful framework for understanding various features of transnational governance arrangements that have perplexed scholars across a number of disciplines. If what such institutions do is not “just” governance – if they represent an emergent form of law for the informational economy, and an increasingly important one – the disconnects between network-and-standard-based governance and rule-of-law ideals point to the beginning of an important institutional design project directed toward rendering them more accountable to global networked publics. That project also must contend with the growing practical sovereignty of platforms and with the challenges such sovereignty poses for the practical realization of rule-of-law aspirations within networked information environments.

3 Tech Dominance and the Policeman at the Elbow

Tim Wu Footnote *
Introduction

What drives the “digital tornado,” to use the evocative phrase coined by Kevin Werbach to describe the fierce, concentrated winds of technological change? One school of thought, neo-libertarian at its core, sees it as an entirely private process, driven by brave scientists, risk-taking entrepreneurs, and the capital markets. If government is relevant, it is merely through the guarantee of property rights and contract; otherwise it does best by staying out of the way.

But what if powerful firms seek to slow down, modulate, or co-opt the winds of change? The view just described takes this as an inherently hopeless task, for it is axiomatic that the rate of technological change is always accelerating, so that any firm or institution dependent on a given technology is therefore automatically doomed to a rapid obsolescence. Even well-meaning laws designed to catalyze innovation, at best, merely risk interfering with a natural progression toward a better technological future, hindering “the march of civilization.” As the general counsel of Standard Oil once put it, government cannot control the aggregation of private power: “You might as well endeavor to stay the formation of the clouds, the falling of the rains, or the flowing of the streams.”Footnote 1

This view, which was widely held in the early 2000s through the 2010s, has great relevance for the antitrust law, the subject of this chapter, and particularly, the parts of the law concerned with monopolization. For if we can indeed assume that the rate of technological innovation is always accelerating, it follows that there can be no such thing as lasting market power, the concern of the law. The dominant firm, necessarily dependent on an older technology, will be quickly surpassed and replaced by a new firm. In its strongest version, it suggests that the antimonopoly portions of the antitrust law are obsolete.

Over the 1980s through 2010s, a series of powerful anecdotes supported this narrative, so much so that it became a broadly accepted wisdom. After all, IBM, in the 1970s and 1980s, once thought lord of everything, was bested by a college dropout named Bill Gates and a few of his buddies. Microsoft, in turn, was ravaged by a series of garage startups with goofy names like Yahoo!, Google, and Facebook. AOL rose and then fell like a rocket that fails to achieve orbit, as did other firms, such as MySpace, Netscape, and so on. The chaos and rapid change made it obvious to many that there could be no such thing as a lasting monopoly. A three-year old firm was middle-aged already; a five-year old firm almost certainly near death, for “barriers to entry” were a twentieth-century concept. The best, indeed the only thing the antitrust law should do is to stand well back and watch.

But what if the supposed new order itself were itself just a phase? What if the assumption of constant accelerating technological change is wrong – or a function of market structure? As these questions may suggest, this chapter joins the challenge to the narrative described. I say join because it is a larger conversation, not to be settled with one single chapter. The contribution of this chapter is to examine a foundational part of the narrative – the erosion of IBM’s dominance in the 1970s and the case of United States v. IBM.

Why focus on IBM? The decline of IBM’s dominance over the 1980s has long been a foundational part of the story that we described in the introduction, one that casts the “new economy” as an exception to the usual rules of industrial organization. As the story goes, IBM, bested by Microsoft, Compaq, Dell, Intel, and other competitors, serves as strong proof that lasting monopoly in unachievable in high-tech industries. Even the mighty IBM could not hold out, given the inevitable challenge from new inventions and innovators.

Unfortunately, that account tends to overlook the fact that IBM was not subject only to the forces of technological change, but also faced significant legal challenges, targeted directly at the exercise of monopoly power. This chapter suggests, with the benefit of decades of hindsight, that subjecting IBM to an antitrust lawsuit and trial actually catalyzed numerous transformational developments key to the growth and innovation in the computing industries. The undeniable fact is that the “policeman at the elbow” can and does change conduct. The IBM case put a policeman at the elbow of the world’s dominant computer firm during a crucial period of change and development in the technology industries. This, I suggest, aided an ongoing process of transformational or Schumpeterian innovation.Footnote 2 Contrary to conventional wisdom, I also think that United States v. IBM is a valuable guide to enforcement policy in the technology-centered industries. This chapter, in short, is a revisionist history of the IBM case, one that casts serious doubt on the narrative of law’s irrelevance in aiding technological change.

The goal is not just to give the IBM case its due among those who study law and technology, but also to rehabilitate its reputation within antitrust law, where, it is given, conventional wisdom has not been kind. United States v. IBM has been cast as among antitrust’s lowest moments, and among the Justice Department’s greatest mistakes. Robert Bork memorably dubbed the litigation “Antitrust’s Vietnam”; Joseph Lopatka termed it a “monument to arrogance”; while an appellate judge quipped that it “went on longer than World War II and probably cost as much.”Footnote 3 Lasting from 1969 through 1982, the case included a six-year trial; the government’s presentation took 104,000 pages of transcript, while for its part, IBM called 856 witnesses and cited 12,280 exhibits. Yet after years of trial, the Justice Department withdrew the case in 1982, without settlement or judicial remedy. The failure of the case to reach a verdict added ammunition to a Reagan-era movement against antitrust’s “big case” tradition.Footnote 4 This has yielded, for many, one lasting “lesson” from IBM: that “big antitrust” – the industry-clearing Section 2 cases – should be used sparingly, at best, given the costs and potential for failure.

This chapter challenges the conventional wisdom and suggests that the IBM lawsuit and trial, despite never reaching a verdict, actually catalyzed numerous transformational developments key to the growth and innovation of the computing industries.

I do not seek to defend everything about the IBM trial. It is admittedly difficult, if not impossible, to defend the manner in which the Justice Department and court managed the litigation and allowed it to last so long. It is also true, as the critics have charged, that the government could have had a clearer theory from the outset. However, in spite of the lack of a remedy, the historic materials made available since the litigation have made it clear that the antitrust case did substantially change IBM’s behavior in specific ways. Perhaps the most important was the case’s role in pushing IBM to unbundle its software offerings from its hardware, and therefore to leave room for the birth of an independent software industry. While the effects are less direct, the case seems to have also influenced the manner of IBM’s PC launch and its conduct thereafter. These developments appear to have at least contributed to the thriving of an independent computer software industry, and later, to a new market for competing, IBM-compatible personal computers, as well as a slew of related, independent industries in storage, processing, printing, modems, and otherwise. During this period, IBM’s avoidance of exclusive contracts and its failure to acquire or seek control of obvious targets (like Microsoft itself) all suggest a firm with “antitrust phobia,” and thereby one that allowed competition to flourish.

Of course, there were a great number of other factors in the late 1970s affecting the software and hardware industries, and there is no claim here that the IBM antitrust litigation drove everything that happened during this era. However, many of the existing narratives are too quick to assume that the developments were “inevitable,” or, alternatively, all the byproduct of the particular genius of Bill Gates, a favorite thesis of the Microsoft-centered books. As discussed earlier, it is well understood by legal scholars that both firms and individuals may behave differently when enforcement is more likely, especially “with a policeman at the elbow.”Footnote 5 The operating theory of this chapter is that a pending monopolization case, which focuses on exclusionary and anticompetitive acts and scrutinizes efforts to dominate new industries, may affect firm conduct in recognizable ways. And the thesis is that the policeman standing at the elbow of the dominant computing firms during the 1970s and early 1980s had an important impact on the development of the software and personal computing industries.

This reexamination of IBM also has important implications for antitrust enforcement policy, for an enforcer interested in “big cases” in the tech industries filed under Section 2 of the Sherman Act. To say that antitrust has, in the late 2010s, regained relevance is to state the obvious. For since the turn of the millennium, when it seemed that market power was indeed always going to be fleeting, the tech industries have consolidated into a much smaller number of “big tech” firms. The question, therefore, of when to bring a big tech case and against whom has returned to first-order importance.

This chapter suggests three things. First, that government lawyers should look for situations where it appears that a single firm is sitting as gatekeeper on what might, plausibly, be several innovative industries, and where breakups or lesser remedies might therefore unleash substantial growth. Specifically, beyond the usual search for monopoly power and anticompetitive practices, enforcers should be looking for “bundled” or “tied” markets that have the potential to be those nascent industries. The presence of stunted cottage industries might suggest an underlying potential. Second, the IBM case suggests the importance of a credible threat – that is, an investigation that seeks dissolution or other important remedies – so as to induce actual changes in conduct and deter anticompetitive behavior. Finally, the IBM case cautions enforcers to be concerned, but not overly concerned with the costs of investigation and trial, which are multimillion dollars questions, when there are billions and possibly trillions at stake.

Background: The Company and the Market

The predecessor firm to International Business Machines was founded in 1911, as a manufacturer of machines for tabulating and data processing. By the 1960s, IBM had become a dominant manufacturer of general purpose, or “mainframe” computers designed to be used by corporations, government agencies, and other large institutions. “Big Blue” was, by then, the largest computer manufacturer in the world. By 1971, IBM had grown to 258,662 employees and $7.2 billion in annual revenues, and its IBM System/360 was the nation’s, indeed the world’s, most successful computer line.Footnote 6 It was a proud company, and its anthem went as follows:

EVER ONWARD – EVER ONWARD!
That’s the spirit that has brought us fame!
We’re big, but bigger we will be
We can’t fail for all can see
That to serve humanity has been our aim!

During the 1960s, the “mainframe” was the dominant computer design – one large computer, covered with blinking lights, designed to handle the most challenging tasks, or to serve many users at once. There was no such thing as a personal computer: At that point the cheapest computers sold by IBM cost over $100,000, and the more expensive units were priced in the millions.

IBM’s design philosophy was characteristic of the era of its greatest success – it embodied the system design thinking of the 1950s and 1960s, which favored centralized, fully integrated designs, of which AT&T “Bell System” was the exemplar.Footnote 7 Hence, IBM’s mainframe computers integrated, or bundled, all hardware, software and peripherals in one package. Of particular interest to us, software was not made available for sale or lease as an independent product: It was a service provided to buyers of IBM hardware.

IBM was not without competitors. The mainframe market was lucrative, and by the mid-1960s, IBM faced competition from seven smaller firms (the “seven dwarfs”), with their own mainframes, such as Burroughs, Univac, NCR, CDC, GE, RCA, and Honeywell. Firms like Univac typically targeted the lower end of the mainframe market, and attempted to win consumers with lower prices. In the early parts of mainframe history, all of the computers offered for sale were incompatible: That is, a firm usually bought all of its computers and peripheries from either IBM or Univac, for the computers were incapable of working together. Later, some firms began to offer peripheral hardware, like disk drives, designed to be “plug-compatible” with IBM’s System/360 mainframes, which meant one could plug the peripherals into IBM’s machines. Finally, other firms, like Control Data, focused on superior performance, and in particular, the supercomputer market, crafting computers faster (and even more expensive) than IBM’s offerings.

The Case

Over the 1960s, there were long-standing complaints that IBM was maintaining its mainframe monopoly and scaring people away from supercomputers using anticompetitive, predatory, and unethical practices. IBM and its management had faced antitrust complaints before: Tom Watson Sr., IBM’s longtime CEO, was convicted of criminal violations of antitrust back in 1913 (when he worked for NCR), and actually sentenced to prison.Footnote 8 What’s more, in 1956, IBM entered into a consent decree with the Justice Department surrounding its leasing practices.Footnote 9

Matters came to a head when, in 1968, rival Control Data sued IBM in a private antitrust action, focusing on its predatory conduct in the supercomputer and mainframe markets.Footnote 10 In 1969, after a long investigation, the Justice Department filed its own suit, also charging IBM with monopoly maintenance in violation of Section 2 of the Sherman Act.Footnote 11 According to the Justice Department, IBM had undertaken “exclusionary and predatory conduct” to maintain its dominant position in “general purpose digital computers.”Footnote 12 That was a market, according to the government’s estimates, in which IBM took some 70 percent of annual revenue.

Most important for our purposes were the government’s allegations surrounding software.Footnote 13 IBM was accused of tying software to “related computer hardware equipment” for a single price in a manner that the Justice Department alleged to be anticompetitive. IBM, it was alleged, also gave away software for free for “the purpose or with the effect of . . . enabling IBM to maintain or increase its market share.”Footnote 14

Beyond the software practices, the government also accused IBM of predatory practices. In particular, it accused IBM of developing specific “fighting machines” designed not to make a profit but rather to intimidate would-be competitors. It also accused IBM of vaporware practices, that is, announcing “future production and marketing [of certain products] when it believed or had reason to believe that it was unlikely to be able to produce and market such products within the announced time frame.”

For these violations, the government sought divestiture – that is, a full breakup of IBM into constituent parts. In that sense, the case was a classic example of the “big case” tradition in antitrust, in the model of the Northern Securities or Standard Oil litigation, whose goal was to restructure the industry entirely.Footnote 15

After some six years of discovery, the case finally went to trial in 1975. The early stages of the trial were somewhat complicated by the fact that IBM was also defending the private antitrust lawsuit brought by Control Data, the manufacturer of supercomputers. At the beginning, Control Data actively cooperated with the Justice Department, and had accumulated a massive database of alleged predatory acts perpetrated by IBM and its salesmen over the 1960s. It was information from this file that the Justice Department hoped to deploy in its lawsuit. However, in 1973, Control Data settled with IBM, and agreed to hand over its file.Footnote 16 IBM immediately destroyed (in some accounts burned) the files, thereby setting back the Justice Department’s discovery.

During the trial, IBM put on a vigorous defense, and spent untold millions of 1970s’ dollars defending the case.Footnote 17 The judge, David Edelstein, permitted the calling of a seemingly unlimited number of witnesses, for indefinite periods of time. One government witness testified for more than six months. Other trial days consisted of reading of depositions into the record.Footnote 18 Many of these details were chronicled by legal writer Steven Brill, in a scathing piece that portrayed the entire trial as a complete fiasco, or, in his words, “a farce of such mindboggling proportions that any lawyer who now tries to find out about it … will be risking the same quicksand that devoured the lawyers involved in the case.”Footnote 19 The trial continued for an astonishing six years, until the Justice Department finally rested in 1981.Footnote 20

But here we are interested less in the trial itself, and more in the effects of the litigation on IBM’s conduct and decision making. For during the lengthy trial and its aftermath, there is little dispute among business historians and journalists that IBM’s management was influenced by the very fact of being under investigation and being on trial. As Don Waldman writes, “the filing and prosecution of the antitrust case affected IBM’s business behavior for the next twenty years.”Footnote 21 Furthermore, as he writes, “lawyers gained control over even the most technical elements of IBM’s business.”Footnote 22 William Kovacic concurs: “DOJ’s law-suit exacted a high price from IBM. Along with the private lawsuits, the DOJ case caused IBM to elevate the role of lawyers in shaping commercial strategy and seems to have led the firm to pull its competitive punches.”Footnote 23 Reporter Paul Carroll, in his insider account Big Blues, gave a detailed portrayal of the effects of efforts to avoid strengthening the antitrust case by producing evidence of market share or anticompetitive conduct: “Lawyers, who were developing a stranglehold on the business, decided what could be said at meetings. No one could talk about IBM’s market share, or if they did, they’d talk in meaningless terms, describing the market for word processors as though it included everything from the supercomputer on down to paper and pencils. Executives couldn’t do any competitive analysis. Developers weren’t allowed to buy a competitor’s machine; they were just supposed to know what was in it.”Footnote 24

Critics have emphasized the sheer size of the case, which did last an astonishing thirteen years, at the end of which, the Reagan Administration simply dropped the case.Footnote 25 Was it, then, all just a waste of resources? That’s no trip to the county courthouse, and no one can defend how the Justice Department managed the litigation, which became as bloated as a 1970s’ muscle car. On the other hand, consider the stakes: The computer and software industries were already bringing in billions in revenue and today are collectively worth trillions of dollars, encompassing many of the most valuable companies on earth. Small effects on this industry would and did have major long-term effects. Neither was the IBM case, as G. David Garson writes, “without its effects” for the early computing industry.Footnote 26

Effects and Impact

It is one thing to suggest that the IBM trial may have caused IBM to behave more cautiously, evade obvious anticompetitive conduct, and generally avoid strengthening Justice’s case. Among other things, that may simply have weakened IBM as a competitor. But it seems more important to point out specific decisions and outcomes that seem to have been strongly influenced by the “antitrust phobia” resulting from being the subject of a Sherman Act case designed to break up the company. In this section I focus on three key moments: (1) IBM’s unbundling of software from hardware, (2) its entry into the microcomputer market, IBM PC, in partnership with Microsoft and others, and (3) its pattern of non-acquisitions in the aftermath of the PC’s success.

Unbundling and the Rise of an Independent Software Industry

The clearest impact of the antitrust case was its contribution to the rise of an independent software industry. This development is no small matter, given that software today is a $1.2 trillion-dollar industry in the US ($3 trillion globally), employing 2.5 million people. However, most of the legal and economics critics of the IBM litigation have, unaccountably, failed to acknowledge the case’s contribution to the software industry.

In the 1960s, it was IBM’s practice, and the practice of most other mainframe manufacturers, to “bundle” software with hardware.Footnote 27 That is, software was sold as a service that was tied to the sale of its hardware – the IBM mainframe unit came with a contract by which IBM programmers wrote software customized to the needs of the customer. Any prepackaged software was meant merely to illustrate for the customer what software might look like, like a model home for a prospective customer.Footnote 28 There were those within IBM who also thought that there might be a profitable market for packaged software, but they were unable to persuade the firm to break from its traditional practices.Footnote 29 The software industry itself was a “small, offbeat, almost cottage industry” and there was, interestingly, little customer demand for independent software.Footnote 30

In the late 1960s, as it became apparent that the Justice Department was planning on bringing an antitrust lawsuit, IBM’s legal team began to conclude, as a legal matter, that the software–hardware tie would be difficult to defend. A key figure was IBM’s general counsel, Burke Marshall, who “saw bundling as a glaring violation of antitrust law” and suggested that, if forced to defend the tie, IBM “would lose.”Footnote 31 Faced with this assessment, and hoping for a quick settlement, IBM President and CEO Thomas Watson Jr. made the decision, late in 1968, to begin the process of unbundling IBM’s software offerings from its hardware offerings.Footnote 32 While the unbundling decision was made before the formal filing of complaint, it was an effort to avoid the complaint’s being filed; such efforts to settle preemptively as is common in antitrust practice.Footnote 33 If there was once some controversy over why IBM unbundled, Watson’s description of the decision, in his autobiography, coupled with the writings of other IBM insiders, seems to have settled the matter.

On June 23, 1969 – sometimes called the “independence day” for the software industry – IBM, for the first time, made seventeen applications independently available for lease (not yet for sale). With the first release of prepackaged software products by the world’s dominant computer firm, the world of computing was never the same again. Richard Lilly, founder of a prominent 1970s software firm, said in 1989, “It created the industry we’re in.”Footnote 34

Consistent with Lilly’s statement, most experts agree that IBM’s unbundling was one key factor in the development of an independent software industry.Footnote 35 Nonetheless, a few caveats are in order. First, it is not that IBM was the first firm to release software as a product – there were others, albeit very few. By becoming the largest firm to enter the industry itself, IBM played a role in validating the idea that software could be a product at all, and also the idea that software was valuable. Second, it is also true that there were other factors necessary for the birth of a software industry. One was IBM’s own development of the 360-mainframe architecture, which was a standardized platform; another, the rise of the minicomputer, a smaller and cheaper alternative to the mainframe. Nonetheless, in its action, IBM both gave life to the industry and, critically, reconceptualized what software was. Its unbundling, as Martin Campbell-Kelly writes, transformed “almost overnight the common perception of software from a free good to a tradable commodity.”Footnote 36 The decision also had important consequences for IBM, both in terms of what it was and what it could control. As Burton Grad, of IBM, writes: “As a consequence of unbundling, IBM unquestionably became the largest supplier of computer software and services during the 1970s and 1980s. However, it never could control that business in the same way that it had (and has) dominated the mainframe hardware market.”Footnote 37 The ongoing antitrust suit, moreover, prevented IBM from rebundling and risking new liability. Hence, the suit and the unbundling helped create the model of computing that drove development through the 1970s and beyond – the concept of a platform for which applications are developed.

Estimating the economic importance of this development – and the contribution of IBM’s unbundling – is not easy, if only because the transformation was so far reaching. An important fact, however, is that the impact was not felt all at once. Hardware continued to be more important than software, and even into the 1980s, the software industry remained relatively small. One economic analysis suggests that “in 1987, the receipts of U.S. software programming service companies (SIC 7371) were $14.2 billion, the receipts for computer integrated systems design (SIC 7373) were $7.1 billion, and the receipts from prepackaged software (SIC 7372) sales were $5.9 billion.”Footnote 38 Other developments, like the triumph of the personal computer over all aspects of business computing, were yet to come. Yet by the 2010s, providers of software, even narrowly construed, were responsible for over a trillion dollars in US revenue, and broadly construed, far more of the world’s economic activity. In the words of Marc Andreessen, “software is eating the world.”Footnote 39

We cannot know for sure whether, without IBM’s decision, software would have become unbundled anyhow – that it was, in some way, the natural order of things. But it seems hard to deny that the antitrust case sped that development. And the failure to take account of the significance and effect of IBM’s unbundling of its software is a major flaw in many of the critiques of the IBM litigation. Take, for example, the work of economic historians Robert W. Crandall and Charles L. Jackson and their highly skeptical analysis of the effects of the IBM antitrust litigation, published in 2011, on the computing industry. The two do mention that unbundling was among the Justice Department’s goals, yet fail to even mention that it actually achieved this goal. That omission allows them to make the incorrect conclusion that: “It is difficult to see how an antitrust action brought in 1969 and dismissed in 1982 could have been a major contributor”Footnote 40 to dramatic changes in the industry.

* * *

If unbundling succeeded in transforming the history of computing, unfortunately for Watson and IBM, it failed in its goal of mollifying the Justice Department, which persisted with other claims of predation and anticompetitive behavior. It seems that the Justice Department continued to litigate based on its large collection of potentially predatory activities – in particular, related to pricing and misleading consumers about IBM’s forthcoming supercomputers (a vaporware strategy). The persistence of the lawsuit led to its influence over another transformational development: The arrival of the personal computer.

The Personal Computer and Microsoft

Another major development that occurred during the pendency of the IBM antitrust lawsuit was the development of the personal computer industry. While Apple and competitors Commodore and Tandy may have ignited the market with the successful launch of the first mass market personal computers, IBM would come to play a central role by developing and releasing the IBM PC, the personal computer whose design would come to dominate the market, with its radically modular, or open, design. This section examines the influence of IBM’s “antitrust phobia” over the core decisions made during this period.

There are two dominant, and conflicting, narratives surrounding the development of the personal computer. The first lionizes IBM’s rapid creation of a personal computer that would come to dominate the new industry. The second describes a blundering IBM, and credits Bill Gates for his brilliant outwitting of a myopic and foolish IBM based on an inability of the latter to understand the future. It isn’t hard to see the contradiction: If IBM was too stupid or backward to understand the personal computer market, how did it come to dominate it in just a few years? And if Bill Gates had such a brilliant sense of the future, why did he only grudgingly come to understand the potential in selling a standardized operating system? But most mysterious of all: Once the IBM PC began to take off, why was it so inert in the defense of its position? Why, for example, did it decline to acquire control of the companies critical to the PC, as IBM’s philosophy of centralized control would suggest? The point of this section is not to suggest that the traditional narrative is entirely wrong, but that it is incomplete. Particularly when it comes to Microsoft, it implies that IBM was simply too bone-headed to appreciate the future, rather than, potentially, concerned about its fate in the antitrust trial, and later, the possibility of reigniting antitrust litigation. There were so many dogs that did not bark during the launch of the PC and its aftermath, many of which are consistent with a fear that too aggressive a posture might yield renewed antitrust pressures based on the monopolization of the PC markets or software.

A moment, first, on the IBM PC itself and its significance. IBM, as already discussed, was the great champion of the mainframe computer – “big iron,” the monster machines designed to serve the needs of an entire business unit. (To be fair, mainframes were actually much smaller than the computers of the 1950s, which sometimes took up entire buildings, and were employed mainly by the military, but that is another matter.) Priced in the millions of dollars, they were obviously not intended for home usage. In the late 1960s through the 1970s, however, a number of researchers, particularly those associated with Xerox’s Palo Alto Research Center, and hobbyists like Steve Jobs and Steve Wozniak, had come up with the idea of a “personal” computer, intended for individual usage or for small businesses. A group of firms, led by Apple and its introduction of the Apple II, and also including Commodore, Atari, Sinclair, Tandy, and others, had by the 1970s proved both that personal computers could be produced for less than $2,000, and that there was a market for them.

IBM had experimented with smaller, entry-level computers, but they were generally smaller versions of its mainframes, and still priced way beyond the reach of any individual. In 1980, with the success of Apple and others, IBM decided to enter the microcomputer market in earnest. In an impressive twelve months, it had introduced the IBM PC, coupled with an advertising campaign featuring Charlie Chaplin’s “Little Tramp.” More powerful than the Apple II or Commodore 64, and soon the beneficiary of a “killer app” in the VisiCalc and then Lotus 1–2-3 spreadsheet program for businesses, the PC would soon become the best-selling personal computer in history to that point, and the dominance of its design had a profound influence on the subsequent history of computing.

What made that design so interesting is that the IBM PC was built in a manner unlike any other IBM product, a fact that would have enormous long-term consequences for the industry. As suggested earlier, IBM, whose system design traditions dated from the 1950s, has been among the champions of a fully integrated system design. Like AT&T, the company whose design philosophy was the most similar, IBM had long believed that the best products required that every component and service be provided in-house. Its practice of bundling, stated differently, was not limited to software; it included all hardware as well – it tended to source all of its hardware and software from itself.

The first IBM PC, however, was an extraordinarily radical break from that design, with a modular, open design philosophy that was essentially the opposite of IBM’s closed and centralized philosophy. The IBM PC team (as we shall see, an experimental subunit of IBM proper) selected a hard drive manufactured by Seagate, a printer made by Epson, and a processor made by Intel, instead of using its own hardware. Most importantly over the long term, an operating system provided by the company then named “Micro-Soft,” then a small startup, headed by one Bill Gates who was just twenty-four at the time and lacking a college degree. Gates, for his part, did not, in fact, write the operating system, but acquired it from a partner (Seattle Software) to whom he paid a license of $25,000 (reportedly, he didn’t mention to Seattle Software that the customer was IBM, or that it had paid him $700,000).Footnote 41 In the end, when the PC came out, only the keyboard, screen, motherboard, and its hardware BIOS (Basic Input Output System) were actually produced by IBM’s internal divisions. Of those, only the BIOS was proprietary.

There is one competitive detail of particular importance in the story of the IBM PC. When IBM contracted with Microsoft to provide the main operating system for the computer,Footnote 42 it neither bought the rights to the software nor required an exclusive license. The agreement was, instead, nonexclusive, leaving Microsoft free to sell its MS-DOS to other computer manufacturers as well. This nonexclusivity was a crucial factor in facilitating competition in the market for IBM-compatible PCs (like Compaq, or Dell). But this is something of a tangent. The question this chapter is trying to assess is what role, if any, the antitrust investigation played in influencing IBM’s original design and subsequent strategic conduct. Unlike unbundling, the causation is much more diluted, but no less important.

First, we do need return back to the original 1969 software unbundling and its effect on the development of the personal computer. Among the effects of IBM’s prior unbundling decision was to create the conditions for the platform/application model that would become the foundation of the personal computer industry. Burton Grad, again, writes that “[u]nbundling mainframe software established a framework for the independent microcomputer software industry’s later growth, which followed the model of separately priced offerings by major software suppliers.”Footnote 43 In other words, perhaps the larger influence of unbundling in 1969 was setting a model for firms like Microsoft, VisiCalc, and Lotus (the last two being among the first spreadsheet producers) to follow, which were a major factor in the success of the IBM PC. The unbundling also made possible Microsoft, which we shall discuss in greater detail in a moment.

The second matter was the question of how IBM would come to enter the market for personal computers. A dominant view is that the industry had too little revenue to attract or interest a mighty firm like IBM. However, there is evidence that, as early as the mid-1970s, IBM management, including its chairman, had gained interest in low-cost, entry-level computers, based on the accurate perception that lower-cost computers would come to be important, and also that rival Xerox might dominate that market. (Xerox had developed an advanced personal computer, the Alto, by 1973, that even had a graphical user interface, but had kept it for internal use, and did not bring a computer to market until the mid-1980s.) IBM considered the acquisitions that might help it forestall the emergence of competitors, but that path was discouraged by its lawyers, who were at that point involved in every level of decision making. For example, IBM considered the acquisition of Atari, and later considered working with Atari to produce its PC, but never did so – whether out of antitrust concerns or not, is unclear.

The IBM PC’s production was also influenced by IBM’s internal restructuring. By the late 1970s, both in the hope of promoting innovation, and also in anticipation of a potential breakup, IBM had divided the firm. Frank Cary, IBM’s CEO, created a separate division to contain all of IBM’s non-mainframe businesses, and it was in this division that the PC was launched. More specifically, IBM created a series of independent business units designed to be innovative and also potential survivors of a breakup, just as AT&T concentrated its choice assets (or so it thought). It was one of these, the “entry level computing” division based in Boca Raton, Florida, that both proposed and then built the IBM PC (codenamed Project Chess).

It is this team that made the decisions to create the modular and open design already discussed. That decision, in turn, is said to have been somewhat inspired by both Wozniak’s Apple II design as well as the Chess team’s desire to get around IBM’s slow-moving methodical culture and arrive to market quickly and cheaply. The evidence leaves little doubt that this was the primary reason behind the design. As Bill Lowe, the first head of the PC team, later said: “[A]t IBM it would take four years and three hundred people to do anything, I mean it’s just a fact of life. And I said no sir, we can provide with product in a year. … To save time, instead of building a computer from scratch, [the team] would buy components off the shelf and assemble them – what in IBM speak was called ‘open architecture.’ IBM never did this.”Footnote 44 Lowe explained that “the key decisions were to go with an open architecture, non-IBM technology, non-IBM software, non-IBM sales and non-IBM service.”

But while speed was surely a predominant factor in the design, it does not provide an explanation for everything. It fails to explain certain matters, like the crucial agreement to a nonexclusive license for Microsoft’s PC-DOS (an exclusive contract would not have been slower), and to leave Microsoft with full control of the source code. Those who have studied that agreement and its creators have tended to focus on Bill Gates’ side of the story, and attribute to him great savvy and foresight, based mainly on his own testimony.Footnote 45 While seeing the potential in the operating system and in nonexclusivity deserves enormous credit, it doesn’t explain why IBM would agree to such a thing. It takes two parties to reach agreement, and as a small firm negotiating with the dominant computing power, it is safe to say that Microsoft would not necessarily have the final say. We also have the fact that IBM made an offer to buy a competing operating system, CP/M, outright.Footnote 46 What is lacking is a reason for IBM’s non-assertion of its market power, and the implicit assumption that IBM was simply too stupid or short-sighted is an inadequate explanation.

The studies of IBM’s side of the deal and its motivations for agreeing to nonexclusivity are fewer, but more important for our purposes. Joseph Porac, in a comprehensive study of the deal, suggests a variety of factors. One was that IBM did not want to manage the code or develop the operating system itself, having had several bad experiences with doing so in recent years. But the other leading reason, according to Porac, was “antitrust phobia.” As he writes: “[A] reluctance to overcontrol small companies that could become potential competitors [was an] offshoot of the company’s antitrust phobia. Signing a nonexclusive contract with Microsoft that was clearly to Microsoft’s benefit was one way of avoiding any future claim that IBM was dominating the personal computer market.”Footnote 47 His assertion is echoed by Charles Ferguson and Charles Morris, who write: “[B]ecause of the still-pending antitrust action, IBM was wary of owning operating system software for fear of suits from software writers” for “IBM was extremely sensitive to even the appearance of having an unfair advantage over a small supplier.”Footnote 48

It is difficult, in the final analysis, to deny that IBM’s antitrust lawyers strongly influenced elements of the legal and technical design of the IBM PC, for it was essentially antitrust-proof. Indeed, after IBM’s successful rise to dominance in the PC market, there was some complaining about “IBM dominance” but no serious antitrust scrutiny or assertion that the firm had employed tying or other exclusionary strategies. The avoidance of even a hint of exclusivity in the design is notable. It is the non-assertion of market power or obvious failures to protect itself from competition that suggest a firm exceptionally intent on avoiding anything that might strengthen the antitrust case against it. Unfortunately for IBM, but fortunately for the economy, the very design of the PC made it much harder for IBM to control the PC market.

The Non-Acquisitions, and More Dogs that Did Not Bark

In 1984, after the success of its PC, IBM was unquestionably the world’s dominant computer firm. In fact, at that point, IBM was the world’s single most valuable company, its stock price rising to $79 billion by the end of that year. It had upended the personal computer industry, and achieved, for a while, a dominant market share in that market. The last point we consider, in terms of the lasting effects of antitrust scrutiny, was IBM’s astonishing failure to take some of the classic measures used by a monopolist to defend its market position.

As already suggested, IBM entered the market in a manner that can only be described as unusually and exceptionally stimulating to competition, and indeed in a manner that breathed life into firms and nascent industries. The level of competition ended up being much greater than IBM could possibly have anticipated. Some of this was surely a blunder on the part of IBM. A key matter was IBM’s assumption that its ownership of the BIOS code would protect it from those seeking to create 100 percent IBM compatibles, based on the premise that it might use copyright infringement lawsuits to block any direct copies. However, Compaq and other firms effectively reverse-engineered and cloned IBM’s BIOS, using clean-lab techniques (that is, its engineers had no access to the relevant code).Footnote 49 This development, which IBM clearly did not anticipate, created a major breach in the barriers to competition it thought it had, and yielded a flourishing PC market from the 1980s onward.

An easier way to protect its dominance would have been control over Microsoft’s DOS,Footnote 50 and we have already discussed IBM’s decision not to insist that the Microsoft’s DOS be exclusive to IBM, or even partially exclusive. That single decision, as many have noted, might have ensured a far longer domination of the PC market for the firm. But even if this might have been, in part, at least an oversight, it is hard to explain IBM’s subsequent non-acquisition of PC-DOS, of Microsoft, or even of a share of Microsoft, without crediting some concern of renewed antitrust problems.

In the 1990s, reporters first revealed that, in fact, IBM was offered various opportunities to buy Microsoft or its software. In 1980, according to the Wall Street Journal, Microsoft offered to let IBM buy its operating system outright, an opportunity that IBM declined, for the reasons discussed above. Later, in 1986, after the IBM PC had successfully taken over personal computing, Bill Gates offered IBM the opportunity to buy 10 percent of Microsoft.Footnote 51 There is some reason to think that, while surely the price would have been steep, IBM might have even have tried to acquire Microsoft in its entirety. IBM – still the most valuable firm in the world – demurred, concerned that its purchase would reignite antitrust concerns by being seen as “throwing its weight around.”Footnote 52 Paul Carroll reports Bill Lowe of the PC team stating that “IBM didn’t want to be seen as dominating the PC market too thoroughly.”Footnote 53

Given the runaway success of the PC, it was also plausible that an IBM that was behaving more like Facebook, Microsoft, or other less inhibited giants would have sought to either acquire or clone the other units, including hard drive manufacturer Seagate, and Epson. However, it made no efforts whatsoever to acquire these actors, which would go on to earn most of the profits in the PC industry.

* *

The various design decisions surrounding the IBM PC and IBM’s subsequent failure to defend its position are of such great importance over the long term for both the computer industry and the entire high-tech economy that they must be carefully examined.Footnote 54 We have already noted that the nonproprietary design left the market open to challengers to the IBM PC, which manufactured cheaper but compatible computers, also using MS-DOS. Over the long run, this development would erode IBM’s early dominance. But more generally, an important consequence was the fostering of a whole series of new and independent industries: an industry for hard drives and other forms of storage, another for processors and memory, and, of course, the market for personal computer software. It is true that these industries existed previously, but their growth was catalyzed by the IBM PC and its later competitors. It is interesting to contrast the PC, in this respect, with Apple, the previous leader, which was slightly less open than IBM, and with the launch of the Apple Macintosh, retreated to an entirely closed architecture with most components manufactured by Apple. In an alternative universe without IBM, had some version of the Apple Macintosh became the dominant business computer, it is very possible that the storage and software industries, in particular, would have been greatly reduced in independence and significance.

Lessons for Enforcers

Today, the concentration of the high-tech industries has become, one again, a matter of major public concern. That’s why the IBM case merits careful study by enforcers, for it was the first major tech antitrust case of the computing era, and a neglected member of the big three cases of the late twentieth century (IBM, AT&T, and Microsoft). Close study of the case offers guidance and insight into what might be the dynamic justifications for bringing a major antitrust lawsuit in the 2020s.

IBM was just one company. Yet, in retrospect, it was sitting atop what turned out to be an enormous number of important industries, from software through storage, processing and operating systems. The subsequent flourishing of the industries that were once in areas controlled, or potentially controlled, by IBM has unquestionably transformed the modern economy.

Given the complexity of any historical period, it is almost always impossible to claim that one factor – one leader, one case, one invention, or one critical decision – changed everything. However, it is not impossible to claim a contribution or a catalyst. And even if we don’t know what would have happened in the absence of antitrust – if, in fact, software might have been unbundled anyhow, or the PC might have developed more or less the same – there is good reason to believe that the pervasive evidence of antitrust phobia hastened the outcome.

If an enforcer wanted to duplicate the catalytic effects of the IBM case, what would he or she do? A close look at the history recommends that antitrust enforcers, particularly when thinking about big Section 2 cases, should spend time thinking about what industries or potential future markets and industries the dominant firm sits on top of or can potentially control the development of. The intuition is that there is a difference, say, between a shoe monopoly and monopoly on sidewalks, given that the latter might be an input into and entwined in so many other businesses. This suggests that antitrust enforcers should, when considering cases, begin by thinking about how many markets are influenced or dependent on the product or service.

A similar logic suggests prioritizing Sherman 2 cases where the problem isn’t just competition in the primary market, but where competition in adjacent markets looks to be compromised or threatened. The long cycles of industrial history suggest that what are at one point seen as adjacent or side industries can sometimes emerge to become of primary importance, as in the example of software and hardware. Hence, the manner of how something is sold can make a big difference. In particular, “bundling” or “tying” one product to another can stunt the development of an underlying industry. Even a tie that seems like “one product” at the time the case is litigated, as, for example, software and hardware were in the 1960s, or physical telephones and telephone lines, might contribute to such stunting. If successful, an antitrust prosecution that breaks the tie and opens a long-dominated market to competition may serve to have very significant long-term effects.

While all things are clearer in retrospect, the existence of a cottage or nascent industry might serve as a clue that, in fact, a vibrant industry might emerge from breaking the tie. If some firms manage to survive even in the presence of a tie, that suggests the possibility that there is far more potential there. In the example of the software industry, as we’ve suggested, the concept of prepackaged software existed, and a scattering of firms sold it and made a profit thereby. That pattern suggested plenty of room to grow, even if it would take many years for premade software to develop into the thriving industry it became.

A last lesson that might be gleaned from the IBM litigation is this: If the government wants to spur a change in conduct, it should seek maximum remedies in the cases it brings. In IBM, the Justice Department brought a complaint seeking dissolution – a full structural remedy. IBM’s concern about such a remedy had a major effect on its thinking and decision making, and seemed to yield the antitrust phobia that proved important over the 1970s. That suggests that the Justice Department cannot induce improved behavior without making credible threats of punishment at the outset of the case. In particular, there is a risk that a case that begins seeking a mere consent decree might not have any real effect on firm conduct.

A full discussion of what should go into case selection is beyond the scope of this or perhaps any article. But the final caveat or warning is this: While I think it is correct to suggest that the costs of litigation should not be fetishized – although the millions matter less when there are billions at stake – one thing that does matter are the effects on the company itself. IBM was, arguably, a reduced entity during the antitrust case (although it remained extremely profitable, and actually quadrupled its stock price). Knowing that the effect of litigation is to damage the firm, the enforcer needs to have some confidence that the damage will be salutary – that is, it will yield room for competitors to take the ball and run with it. While “do not harm” cannot be the mantra of antitrust enforcement, no one should seek outcomes that make things worse. And this is what makes timing all important, and suggests that enforcement policy will always be a matter requiring, above all, good judgment and a clear eye toward what one hopes to achieve.

Footnotes

1 The Regulated End of Internet Law, and the Return to Computer and Information Law?

* I wish to thank the contributors to this edited collection and its editor, the panelists and participants at the Wharton School symposium “After the Digital Tornado” on 10 November 2017, and participants at the Georgetown Technology Law Review symposium on 23 February 2018 on “Platform Law,” especially Julie Cohen and Mireille Hildebrandt. I also wish to thank the contributors and participants at the Münster Institute for Information and Telecommunications Law twentieth anniversary symposium in Berlin, Germany on 15 July 2017, especially Bernd Holznagel, and the contributors and participants at the eleventh annual Gikii symposium in Winchester, England, on 15 September 2017, especially Lilian Edwards, Andres Guadamuz, Paul Bernal, Daithi MacSithigh, and Judith Rauhofer. All errors and omissions remain my own.

1 Eastham, Laurence (2011) Interview with SCL’s New President, Richard Susskind, Society for Computers and Law, 23 August, at www.scl.org/articles/2191-interview-with-scl-s-new-president-richard-susskind. See also Susskind, Richard (2018) Sir Henry Brooke – A Tribute, Society for Computers and Law, at www.scl.org/articles/10221-sir-henry-brooke-a-tribute.

2 A good introduction is Reed, Chris (2010) Making Laws for Cyberspace, Oxford: Oxford University Press, especially at pp. 2947.

3 See generally for European law, Edwards, Lilian (ed., 2018) Law, Policy, and the Internet, Oxford: Hart Publishing; for US law, Goldman, Eric (2018) Internet Law Cases and Materials. For an annotated bibliography of classic academic legal writing, see Marsden, Chris (2012) Internet Law, Oxford Bibliographies Online, New York: Oxford University Press.

4 For early UK cases, see Athanasekou, P. E. (1998) Internet and Copyright: An Introduction to Caching, Linking and Framing, Journal of Information, Law and Technology (JILT); Opinion of Lord Hamilton in The Shetland Times Ltd v. Dr Jonathan Wills and Zetnews Ltd. Court of Session, Edinburgh 24 October 1996, at www.linksandlaw.com/decisions-87.htm.

5 European Commission (2017) Communication on Tackling Illegal Content Online: Towards an Enhanced Responsibility of Online Platforms; European Commission (2018) Recommendation on Measures to Effectively Tackle Illegal Content Online, published 1 March.

6 Belli, Luca and Zingales, Nicolo (eds. 2017) Platform Regulations: How Platforms Are Regulated and How They Regulate Us, FGV Direito Rio, Brazil, at https://bibliotecadigital.fgv.br/dspace/handle/10438/19402.

7 An excellent review is provided by chapters 1–3 in Murray, Andrew and Reed, Chris (2018) Rethinking the Jurisprudence of Cyberspace, Cheltenham: Edward Elgar.

8 Holznagel, Bernd and Hartmann, Sarah (2017) Do Androids Forget European Sheep? – The CJEU’s Concept of a ‘Right to Be Forgotten’ and the German Perspective, in Russel Miller (ed.) Privacy and Power – A Transatlantic Dialogue in the Shadow of the NSA-Affair, Cambridge: Cambridge University Press, pp. 586614.

9 See, for instance, Frischmann, Brett, M. (2005) An Economic Theory of Infrastructure and Commons Management, Minnesota Law Review, Vol. 89, 9171030, at https://ssrn.com/abstract=588424, discussed in Marsden, Chris (2017) Network Neutrality: From Policy to Law to Regulation, Manchester: Manchester University Press.

10 Werbach, Kevin (1997) Digital Tornado: The Internet and Telecommunications Policy, Federal Communications Commission Office of Plans and Policies Working Paper 29. Washington, FCC.

11 See most recently, Mahler, Tobias (2019) Generic Top-Level Domains: A Study of Transnational Private Regulation (Elgar Studies in Law and Regulation) Cheltenham: Edward Elgar. See also Marsden, Chris (forthcoming) Transnational Information Law, in Peer Zumbansen (ed.) Oxford Handbook of Transnational Law, Oxford: Oxford University Press.

12 For co-regulation, see Senden, Linda A. J. (2005) Soft Law, Self-regulation and Co-regulation in European Law: Where do they Meet?, Electronic Journal of Comparative Law, Vol. 9, No. 1, January 2005, at https://ssrn.com/abstract=943063; Marsden, Chris (2011) Internet Co-Regulation, Cambridge: Cambridge University Press. Historically, see the Max Planck Institute study: Collin, Peter (2016) Justice without the State within the State: Judicial Self-Regulation in the Past and Present, Moderne Regulierungsregime, Vol. 5, IX, 373.

13 Clark, David D., Field, Frank, and Richards, Matt (2010) Computer Networks and the Internet: A Brief History of Predicting Their Future, CSAIL Working Paper, at http://groups.csail.mit.edu/ana/People/DDC/Working%20Papers.html; first international links were from the United States to Norway, see Brown, Ian (ed., 2012) Research Handbook on Governance of the Internet, Cheltenham: Edward Elgar, chapter 1.

14 Goldman, Eric S. (1994) Cyberspace, the Free Market and the Free Marketplace of Ideas: Recognizing Legal Differences in Computer Bulletin Board Functions, Hastings Comm/Ent Law Journal, Vol. 16, 87.

15 Clark, David D. and Blumenthal, Marjory S. (2011) The End-to-End Argument and Application Design: The Role of Trust, Federal Communications Law Journal, Vol. 16, 357–70.

16 De Sola Pool, Ithiel (1983) Technologies of Freedom, Harvard: Harvard University Press. Pool analyzed the confrontation between the regulators of the new communications technology and the First Amendment, presciently forecasting the legal conflict that the Internet created between freedom of expression and government control/censorship. See also Kahin, Brian and Keller, James H. (eds., 1997) Coordinating the Internet, Cambridge, MA: MIT Press.

17 Kahin, Brian and Nesson, Charles (eds., 1997) Borders in Cyberspace: Information Policy and the Global Information Infrastructure, Cambridge, MA: MIT Press.

18 Reidenberg, Joel (1993) Rules of the Road for Global Electronic Highways: Merging the Trade & Technical Paradigms, Harvard Journal of Law and Technology, Vol. 6, 287, at http://jolt.law.harvard.edu/articles/pdf/v06/06HarvJLTech287.pdf.

19 Reidenberg, Joel (1998) Lex Informatica: The Formulation of Information Policy Rules through Technology, Texas Law Review, Vol. 76, 553–93.

20 Johnson, D. and Post, D. (1996) Law and Borders: The Rise of Law in Cyberspace, Stanford Law Review, Vol. 48, 1367–75.

21 Lessig, Lawrence (2006) Code and Other Laws of Cyberspace, New York: Basic Books. Revised 2nd ed. titled Code v2.0, at http://codev2.cc/.

22 Goldsmith, Jack L. (1998) Against Cyberanarchy, University of Chicago Law Review, Vol. 65, 1199.

23 Mayer-Schönberger, Viktor and Foster,Teree E. (1997) A Regulatory Web: Free Speech and the Global Information Infrastructure, Michigan Telecommunications and Technology Law Review, Vol. 3, 45, at www.mttlr.org/volthree/foster.pdf.

24 Samuelson, P. (2000) Five Challenges for Regulating the Global Information Society, in Chris, Marsden (ed.) Regulating the Global Information Society, Routledge: London.

25 Marsden, C. (2010) Network Neutrality: Towards a Co-Regulatory Solution, London: Bloomsbury, at 216–19.

26 Easterbrook, Frank H. (1996) Cyberspace and the Law of the Horse, Chicago: University of Chicago Legal Forum, 207.

27 Lessig, Lawrence (1999) The Law of the Horse: What Cyberlaw Might Teach, Harvard Law Review, Vol. 113, 501.

28 Sommer, Joseph H. (2000) Against Cyberlaw, Berkeley Technology Law Journal, Vol. 15, 3, at www.law.berkeley.edu/journals/btlj/articles/vol15/sommer/sommer.html.

29 Kerr, Orin S. (2003) The Problem of Perspective in Internet Law, Georgetown Law Journal, Vol. 91, 357, at http://ssrn.com/abstract=310020.

30 Guadamuz, Andrés (2004) Attack of the Killer Acronyms: The Future of IT Law, International Review of Law, Computers & Technology, Vol. 18, No. 3, 411–24.

31 Larouche, Pierre (2008) On the Future of Information Law as a Specific Field of Law, TILEC Discussion Paper No. 2008-020, at http://ssrn.com/abstract=1140162.

32 Goldman, Eric (2008) Teaching Cyberlaw, Santa Clara University School of Law Legal Studies Research Papers Series Working Paper No. 08-57, at http://ssrn.com/abstract=1159903.

33 Murray, A. (2013) Looking Back at the Law of the Horse: Why Cyberlaw and the Rule of Law are Important, SCRIPTed, Vol. 10, No. 3, 310, at http://script-ed.org/?p=1157.

34 See, for instance, Marsden, Chris (2012) Oxford Bibliography of Internet Law, New York: Oxford University Press.

35 Lemley, Mark and McGowan, David (1998) Legal Implications of Network Economic Effects, California Law Review, Vol. 86, 479, at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=32212.

36 Lessig, Lawrence (1999) Code and Other Laws of Cyberspace, New York: Basic Books.

37 Benkler, Yochai (2002) Coase’s Penguin, or Linux and the Nature of the Firm, Yale Law Journal, Vol. 112, 369, at www.benkler.org/CoasesPenguin.html.

38 Wu, Tim (2003) When Code Isn’t Law, Virginia Law Review, Vol. 89, 679, at papers.ssrn.com/sol3/papers.cfm?abstract_id=413201.

39 Zittrain, Jonathan (2006) The Generative Internet, Harvard Law Review, Vol. 119, 1974, at papers.ssrn.com/sol3/papers.cfm?abstract_id=847124.

40 Lemley, Mark, Menell, Peter S., Merges, Robert P., and Samuelson, Pamela (2011) Software and Internet Law, Gaithersburg, MD: Aspen Law & Business, 4th ed.; Thierer, Adam (ed., 2003) Who Rules the Net? Internet Governance and Jurisdiction, Washington, DC: Cato Institute.

41 Yaman, Akdeniz, Walker, Clive, and Wall, David (eds., 2001) The Internet, Law and Society, London: Longman; Edwards, Lilian and Waelde, Charlott (eds., 2009) Law and the Internet, 3rd ed., Oxford: Hart Publishing; Marsden, Chris (ed., 2000) Regulating the Global Information Society, London: Routledge; Hedley, S. (2006) The Law of Electronic Commerce and the Internet in the UK and Ireland, London: Routledge-Cavendish.

42 Marsden, Chris (2004) Hyperglobalized Individuals: the Internet, Globalization, Freedom and Terrorism, Foresight, Vol. 6, No. 3, 128–40.

43 Marsden, Chris (2014) Hyper-Power and Private Monopoly: the Unholy Marriage of (Neo) Corporatism and the Imperial Surveillance State, Critical Studies in Media Communication, Vol. 31, No. 2, 100108, at www.tandfonline.com/doi/full/10.1080/15295036.2014.913805.

44 Odlyzko, Andrew (2003) Pricing and Architecture of the Internet: Historical Perspectives from Telecommunications and Transportation, December 29, at www.dtc.umn.edu/»odlyzko. For Yahoo! rise and fall, see https://en.wikipedia.org/wiki/Yahoo!#Expansion.

45 See, generally, Malik, Om (2003) Broadbandits: Inside the $750 Billion Telecom Heist, Wiley & Sons.

46 Pub. Law No. 107-204, 15 U.S.C. §§ 7201 et seq. (2003).

47 Wren-Lewis, Simon (2015) The Austerity Con, London Review of Books, Vol. 37, No. 4, 911. UK neoliberal austerity lasted until 2018, in contrast to the US under President Obama’s stimulus programme from 2010. For a US perspective, see Paul Krugman (2015) The Austerity Delusion, The Guardian, 29 April, at www.theguardian.com/business/ng-interactive/2015/apr/29/the-austerity-delusion; Romano, Roberta (2004) The Sarbanes-Oxley Act and the Making of Quack Corporate Governance, New York University Law and Economics Working Paper 3, at http://lsr.nellco.org/nyu_lewp/3.

48 47 U.S.C. § 230. See Cannon, Robert (1996) The Legislative History of Senator Exon’s Communications Decency Act: Regulating Barbarians on the Information Superhighway, Federal Communications Law Journal, Vol. 51, 74.

49 47 U.S.C. § 230(c)(1).

50 The Communications Decency Act was Part V of the Telecommunications Act of 1996, in which S.222 deals with privacy and transparency.

52 ACLU v. Reno, 521 U.S. 844, overturned s.223. Rappaport, Kim L. (1997) In the Wake of Reno v. ACLU: The Continued Struggle in Western Constitutional Democracies with Internet Censorship and Freedom of Speech Online, American University International Law Review, Vol. 13. 765.

53 Guadamuz, Andres (2018) Chapter 1: Internet Regulation, inLilian Edwards (ed.) Law, Policy and the Internet, Oxford:Hart/Bloomsbury Publishing.

54 DrudgeReport Archives (1998), Newsweek Kills Story On White House Intern, 17 January, at www.drudgereportarchives.com/data/2002/01/17/20020117_175502_ml.htm.

55 Upheld in United States v. American Library Association, 539 U.S. 194 (2003).

56 Ashcroft v. American Civil Liberties Union, 542 U.S. 656 (2004).

57 Goldman, Eric (2018) An Overview of the United States’ Section 230 Internet Immunity, in Giancarlo Frosio (ed.) Oxford Handbook of Online Intermediary Liability, at https://ssrn.com/abstract=3306737.

58 Holznagel, B. (2000) Responsibility for Harmful and Illegal Content as Well as Free Speech on the Internet in the United States of America and Germany, in C. Engel and H. Keller (eds.) Governance of Global Networks in Light of Differing Local Values, Nomos: Baden Baden.

59 Yen, Alfred (2000) Internet Service Provider Liability for Subscriber Copyright Infringement, Enterprise Liability and the First Amendment, Georgetown Law Journal, Vol, 88, 1.

60 Holznagel, supra Footnote note 58.

61 Frydman, B. and Rorive, I. (2002) Regulating Internet Content Through Intermediaries in Europe and the USA, Zeitschrift fur Rechtssoziologie Bd.23/H1, July 2002, Lucius et Lucius.

62 CORDIS (1996) The ‘Bit Tax’: The Case for Further Research, at https://cordis.europa.eu/news/rcn/6988/en. The bit tax is a tax on the transmission of information by electronic means – literally, on bits.

63 Dickson, Annabelle (2018) UK to Introduce ‘Google Tax’ in 2020, Politico, 29 October, at www.politico.eu/article/uk-to-bring-in-digital-services-tax-in–2020/.

64 Also known as the Information and Communications Services Act (Informations- und Kommunikationsdienstegesetz – IuKDG). See IRIS Legal Observations of the European Audiovisual Observatory, IRIS 1997-8:11/16, at http://merlin.obs.coe.int/iris/1997/8/article16.en.html.

65 Bender, G. (1998) Bavaria v. Felix Somm: The Pornography Conviction of the Former CompuServe Manager, IJCLP Vol. 1, at www.digital-law.net/IJCLP/1_1998/ijclp_webdoc_14_1_1998.html.

66 See IP/97/313 Brussels, 16 April 1997: Electronic Commerce: Commission presents framework for future action, at http://europa.eu/rapid/press-release_IP-97-313_en.htm?locale=en.

67 Frydman and Rorive, supra Footnote note 61, at 54.

68 Marsden, C. (2011) Network Neutrality and Internet Service Provider Liability Regulation: Are the Wise Monkeys of Cyberspace Becoming Stupid? Global Policy, Vol. 2, No. 1, 112.

69 Frydman and Rorive, supra Footnote note 61, at 56.

70 Footnote Ibid at 59.

71 Balkin, Jack andZittrain, J. (2016) A Grand Bargain to Make Tech Companies Trustworthy? The Atlantic, October, https://perma.cc/WW5N-98UZ.

72 Perrin, W. and Woods, L. (2018) Harm reduction in social media – what can we learn from other models of regulation? Carnegie Trust, www.carnegieuktrust.org.uk/blog/harm-reduction-social-media-can-learn-models-regulation/. For criticism, see Smith, Graham (2018) Take care with that social media duty of care, Inforrm Blog, 23 October, https://inforrm.org/2018/10/23/take-care-with-that-social-media-duty-of-care-graham-smith/.

73 Polygram International Publishing v. Nevada/TIG, Inc., 855 F. Supp. 1314, 1317-18 (D. Mass. 1994).

74 Yen, supra Footnote note 59, at 19.

75 Viacom International, Inc. v. YouTube, Inc., No. 07 Civ. 2103, US District Court for the Southern District of New York, settled in 2013.

76 Reidenberg, J. (2005) Technology and Internet Jurisdiction, University of Pennsylvania Law Review, Vol. 153, 1951, at http://ssrn.com/abstract=691501.

77 Case C-362/14.

78 95/46/EC.

79 For which, see the Electronic Privacy Directive 2002/58/EC, which specifically regulates personal data protection on electronic networks.

80 Regulation EU 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L119. See Veale, Michael and Edwards, Lilian (2018) Clarity, Surprises, and Further Questions in the Article 29 Working Party Draft Guidance on Automated Decision-Making and Profiling, Computer Law & Security Review, Vol. 34, No. 2, 398404, at http://dx.doi.org/10.2139/ssrn.3071679. See also O’Conor M. (2018) GDPR Is for Life Not Just 25th of May, Computers and Law, 18 April, at www.scl.org/blog/10192-gdpr-is-for-life-not-just-the-25th-of-may.

81 Office of National Statistics (2012) Internet Access – Households and Individuals, 2012, Figure 1: Households with Internet Access, 1998 to 2012, at www.ons.gov.uk/ons/rel/rdit2/internet-access–households-and-individuals/2012/chd-figure-1.xls.

82 OECD (2017) Digital Economy Outlook, OECD: Paris, at www.oecd.org/internet/ieconomy/oecd-digital-economy-outlook-2017-9789264276284-en.htm.

83 Eurostat (2018) Archive: Internet Access and Use Statistics – Households and Individuals, Revision as of 15:34, 28 March, at https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Internet_access_and_use_statistics_households_and_individuals&oldid=379591.Using a group of various official statistics, the best current source is Internet World Stats (2017) Internet User Statistics, Facebook & 2017 Population for the 28 European Union member states, at www.internetworldstats.com/stats9.htm.

84 Directorate-General for Communications Networks, Content and Technology (European Commission), Programme in Comparative Law and Policy (2004) Self-Regulation of Digital Media Converging on the Internet: Industry Codes of Conduct in Sectoral Analysis, Final Report of IAPCODE Project for European Commission DG Information Society Safer Internet Action Plan, 30 April, Section 12.7, at https://publications.europa.eu/en/publication-detail/-/publication/b7c998d9-75d6-464d-9d91-d59aa90a543c/language-en.

85 Marsden, C. (2017) How Law and Computer Science Can Work Together to Improve the Information Society: Seeking to remedy bad legislation with good science, Communications of the ACM, Viewpoint: Law and Technology.

86 Marsden C., Cave, J. and Simmons, S. (2008) Options for and Effectiveness of Internet Self- and Co-Regulation, TR-566-EC. Santa Monica, CA: RAND Corporation.

87 European Commission (2017) Antitrust: Commission Fines Google €2.42 Billion for Abusing Dominance as Search Engine by Giving Illegal Advantage to Own Comparison Shopping Service, Factsheet, Brussels, 27 June 2017, at http://europa.eu/rapid/press-release_MEMO-17-1785_en.htm.

88 European Commission (2017) Speech by Johannes Laitenberger, Director-General for Competition, EU competition law in innovation and digital markets: fairness and the consumer welfare perspective, at http://ec.europa.eu/competition/speeches/text/sp2017_15_en.pdf.

89 Sampson, Anthony (1973) The Sovereign State of ITT, New York: Stein and Day.

90 Stopford, John and Strange, Susan (1991) Rival States, Rival Firms, Cambridge: Cambridge University Press; Duchacek, Ivo D. (1984) The International Dimension of Subnational Self-Government, Publius: The Journal of Federalism, Vol. 14, No. 4, 531, at https://doi.org/10.1093/oxfordjournals.pubjof.a037513.

92 https://transparencyreport.google.com/copyright/overview – noting many companies have such reports, linking to 42 others (some have since merged or discontinued reports).

94 H. Rept. 107–803 – Legislative Review Activities of the Committee On International Relations 107th Congress (2001-2002). See, generally, for US Internet co-regulation, Weiser, P. (2009) The Future of Internet Regulation, U.C. Davis Law Review, Vol. 43, 529–90.

95 Marsden, Internet Co-Regulation, supra Footnote note 12.

96 See COM/2002/275, COM/2002/0278, COM 2002/704.

97 Inter-Institutional Agreement on Better Law-Making (OJ C 321, 31.12.2003), pp. 1–5.

98 European Union (2016) Better Regulation, at www.consilium.europa.eu/en/policies/better-regulation/.

99 COM/2005/97.

100 SEC /2005/791.

101 This is now codified in the new Interinstitutional Agreement between the European Parliament, the Council of the European Union and the European Commission on Better Law-Making OJ L 123, 12.5.2016, pp. 1–14, at https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:L:2016:123:FULL&from=EN.

102 Price M. and Verhulst, S. (2005) Self-Regulation and the Internet, Amsterdam: Kluwer.

103 Latzer, Michael, Price, Monroe E., Saurwein, Florian, Verhulst and Stefaan G. (2007) Comparative Analysis of International Co- and Self-Regulation in Communications Markets, Research report commissioned by Ofcom.

104 Marsden, Internet Co-Regulation, supra Footnote note 12, at pp. 60, 147, 222.

105 Ofcom (2008) Identifying Appropriate Regulatory Solutions: Principles for Analysing Self- and Co-Regulation, 10 December.

106 Newman Abraham, L. and Bach,David (2004) Self-Regulatory Trajectories in the Shadow of Public Power: Resolving Digital Dilemmas in Europe and the United States, Governance: An International Journal of Policy, Administration, and Institutions, Vol. 17, No. 3, July 2004, 388.

107 Froomkin, A. Michael, Wrong Turn in Cyberspace: Using ICANN to Route Around the APA and the Constitution, Duke Law Journal, Vol. 50, 17, at www.law.miami.edu/~froomkin/articles/icann.pdf.

108 Marsden, Cave and Simmons, supra Footnote note 86.

109 Reed, Chris (2018) How Should We Regulate Artificial Intelligence? Philosophical Transactions of the Royal Society, A 2018 376 20170360.

110 See, for instance, Frischmann, Brett M. (2005) An Economic Theory of Infrastructure and Commons Management , Minnesota Law Review, Vol. 89, 9171030, https://ssrn.com/abstract=588424.

111 Marsden, C. (2018) Oral Evidence to Lords Communications Committee, “The internet: to regulate or not to regulate?” Parliamentlive.tv, 24 April, at https://parliamentlive.tv/Event/Index/4fac3ac3-3408-4d3b-9347-52d567e3bf62.

112 White, Sharon (2018) Tackling online harm – a regulator’s perspective: Speech by Sharon White to the Royal Television Society, 18 September, at www.ofcom.org.uk/about-ofcom/latest/media/speeches/2018/tackling-online-harm.

113 Merging CESG (the information security arm of GCHQ), the Centre for Cyber Assessment (CCA), Computer Emergency Response Team UK (CERT UK) and the cyber-related responsibilities of the Centre for the Protection of National Infrastructure (CPNI).

115 Marsden, C. (2018) Prosumer Law and Network Platform Regulation: The Long View Towards Creating Offdata, Georgetown Technology Law Review, Vol. 2, No. 2, pp. 376–98.

116 Oftel (1995) Beyond the Telephone, the TV and the PC: Consultation Document. Note further consultations were released, the last in 1998 – seen as a forerunner to the agenda on convergent communications for government and eventually Ofcom. See Barnes, Fod (2000) Commentary: When to Regulate in the GIS? A Public Policy Perspective, chapter 7, pp. 117–24 in Marsden, C. ed. (2000) Regulating the Global Information Society, New York: Routledge.

117 See Kroll, Joshua A., Huey, Joanna, Barocas, Solon, Felten, Edward W., Reidenberg, Joel R., Robinson, David G. and Yu, Harlan (2017) Accountable Algorithms, University of Pennsylvania Law Review, Vol. 165, at https://ssrn.com/abstract=2765268. See also Reed, supra Footnote note 2.

118 Hildebrandt, Mireille (2018) Primitives of Legal Protection in the Era of Data-Driven Platforms, Georgetown Technology Law Review, Vol. 2, 252, at 253 footnote 3.

119 See Marsden, Chris and Meyer, Trisha (2019) Regulating Disinformation with Artificial Intelligence (AI): The Effects of Disinformation Initiatives on Freedom of Expression and Media Pluralism, Report for Panel for the Future of Science and Technology (STOA), Scientific Foresight Unit of the Directorate for Impact Assessment and European Added Value, Directorate-General for Parliamentary Research Services (EPRS) of the Secretariat of the European Parliament.

120 House of Lords (2017) AI Select Committee: AI Report Published, at www.parliament.uk/business/committees/committees-a-z/lords-select/ai-committee/news-parliament-2017/ai-report-published/ (note the report is published in non-standard URL accessed from this link).

121 Reddit poster (2017) Artificial Intelligence Can’t Tell Fried Chicken from Labradoodles, at www.reddit.com/r/funny/comments/6h47qr/artificial_ intelligence_cant_tell_fried_chicken/.

124 Hara, Kotaro, Adams, Abi, Milland, Kristy, Savage, Saiph, Callison-Burch, Chris and Bigham, Jeffrey (2017) A Data-Driven Analysis of Workers’ Earnings on Amazon Mechanical Turk. arXiv:1712.05796, Conditionally accepted for inclusion in the 2018 ACM Conference on Human Factors in Computing Systems (CHI’18) Papers program.

125 YouTube Transparency Report (2018), at https://transparencyreport.google.com/youtube-policy/overview.

126 Edwards, Lilian and Veale, Michael (2017) Slave to the Algorithm? Why a “Right to Explanation” is Probably Not the Remedy You are Looking for, at https://ssrn.com/abstract=2972855; Erdos, David (2016) European Data Protection Regulation and Online New Media: Mind the Enforcement, Gap Journal of Law and Society, Vol. 43, No. 4, 534–64, at http://dx.doi.org/10.1111/jols.12002.

127 Veale, Michael, Binns, Reuben and Van Kleek, Max (2018) The General Data Protection Regulation: An Opportunity for the CHI Community? (CHI-GDPR 2018), Workshop at ACM CHI’18, 22 April 2018, Montreal, Canada, arXiv:1803.06174.

128 Vestager, M. (2018) Competition and a Fair Deal for Consumers Online, Netherlands Authority for Consumers and Markets Fifth Anniversary Conference, The Hague, 26 April, at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/competition-and-fair-deal-consumers-online_en.

129 A recent Bird & Bird study for the European Commission evaluated the first triennial review of Net Neutrality in Regulation 2015/2120 – its conclusions were that the lack of enforcement to date means it is too early to tell how useful it will be. But zero rating is more controversial in developing nations, not least because the use of zero rated WhatsApp in data-poor Brazil appears to have helped swing the Presidential election of Bolsonaro: Belli, Luca (2018) WhatsApp Skewed Brazilian Election, Proving Social Media’s Danger to Democracy, The Conversation, 5 December, at https://theconversation.com/whatsapp-skewed-brazilian-election-proving-social-medias-danger-to-democracy-106476.

130 Marsden, C. (2019) Predictions 2019: Professor Chris Marsden, Society for Computers and Law, at www.scl.org/articles/10379-predictions-2019-professor-chris-marsden.

2 Networks, Standards, and Network-and-Standard-Based Governance

* My thanks to Laura DeNardis, Paul Ohm, Greg Shaffer, and Kevin Werbach for their helpful comments and to Jade Coppieters, Natalie Gideon, Sherry Safavi, and Tom Spiegler for research assistance.

1 Julie E. Cohen (2019), Between Truth and Power: Legal Constructions of Informational Capitalism, New York: Oxford University Press.

2 On networked governance, see, for example, John Braithwaite (2006), “Responsive Regulation and Developing Economies,” World Development 34(5): 884–898; Kal Raustiala (2002), “The Architecture of International Cooperation: Transgovernmental Networks and the Future of International Law,” Virginia Journal of International Law 43(1): 192; Anne Marie Slaughter (2004), A New World Order, Princeton, NJ: Princeton University Press. On standards in transnational governance, see, for example, Panagiotis Delimatsis, ed. (2015), The Law, Economics, and Politics of International Standardization, New York: Cambridge University Press; Harm Schepel (2005), The Constitution of Private Governance: Product Standards in the Regulation of Integrating Markets, Portland, OR: Hart Publishing.

3 See, for example, Lawrence Lessig (1998), Code and Other Laws of Cyberspace, New York: Basic Books; Joel R. Reidenberg (1998), “Lex Informatica: The Formulation of Information Policy Rules through Technology,” Texas Law Review 76(3): 553–593. For a brief flirtation with the idea of Internet governance processes as “hybrid” code- and law-based institutions, see Laurence B. Solum (2009), “Models of Internet Governance,” in Internet Governance: Infrastructure and Institutions, eds. Lee A. Bygrave and Jon Bing, New York: Oxford University Press, pp. 4891.

4 For a summary and analysis of the major strands of Anglo-American rule-of-law theorizing, see Richard H. Fallon, Jr. (1997), “‘The Rule of Law’ as a Concept in Constitutional Discourse,” Columbia Law Review 97(1): 156. For a broader comparative discussion, see Mireille Hildebrandt (2016), Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology, Northampton, MA: Edward Elgar, pp. 133–156.

5 Hildebrandt, cited in note 4, pp. 174–185.

6 For two very different perspectives on the origins and effects of trade liberalization logics, see William J. Drake and Kalypso Nicolaidis (1992), “Ideas, Interests, and Institutionalization: “Trade in Services’ and the Uruguay Round,” International Organization 46(1): 37100; Jane Kelsey (2008), Serving Whose Interests? The Political Economy of Trade in Services, New York: Routledge-Cavendish, pp. 7688.

7 See, generally, Delimatsis, ed., cited in note 2.

8 For a sampling of perspectives on these developments, Todd Allee and Andrew Legg (2016), “Who Wrote the Rules for the Trans-Pacific Partnership?,” Research and Politics July–September 2016: 1–9; Kyle Bagwell, Chad P. Bown, and Robert W. Staiger (2016), “Is the WTO Passé?,” Journal of Economic Literature 54(4): 11251231; Nitsan Chorev and Sarah Babb (2009), “The Crisis of Neoliberalism and the Future of International Institutions: A Comparison of the IMF and the WTO,” Theory and Society 38(5): 459–484.

9 See Christopher Ingraham, “Interactive: How Companies Wield Off-the-Record Influence on Obama’s Trade Policy,” Washington Post, February 8, 2014, https://perma.cc/UPN6-DHKD.

10 Joachim Pohl, Kekeletso Mashigo, and Alexis Nohen (2012), “Dispute Settlement Provisions in International Investment Agreements: A Large Sample Survey,” OECD Working Papers on International Investment 2012/02, https://perma.cc/VN6T-GT64.

11 Margaret E. Keck and Kathryn Sikkink, eds. (1998), Activists Beyond Borders: Advocacy Networks in International Politics, Ithaca, NY: Cornell University Press, pp. 143.

12 Chris Brummer (2012), Soft Law and the Global Financial System, New York: Cambridge University Press; Charles D. Raab (2011), “Networks for Regulation: Privacy Commissioners in a Changing World,” Journal of Comparative Policy Analysis: Research and Practice 13(2): 195213.

13 See, generally, Mark Raymond and Laura DeNardis (2015), “Multistakeholderism: Anatomy of an Inchoate Global Institution,” International Theory 7(3): 572616.

14 Benedicte Bull and Desmonde McNeill (2007), Development Issues in Global Governance: Public-Private Partnerships and Market Multilateralism, New York: Routledge, pp. 122; Marco Schäferhoff, Sabine Campe, and Christopher Kaan (2009), “Transnational Public‐Private Partnerships in International Relations: Making Sense of Concepts, Research Frameworks, and Results,” International Studies Review 11(3): 451–474.

15 Melissa J. Durkee (2017), “Astroturf Activism,” Stanford Law Review 69(1): 201–268; Melissa J. Durkee (2018), “International Lobbying Law,” Yale Law Journal 127(7): 1742–1826.

16 Klaas Hendrik Eller (2017), “Private Governance of Global Value Chains from Within: Lessons for Transnational Law,” Transnational Legal Theory 8 (3): 296329; Li-Wen Lin, “Legal Transplants through Private Contracting: Codes of Vendor Conduct in Global Supply Chains as an Example,” American Journal of Comparative Law 57(3) (2009): 711–744; Schepel, The Constitution of Private Governance, cited in note 2.

17 Heather McKeen-Edwards and Tony Porter (2013), Transnational Financial Associations and the Governance of Global Finance: Assembling Wealth and Power, New York: Routledge.

18 See, for example, Jordi Agusti-Panareda, Franz Christian Ebert, and Desiree LeClerq (2015), “ILO Labor Standards and Trade Agreements: A Case for Consistency,” Comparative Labor Law and Policy Journal 36(2): 347–380; Orr Karassin and Oren Perez (2018), “Shifting between Public and Private: The Reconfiguration of Global Environmental Regulation,” Indiana Journal of Global Legal Studies 25(1): 97130; Kevin Kolben (2011), “Transnational Labor Regulation and the Limits of Governance,” Theoretical Inquiries in Law 12(2): 403–437.

19 On the complexity of assemblages for Internet namespace and protocol governance, see generally Laura DeNardis (2014), The Global War for Internet Governance, New Haven, CT: Yale University Press, pp. 4555, 6376.

20 For comprehensive refutations of the view that Internet governance is a purely technical activity, see Laura DeNardis (2009), Protocol Politics: The Globalization of Internet Governance, Cambridge, MA: MIT Press; DeNardis, cited in note 19; Milton Mueller (2004), Ruling the Root: Internet Governance and the Taming of Cyberspace, Cambridge, Mass: MIT Press. See also Roger Cotterrell (2012), “What Is Transnational Law?,” Law and Social Inquiry 37(2): 500–524.

21 On the evolving civil society role, see Stefania Milan and Niels ten Oever (2017), “Coding and Encoding Rights in Internet Infrastructure,” Internet Policy Review 6(1). On the involvement of governments and on ICANN’s design more generally, see Raymond and DeNardis, cited in Footnote note 13.

22 Andrew L. Russell (2006), “‘Rough Consensus and Running Code’ and the Internet-OSI Standards War,” IEEE Annals of the History of Computing 28(3): 4861.

23 DeNardis, cited in note 19, pp. 70–71, 226–230; Raymond and DeNardis, cited in note 13.

24 Well-known expressions of network optimism include Yochai Benkler (2006), The Wealth of Networks: How Social Production Transforms Markets and Freedom, New Haven, CT: Yale University Press; Anupam Chander (2013), The Electronic Silk Road: How the Web Binds the World Together in Commerce, New Haven, CT : Yale University Press; Joshua Cohen and Charles F. Sabel, “Global Democracy?,” N.Y.U. Journal of International Law & Politics 37(4): 763–797. More measured evaluations include Jack Goldsmith and Tim Wu (2008), Who Controls the Internet? Illusions of a Borderless World, New York: Oxford University Press; Laurence R. Helfer (2004), “Regime Shifting: The TRIPs Agreement and New Dynamics of International Intellectual Property Lawmaking,” Yale Journal of International Law 29(1): 184; Anna di Robilant (2006), “Genealogies of Soft Law,” American Journal of Comparative Law 54(3): 499554.

25 David Singh Grewal (2008), Network Power: The Social Dynamics of Globalization, New Haven, CT: Yale University Press; Manuel Castells (2009), Communication Power, New York: Oxford University Press.

26 See, generally, Milton L. Mueller (2010), Nations and States: The Global Politics of Internet Governance, Cambridge, MA: MIT Press, pp. 41–46.

27 On network organization generally, see Albert-Laszlo Barabasi (2002), Linked: The New Science of Networks, Cambridge, MA: Perseus Publishing.

28 Duncan Kennedy (1976), “Form and Substance in Private Law Adjudication,” Harvard Law Review 89(8): 1685–1778; Pierre Schlag (1985), “Rules and Standards,” UCLA Law Review 33(2): 379430.

29 On network lock-in, see Michael L. Katz and Carl Shapiro (1985), “Network Externalities, Competition, and Compatibility,” American Economic Review 75(3): 424–440.

30 Grewal, cited in note 25, pp. 4–8; Castells, cited in note 25, p. 43.

31 See generally John Braithwaite and Peter Drahos (2000), Global Business Regulation, New York: Cambridge University Press.

32 Kal Raustiala (2016), “Governing the Internet,” American Journal of International Law 110(3): 491503; see also Raymond and DeNardis, cited in note 13.

33 Carl Shapiro and Hal R. Varian (1999), “The Art of Standards Wars,” California Management Review 41(2): 832.

34 See, generally, Daya Kishan Thussu (2015), “Digital BRICS: Building a NWICO 2.0?”, in Mapping BRICS Media, eds. Kaarle Nordenstreng and Daya Kishan Thussu, New York: Routledge, pp. 242–263; see also Tracy Staedter, “Why Russia Is Building Its Own Internet,” IEEE Spectrum, January 17, 2018, https://perma.cc/6UU4-NNJG.

35 See Brummer, cited in note 12, pp. 233–265.

36 Laurence R. Helfer (2004), “Regime Shifting: The TRIPs Agreement and New Dynamics of International Intellectual Property Lawmaking,” Yale Journal of International Law 29(1): 184.

37 Annemarie Bridy, (2017) “Notice and Takedown in the Domain Name System: ICANN’s Ambivalent Drift into Online Content Regulation,” Washington and Lee Law Review 74 (3): 1345–1388; Peter Bright, “DRM for HTML5 Finally Makes It as an Official W3C Recommendation,” ArsTechnica, September 18, 2017, https://perma.cc/Z9P6-2JLW.

38 For analyses of the interplay between data protection and multilateral trade instruments, see Svetlana Yakovleva and Kristina Irion (2016), “The Best of Both Worlds? Free Trade in Services, and EU Law on Privacy and Data Protection,” European Data Protection Law Review 2(26): 191208; Graham Greenleaf (2017), “Free Trade Agreements and Data Privacy: Future Perils of Faustian Bargains,” in Transatlantic Data Privacy Relations as a Challenge for Democracy, eds. Dan Svantesson and Dariusz Kloza, Antwerp: Intersentia, pp. 181212; Graham Greenleaf (2018), “Looming Free Trade Agreements Pose Threats to Privacy,” International Report: Privacy Laws & Business 152: 123–127.

39 Stephanie E. Perrin (2018), “The Struggle for WHOIS Privacy: Understanding the Standoff between ICANN and the World’s Data Protection Authorities,” unpublished dissertation, Faculty of Information, University of Toronto, pp. 243–253.

40 Anu Bradford and Eric A. Posner (2011), “Universal Exceptionalism in International Law,” Harvard International Law Journal 52(1): 354, 36.

41 Castells, cited in note 25, pp. 45–46.

42 Yiping Huang (2016), “Understanding China’s Belt & Road Initiative: Motivation, Framework and Assessment,” China Economic Review 40: 314–321; Dane Chamorro, “Belt and Road: China’s Strategy to Capture Supply Chains from Guangzhou to Greece,” Forbes, December 21, 2017, https://perma.cc/4LYV-EFNW.

43 PwC, “Global Top 100 Companies by Market Capitalisation,” March 31, 2017, 35, https://perma.cc/8TNB-TBCA.

44 McKinsey Global Institute, “China’s Digital Economy: A Leading Global Force,” August 2017, https://perma.cc/X4BD-75TB; Charles Arthur, “The Chinese Tech Companies Poised to Dominate the World,” Guardian, June 3, 2014, https://perma.cc/W7TP-E89Z.

45 Samm Sacks, “Beijing Wants to Rewrite the Rules of the Internet,” Atlantic, June 18, 2018, https://perma.cc/YFY8-KYM5.

46 Gabriele de Seta, “Into the Red Stack,” Hong Kong Review of Books, April 17, 2018, https://perma.cc/J5VD-7SH3.

47 Ruth W. Grant and Robert O. Keohane (2005), “Accountability and Abuses of Power in World Politics,” American Political Science Review 99(1): 2943.

48 Margot E. Kaminski (2014), “The Capture of International Intellectual Property Law through the U.S. Trade Regime,” Southern California Law Review 87(4): 9771052.

49 For an illustrative discussion, see Golan v. Holder, 565 U.S. 302, 335–336 (2012).

50 ICANN, “Accountability and Transparency,” https://www.icann.org/resources/accountability.

51 Mueller, Ruling the Root, cited in note 20; DeNardis, The Global War for Internet Governance, cited in note 19; Perrin, cited in note 39.

52 In general, the innovations that “go viral” within networks are those originating from more connected nodes within the network. See Barabasi, cited in note 27, pp. 131–135.

53 Durkee, “Astroturf Activism,” cited in note 15.

54 For an especially compelling articulation of this worry, see David Kennedy (2014), “Law and the Political Economy of the World,” in Grainne de Burca, Claire Kilpatrick, and Joanne Scott, eds., Critical Legal Perspectives on Global Governance: Liber Amicorum David M. Trubek, Portland, OR: Hart Publishing, pp. 65102.

55 Hans Krause Hansen and Tony Porter (2012), “What Do Numbers Do in Transnational Governance?,” International Political Sociology 6(4): 409–426.

56 For a detailed example, see Kernaghan Webb (2015), “ISO 26000 Social Responsibility Standard as ‘Proto Law’ and a New Form of Global Custom: Positioning ISO 26000 in the Emerging Transnational Regulatory Governance Rule Instrument Architecture,” Transnational Legal Theory 6(2): 466500.

57 Timothy H. Edgar (2017), Beyond Snowden: Privacy, Mass Surveillance, and the Struggle to Reform the NSA, Washington, DC: Brookings Institution Press, p. 123.

58 See, generally, Sheila Jasanoff (1990), The Fifth Branch: Science Advisers as Policymakers, Cambridge, MA: Harvard University Press.

59 See Sakiko Fukuda-Parr (2011), “The Metrics of Human Rights: Complementarities of the Human Development and Capabilities Approach,” Journal of Human Development and Capabilities 12(1): 7389; Sakiko Fukuda-Parr and Alicia Yamin (2013), “The Power of Numbers: A Critical Review of MDG Targets for Human Development and Human Rights,” Development 56(1): 5865; AnnJanette Rosga and Margaret Satterthwaite (2009), “The Trust in Indicators: Measuring Human Rights,” Berkeley Journal of International Law 27(2): 253315.

60 For a prescient early treatment of this problem, see Jonathan Zittrain (2008), The Future of the Internet—And How to Stop It, New Haven, CT: Yale University Press, 3657.

61 See, for example, Frank Ackerman, Lisa Heinzerling, and Rachel Massey (2005), “Applying Cost-Benefit to Past Decisions: Was Environmental Protection Ever a Good Idea?,” Administrative Law Review 57(1): 155–192; Kenneth A. Bamberger (2010), “Technologies of Compliance: Risk and Regulation in a Digital Age,” Texas Law Review 88(4): 669740; James Fanto (2009), “Anticipating the Unthinkable: The Adequacy of Risk Management in Finance and Environmental Studies,” Wake Forest Law Review 44(3): 731–756.

62 William Krist (2013), Globalization and America’s Trade Agreements, Washington, DC: Woodrow Wilson Center Press with Johns Hopkins University Press; Erik Reinert (2007), How Rich Countries Got Rich … and Why Poor Countries Stay Poor, New York: Carroll & Graf.

63 See, for example, Claudio Grossman and Daniel D. Bradlow (1993), “Are We Being Propelled Towards a People-Centered Transnational Legal Order?,” American University Journal of International Law and Policy 9(1): 126; Gunther Teubner (2011), “Self-Constitutionalizing TNCs? On the Linkage of ‘Private’ and ‘Public’ Corporate Codes of Conduct,” Indiana Journal of Global Legal Studies 18(2): 617–38.

64 Julie E. Cohen (2017), “Law for the Platform Economy,” U.C. Davis Law Review 51(1): 133204.

65 Castells, cited in note 25, pp. 45–46.

66 See Convention on Rights and Duties of States art. 1, December 26, 1933, T.S. No. 88.

67 On the spatial dimension of user experiences of the Internet, see Julie E. Cohen (2007), “Cyberspace as/and Space,” Columbia Law Review 107(1): 210–255.

68 DeNardis, The Global War for Internet Governance, cited in note 19, pp. 45–55.

69 Facebook, “Newsroom: Company Info,” https://perma.cc/5RC7-ZPGG (2.13 billion monthly active users as of December 2017); Xavier Harding, “Google Has 7 Products with 1 Billion Users, Popular Science,” February 1, 2016, https://perma.cc/2ZYC-LU5C; Credit Suisse, “Apple Inc. (AAPL.OQ) Company Update,” p. 1 (2016) (estimated 588 million users as of April 2016), https://perma.cc/TK8F-JTKW.

70 See Tarleton Gillespie (2017), “Governance of and by Platforms,” in Jean Burgess, Alice Marwick, and Thomas Poell, eds., The SAGE Handbook of Social Media, Thousand Oaks, CA: Sage, pp. 254–278; Kate Klonick (2018), “The New Governors: The People, Rules, and Processes Governing Online Speech,” Harvard Law Review 131(6): 1598–1670.

71 David Dayen, “The Android Administration: Google’s Remarkably Close Working Relationship with the Obama White House, in Two Charts,” The Intercept, April 22, 2016, https://perma.cc/QL5K-VT7Y.

72 Mike Swift, “Facebook to Assemble Global Team of ‘Diplomats,’” San Jose Mercury News, May 20, 2011, https://perma.cc/396G-SUGX; Gwen Ackerman, “Facebook and Israel Agree to Tackle Terrorist Media Together,” Bloomberg, September 12, 2016, https://perma.cc/E4UU-SRVF; My Pham, “Vietnam Says Facebook Commits to Preventing Offensive Content,” Reuters, April 27, 2017, https://perma.cc/FN45-XLNY; Adam Taylor, “Denmark is Naming an Ambassador Who Will Just Deal with Increasingly Powerful Tech Companies,” Washington Post, February 4, 2017, https://perma.cc/PCV3-L2J3.

73 Kate Conger, “Microsoft Calls for the Establishment of a Digital Geneva Convention,” TechCrunch, February 14, 2017, https://perma.cc/78Q3-Q38S.

3 Tech Dominance and the Policeman at the Elbow

* This essay benefited from comments by Kevin Werbach, discussions with Randy Picker, and an illuminating conversation with Bill Gates.

1 Quoted in Tim Wu, The Curse of Bigness (2018).

2 I elaborate this theory in Tim Wu, The Master Switch (2010), at pp. 138, 158.

3 Steven Brill, What to Tell Your Friends About IBM, American Lawyer (April 1982), 1.

4 See Gary L. Reback, Free the Market!: Why Only Government Can Keep the Marketplace Competitive (2009); William E. Kovacic, Failed Expectations: The Troubled Past and Uncertain Future of the Sherman Act as a Tool for Deconcentration, 74 Iowa L. Rev. 1105 (1989).

5 For a recent account of the influence of a high probability of law enforcement over compliance with the law, see Aaron Chalfin and Justin McCrary, Criminal Deterrence: A Review of the Literature, 55 J. Econ. Lit 5 (2017) (reviewing effects of increased law enforcement on crime); see also Robert C. Ellickson, Order Without Law: How Neighbors Settle Disputes, pp. 147–148 (1991) (dismissing extreme view of “legal peripheralism”).

6 Daniela Hernandez, Tech Time Warp of the Week: 50 Years Ago, IBM Unleashed the Room-Sized iPhone, Wired (June 27, 2014), https://www.wired.com/2014/06/tech-time-warp-ibm-system360/.

7 The centralized design ideology is described in Wu, supra Footnote note 2, at pp. 45–60.

8 Kevin Maney, The Maverick and His Machine: Thomas Watson, Sr. and the Making of IBM (2003).

9 Footnote Ibid at p. 423.

10 James Cortada, IBM: The Rise and Fall and Reinvention of a Global Icon, pp. 332–333 (2019).

11 The Complaint is reprinted in the appendix of Franklin M. Fisher et al., Folded, Spindled and Mutilated: Economic Analysis and U.S. v. IBM, 353 (1983).

12 Plaintiff’s Statement of Triable Issues (dated Sep. 23, 1974), United States v. IBM, 69 Civ. 200 (S.D.N.Y. 1969).

13 See Amended Complaint, U.S v. IBM, 69 Civ. 200, ¶ 19(a) (S.D.N.Y. 1969).

14 Plaintiff’s Statement, supra Footnote note 12.

15 See Wu, supra Footnote note 1.

16 James Cortada, IBM: The Rise and Fall and Reinvention of a Global Icon, p. 333 (2019).

17 Footnote Ibid at p. 331.

18 See Fisher, supra Footnote note 7, at 16.

19 Brill, supra Footnote note 2, at 1.

20 Peter Behr, IBM, Justice Rests Cases In Historic Antitrust Trial, Wash. Post (June 2, 1981), available at https://www.washingtonpost.com/archive/business/1981/06/02/ibm-justice-rests-cases-in-historic-antitrust-trial/5cc16db0-8e7f-4763-a17d-fdfb6fef0464/?noredirect=on.

21 Don Waldman, IBM, in Market Dominance: How Firms Gain, Hold, Or Lose it and the Impact on Economic Performance, p. 140 (David Ira Rosenbaum, ed., 1998).

23 Andrew I. Gavil, William E. Kovacic, Jonathan B. Baker, Antitrust Law in Perspective: Cases, Concepts, and Problems in Competition Policy 1112 (2008).

24 Paul Carroll, Big Blues: The Unmaking of IBM 57 (1994).

25 Cortada, supra Footnote note 16, at p. 346.

26 G. David Garson, Public Information Technology and E-governance: Managing the Virtual State 229 (2006).

27 In antitrust jargon, bundling and tying are differentiated by the idea that tying is non-optional, while a bundle allows the customer to buy the constituent products separately, or in a (usually cheaper) bundle. However, in business usage, the two terms are used interchangeably, and in this piece “bundling” is used as a synonym for “tying.”

28 Burton Grad, A Personal Recollection: IBM’s Unbundling of Software and Services 24. IEEE Annals of the History of Computing 64, 66 (2002).

29 Footnote Ibid at p. 67.

30 Stanley Gibson, Software Industry Born with IBM’s Unbundling, Computerworld, 6 (June 19, 1989).

31 Thomas J. Watson Jr. and Peter Petre, Father, Son & Co.: My Life at IBM and Beyond (1990). There were also some within IBM who thought that the firm was missing out on an opportunity to make money in software. See Grad, supra Footnote note 28, at p. 65.

32 There was, at some point, controversy over what caused IBM to unbundle software. In 1983, Fisher, McKie, and Mancke disputed the argument that it was antitrust pressure, and suggested that cutting the costs of support was the primary motive. See Franklin M. Fisher, James W. McKie, and Richard B. Mancke, IBM and the U.S. Data Processing Industry: An Economic History (1983). However, later admission in Watson’s autobiography and corroboration by insiders like Grad seems to have ended the controversy.

33 For example, in the 2010s, when under FTC investigation, Google preemptively abandoned several practices that investigators had deemed anticompetitive. See https://www.vox.com/2017/12/27/16822150/google-ftc-yelp-scraping-antitrust-ftc.

34 Gibson, supra Footnote note 30, at p. 6.

35 See, e.g., Grad, supra Footnote note 28; see also W. Edward Steinmueller, The U.S. Software Industry: An Analysis and Interpretive History, in The International Computer Software Industry: A Comparative Study of Industry Evolution and Structure (David C. Mowery ed., 1995).

36 Martin Campbell-Kelly, Development and Structure of the International Software Industry, 1950–1990, 24 Bus. & Econ. History 73, 88 (1995).

37 Grad, supra Footnote note 28, at p. 71.

38 Steinmueller, supra Footnote note 35, at p. 7.

39 Marc Andreessen, Why Software is Eating the World, Wall St. J. (August 20, 2011), available at https://a16z.com/2016/08/20/why-software-is-eating-the-world/.

40 Robert W. Crandall and Charles L. Jackson, Antitrust in High-Tech Industries, 38 Rev. Ind. Organ. 319, 327 (2011).

41 Mat Honan, Bill Gates Spent the Best Money of His Life 30 Years Ago Today, Gizmodo (July 27, 2011).

42 The IBM PC was actually offered with three operating systems, but Microsoft’s was the cheapest ($40) and may have been the default, for it quickly became the dominant choice. Eric G. Swedin and David L. Ferro, Computers: The Life Story of a Technology, p. 95 (2007).

43 Grad, supra Footnote note 28, at p. 71.

44 Robert X. Cringely, “Triumph of the Nerds: The Rise of Accidental Empires,” PBS (June 1996), http://www.pbs.org/nerds/part2.html.

45 Walter Isaacson, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution 360 (2015).

46 Eric G. Swedin and David L. Ferro, Computers: The Life Story of a Technology 95 (2007).

47 Joseph F. Porac, Local Rationality, Global Blunders, and the Boundaries of Technological Choice: Lessons from IBM and DOS, in Technological Innovation: Oversights and Foresights 129, 137 (Raghu Garud et al. ed., 1997).

48 Charles H. Ferguson and Charles R. Morris, Computer Wars: The Post-IBM World 26, 71 (1993). See also Eli M. Noam, Media Ownership and Concentration in America 187 (2009).

49 For a description of one firm’s cloning techniques, see James Langdell, Phoenix Says Its BIOS May Foil IBM’s Lawsuits, PC News (Jul. 10, 1984), available at https://books.google.com/books?id=Bwng8NJ5fesC&lpg=PA56&ots=_i5pxGorF7&dq=ibm+pc+program+use+extra+bios+chip&pg=PA56&hl=en#v=onepage&q&f=true.

50 A contrasting theory, not explored further, is that exclusivity would have hindered the spread of the PC platform, or perhaps made CP/M rivals more successful.

51 Don E. Waldman, The Rise and Fall of IBM, in Market Dominance: How Firms Gain, Hold, or Lose It and the Impact on Economic Performance, p. 141 (David Ira Rosenbaum ed., 1998).

53 Carroll, supra Footnote note 24, at p. 57.

54 See, e.g., Jimmy Maher, The Complete History of the IBM PC, Ars Technica (July 31, 2017), https://arstechnica.com/gadgets/2017/06/ibm-pc-history-part-1/.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×