This book-length history of the semiconductor industry is the first to focus on the national security rationale for semiconductor industrial policies. Short, punchy chapters and colorful anecdotes create an engaging read.
Miller examines industrial competition in semiconductors as a dimension of global political-military rivalry. Numerous studies document the military’s role in American chipmakers’ rise to global industrial dominance. Miller’s book is the first to detail interactions with Soviet and Chinese policy responses. It is less successful as an economic history of global competition in semiconductors, with significant omissions and occasional implausible claims.
Miller briefly describes how military procurements for U.S. missile and space systems created the initial demand for the first pricey chips. His account largely skips over a rich variety of Cold War U.S. defense initiatives that, episodically, helped push (with R&D subsidy) and pull (with procurement) the American semiconductor industry’s technological frontiers.
This book focuses primarily on catch-up strategies deployed in the Soviet Union, China, and Taiwan. Europe and South Korea merit only an occasional aside. There is minimal attention given to the fate of U.S. chip firms other than Fairchild, and its successor, Intel, who are treated as the U.S. national champions. Other important players, like IBM, pop up only sporadically, primarily when transferring their technology to others.
Miller documents how the Soviet Union adopted a “copy it” strategy toward Western technology developments, complemented by state-sponsored industrial espionage, early in the Cold War. Miller presents this as poor judgment by a single, powerful Soviet bureaucrat who “called the shots behind a ministerial desk in Moscow” (p. 44). Miller says constant advances in microelectronics technology (shorthand: “Moore’s Law”) and the vastly larger scale Western commercial electronic markets soon attained are reasons this strategy was doomed to fail. Russian microelectronics fell and stayed behind.
Miller does not consider how individual semiconductor firms tried similar “fast follower” strategies (“first movers” pay for costly R&D on alternative technology solutions; nimble follower-copycats reap most benefits at much lower cost), typically with poor results. Dynamic scale economies (i.e., “learning” or “experience” curves—unit cost falling with cumulative semiconductor output) provided early entrants with cost advantages that persisted over time. How learning curves, along with conventional economies of scale—from leading-edge chip R&D, or minimum efficient scale for facility investment—constrained national policy or shaped global industrial structure is never considered.
Miller describes how Cold War American engineer-defectors persuaded Russian politicians to invest resources at scale in a 1960s Russian “Silicon Valley”—the closed city of Zelenograd. The KGB also then established a “Directorate T” to illicitly acquire Western technology. But as Western semiconductor industries spun off the increasingly complex equipment and materials required to manufacture rapidly evolving chip technology into separate, upstream industries, the roster of technologies and equipment that would have to be imported to copy leading-edge chip production grew longer and longer. Massive covert acquisition programs for Western semiconductor materials and manufacturing equipment were countered by higher export control walls.
Perceived Soviet numerical weapons advantages in the 1970s led U.S. military planners to explicitly articulate an “offset strategy” using semiconductors incorporated into cutting-edge electronics to create technologically superior weapons systems. This U.S. strategy was challenged in the 1980s not by the Soviet Union, but by Japanese chipmakers wresting away superiority in memory chip fabrication.
Citing national security concerns, the United States in the late 1980s imposed semiconductor trade restrictions combined with market-opening pressure on Japan and subsidized a semiconductor industry R&D consortium (SEMATECH). Miller says little about the effects of trade policy. He says even less about SEMATECH, ending his narrative at its first CEO, Robert Noyce’s, death in 1990, only 3 years into SEMATECH’s 20-year-plus history. SEMATECH’s subsequent role in an industry-wide R&D coordination initiative to accelerate the pace of innovation, reducing time lags between new generations of semiconductor manufacturing equipment from three to two years, is ignored.
Economic studies confirm that semiconductor innovation accelerated after 1994, with semiconductor prices declining at unprecedented rates, and the U.S. industry’s global market share rebounding past that of a now-declining Japan. Large declines in IT capital prices and increased productivity gains in the United States resulting from faster technological advances go unmentioned. Instead, Miller implausibly credits the 1990s U.S. semiconductor resurgence to the entry of Micron Technology into memory chip production in the late 1970s. Arguing against this are sales of non-memory chips, particularly microprocessors, microcontrollers, and application-specific logic designs, which accounted for the bulk of American chip companies’ aggregate sales growth in the 1990s.
Miller thinks Western export controls’ effects on slowing Soviet catch-up are overstated (p. 144). Oddly, when later discussing China, he finds export restrictions “could also be a devastatingly powerful weapon” (p. 303) and “demonstrate just how powerful the chip industry’s choke points are” (p. 329). The reader is left wondering what changed.
In China, Miller claims a Cultural Revolution-era policy “that every worker could produce their own transistors had been an abject failure” (p. 250). A Maoist struggle campaign slogan is interpreted as a literal description of policy. Other histories of China’s semiconductor industry document chaotic decentralization—establishment of dozens of regional chip fabrication centers across China—but do not describe workers in electronic equipment factories literally manufacturing their own transistors. After 2000, significant resources were invested in semiconductor self-sufficiency in a much more centralized fashion. Miller says little about the new policy beyond naming recipient national champion firms. While “substantial sums of money” (p. 320) are involved, the policy details are vague.
Taiwan Semiconductor Manufacturing Company (TSMC) is currently the global leader in advanced semiconductor manufacturing. Miller’s account credits its success to U.S. semiconductor technologist Morris Chang. Chang organized Taiwanese support for a pioneering new business model—the “foundry,” manufacturing the chip designs of others, but not its own—after leaving a career at U.S. chipmaker Texas Instruments. Miller thinks Chang remade an entire industry “with vision and with government support” (p. 329) but supplies few details on the various forms in which support took over TSMC’s 37-year history.
Chang’s vision built on U.S. government-supported development of software and design tools needed to implement the foundry model in the 1970s and early 1980s, some of which Miller describes. But the question of why the “pure” foundry business model was not realized first in the U.S. goes unasked and unanswered. The book does not mention a second, alternative “vision with government support” that competed contemporaneously for policymaker attention—the failed effort to create a Taiwanese memory chip industry in the 1990s. In the early 2000s, Taiwan’s memory chip sales were roughly equal to foundry chip sales.
Miller’s historical narrative ends with a currently lagging American Intel, Korea’s Samsung, and TSMC as Western semiconductor champions jousting for strategic advantage with an insurgent Chinese industry in “the fight for the world’s most critical technology.” He attributes Intel’s current problems to previous failures to embrace the foundry business model—an explanation that begs for further absent details. Is the foundry model now the only economically viable business model utilizing leading-edge chip manufacturing technology? If so, why?
Soviet failures to stay abreast of chip technology may be less clear-cut than Miller suggests. He acknowledges that Russian scientific capabilities were always top-notch, even when industrial production was second-rate. Numerous declassified CIA reports (only a few cited in Chip Wars) found Soviet chips lagging behind U.S. equivalents. But CIA analysts were always careful to add caveats—comparing chips being sold commercially, and not laboratory models (difficult to obtain) or engineering prototypes utilized in military systems (feasible in small military production volumes, but not in mass-market consumer electronics).
The U.S. intelligence community was divided in assessing Soviet capabilities in microelectronics: the CIA was dismissive, and the NSA was more respectful. In the mid-1980s, the NSA discovered sophisticated electronic “key loggers” concealed within electric typewriters used within the American embassy in Moscow. These implants, dating back to the mid-1970s, are reported to have used then-state-of-the-art integrated circuits and core memories that impressed NSA engineers.
Highly sensitive military applications apparently secured cutting-edge, Soviet-designed semiconductors. The Soviet Union had a dual economy, with defense industries separated from civil production. Declassified CIA reports estimated that the Soviets secured enough imported semiconductor equipment and materials to meet military equipment requirements, albeit at a very high cost. It was the civil industry that suffered most acutely from low quality and supply constraints.
Economists will be disappointed by inattention to how changing economics have affected the industrial structure of the global chip business. By 1965 (just 3 years after the first IC sold shipped, in 1962), there were already 17 American producers making integrated circuits. Capital investment required for a leading-edge, high-volume IC fabrication line using 1-inch silicon wafers was around $1 million. In 2001, roughly 3 dozen producers globally were producing leading-edge ICs, at a time when a leading-edge chip fabrication facility using 12-inch wafers cost $2 billion. Today, competition at the leading edge has shrunk to just 3 firms globally, while fabrication line costs have ballooned to around $15 billion. Steadily increasing industrial concentration over recent decades goes unnoticed.
Chip innovation is portrayed as a constant (and exogenous), unchanging background trend. But Miller’s chosen metric for Moore’s Law progress, device density—doubling time for the number of electronic devices squeezed onto a fixed silicon area—has fluctuated dramatically over time, accelerating in the mid-1990s, then slowing, finally losing meaning entirely as electronic devices shift from 2D areas patterned on silicon wafers to 3D vertical structures.
In addition to greater density, Moore’s Law semiconductor manufacturing innovation delivered steady, rapid declines in device cost, energy consumption, and increases in device speed. These other dimensions of chip improvement largely go unnoticed in this account yet make key contributions to both military and economic value.
Unfortunately, recent studies find that quality-adjusted price per computer chip is no longer declining rapidly. While military interest in accessing leading-edge chips is certain to continue, the economic case for using the most advanced available chipmaking technology is increasingly tied to application specifics. The “world’s most critical technology” may now be improved design tools that reduce skyrocketing fixed costs for creating customized system designs that effectively utilize leading-edge manufacturing technology.