Creating Abundance emphasizes that accounting for the biological innovations that facilitated the control of plant diseases significantly affects our understanding of the dynamics of American agricultural development.Footnote 1 However, Creating Abundance barely touches on the efforts to control livestock diseases. This subject is so essential to our understanding of human welfare that it warrants a more extensive treatment. To understand its importance, it helps to compare the impacts of crop and livestock diseases. Crop losses cut into the food supply available to humans and their livestock. Livestock diseases have a similar direct effect on output by diminishing the supply of high-protein meat and milk products. But livestock diseases also reduce the supplies of fertilizer and draft power, which could be devastating in the age before petroleum-based fertilizers and the internal combustion engine. In addition, many animal diseases cross over and infect humans—a consequence not associated with crop pests.
IMPACTS OF LIVESTOCK DISEASE
Technological and institutional changes are the primary engines of modern economic growth. A few general purpose technologies such as the steam engine, electricity, and the computer are especially important because they created paradigm shifts and opened the way for myriad spin-off technologies. The gradual discovery and acceptance of the germ theory of disease was one of those fundamental technological advances—one which revolutionized the understanding of human and animal health and gave rise to the sciences of bacteriology, microbiology, virology, and immunology, among others. In the late nineteenth century, at the very time when scientists, physicians, and veterinarians were gaining a better understanding of disease overall, many livestock diseases were spreading across the United States and Western Europe, some at alarming rates. The same transportation revolution that facilitated commodity trade also greatly increased the movement of animals harboring infectious diseases. Faster transportation meant that recently infected and contagious livestock could travel great distances before showing symptoms. The very actions needed to improve herd quality (such as the concentration of livestock in dairies and stockyards and the increased intermingling of prized breeding stock) also contributed to the rapid spread of diseases.
Enlightened local and state animal health officials often enacted measures to control and stamp out diseases, but these policies were hampered by many factors. Externalities, imperfect information, and economies of scale in enforcement doomed many efforts. Infectious diseases paid no heed to political boundaries and local and state initiatives were often overwhelmed as recently cleansed areas were reinfected. Costly legal disputes and beggar-thy-neighbor policies were predictable outcomes in the absence of national legislation. The push for federal intervention was further enhanced when foreign governments banned or restricted importation of American products due to the threat of contagious bovine pleuropneumonia, foot-and-mouth disease, trichinosis, hog cholera (swine fever), and other diseases. The stakes were enormous.Footnote 2
The story of rinderpest illustrates the dreadful impact that an animal disease can have on supplies of food and draft power. Rinderpest is a highly infectious viral disease that over the millennia has devastated cattle, water buffalo, sheep, goats, and other domestic and wild animals across Asia, Africa, and Europe. It never reached the United States, but it could have.Footnote 3 In August 1942 the United States and Canadian governments entered into a secret compact to construct a jointly managed biological weapons facility on Grosse Isle in the Saint Lawrence River. The project was code named GIR, which stood for Grosse Isle Rinderpest. For the binational commission that established biological warfare priorities, rinderpest, even more than anthrax, foot-and-mouth disease, and botulism, was the mother of all fears.Footnote 4 History helps explain why.
In the mid-nineteenth century when the movement to build systematic animal disease defenses in the United States was in its infancy, a few farsighted leaders gazed fearfully across the Atlantic appalled by the havoc that rinderpest had wrought.Footnote 5 The disease had repeatedly entered Western Europe from Russia and Asia Minor, where it was enzootic. In the eighteenth century, destruction came in titanic waves with 50 to 90 percent of the cattle in large regions succumbing in a matter of months. In his encyclopedic treatment of this disease, Clive Spinage noted that in 1713/14, 70,000 head of cattle succumbed in Piedmont and in 1781, Holland, which already had been struck several times in the century, lost more than 300,000 head. Such enumerations of local losses go on for pages. Spinage also documented how the loss of food sources and draft power caused widespread privation and political turmoil. In 1865–1867 rinderpest inflicted terrible damage in Britain before being stamped out by an extensive quarantine and slaughter program. It is commonly asserted that rinderpest killed well over 200 million cattle in Western Europe in the eighteenth century. The American observers of 1865 accepted this estimate.Footnote 6
Rinderpest had an alarming impact in Europe, but the effects of the epizootic that began in Ethiopia (now Eritrea) in 1888 reached biblical proportions. The disease swept the length of Africa in about eight years, battering present-day South Africa by 1896. In many areas, over 90 percent of all cattle died; most susceptible wildlife also perished. The near total loss of meat and milk supplies and the destruction of the draft power needed to plant and tend crops unleashed one of the worst famines in Ethiopian history. Rinderpest also led to starvation among the Maasai of Kenya and helped spark the Matabele rebellion in Zimbabwe. From April 1896 to August 1899, roughly 2.5 million cattle perished in southern Africa. In many areas, the transport system collapsed, with wagons and stages left abandoned along the roads where the oxen had fallen. One account asserts that in South Africa overland freight charges per hundredweight increased over eightfold in the first six months of 1896. Farmers found it impossible to dispose of the carcasses that littered the countryside and fouled the wells, ponds, and rivers.Footnote 7
Rinderpest left a trail of misery, but it did not directly infect humans. Animal diseases that can be transmitted to humans are of special concern because they have been responsible for the deaths of millions of people.
LIVESTOCK DISEASE AND HUMAN HEALTH
In recent years, we have witnessed a number of highly publicized episodes of human-animal disease interaction. Mad cow disease (variant Creutzfeldt-Jakob disease in humans and Bovine Spongiform Encephalopathy in cattle) offers one of many examples. Far more ominous is the ever looming threat that new influenza strains will emerge in swine and birds, mutate to infect humans, and then pass from human to human. Such mutations have occurred repeatedly. The 1918 Spanish influenza pandemic is credited with the deaths of—who knows—but perhaps 50 million people worldwide.
Approximately four-fifths of all known infectious diseases of humans are shared by other vertebrate animals. These shared diseases are called zoonoses and include hundreds of nasty characters such as anthrax, influenza, brucellosis, cholera, variant Creutzfeldt-Jakob disease, Ebola, malaria, plague, rabies, Salmonellosis, trichinosis, tuberculosis, and yellow fever. Humans contract many zoonoses solely from other animal species, and the most effective method to combat many zoonoses is to attack the animals that carry and transmit the diseases.Footnote 8
Richard Easterlin and numerous others have emphasized the importance of public health initiatives in improving human health.Footnote 9 Take Easterlin's analysis in his splendid 1999 article as an example. He did not note (although he likely knew) that the majority of the infectious diseases that he analyzed are zoonoses and that there was a close synergy between animal and human medicine. The control of livestock diseases often had a direct and immediate impact on human health. Furthermore, animal researchers, trained as veterinarians, physicians, and research scientists, made many of the breakthroughs that advanced basic and applied knowledge, including important laboratory methodologies associated with the development of the germ theory of disease.Footnote 10
Robert Koch and Louis Pasteur were certainly among the most eminent of the elite group of scientists-entrepreneurs who transformed human medicine. Koch was trained as a physician, but he made fundamental contributions to the understanding of anthrax, tuberculosis (including bovine tuberculosis), cholera, and other zoonoses. His first breakthrough was with anthrax where the primary concern was the health of farm animals, and he worked with and on animals throughout his professional life. Pasteur was trained as a chemist. In addition to his famous work on wine and milk, which led to the concept of pasteurization, Pasteur made fundamental advances in the understanding and development of vaccines for anthrax, cholera, and rabies. All of this work was closely linked to livestock. Koch and Pasteur were not alone in merging the study of animal health with human medicine. Many of the researchers leading the biomedical revolution were studying seemingly “spontaneously occurring diseases of domestic animals with the object of acquiring medical knowledge also applicable to man.”Footnote 11
Perhaps America's most renowned research physician of the late nineteenth and early twentieth centuries was Theobald Smith. Between 1889 and 1892 Smith worked on the cattle disease Texas fever. In 1893 Smith and fellow Bureau of Animal Industry (BAI) scientist F. L. Kilborne published a seminal report proving that a vector—in this case a tick—transmitted a microorganism that caused an infectious disease. This electrifying discovery sped advances in the understanding of other vector-borne diseases including malaria, yellow fever, typhus, and African sleeping sickness. One authority of medical history asserts that “the initiation of vector control … was probably the single most unprecedented event in the history of disease control.” Smith “spent virtually his entire career in veterinary medicine and never had even the remotest connection with human medical practice.”Footnote 12
Most people probably care more about their own health than that of their livestock and think that a human's life is more valuable than an animal's life. For these reasons, one might predict that progress in human health would have preceded advances in animal health. However, the reverse seems to have been the case. For centuries law and custom in the West prohibited researchers from using human cadavers for medical training and research, and thus much of what physicians learned about human physiology was based on extrapolation from animal models. Moreover, the species of primary concern, be it humans or large livestock, was often inaccessible or too valuable for research.Footnote 13 Livestock owners had an interest to invest individually to protect their animals and to organize to demand that governments use the police power of the state to combat animal plagues. Employers of wage labor did not have a similar incentive. The concentration of wealth and political power in the hands of the elite contributed to entitlement failures whereby the health requirements of the masses went unheeded. In addition, it was easier to control livestock diseases because authorities could test and quarantine animals more indiscriminately, slaughter sick and suspect animals, and even depopulate entire districts. This ability to more aggressively attack diseases in livestock helps explain the more rapid demise of tuberculosis in cattle than in humans in the United States. Between 1917 and 1940 veterinarians administered about 232 million tuberculin tests to cattle in the United States and ordered the slaughter of over 3.8 million suspect animals.Footnote 14
CONTROLLING LIVESTOCK DISEASE
Progress in the control of human and animal diseases rested upon the same two forces that drove economic growth more generally: advances in scientific knowledge and the development of institutions.Footnote 15 Each of these forces fueled the other. New institutions were needed to overcome serious market failures in order for the new knowledge to be converted into policy. A better understanding of diseases spurred consumer and producer demands to create and strengthen disease-control institutions, and new knowledge also weakened the credibility of those opposed to institutional reform. The new veterinary institutions in turn invested in research and extension activities that enhanced the understanding of the diseases.
By 1900 the United States had emerged as a world leader in animal disease control, even though the nation was a laggard in human medicine and the biological sciences more generally. In the last half of the nineteenth century, most of America's future veterinary researchers and leaders received their scientific training in Europe, most notably in Germany, France, England, and Scotland. All of these nations had rich medical and veterinary traditions with research and training academies superior to anything in the United States. The first dedicated veterinary college in Europe opened in Lyon in 1762 with the goal of finding a cure for rinderpest. International students trained at Lyon and at Alfort, France's second veterinary institution, and would spearhead the creation of prestigious institutions across Europe. By 1829 Europe had 29 veterinary schools. Scientific veterinary training in the United States did not begin to creep forward until 1868, when Cornell initiated an undergraduate veterinary program. As a mark of Europe's preeminence, Cornell's program was directed by James Law, a distinguished DVM trained at the Edinburgh Veterinary College in Scotland.Footnote 16
One Leader
In 1876 Cornell granted Daniel E. Salmon the first DVM awarded in the United States.Footnote 17 This was the Salmon of Salmonella fame. More than any other individual, Salmon was responsible for America's leadership in the control of livestock diseases. He helped forge the USDA's Bureau of Animal Industry (BAI) in 1884 and served as its chief for 21 years. The BAI literally became the first line of defense in fighting animal, and many human, diseases in the United States. It has fallen out of fashion in our profession to eulogize the accomplishments of the great inventors and innovators in the style of Jonathan Hughes' stimulating work.Footnote 18 After all, what is the counterfactual? Wouldn't someone else have done the same thing a short time later? But in fighting infectious diseases a “short time” could be an eternity. Many of the diseases Salmon and his successors squelched, if left unattended for even a few extra days, likely would have mushroomed totally out of control. Preventing the spread of foot-and-mouth disease and contagious bovine pleuropneumonia in Chicago's Union Stockyards, are just two of many such examples. Without rapid action, suppression would have been nearly impossible given the technology of the day. The histories of Australia, Argentina, and much of Europe, Africa, and Asia offer abundant testimony to the cost of inaction.
Salmon was not the prototypical Schumpeterian entrepreneur. His enterprise—the BAI—never turned a profit. The number of employees started at 23 (plus Salmon) in 1884, grew rapidly to over 200, but never rivaled the labor force of America's corporate giants. Salmon's 1884 salary of $3,000 was a mere pittance compared to the earnings of the captains of industry. Yet Salmon set in motion the institutional machinery that by the 1940s had saved hundreds of thousands of American lives and had eradicated seven major animal diseases from the United States: contagious bovine pleuropneumonia (1892), fowl plague (1929), foot-and-mouth disease (1929), glanders (1934), bovine tuberculosis (1941), dourine fever (1942), and Texas fever (1943). In addition, BAI scientists spearheaded the quest to understand and control scores of other diseases enzootic in the United States including scabies in sheep (related to mad cow disease) and hog cholera. One of Salmon's first acts as chief of the BAI was to establish what is now regarded as the “first significant microbiological laboratory in the United States in 1884.”Footnote 19 Many of the BAI's scientific breakthroughs represented fundamental advances. Smith's work on vector-borne diseases and tuberculosis and the discovery of Salmonella head a long list. The agency's monitoring and quarantine network repeatedly blocked the entry of diseases into the United States.Footnote 20
As great as Salmon's scientific achievements were, his real genius was as a master political tactician. The science underlying the BAI's triumphs was available to animal health officials everywhere. But it is doubtful that any nation rivaled the United States in converting science into the public policy needed to control diseases. Advances in political economy accompanied scientific progress as officials learned how to build better institutions. Salmon and his successors repeatedly devised and fine-tuned incentive-compatible schemes that gained the cooperation of most farmers while overcoming entrenched special interest groups hostile to particular initiatives. He had to do this within the constraints imposed by the constitutional division of powers that initially restricted Salmon's authority to act without state cooperation—something that was not always forthcoming. He built the BAI into a powerful hierarchical administrative agency that could monitor the movement of diseases in the United States and overseas and ruthlessly and rapidly intervene with near dictatorial authority to nip infectious diseases in the bud. This concentration of power was dubbed the “one man principle.” He was the first to introduce the revolutionary strategy of regional (and later national) disease eradication that was used in the fights against contagious bovine pleuropneumonia, Texas fever, bovine tuberculosis, and other diseases.Footnote 21 These campaigns required a scale and complexity unprecedented in the history of human and animal health. The template that Salmon created would be copied around the world in campaigns against both animal and human diseases. As an example, researchers intent on eradicating smallpox credited the BAI's campaign against contagious bovine pleuropneumonia (CBPP) with establishing “the precedent and mechanisms” for “area-wide eradication programs….”Footnote 22
Baptism by Fire
A series of spectacular successes, starting with the BAI's first campaign against CBPP initiated in 1884, increased public favor and persuaded Congress to expand the budget and power of Salmon's enterprise. CBPP is insidious because highly contagious cattle show few, if any, visible symptoms in the early stages of the disease thus making it very difficult to control the pathogen's spread. CBPP had long plagued cattle in Europe and had spread to many New World lands. It became deeply entrenched in Australia following its introduction in 1858. In the United States, CBPP had gained a foothold in at least five states and the District of Columbia by 1884. In July 1884, less than two months after the BAI's founding, CBPP was discovered west of the Allegheny Mountains. Without immediate and forceful action it almost surely would have become enzootic because the intermingling of cattle on the open range would have made it nearly impossible to stamp out. This is what had happened in Australia.Footnote 23
BAI agents rushed to hot spots in several states to quarantine and destroy suspect cattle. Their efforts were often delayed by a lack of state cooperation. As the eradication campaign gained momentum, most states strengthened their own animal health bureaucracies and modified their constitutions and laws to facilitate federal-state cooperation. Most governors also signed documents that granted the BAI chief absolute power to declare quarantines within his state, slaughter infected and exposed animals, and inspect stockyards engaged in interstate commerce. The prospect of other states quarantining holdouts, along with pressure from the BAI, helped ensure rapid compliance. The traditional boundary of state versus federal rights was shifting rapidly.Footnote 24
On 26 September 1892 Secretary of Agriculture J. M. Rusk triumphantly declared the United States free of CBPP. With a total federal government expenditure of about $1.5 million (a sum roughly equal to the annual loss attributed to the disease and substantially less than what the cost would have been had CBPP spread), the United States became the first large nation extensively infected with CBPP to eradicate the disease.Footnote 25 This victory gave the BAI the confidence to undertake even greater challenges. The partnerships forged with state officials and the new legal structures enacted in most states gave the BAI vastly more power and a network of local allies.
Institutional Deepening
The institutions created to fight CBPP were inadequate to deal with all diseases. Tensions continued between the advocates of states' rights and supporters of a strong central power that could more effectively deal with asymmetric information problems and other market failures associated with infectious diseases. In order to foster broad-based support, BAI leaders learned to tailor control policies to meet the specific threats of different diseases. One size did not fit all, and it required considerable trial and error to develop policies that would work in the field. In addition, officials faced a moving target as knowledge and the disease environment changed.
There were a wide range of policy options, and the methods chosen typically depended on a disease's infectious characteristics and on the economic damage it might inflict. If rinderpest had entered the United States, the BAI would have quarantined infected areas and ruthlessly depopulated susceptible livestock and wildlife. CBPP and foot-and-mouth disease were also infectious enough to warrant the extensive slaughter of sick livestock and many seemingly healthy animals. Bovine tuberculosis was considered dangerous but less infectious so that a more targeted test-and-slaughter program could be implemented. Texas fever could be controlled by killing the ticks that transmitted the disease. Still other diseases could be controlled with vaccinations, inoculations, and in more modern times with antibiotics. Eradication campaigns would attack a disease in a large target area, such as a county, in order to reduce the possibility that cleansed farms would be reinfected.
These control decisions had to be coordinated with a variety of incentive and penalty schemes. Policymakers learned that farmer cooperation was essential and that paying compensation for destroyed animals encouraged farmers to reveal their diseased animals rather than hide, or worse, sell them to others. But officials also learned that paying too much compensation created a moral hazard by discouraging farmers from taking proper precautions and at times enticing them to actually infect their animals to qualify for government payments. Compensation programs became calibrated to the severity and ease of transmission of diseases. As a rule it was good policy to pay full compensation for highly infectious diseases such as foot-and-mouth disease and partial compensation for less infectious diseases such as bovine tuberculosis.
The perceived risk of a given disease spreading to other livestock and to humans also governed policies on disposal of the meat. In the pre-World War II era, the carcasses of animals with highly infectious diseases such as foot-and-mouth were buried or incinerated. By contrast, most of the meat from cattle with tuberculosis was allowed into the human food supply after the carcasses were trimmed of visible lesions. As incomes increased and bovine tuberculosis became less common, depopulation policies replaced individual test-and-slaughter programs, and governments banned the sale of all meat from infected livestock.
In the early phases of enzootic disease eradication programs, there was typically considerable opposition. Many farmers, especially those with infected stock, were not convinced that the diseases posed much of a danger, and others doubted that control was possible. Opposition required the BAI to build coalitions and sometimes offer more generous compensation. Initial successes quieted many skeptics. Eradication campaigns had a common political dynamic. Once a county or state had been cleansed, farmers in that area had a strong incentive to support aggressive policies that forced laggards to participate in order to prevent reinfection. So those who might have been ambivalent at first often became strong supporters as the eradication programs advanced. Resolving legal and organizational problems could take decades and often had to wait for corresponding advances in knowledge.
The campaign against Texas fever clearly highlights the interaction of science and policy. It also demonstrates Salmon's vision and resolve. In 1891 the USDA drew a roughly 2,000 mile quarantine line that restricted the movement of southern cattle. Gradually the line was pushed southward as areas were cleansed of the ticks that carried the disease. There was no precedent for an effort of this magnitude. Figure 1 shows the quarantine area in 1906.
The campaign against bovine tuberculosis illustrates the need for institutional learning. In 1910 federal agents uncovered a nefarious interstate trade in dairy stock infected with bovine tuberculosis. Several businessmen operating out of the dairy-shed northwest of Chicago discovered a profitable market niche. They purchased diseased and suspect animals from farmers in areas subject to state and local cleanup campaigns, doctored the cattle so they would not react to tuberculosis tests for months, and gave the animals bogus certificates of health. One prominent Chicago-area cattle dealer, James Dorsey, knowingly infected at least 10,000 dairy herds in the United States, Canada, and Mexico, exposing tens of thousands of families to tuberculosis.Footnote 26 State and local efforts to control bovine tuberculosis actually backfired by encouraging farmers to sell their animals. The tort system provided injured parties with few remedies due to high enforcement costs and the lack of sufficient resources to pay damages. The microscopic nature of the TB organism, its long incubation period, and the innumerable channels of infection made it almost impossible to document in court when and how an animal or person contracted the disease. For these reasons, it was more efficient to prevent damages via ex ante technological regulation than to compensate for damages ex post.
Interstate jurisdictional issues and protection from well-placed political cronies in Illinois hamstrung the efforts of state livestock sanitary officials to regulate the trade in tuberculous cattle. The BAI was also unable to act even though legislation dating back to 1884 made it illegal to knowingly move diseased animals across state lines.Footnote 27 The key problem was proving that Dorsey and his ilk had prior knowledge of the disease when they shipped the animals. The BAI also had to wait for legal and public acceptance of the validity of the tuberculin test and for general acceptance of the still highly controversial finding that bovine tuberculosis was actually a danger to humans rather than a blessing as many maintained. It therefore took more than three years for the BAI to shut down the illegal trafficking. Cleaning up the Chicago milkshed required a 22-month quarantine of five northern Illinois' counties. As knowledge was gained, BAI officials endeavored to eradicate bovine tuberculosis by repeatedly testing all dairy cows and breeding stock and slaughtering the reactors. Between 1917 and 1962, the annual discounted benefits of the state-federal cooperative test-and-slaughter program to the livestock sector alone were approximately 12 times the annual costs (including the cost borne by farmers). Adding the effects on human health and lives would increase this estimate substantially—probably severalfold.Footnote 28 The returns on other animal health initiatives were also extremely favorable.
ASSESSING FOOD REGULATIONS
This account of early livestock disease control stands in sharp contrast to the literature on the origins of livestock and food regulations in the United States. In his analysis of the Long Drive, David Galenson maintained that the legislation passed by northern states to prohibit the entry of southern cattle was primarily motivated by the desire to limit competition. In a similar fashion, the pursuit of the public interest played little role in Gary Libecap's explanation of the origins of the Meat Inspection Act of 1891. He argued that the legislation represented a classic case of special interests within the meatpacking industry capturing the regulatory process to limit competition from the large and more efficient Chicago producers. Libecap asserted that there were no serious threats to public health.Footnote 29 Building on this interpretation, Edward Glaeser and Andrei Shleifer offered a synopsis of the broader literature on consumer protection regulation. “The list goes on, but the basic point remains: Progressive Era regulation was captured by industry, leaving consumer interests in the dustbin.”Footnote 30
This conclusion would indeed have made interesting reading for the hundreds of thousands of Americans who, without government regulations, surely would have suffered horribly before eventually perishing from bovine tuberculosis and a number of other zoonoses, including anthrax, brucellosis, and rabies. The livestock owners who saw the efficiency of their operations increase, and the consumers who were the ultimate beneficiaries of the production efficiencies, might also have found merit in Progressive Era food safety and animal health regulations. The findings that many meat processors were using meat infected with tuberculosis and infested with trichinae to make sausages should give ample cause to rethink the need for the Meat Inspection Act of 1891. Most European countries had meat inspection acts, some going back decades, and many of these countries had recently strengthened their laws. There were no interindustry rivalries in Europe as in the United States. The widespread movement for stronger laws in the United States and elsewhere was primarily fueled by common forces that transcended national boundaries—new scientific information and greater public awareness. Justifiably, American consumers were profoundly concerned with food safety, and there is no doubt that Progressive Era legislation on food safety and on livestock and human disease control was a resounding success.
Libecap, Galenson, and others do point to a very real problem. It is standard practice for special interest groups to exploit health concerns to limit competition. How do we know if a particular health claim is merited or simply a subterfuge for protection? In recent years, we have seen just how passionately people care about meat safety. Were the concerns over mad cow disease blown out of proportion as the American cattle industry claimed? Were the Canadians warranted in excluding U.S. beef (as the United States once restricted imports of Canadian beef)? Was the South Korean prohibition on American beef imports a ruse to support protectionism? The demonstrations in Seoul suggest that many Korean consumers harbored serious health concerns. On the other side of the coin, why should one trust producers or producing nations to honestly report animal diseases? In 2006 Chinese authorities hid the existence of a new and highly infectious swine disease—dubbed “blue ear”—much as they had done with SARS in 2002 and 2003. By 2008 blue ear had killed millions of pigs and had spread to neighboring countries. The global pork industry is at risk, and scientists from around the world are scrambling to prevent the infection from spreading.Footnote 31 The recent spate of melamine poisonings from tainted milk and pet food add to the case for international monitoring.
When the Meat Inspection Act was passed in 1891, similar issues had long been boiling. But in this era it was often Americans who were accused of denying the existence of serious diseases. European and Canadian governments sent representatives to investigate disease outbreaks and meatpacking and inspection procedures in the United States. Numerous nations banned American imports due to the threat of swine fever, CBPP, and foot-and-mouth disease. However, by far the most serious international controversy dealt with trichinosis. This story brings to light a significant failure of the American infrastructure to combat livestock disease and protect human health.
TRICHINOSIS AND INSTITUTIONAL FAILURE
Trichinosis is a parasitic zoonosis that affects humans and numerous other mammals. Humans usually acquire the disease by consuming raw or undercooked pork. Trichinosis is no longer a serious threat in the United States. Between 1997 and 2001, the Centers for Disease Control reported an average of only 12 cases a year with no deaths.Footnote 32
Scientific advances brought trichinosis to the forefront in the nineteenth century. In 1835 London medical student James Paget first observed trichinae larvae in human tissue. Trichinosis was added to the list of human diseases in 1860 when the German physician Friedrich von Zenker discovered trichinae larvae in the tissue of a woman who had died from what had been diagnosed as typhoid fever. Europe and the United States suffered numerous frightful epidemics. Germany was the hardest hit due to the national penchant for consuming raw and undercooked pork. One of the worst episodes occurred in Hedersleden in 1865 when 337 people became ill and 101 died. As with Dr. Zenker's patient, most victims were initially misdiagnosed—often with typhoid fever, cholera, or influenza.Footnote 33 Trichinosis exhibits a wide range of symptoms and mimics scores of other diseases. It was rarely properly diagnosed.
International Condemnation and American Policy
Trichinosis played a prominent role in the origins of food safety regulation in the United States and Europe, in the creation of the Bureau of Animal Industry, and in the prolonged diplomatic squabbles over the safety of American meat exports. As early as 1863, some German localities passed regulations requiring microscopic inspection of pork at the time of slaughter, and in 1875 Prussia passed the first of a series of pork inspection laws that made inspection compulsory. Mandatory inspection spread to many other European nations.Footnote 34
In 1879 an international pork trade war ignited, and by the end of 1880 most European nations restricted or prohibited American imports. This was not a trivial matter, because hog products accounted for about 10 percent of all American exports—only exceeded by the values of breadstuffs exports and cotton exports. The European embargos hit hard. French imports of salt pork products from the United States fell from over 70 million pounds in 1880 to about 460,000 pounds in 1882. Exports to Germany fell from about 43 million pounds in 1881 to around 4.5 million pounds in 1882.Footnote 35 Britain, which was by far the largest importer of U.S. pork, lifted its ban after a few months, but most continental countries maintained their embargos.
Porkophobia gripped Europe and America. Reports that American pork was crawling with trichinae and had caused outbreaks of the disease across Europe gave protectionists cover for embargoing American pork. Exposés published in Germany reported that the infection rates of American pork were frequently 100 times that of the European product. Germans were by no means united in supporting the condemnation of American pork. Many politicians and leading scientists, including Rudolf Virchow, sided with the American position.Footnote 36
The BAI devoted considerable effort to documenting the extent of the disease in American swine and to defending American interests. In 1885 Salmon reported that only 2.1 percent of the nearly 300,000 microscopic examinations of American pork found trichinae. However, given that one contaminated hog could conceivably infect hundreds of people this was not a particularly sanguine finding. Nevertheless, American officials maintained that the U.S. infection rate was lower than in western Canada, and compared favorably with many areas of Europe. Salmon rejected the data on European swine infection rates, charging that many Prussian inspectors were “utterly incompetent” and that their equipment and methodology were often flawed. “An examination in 1877 showed that many of the microscopes were useless, that the glasses used were too dirty to permit the examination, and that some of the inspectors were incapable of detecting the parasite.” By contrast, Salmon proclaimed that American's well-trained microscopists “would not overlook a single case . …”Footnote 37
From the beginning of the embargoes, many exporters clamored for a U.S. government microscopic inspection program, but most meat packers opposed federal inspection. By the end of the 1880s, the continued loss of foreign markets intensified the calls for inspection. Inspection advocates understood that anything less than a rigorous and honest inspection system would not have credibility in Europe. In March 1891 Congress finally adopted legislation requiring microscopic inspection of pork destined for export—American consumers were not afforded this protection. Inspection, which began in June 1891 in major Chicago plants, was witnessed by representatives of several countries sent to certify the procedures. Many Prussian localities refused to honor American inspection certificates and insisted on reinspecting the meat locally. American officials complained that reinspections represented a serious restraint on trade, but the German Imperial Government responded that localities had the right to impose health standards.Footnote 38 In 1906 the United States did an about-face and abolished the microscopic inspection of pork destined for export. The BAI's studies showed that too many infected animals slipped through the inspections and that in some cases the parasite was only discovered after twenty or more samples were examined.
The BAI's conclusion was reinforced by the problems Germany continued to have in spite of its own inspection system. German microscopists evidently missed many infected animals because a high percentage of trichinosis victims had eaten meat previously certified by German inspectors. One of America's most accomplished zoologists, Charles Wardell Stiles, was posted to Germany in 1897 and spent two years investigating the “alleged presence of trichinae in inspected American meats” and trichinosis outbreaks in Germany and neighboring countries. He deduced that 53 percent of all human trichinosis cases and 41 percent of the deaths in Germany between 1881 and 1898 “appear to have been due to faults in the German inspection system.” Given the relatively small number of trichinosis cases reported in the United States, Stiles argued that universal inspection was not economically justified. “Our methods of curing and cooking” were superior to spending 3 to 4 million dollars a year on a system that gives consumers a false sense of security and perpetuates “that exceedingly unhygienic German custom of eating raw or rare pork.” For Stiles, American pork was far safer than German pork. In fact, he reckoned that Germany could significantly lower the incidence of human infections by consuming American imports instead of domestic pork.Footnote 39
Most American historians have sided with Salmon and his lieutenants, interpreting the embargos as thinly veiled protectionist measures. Protectionism surely played a role, but so did the fear of disease. Otherwise, European countries could have imposed tariffs on pork and avoided the political hassles that the embargoes created. In addition, the costly systems of microscopic inspection—Prussia alone employed an average of 25,000 pork inspectors a year around 1890—clearly point to a very real health concern.Footnote 40
Reassessing the Trichinosis Problem
The historical accounts of the pork trade war have missed much of the story. Whatever the motives of those railing against U.S. imports, there is little doubt that American pork products were far more dangerous than American representatives claimed. Just as it is helpful to have a knowledge of the past to interpret the present, it helps to have a sense of what happened after an event to give it perspective. Science did not stand still.
Most noteworthy were the United States National Institute of Health investigations conducted by Willard H. Wright and others in the 1930s and 1940s.Footnote 41 Extrapolating from 11,931 postmortem examinations, it became evident that circa 1940 roughly one out of six Americans had been infected with trichinae, and that there were roughly 1 to 2 million new infections every year.Footnote 42 Wright's own analysis, based on 5,313 postmortems conducted between 1936 and 1941 in 37 states and the District of Columbia, found that roughly 45 percent of the infected corpses harbored live larvae, suggesting relatively recent exposures—probably within the previous five to ten years. Wright's findings were consistent with many suppositions about expected exposure patterns. The infection rate increased with age because older individuals had more opportunity to ingest infected pork. Persons of German and Italian extraction were nearly twice as likely to harbor trichinae as the overall population, whereas only 2.1 percent of the Jewish population in the sample was infected.Footnote 43
The high prevalence of trichinosis shocked the medical community, but the implications for human health are cloudy. The severity of the symptoms depend on myriad factors including the density of parasites in the victim's gut and tissue, the parts of the body infected, the victim's general health, and the immune system response to the parasite. There is no strict correlation between the intensity of infection and the clinical symptoms—immunological responses vary considerably. If even a relatively few larvae migrated to the brain, heart, or central nervous system, the outcome could be serious and even fatal. As few as five larvae per gram of tissue had been reported to cause death, and as many as 1,000 larvae per gram had been discovered in the tissue of individuals who had reportedly died from other causes.Footnote 44
Drawing on medical literature, it is possible to construct crude guidelines for translating the number and intensity of infections (as measured by the number of trichinae per gram of tissue) into a range of estimates of the number of clinical cases. (See the Appendix for estimates of the number of clinical cases in the United States.) While it had been assumed that circa 1940 about 300 to 500 people a year suffered bouts of trichinosis, a cautious reading of the NIH research suggests that there were 40,000 clinical cases a year.Footnote 45 For every reported case of trichinosis, there were likely at least 80 clinical cases that either went unreported or were misdiagnosed as influenza, typhoid fever, food poisoning, rheumatism, or other affliction. It was assumed that 20 to 30 people a year died from trichinosis, but the NIH studies implied that it may have been at least ten times that number. In fact, one of the NIH's most prominent researchers, Maurice C. Hall, reached this very conclusion in 1937. Hall later speculated that the incidence of human infections had probably fallen since the 1880s because of the decline in the previously “widespread practice of feeding offal from slaughtered hogs to other hogs.”Footnote 46 If Hall was correct, every year tens of thousands of Americans suffered clinical cases of trichinosis from the late nineteenth century to 1940.
Agricultural Policy and Trichinosis
By 1966–1970 estimates suggest that about 4.2 percent of the U.S. population harbored trichinae—one-quarter the level in 1936–1941. Nevertheless, the new findings still indicated that there were approximately 150,000 to 300,000 new infections a year.Footnote 47
The decline in the incidence of trichinosis was due primarily to policies that attacked the disease in swine and had very little to do with a concern for human illness. Although modern veterinary opinion is divided, Salmon and other American leaders were probably correct in opposing microscopic inspection of pork. The process was too costly and too unreliable to warrant adoption. The real lapse was the failure to clean up the swine population by changing farming practices. By the 1880s researchers showed that garbage-fed swine had significantly higher infection rates than animals raised on pasture and grain. The decrease in the prevalence of trichinosis in American hogs before the 1950s was due primarily to the gradual decline in the practice of feeding swine offal containing raw hog scraps, which in turn was largely due to the changing location and economics of hog-raising.Footnote 48
Public health leaders had long advocated restrictions on feeding uncooked garbage to swine, but they were stymied by farmer and packer opposition. In 1952 Brock Chisholm, Director General of the World Health Organization, singled out the United States for its lax trichinosis policies. “There is no other well-developed country which allows trichinosis among its hogs as this country does, affecting the health literally of millions of people all the time.”Footnote 49 Coincidentally, U.S. policy began to change at this time. In 1952 a serious viral disease, vesicular exanthema, infected swine across the nation. Unlike trichinosis, this represented a serious financial threat to hog farmers. But like trichinosis, the new disease spread via infected garbage. Farmers now demanded government controls. In 1953 states began requiring that garbage be cooked, a practice long mandated in Canada and much of Europe. By 1957 all but one state had adopted mandatory cooking laws. The USDA also limited the interstate trade of garbage-fed swine products. In 1962 a national campaign to eradicate hog cholera increased the emphasis on cooking garbage. Between 1955 and 1965, the number of swine marketed that had been fed raw garbage fell from 374,000 to about 11,000 animals. At the same time, the prevalence of larvae in pork sausage plummeted.Footnote 50
The new evidence on the prevalence of trichinosis in swine and humans, along with recognition of the deficiencies of early testing procedures, casts a new light on the European-American pork trade wars. The embargoes were indeed justified on health grounds. Based on a comparison of autopsies, Hall concluded that, even allowing for significant measurement errors, Americans were several times more likely to be infected than Europeans, Americans were twice as likely to be infected as Germans who more frequently consumed raw pork. Furthermore, Americans of German ancestry were far more likely to be infected than their cousins in Europe. Hall's disdain for those who had touted the safety of U.S. pork was palpable: “Fifty years ago competent authorities concluded … that trichinosis was a minor public health problem in the United States, and this fiction, once established, had maintained itself … for a half century … . [t]here was an export trade involved, and … a desire to make facts fit the needs of that export trade played a role in the minds of observers who did not lack the mental qualifications to draw sound conclusions from adequate data, provided that they had an unprejudiced status in the matter.”Footnote 51
Other research buttresses Hall's conclusion. Almost all American pork exports were salted or cured. For decades American officials had assured the world that these processes killed the trichinae. But in 1920 USDA scientists noted that these guarantees had “no apparent basis except the opinions of the writers that made them,” and that several common curing methods yielded meat infested with dangerous levels of live trichinae.Footnote 52 Curing processes are extremely sensitive to several variables and seemingly small variations could mean total failure in destroying trichinae. As late as 1971 Zimmermann questioned the safety of American hams because “few ham processors exactly follow … the proscribed methods.” He further noted that “salt alone did not destroy trichinae in hams through a curing process of 40 days,” and even with very high salt concentrations “the results were sporadic.”Footnote 53
The BAI's handling of trichinosis represents a glaring blemish on the agency's otherwise sterling record of building institutions to control livestock diseases and protect human health. The delays in attacking the problem stemmed in good part from the opposition of farmers and meatpackers to regulation. In many other instances, the BAI's leaders fought toe-to-toe with special interests to protect public health, but not in the case of trichinosis. Once the USDA got serious about cleaning up swine feeding practices because of the threat of other swine diseases, trichinosis infections in swine and humans plummeted.
CONCLUSION
Future research will expand on many of the issues raised in this article. The control of animal diseases was vital for improving agricultural productivity. There has been an important synergy between veterinary and human medicine, and in the nineteenth and early twentieth centuries advances in veterinary medicine preceded those in human medicine. Controlling livestock diseases had enormous spillovers for human health. The United States was a world leader in building the institutions for livestock disease control—largely due to the vision and organizational skills of Daniel E. Salmon. The vehicle for Salmon's success was the BAI. Under his leadership, it grew into the strong central institution needed to overcome many of the market failures and free rider problems associated with infectious diseases. The BAI and the broader institutional structure that it helped create, including parallel state institutions, had to evolve in order to address many new and difficult challenges. The BAI excelled in creating both new knowledge and in transferring scientific findings into effective public policy.
An analysis of the costs and benefits of livestock disease regulations and policies, and a fresh look at the problems of food safety more generally, suggest that Progressive Era animal health regulations were a spectacular success. They greatly increased the efficiency of farm operations, promoted the international trade in livestock and livestock products, and were responsible for saving hundreds of thousands of American lives by the World War II era. As with the study of crop pests, the choice of a reasonable counterfactual world is necessary to the analysis of livestock diseases. Without the BAI, many livestock diseases would have spread out of control. Comparing the United States with Western Europe, Australia, Argentina, and other countries that harbored some of the same diseases and had access to the same science offers a crude basis for evaluating the BAI's record. In most cases, the United States was well ahead of the curve—with the utter failure to control trichinosis being a key exception.
Appendix
The Appendix first discusses the correlation between the intensity of trichinae infections, as measured in terms of larvae per gram of tissue, and the severity of the disease experienced. It next estimates the number of new trichinosis infections around 1940 utilizing data found in the NIH studies.Footnote 54 In order to extrapolate the NIH results to the entire U.S. population, it is necessary to account for differences in the age profiles of the study population and the U.S. population. The age-adjusting procedure is explained in detail in the second section of this Appendix.
The Intensity of Infections and Clinical Symptoms
The clinical course of trichinosis is quite varied and the severity of the disease cannot be classified based solely on the intensity of the infection. Nevertheless, the literature does offer rough guidelines that allow for the creation of a range of estimates linking the intensity of infection with clinical symptoms. Wright noted “that infections of 51 to 100 larvae per gram [of sample tissue] are capable of causing severe illness and it is highly probable that infections of 11–50 larvae per gram may cause pronounced symptoms.”Footnote 55 Wright was uncertain about the clinical significance of infections with fewer than 11 larvae per gram. Others have suggested different correspondences of larvae counts and symptoms. In their classic treatment of parasitic diseases, Franklin Neva and Harold Brown noted that infections with 1–10 larvae tend to be “mild or moderate,” those with 10–100 “can vary from moderate to severe, while infections with 100 or more larvae per gram are likely to be very severe or even fatal.” Lynne Shore Garcia suggests a less ominous picture—infections with 11–49 larvae per gram were apt to cause “mild” symptoms, those with 50–500 might generate “moderate” symptoms, and those above 500 larvae were likely to lead to “severe” cases. But Garcia's description of “mild” symptoms closely mimicked a prolonged case of influenza with fever, vomiting, headaches, muscle and joint pain, and many
Sources:
Derived from Appendix text and Appendix Table 2.
other symptoms lasting for as long as two months. This was a “mild” disease only in relation to more serious cases.Footnote 56
Appendix Table 1 reports a range of estimates for the annual number of clinical cases circa 1940. The estimates in the first column assume that all individuals with more than 10 larvae per gram of tissue experienced clinical symptoms (and that no one with 10 or fewer larvae experienced any symptoms). The estimates in the second column, following Garcia's lead, assume a demarcation line of 50 larvae per gram. The assumed lifespan of trichinae larva varies across rows. The literature typically uses five and ten years. Given this range of assumptions, the autopsy data indicate that there were between 40,000 and 225,000 clinical cases per year.Footnote 57
The death rate of officially diagnosed trichinosis patients varied substantially, sometimes reaching 30 percent. It averaged about 6 percent of reported cases in the United States in the 1930s.Footnote 58 Assuming that only 6 percent of those with more than 500 larvae per gram of tissue died would still infer approximately 200 to 400 deaths per year— over 10 times the official numbers. Using a 50 larvae per gram benchmark yields an estimated 2,400 to 4,900 deaths a year.
Age-Adjusting Procedure
The age profile of Wright's sample population is older (median age about 50) than that of the U.S. population (median age 29). Because younger people are more likely to have milder infections and are more likely to have live larvae present, Wright's data need to be age-adjusted in order to extrapolate his results to the whole U.S. population. The adjustment is made using the following procedure. The percentage of the population that lies within each age group in table III of the 1940 U.S. Census of Population is calculated. The percentage of Wright's sample population within each age group is also calculated.Footnote 59 The ratio of the percentage from the census data divided by the percentage from Wright's data provides the adjustment
Sources:
† U.S. Census Bureau, Census of Population, p. 10, table III; ‡ Wright, Jacobs, and Walton, “Studies on Trichinosis XVI,” p. 676, table 3; and § Wright, Kerr, and Jacobs, “Studies on Trichinosis XV,” p. 1307, table 6.
factor for each age group. For example, the census reports that 8 percent of the population was in the 55 to 64 age group while 20 percent of Wright's sample was in this age category, yielding an adjustment factor of approximately 0.4 for this age group. Columns 1–3 of Appendix Table 2 report the relevant age distributions and adjustment factors. For each infection data series (columns 4–7), the number of positives in a given age group is multiplied by its adjustment factor. These are reported in columns 8–11. To obtain the age-adjusted percentage of the population, it is necessary to sum across all age groups and divide by the total sample size of 5,271.Footnote 60
The next objective is to estimate the percentage of the population with both live larvae and total larvae concentrations per gram of tissue of more than 10 and more than 50 (Wright's and Garcia's benchmarks). Wright reported both the number with live infections and the various trichinae concentration levels by age group, but did not report the number of infections with live larvae by concentration level and age group together. This must be imputed by first determining the percentage of all infections with live larvae and multiplying this by the percentage with concentrations greater than 10 (or 50). Column 9 estimates that 7.27 percent of the U.S. population had live trichinae and column 8 estimates that 12.27 percent had live or dead larvae.Footnote 61
Dividing column 9 by column 8 suggests that about 59 percent of all infections contained live larvae. Multiplying the percent of the age-adjusted sample with concentrations greater than 10 (1.44 percent) and greater than 50 (0.52 percent) larvae per gram by 59 percent suggests that between 0.31 percent and 0.85 percent of the U.S. population had active clinical cases. To obtain the number of new clinical cases per year around 1940, multiply these estimates by the U.S. population (132 million) and divide by the estimated lifespan of trichinae larvae—between five and ten years.62 The results are reported in Appendix Table 1.