We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Cambridge Companions are a series of authoritative guides, written by leading experts, offering lively, accessible introductions to major writers, artists, philosophers, topics, and periods.
Cambridge Companions are a series of authoritative guides, written by leading experts, offering lively, accessible introductions to major writers, artists, philosophers, topics, and periods.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When he assumed the presidency in January 1961, the forty-three-year-old John F. Kennedy and his glamorous wife Jacqueline transformed the White House into an exciting and inspiring set of images. Television, still a young medium, was for the first time in virtually every household in America. Three broadcast networks controlled what was seen on the national screen. The news appeared at dinnertime for a fifteen-minute period, and would soon expand to a half-hour. This was followed by “prime-time” entertainment consisting mainly of Westerns and family situation comedies such as Father Knows Best. Heavily censored, these shows provided Americans with an idealized reflection of themselves. Kennedy and his family brought to the news the same telegenic good looks, knowledge of Hollywood and the media, and innate sense of drama and high style found in the nation's entertainment. Novelist Norman Mailer had predicted before the election that with Kennedy in the White House the American frontier myth would “emerge once more, because America's politics would now be also America's favorite movie, America's first soap opera, America's best-seller.”
Two events which took place near the turn of the twentieth century are instructive places to begin to understand the dynamics of American religion: the World's Parliament of Religions in Chicago and the passage of the Eighteenth Amendment to the US Constitution. The former, which was held in conjunction with the World's Columbian Exposition in 1893, was the first time in US history that a conscious attempt was made to promote interreligious dialogue on a significant scale. Representatives of every conceivable tradition were invited by the organizers, who were themselves of the liberal wing of the itself liberal Unitarian movement, and a wide variety of spokespeople took advantage of the opportunity to explain to the American public exactly what their traditions taught. The event was not without controversy: Roman Catholic bishops, for example, were divided over the wisdom of participating, although the forward-looking Archbishop John Ireland of Minnesota decided that the risks of seeming to relativize his church were worth the opportunity of gaining a sympathetic hearing. In addition to the more general acknowledgment of religious pluralism in the nation, the Parliament resulted in the American public's having an opportunity to witness the diversity that already characterized the national religious scene as well as a chance to learn about heretofore exotic traditions such as Hinduism, which for the first time now began to reach an audience beyond the minute number of ethnic South Asians then resident in the country.
About half of the nearly ten million African Americans living in 1900 had been born during the slavery period, and while slavery had not yet receded into the distant past, it seemed important to the former slaves and their descendants to stress the distance they had traveled from that past. Only forty years earlier, the overwhelming majority of black Americans - more than 85 percent - had belonged to and could be bought and sold by white owners, a deep-seated contradiction in one of the world's oldest democracies with a founding document that declared that “all men are created equal.” “Natally alienated” (to use Orlando Patterson's term), slaves were forced to perform unpaid labor, without any civil status that would guarantee them even such basic human rights as the right to marry, to raise their own children, or to learn how to read and write. Slavery was, and remained for a long time, a haunting and troubling memory, a scar of shame. Emancipation, which seemed like a rebirth from a state of social death, was indeed a “resurrection” from the tomb, as Frederick Douglass's famous slave narrative had represented his own transformation from the status of a slave to that of a self-freed man.
American women's lives changed in many crucial respects over the course of the twentieth century. In 1900, domesticity framed most women's lives; few obtained education after the age of fourteen. Yet while almost all white women left the formal labor force after marriage, many African American women remained economically active throughout their adult lives. The vast majority of women married by the age of twenty-two or twenty-three, and stayed with their partner until he died. Divorce was a rarity. The average woman had four children. She typically survived the departure from home of her youngest child by only a handful of years, so living alone was a rarity. Many women participated in social and political events outside the home, even though only a small number could vote, mostly in local elections and some western states. Few women held public office, yet they worked effectively outside the political mainstream in various reform movements and voluntary activities.
By 2000, economic activity throughout adulthood has become the norm, with a shift in predominant occupations from domestic service and factory labor to paper-based employments in the professions and offices of the country. While domesticity still figures in the female experience, especially when children are young, few women remain outside the labor force entirely, and race and marital status have less affect on employment rates. Education levels have skyrocketed for women as well as men from every race and ethnic group. Almost all young women graduated from high school in 2000 and about half entered higher education.
Religion in the United States currently takes on a very visible - and in ways puzzling and disturbing - role in public life. In 2004, George W. Bush was re-elected President of the United States with strong support from evangelical Christians. His God-and-country rhetoric and support for government funding of faith communities signaled a worrisome alliance between political neoconservatives and evangelical Christianity and led to a blurring of boundaries between religion and government, despite an official legal separation of church and state. To critics, it looked, and looks, as if a national religion has been “institutionalized.”
Recent developments have spurred secular reaction. One of the clearest signs of the reaction was the lawsuit brought to the Supreme Court in 2004 by Michael Newdow, an atheist who charged that the phrase “a nation under God” in the pledge of allegiance as it was recited in public schools violated the separation of church and state and was therefore unconstitutional. Though the court has not ruled on the substance of the case, it has spawned considerable controversy touching on what amounts to a sensitive and unresolved issue in American national identity.
Perhaps a million immigrants came to America between 1565 and 1800, about 20 million in the nineteenth century, and at least 55 million in the twentieth century. During the twentieth century, particularly after World War II, as American immigration laws and regulations became more complex, the phenomenon of illegal immigration became increasingly significant. The numbers above include some 10 million illegal twentieth-century immigrants.
Even these approximate numbers are, in a sense, illusory, as they seem to record a permanent move from one nation to another. Yet, from the earliest colonial times, many who came either returned or went somewhere else, and many of those came back again. Specialists estimate that perhaps one immigrant in three later left. Many of these, often called sojourners, always intended to return: but many who came as sojourners - usually to make money - actually stayed, while others, who came intending to remain, eventually left. Almost certainly the most reliable statistic about American immigration is the incidence of immigrants - that is persons who were born somewhere else - in the total population.
What we know today as early modern philosophy was forged in the opening years of the seventeenth century, in the writings of such thinkers as Francis Bacon, Thomas Hobbes, and René Descartes. We think of this period as the beginning of modern philosophy in part because these philosophers saw themselves as the vanguard of an intellectual revolution, whose goal was to break with the philosophy of the past. Here they identified their most important target as Aristotle, whose teachings in logic and metaphysics had dominated educated opinion in Europe through most of the previous millennium. Almost all of the best-known philosophers and scientists of the seventeenth century saw Aristotle’s views as a significant impediment to the advance of knowledge, and believed that progress could only begin once the edifice of Aristotle’s system had been razed and philosophy could begin to rebuild on solid foundations. The metaphor of demolishing the old to make room for the new is familiar to students of philosophy from Descartes’s First Meditation, but the English philosopher Francis Bacon had employed it some twenty years before Descartes. In his New Organon (another allusion to Aristotle, whose logical works were known as the organon, or “instrument”), Bacon declares: “It is futile to expect a great advancement in the sciences from overlaying and implanting new things on the old; a new beginning has to be made from the lowest foundations, unless one is content to go round in circles for ever, with meagre, almost negligible, progress” (New Org., I.31).
Every year, on March 22, Riverside, Iowa, celebrates an event that has not yet happened and never will. It is the place and date designated for the birth of Captain James Tiberius Kirk, Captain of the Star Ship Enterprise. America has so successfully colonized the future that it has mastered the art of prospective nostalgia. Its natural tense is the future perfect. It looks forward to a time when something will have happened. It is a place, too, where fact and fiction, myth and reality dance a curious gavotte. It is a society born out of its own imaginings.
There are those who believe they can remember alternative past lives. The science fiction writer Philip K. Dick claimed to remember a different present life. In his case it may have had something to do with amphetamines, but in fact we do inhabit different and parallel presents. The 1920s constituted the jazz age, except for those who tapped their feet to different rhythms. The 1960s were about drugs and rock and roll, except for the majority for whom they were not. Thoreau once wrote of his wriggling his toes in the mud of Walden Pond in search of the rock beneath. The search for a secure foundation is understandable but cannot always be satisfied. Nineteenth century American writers dealt in symbols for a reason. Unlike the metaphor, the symbol suggested a field of meaning, an ambiguity which in the end perhaps could more truthfully capture a world in flux, desperate for clear definitions yet aware that in stasis lay a denial of, rather than a route to, meaning in a society wedded to the idea of possibility, always coming into being and never fixed.
When the French writer Simone de Beauvoir visited the United States in 1947, she was deeply saddened by the conformism that seemed so pervasive. “This country, once so passionate about individualism,” she later recalled, “had itself become a nation of sheep; repressing originality, both in itself and in others; rejecting criticism, measuring value by success, it left open no road to freedom except that of anarchic revolt; this explains the corruption of its youth, their refuge in drug-taking and their imbecile outbreaks of violence.” Beauvoir conceded that some books and films pointed to political resistance; and “a few literary magazines, a few almost secret political newsletters” also “dared oppose public opinion.” But such artifacts could gain little traction against “the anti-communist fanaticism of the Americans” which “had never been more virulent. Purges, trials, inquisitions, witch-hunts - the very principles of democracy had been rejected.” The air that she had breathed in the United States had become “polluted.”
No major advanced industrial nation has suffered less or profited more from its twentieth-century wars than the United States. Nor has any nation dispatched its troops to as many places across the globe in the late twentieth century to defend and extend its national interest. At the end of the nineteenth century, the United States possessed one of the smallest armies in the industrial world; a century later its armed forces spanned the globe, bristling with deadly hardware and sophisticated technology, a military power without peer. To a large extent, this remarkable transformation had resulted from participation in two European wars, which had necessitated a reorganization of society and the establishment of new controls over its citizens.
The Spanish-American and Philippine-American Wars
By the 1890s, many influential Americans believed their economy required access to foreign markets to avoid future depressions. Incorporating this notion into a broader ideological framework, influential policymakers sought to establish an indirect control of large areas of the Caribbean and the Pacific. These ideas, informed by notions of racial hierarchy and articulated through a gendered vocabulary, provided the larger context for the war of 1898, as two presidents faced a growing Cuban insurrection against Spanish rule.
At the beginning of the twentieth century, the American theatre was for the most part a medium of mass entertainment. In the cities, the theatre meant popular melodrama in enormous theatres like the Bowery in New York as well as the likes of Sarah Bernhardt from France touring in plays by Rostand and Racine and Ellen Terry and Henry Irving from England playing in Shakespeare and Shaw. American stars E. H. Sothern, Julia Marlowe, and Richard Mansfield acted in Shakespeare, and Ethel, Lionel, and John Barrymore starred in contemporary plays by American and English writers. The theatre also meant numerous American companies touring in old American standards like the ubiquitous Uncle Tom's Cabin and James O'Neill's thirty-year vehicle, Monte Cristo. Increasingly risqué revues like the Ziegfeld Follies played alongside minstrel shows and the wholesome family entertainment of vaudeville. To the early twentieth-century public, the theatre included burlesque, circus, and “extravaganza,” as well as the Yiddish theatre, the settlement house theatre, and the puppet theatre.
The understanding of what knowledge consists in, how it is to be secured, the means by which discoveries are to be made, and the means by which purported knowledge is to be legitimated or confirmed were all questions that were disputed intensely in the course of the sixteenth and seventeenth centuries. These disputes were partly the outcome of developments in natural philosophy, but in some cases they lay partly at the source of these developments. They began, in the early sixteenth century, with reflection on Aristotle’s doctrine of method and scientific explanation, but soon turned into increasingly radical revisions to this doctrine. By the beginning of the seventeenth century, they took the form of a search for a wholly new approach, with several different, novel methodological models being advocated. The search for a satisfactory method is not a wholly linear development, however, and two sets of factors serve to overdetermine what is already quite a complex issue. The first turns on the fact that questions of method not only have direct connections to substantive developments in natural philosophy itself, but also to the relation between natural philosophy and the other disciplines (most notably metaphysics and theology), as well as to the question of what kinds of skills and virtues the practitioner of natural philosophy requires. Secondly, questions about the appropriate method for scientific disciplines become translated into questions about the legitimation of the scientific enterprise as a whole.
In the nineteenth century, poets like Whitman and Dickinson seemed to thrive on the impulse to push at boundaries and to seek out new idioms for an American vernacular poetics. In reaction to the weary genteel romanticism of much poetry at the turn of the nineteenth century, this transgressive impulse became more pronounced with the innovations generated by modernist poets such as William Carlos Williams, T. S. Eliot, and especially Ezra Pound, whose mantra “make it new” encapsulates this energetic thrust. While modern American poetry is indeed a broad, disparate field, embracing a range of practices and styles, nevertheless, no study of American poetry in the twentieth century can legitimately ignore the signal contributions of the modernists T. S. Eliot, Ezra Pound, William Carlos Williams, Robert Frost, and Wallace Stevens. Many of the driving formulations and elaborations of contemporary poetics owe themselves to Pound's intervention in what he saw as the dilapidated and dead-end poetics of late-nineteenth-century romanticism.
Historians commonly date the beginning of early modern epistemology and metaphysics from Descartes’s attempt in the Meditations to find a foundation for knowledge that is immune to skeptical challenge for an individual self-critical mind. There is no comparable consensus about when early modern ethical philosophy begins, but, as J. B. Schneewind has argued, it makes sense to link it similarly to an engagement with forms of ethical skepticism in the writings of Montaigne in the late sixteenth century and Hugo Grotius in the early seventeenth. If one were to seek a parallel canonical moment, one might do no better than a passage in Grotius’s On the Law of War and Peace (1625), in which Grotius puts into the mouth of the ancient skeptic Carneades the challenge that “[T]here is no law of nature, because all creatures … are impelled by nature towards ends advantageous to themselves … [C]onsequently, there is no justice, or if such there be, it is supreme folly, since one does violence to his own interests if he consults the advantage of others.”
To appreciate the force of this challenge, we must know what Grotius and his contemporaries would have understood by a “law of nature.” Natural laws (of the normative or ethical sort) were thought of as universal norms that impose obligations on anyone who is capable of following them, on all moral agents, rather than on citizens of a more specific jurisdiction. And, differently from positive law, they were thought to require no positing, legislative act, at least no human one.