This is being written from Bielefeld, Germany, where I am a guest professor this year. At the moment (right before Christmas) speculation abounds that Foreign Minister Guido Westerwelle, leader of the liberal Free Democratic Party, will be driven out by his own party. There are many reasons for this, although unwelcome revelations via Wikileaks did not help, especially when it turned out that U.S. diplomats had access to confidential German coalition discussions via one of Westerwelle's own assistants in the FDP headquarters. As with most of the documents appearing on Wikileaks, publication right now—as opposed to decades from now in Foreign Relations of the United States—causes problems, but it is hard to fault U.S. diplomats for gathering information on such matters and discussing them candidly.
Eventually, the news and tensions and rumors set off by Wikileaks will die down, along with the melodrama surrounding that organization's predictably unpleasant leader. The main lessons will surely relate to the security of candid discussion in the digital age. But there are also large lessons to consider regarding the role of the United States in the world. Most optimistically, U.S. diplomats are clearly effective at their basic responsibility of figuring out and reporting what is happening in the countries where they are stationed. Wikileaks caused turmoil in numerous countries because the published cables contain many accurate observations and unwelcome truths. On the other side of the ledger, revelation of thousands of confidential U.S. diplomatic discussions caused such widespread damage because the United States maintains a presence almost everywhere. Wikileaks underscored the point by publishing a State Department list of strategically valuable sites around the world. It would be a challenge to find a square segment on any continent other than Antarctica of 500 × 500 miles that someone official in the United States has not identified as vital to national security for this reason or that. One does not have to be a neo-isolationist nostalgic for “no entangling alliances” to worry that such globalism is dysfunctional for the country and for everyone it deals with.
Amanda McVety's essay takes one back to when none of the above was true. At the time of Robert Skinner's 1903 mission to Ethiopia, American officials still had only the sketchiest information about vast areas of the globe, and it took an active imagination to argue that the United States had a concrete security interest in most places outside the Americas and the Pacific. Ethiopia, the article explains, could readily become a screen (at least in American minds) onto which different Americans, from President Theodore Roosevelt on down, projected hopes, fancies, and enthusiasms. Tracing the story forward to the Italian invasion of 1935, McVety recalls how difficult it was for Americans to become more specific when an accurate picture began to matter. To be sure, the pervasiveness of the United States originated in the crises of the 1930s, World War II, and the international situation that that catastrophe left. All the places around me right now and the family histories of everyone I know here underscore that reality.
The detached relation of most Americans to even traumatic upheaval overseas forms the backdrop to Lynn Dumenil's article. Women's groups in Los Angeles well understood that World War I was real and miserable, but it was still a distant event. A vast variety of women in the southern California city sought to help the war effort in the best ways that they could imagine or that authorities thought to request of them. Yet the women's groups were still in Los Angeles and most aware of it. Dumenil documents the extent that women mingled their efforts to aid troops overseas with existing agendas in and for their home city.
War grows distant in time as well as space, except for the veterans who left part of themselves there and for their families. Russell Johnson demonstrates that even in the war that came closest to most Americans, the Civil War, persistence in inherited practices combined with assumptions about how different social classes should live in peacetime to override imagination for the human damage of the war itself. The pensions granted to the wounded veterans Johnson traces were paltry, even after supposedly extravagant reforms and in comparison the experiences they went through during the war and to the situations they faced in their postwar lives. Johnson's data cast doubt upon the current historical argument that the Civil War pension system created a practical and ideological foundation for later welfare practices. Well, maybe it did, and that's a problem.
The remarks above and the three articles that prompted them all assume that people can derive meaningful knowledge and skills from the learning about and reflecting upon history. What if that proposition is true but cannot be documented by any systematic method available to social-science researchers? Thomas Fallace explains that by the early 1900s, perception, behavioral, and educational psychology had more or less demonstrated that all conventional explanations for the value of studying history had no empirical basis. When justifying required courses to freshmen, nearly all of us fall back on some loose version of the faculty psychology discredited a century ago. We insist with assurance that all students, including young people with no intention of studying history further, may benefit from the skills history encourages in research, the assessing of evidence, and the explanation of complex situations. That is, history develops judgment, a claim exposed as pseudoscience long before any of us were born, to the extent that such a moral, civic, and literary argument was ever meant to have a scientific basis.