Johannes Lenhard reports in the foreword to this excellent book that only two of its nine chapters had Ann Johnson’s direct input before she passed away in 2016, with the remainder requiring him to “reinvent and reexperience” their joint discussions from the past. Though I did not know her well, I had the pleasure of interacting with Ann Johnson at various philosophical events—reading groups and lectures—during my doctoral studies at the University of South Carolina. I am glad to report that Ann’s voice, and more importantly, her ideas and contributions to the history and philosophy of science, echo loudly throughout the pages of Cultures of Prediction.
As the title suggests, Lenhard and Johnson describe and analyze various cultures of prediction within the past four hundred years of the history of engineering. They define “cultures of prediction” as the practices of prediction making that developed alongside complementary ideas and tools in mathematics and technology. Thus, the book explores the coevolution of predictive practices alongside a variety of mathematical tools and other elements of science: epistemology, technology, and social organization. Through their intentionally interdisciplinary analysis (which integrates philosophy of science, history of science, and science and technology studies), they identify four primary cultures of prediction that developed over the course of the history of science and engineering.
The first two cultures of prediction are closely intertwined in the history of science: the rational culture and the empirical culture. The rational culture begins by assuming that there are mathematical laws of nature that “capture the world’s structure and determine predictions through mathematical analysis and derivation” (13). The empirical culture, however and as the name suggests, begins with the empirical data and observations, from which extrapolative predictions can be made. Johnson and Lenhard elucidate the two cultures through three episodes from the history of ballistics (ch. 2). Their key insight comes in the third episode between Benjamin Robins and Leonhard Euler. Robins’s project demonstrated an empirical mode of prediction, relying upon the collection of new data from, among other things, numerous experiments, new measurement devices for air resistance, and gunpowder standardization. Euler’s approach, however, was to take the manuscript written by Robins and use mathematical analysis and tools to better understand the true trajectory of a projectile. His approach simplifies various empirical data for tractability to allow for the derivation of predictions from equations. Of course, these two cultures are not completely distinct and, indeed, later episodes show how the empirical and rational modes of prediction hybridize (ch. 3), but the difference in approaches is evident: one focused on experimental data and measurements and the other focused on mathematical derivations.
The third culture of prediction they explore is the iterative-numerical culture in which predictions are created through the iteration of simple algorithms. The culture developed in large part because of the development of digital computers that allowed for iterative methods that had previously been too time-consuming and thus restrictive to have been pursued in prior years. Johnson and Lenhard explore the development of this culture through the example of The Limits to Growth, a landmark study from the 1970s that used a computational algorithm to predict global collapse with continued growth of economy, pollution, and population (ch. 5). The use of computational models represented a new stage of predictive culture in part because it allowed for predictions even in the cases in which there was complex dependency with no clear analytical solution. Within this culture, “predictions based on computer models become a sine qua non for predicting” (125). Notably, at this stage, the limited access to computers constrained the iterative-numerical culture to a particular form of use of computers—one in which the models were not routinely changed.
Widely available access to capable and networked computer systems provided the foundation for a new culture of prediction. This culture of prediction, the final one they discuss in the book, is what they call the exploratory-iterative culture of prediction. In this mode of prediction, the predictive computational models become the subject of the predictive practice. Scientists modify models by examining the relationship between various kinds of inputs and outputs. This culture grows largely out of the development of more widely available computer systems, which make the prospect of model modification more practical.
Johnson and Leonhard (along with Hans Hasse, the head of a thermodynamic engineering laboratory, who coauthored chapter 8) demonstrate how this approach is used in the context of thermodynamic engineering. When developing equations of state to explore the thermodynamic effects of various contexts (e.g., what will be the pressure of this novel chemical when held at 15 degrees Celsius, with this quantity, and in this container?), thermodynamic engineers use exploratory models with adjustable parameters. These models are used to create simulated data that can be compared to experimental data, allowing for a feedback loop in which the model parameters can be adjusted. Because they have sufficient computational power available in their laboratory, they are able to make adjustments to parameters again and again, without taking excessive time to yield results.
Of course, it is worth being clear—as Johnson and Lenhard rightly note in the conclusion (ch. 9) and, indeed, throughout the course of the whole book—that these cultures of prediction do not neatly sort out into pure instances of empirical, rational, iterative-numerical, or exploratory-iterative. In reality, historical cases (as their detailed explorations demonstrate) almost always involve elements from a range of cultures. Such is the case with the empirical and rational modes, but also with the iterative-numerical and exploratory-iterative modes.
Indeed, the hybridized, multidimensional analysis of the case studies in the book is among Johnson and Lenhard’s greatest achievements and stood out as my favorite part of the book. Rather than telling a neat and clean story about four distinct phases of the history of prediction, each with clear-cut and pure examples of the given culture of prediction, the authors dig and lean into the messiness of the historical cases. The complexity of the cases is not a detriment to their four-part categorization of cultures of prediction—that categorization is not meant to provide hard and fast distinctions, but rather to provide scaffolding for understanding a variety of predictive practices. Their careful study of the cases from the history of science and engineering reveals the intricate means by which these cultures of prediction in science were developed contextually, sociologically, as an interplay between individuals, scientific practice, and the various mathematical and scientific tools available within that practice. In short, their case studies studiously demonstrate that prediction is not a given within scientific practice, but rather one of many laborious arts (like representation [Boesch Reference Boesch2022]) scientists undertake in light of and because of the broader context of scientific practices.
It is also worth paying attention to the way that cultures of prediction are likely to continue to evolve, especially as various new kinds of computational tools and systems become available. In the conclusion (ch. 9), Johnson and Lenhard offer some preliminary insights into whether some new predictive culture is likely in the offing, given the rise of artificial neural networks. There are several distinctive features of these systems that could give reason to believe that they are a computational tool that will lead to a new culture of prediction. One such feature is the opacity of the model and the way in which adjustments to parameters may be made. Depending on the complexity of the model developed by the artificial neural network, it may be intractable to human users, resulting in an opaque system (Humphreys Reference Humphreys2004). Furthermore, artificial neural networks tend to make models that depend entirely on a large set of parameters, rather than on broader theories. For these reasons, there is justification to expect that a new culture of prediction may indeed emerge given the novel tools of machine learning. However, as they note, we must wonder whether such a culture would prioritize prediction over other goals of science, like explanation.
Whether or not some new culture of prediction emerges in the future—be it because of the development and greater use of artificial neural networks or of some other novel mathematical method or tool—the work of Johnson and Lenhard will provide us with a road map for identifying and understanding that new culture and how it relates to the others they so helpfully identified in this excellent book.