Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-25T15:13:09.013Z Has data issue: false hasContentIssue false

Phenomena complexity, disciplinary consensus, and experimental versus correlational research in psychological science

Published online by Cambridge University Press:  05 February 2024

Dean Keith Simonton*
Affiliation:
Department of Psychology, University of California, Davis, Davis, CA, USA [email protected] https://simonton.faculty.ucdavis.edu/
*
*Corresponding author.

Abstract

The target article ignores the crucial role of correlational methods in the behavioral and social sciences. Yet such methods are often mandated by the greater complexity of the phenomena investigated. This necessity is especially conspicuous in psychological research where its position in the hierarchy of the sciences implies the need for both experimental and correlational investigations, each featuring distinct assets.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2024. Published by Cambridge University Press

Almaatouq et al. describe an innovative way to improve experimental research in the behavioral and social sciences. Yet one serious oversight in their proposed solution stands out: The complete omission of any discussion of correlational methods. Correlations are nowhere mentioned in the text nor is there any reference to the common statistical procedures associated with correlational research, such as multiple regression, factor analysis, and structural equation models. Correlational research is especially common in various social sciences, like sociology, cultural anthropology, political science, and economics, and such research plays a major role in psychological science as well. The last point was treated in a classic paper by Cronbach (Reference Cronbach1957) titled “The Two Disciplines of Scientific Psychology,” the two disciplines being experimental and correlational (see also Tracy, Robins, & Sherman, Reference Tracy, Robins and Sherman2009). This bifurcation dates back to the earliest years of psychological research. Where Wilhelm Wundt founded experimental psychology, Francis Galton initiated correlational psychology, both in the latter half of the nineteenth century. But why do behavioral and social scientists adopt correlational methods when everybody knows that experimental methods are superior at making causal inferences? After all, “correlation can't prove causation” has become a proverb in research methods courses.

Ironically, Almaatouq et al. themselves provide a partial answer when they note “Social and behavioral phenomena exhibit higher ‘causal density’ (or what Meehl called the ‘crud factor’) than physical phenomena, such that the number of potential causes of variation in any outcome is much larger than in physics and the interactions among these causes are often consequential” (target article, sect. 2.1, para. 2). In other words, physical phenomena are less complex than behavioral and social phenomena. To provide an illustration, when Newton formulated his universal law of gravity with respect to two bodies, he needed only three independent variables: The mass of each body and the distance between the body centers. With that key formula he could accurately predict both the trajectories of projectiles and the orbits of the known planets. In contrast, imagine what is necessary to account for the romantic attraction between two human bodies. Easily dozens of variables would be required – far more than the 20 in the target article title. These would include numerous demographic variables, personality traits, situational factors, and various determinants of physical attractiveness. Moreover, these variables would have to be combined in such a way as to allow for unrequited love, when one body is attracted but the other repelled, which has no counterpart in the physical world. That much given, it is extremely doubtful that even the most complicated equation would ever predict romantic attraction as precisely and universally as Newton's gravitational formula. Many intangible factors, such as interpersonal “chemistry,” would necessarily be left out.

To be sure, phenomena complexity is by no means the only reason why researchers will adopt correlational rather than experimental methods. Experimenters must often face severe practical and ethical limitations that undermine their capacity to address certain significant questions. Variable manipulation and random assignment to treatment and control conditions are frequently rendered impossible except under draconian circumstances (as in Nazi death camps). Nevertheless, in this commentary I would like to focus on the complexity issue because that relates most closely to the rationale for the integrative experiment design advocated by Almaatouq et al. (for other important implications, see Sanbonmatsu, Cooley, & Butner, Reference Sanbonmatsu, Cooley and Butner2021; Sanbonmatsu & Johnston, Reference Sanbonmatsu and Johnston2019).

The philosopher August Comte (Reference Comte1839–1842/1855) was the first to suggest that the empirical sciences – those that deal with concrete subject matter (unlike abstract mathematics) – can be arrayed into a hierarchy. One of the criteria that he used to determine a discipline's ordinal placement was the complexity of the phenomena that are the target of investigation. This complexity helped explain why certain sciences, such as astronomy, were able to emerge and mature prior to other sciences, such as biology. Because the sciences were not defined in the same way in Comte's time, the social and behavioral sciences being largely absent, his scheme has undergone some modifications to make it more consistent with modern disciplinary categories (Cole, Reference Cole1983). This transformation then supported empirical research on whether Comte's hierarchy of the sciences could claim any validity (Benjafield, Reference Benjafield2020; Fanelli, Reference Fanelli2010; Fanelli & Glänzel, Reference Fanelli and Glänzel2013; Simonton, Reference Simonton2004, Reference Simonton2015; Smith, Best, Stubbs, Johnston, & Archibald, Reference Smith, Best, Stubbs, Johnston and Archibald2000). The hierarchy has been validated using multiple indicators, almost entirely objective but also including subjective ratings of the relative “hardness” of disciplines. The following results are representative (Simonton, Reference Simonton2015):

$$\!\eqalign{{\rm Physics}\, & > \,{\rm Chemistry}\,\gg \,{\rm Biology}\, > \,{\rm Psychology}\,\gg\gt\,{\rm Sociology}}$$

Here the physical sciences come first, followed by biology and psychology, and then sociology. The number of “>” symbols indicates the degree of separation, for the hierarchy is quantitative, not just ordinal. Thus, physics and chemistry are very close, and so are biology and psychology, but biology is more distant from chemistry and sociology is even more distant from psychology.

It is noteworthy that this hierarchy strongly corresponds with disciplinary consensus about what constitute the key findings in the field (Simonton, Reference Simonton2015), the very problem that Almaatouq et al. were trying to solve with their integrative experiment design. Yet it is evident that the hierarchy also aligns inversely with the relative prominence of experimental versus correlational methods, with psychology appropriately placed near the middle. However, that should not lead us to belittle correlational research as inferior. In fact, such methods feature definite advantages over the experimental. Probably the most significant is the multivariate response to the 20 questions problem: Why not answer all 20 questions at once using the suitable number of independent variables? And why not use multiple indicators that simultaneously implement alternative operational definitions for the central substantive variables? In short, why not use latent-variable structural equation models? It is not psychology or sociology's fault that their phenomena are often so complex that these models will incorporate many more variables than a typical experiment. Better yet, within psychological science, those subdisciplines that are more correlational exhibit higher replication rates than those that are more experimental (Youyou, Yang, & Uzzi, Reference Youyou, Yang and Uzzi2023; see, e.g., Soto, Reference Soto2019). That advantage certainly deserves consideration.

Acknowledgment

I thank Richard W. Robins for help in identifying two essential references.

Financial support

This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.

Competing interest

None.

References

Benjafield, J. G. (2020). Vocabulary sharing among subjects belonging to the hierarchy of sciences. Scientometrics, 125, 19651982. https://doi.org/10.1007/s11192-020-03671-7CrossRefGoogle Scholar
Cole, S. (1983). The hierarchy of the sciences? American Journal of Sociology, 89, 111139. https://doi.org/10.1086/227835CrossRefGoogle Scholar
Comte, A. (1839–1842/1855). The positive philosophy of Auguste Comte (H. Martineau, Trans.). New York: Blanchard. (Original work published 1839–1842).Google Scholar
Cronbach, L. J. (1957). The two disciplines of scientific psychology. American Psychologist, 12, 671684. https://doi.org/10.1037/h0043943CrossRefGoogle Scholar
Fanelli, D. (2010). “Positive” results increase down the hierarchy of the sciences. PLoS ONE 5(4), e10068. doi:10.1371/journal.pone.0010068CrossRefGoogle ScholarPubMed
Fanelli, D., & Glänzel, W. (2013). Bibliometric evidence for a hierarchy of the sciences. PLoS ONE, 8(6), e66938. doi:10.1371/journal.pone.0066938CrossRefGoogle ScholarPubMed
Sanbonmatsu, D. M., Cooley, E. H., & Butner, J. E. (2021). The impact of complexity on methods and findings in psychological science. Frontiers in Psychology, 11, 580111. https://doi.org/10.3389/fpsyg.2020.580111CrossRefGoogle ScholarPubMed
Sanbonmatsu, D. M., & Johnston, W. A. (2019). Redefining science: The impact of complexity on theory development in social and behavioral research. Perspectives on Psychological Science, 14, 672690. https://doi.org/10.1177/1745691619848688CrossRefGoogle ScholarPubMed
Simonton, D. K. (2004). Psychology's status as a scientific discipline: Its empirical placement within an implicit hierarchy of the sciences. Review of General Psychology, 8, 5967. https://doi.org/10.1037/1089-2680.8.1.59CrossRefGoogle Scholar
Simonton, D. K. (2015). Psychology as a science within Comte's hypothesized hierarchy: Empirical investigations and conceptual implications. Review of General Psychology, 19, 334344. https://doi.org/10.1037/gpr0000039CrossRefGoogle Scholar
Smith, L. D., Best, L. A., Stubbs, D. A., Johnston, J., & Archibald, A. B. (2000). Scientific graphs and the hierarchy of the sciences. Social Studies of Science, 30, 7394. https://doi.org/10.1177/030631200030001003CrossRefGoogle Scholar
Soto, C. J. (2019). How replicable are links between personality traits and consequential life outcomes? The Life Outcomes of Personality Replication Project. Psychological Science, 30, 711727. https://doi.org/10.1177/0956797619831612CrossRefGoogle ScholarPubMed
Tracy, J. L., Robins, R. W., & Sherman, J. W. (2009). The practice of psychological science: Searching for Cronbach's two streams in social-personality psychology. Journal of Personality and Social Psychology, 96, 12061225. https://doi.org/10.1037/a0015173CrossRefGoogle ScholarPubMed
Youyou, W., Yang, Y., & Uzzi, B. (2023). A discipline-wide investigation of the replicability of psychology papers over the past two decades. Proceedings of the National Academy of Sciences of the United States of America, 120(6), e2208863120. https://doi.org/10.1073/pnas.2208863120CrossRefGoogle ScholarPubMed