The short- and long-term effects of early developmental experiences (e.g., harsh parenting, neglect) and environmental exposures (e.g., poverty, single parenthood) have long concerned scholars, clinicians, parents, and policymakers alike. Initially, the most pressing question was determining what the effects of early adversity and support were on child, adolescent, and even adult development. As work in this area evolved, the core questions shifted – in both basic science and applications to interventions – to understanding how such effects become instantiated (i.e., mechanisms of influence); why, from the perspective of human evolution, contextually induced development operates the way it does; and which core underlying dimensions of adversity influence development. Here we seek to address these questions within an integrated model of dimensions of environmental experience. We argue that how and why are not alternative explanations. Instead they are central to any complete explanatory framework and thus essential to understanding which dimensions of the environment matter.
A highly influential approach to documenting and explaining the consequences of early adversity has been cumulative risk (Evans et al., Reference Evans, Li and Whipple2013; Felitti et al., Reference Felitti, Anda, Nordenberg, Williamson, Spitz, Edwards, Koss and Marks1998). The general hypothesis guiding cumulative-risk research is that the more environmental risks to which children are exposed, the poorer or more compromised will be their development, be it cognitive, linguistic, social, emotional, or even physical health. The cumulative-risk approach assumes that discrete forms of adversity have additive effects on developmental outcomes, and that different kinds of adversity do not produce distinct changes in behavioral or neural development (e.g., Smith & Pollak, Reference Smith and Pollak2021). In this view, no single form of adversity is more important or influential than another in shaping human development. Although the cumulative risk approach has advanced the field by establishing the profound power of early adversity to statistically predict the emergence of developmental problems, it must be appreciated that prediction does not equal explanation. The cumulative-risk approach was developed to explain the what of development, not how, why, or which.
The central goal of this paper is to move beyond cumulative risk by addressing the how, why, and which of development in contexts of adversity. Toward this end, we focus our analysis on two recent dimensional models of adversity, each of which reconceptualizes the central question of interest in cumulative risk research: What are the effects of early adversity? Ellis et al. (Reference Ellis, Figueredo, Brumbach and Schlomer2009) reconceptualized this question from an evolutionary why perspective based in life history theory. This approach led Ellis and colleagues to propose which dimensions of early adversity matter – harshness and unpredictability – for regulating variation in development of life history strategies. McLaughlin and Sheridan (McLaughlin, Sheridan, & Lambert, Reference McLaughlin, Sheridan and Lambert2014; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2014) reconceptualized this question through a mechanistic how lens, focusing on neurodevelopmental pathways linking adversity with developmental outcomes. This neurobiological approach led McLaughlin and Sheridan to highlight which dimensions of early adversity matter – threat and deprivation – in relation to neurodevelopmental mechanisms that link these dimensions with psychopathology and related emotional and cognitive outcomes. Although the harshness-unpredictability and the threat-deprivation models developed independently and focus on different dimensions of adversity, a major goal of this paper is to integrate these two frameworks.
We begin by reviewing the main theoretical assumptions on which the harshness-unpredictability and threat-deprivation models are based. This involves discussing why, based on a history of natural selection, development operates the way it does across a range of environmental contexts, and how the neural mechanisms that underlie plasticity and learning in response to environmental experiences influence brain and psychological/behavioral development in children. Then we present the basic tenets of the harshness-unpredictability and threat-deprivation models. With that foundation in place, we endeavor to integrate these frameworks.
To demonstrate the added value of jointly addressing the questions of why and how for advancing an understanding of which, we propose an integrated model of dimensions of environmental experience. That model is presented in five sections. The first (Core Integrative Concepts) discusses three guiding assumptions that underpin the integrated model: (a) the directedness of learning, (b) different levels of developmental adaptation to the environment, and (c) relations between adaptive and maladaptive processes in regulating developmental adaptations to adversity. The second and third sections (Integrative Discussion of Harshness and Threat; Integrative Discussion of Harshness and Deprivation) conceptualize threat and deprivation as separate sources of harshness. Harshness constitutes at least two distinct adaptive problems: morbidity–mortality from harm imposed by other agents (indicated by threat-based forms of harshness) and morbidity–mortality from insufficient environmental inputs (indicated by deprivation-based forms of harshness). Cues to each of these adaptive problems range from more proximal to the child (immediate experiences of threat and deprivation) to more distal to the child (ecological factors linked to threat and deprivation). Experiencing these cues does “double duty” in terms of calibrating development to both immediate rearing environments and broader ecological contexts, current and future. Although deprivation primarily constrains development, it may alter phenotypes in ways that enable individuals to make the best of bad circumstances. The fourth section (Integrative Discussion of Environmental Unpredictability) focuses on unpredictability in both proximal and distal cues to threat-based and deprivation-based forms of harshness. Discussion centers on potential mechanisms through which individuals detect and respond to environmental unpredictability. The final section (Summary of Integrated Model of Dimensions of Environmental Experience) attempts to pull these pieces together; it reviews how a mechanistic and neurobiological analysis of development informs understanding of dimensions of adversity in ways that refine and extend the harshness-unpredictability model, and how an evolutionary analysis of development informs understanding of adaptation to adversity in ways that refine and extend the threat-deprivation model.
Throughout these integrative sections, we outline actionable directions for research – aimed at understanding the pathways through which early-life adversity shapes development – that are needed to advance the proposed integrated model of dimensions of environmental experience. Our overarching goal is to show how this integrated model can be leveraged to advance an understanding of why, how, and which dimensions of adversity influence cognitive, emotional, and neurobiological development, as well as physical health. Before proceeding further, however, it is important to acknowledge that most of the research reviewed herein is not genetically informed. This means, of course, that apparent effects of early adversity on developmental outcomes could reflect gene–environment correlations and not environmental causation. Further, individuals may systematically differ in their susceptibility to environmental influences (Belsky & Pluess, Reference Belsky and Pluess2009; Boyce & Ellis, Reference Boyce and Ellis2005; Pluess & Belsky, Reference Pluess and Belsky2013), and thus apparent effects of early adversity could apply to some children more than others. These limits to understanding should be kept in mind in the conceptual analysis to follow.
Why? An evolutionary-developmental approach to early adversity
Over the course of human evolutionary history, our ancestors encountered recurring environmental contexts that influenced their survival and reproductive success. These different contexts constitute adaptive problems – distinct selection pressures posed by the environment – that individuals had to solve to survive and pass on their genes to future generations, the latter being the ultimate goal of all forms of life. A core assumption of an evolutionary approach is that solutions to adaptive problems differ markedly across domains and environmental conditions (e.g., Buss, Reference Buss1995). In children, the adaptive problems posed by food scarcity, emotional neglect, parental violence, and parental relationship dissolution – which in all probability have characterized human developmental experiences since time immemorial – require different kinds of solutions. As Symons (Reference Symons1992) pointed out, there is no such thing as a general solution (e.g., high stress reactivity) because there is no such thing as a general problem (e.g., early adversity). Evolved psychological, behavioral, and physiological processes should be sensitive to different experiences that, over our evolutionary history, recurrently indicated the presence of different adaptive problems. In total, an evolutionary-developmental perspective strongly implies that natural selection shaped our developmental systems to detect and respond to particular forms of adversity. This prioritization of particular sources of environmental information that were historically linked to fitness provides the evolutionary foundation for dimensional models of early adversity.
Developmental adaptation to stress
A widespread assumption in developmental science is that children raised in supportive and well-resourced environments tend to develop normally and manifest so-called “optimal” trajectories and outcomes. Notions of “normal” and “optimal” in this context all-too-often imply that the natural course of human development, from youth to later in life, is to become secure, autonomous, self-regulating, prosocial, intimate in the context of pair bonds, hardworking, and sensitively responsive in parenting, as well as physically healthy, happy and long lived. By contrast, children exposed to early adversities are at risk for developmental dysfunction, dysregulation, and disorder – the opposite of “normal” and “optimal” – leading to problem behaviors that are destructive to themselves and others. In scholarly, policy, and journalistic writing, these assumptions are powerful and pervasive, even if usually implicit, and underlie prominent models of development that focus on dysregulation and pathology, such as cumulative risk.
An evolutionary-developmental analysis challenges the prevailing view of so-called dysfunctional, dysregulated, or maladaptive development, especially when it arises within contexts of early-life stress (Belsky, Reference Belsky2008; Belsky et al., Reference Belsky, Steinberg and Draper1991; Chisholm, Reference Chisholm1999; Ellis & Del Giudice, Reference Ellis and Del Giudice2014; Ellis et al., Reference Ellis, Boyce, Belsky, Bakermans-Kranenburg and Van IJzendoorn2011; Ellis et al., Reference Ellis, Del Giudice, Dishion, Figueredo, Gray, Griskevicius, Hawley, Jacobs, James, Volk and Wilson2012; Hinde & Stevenson-Hinde, Reference Hinde and Stevenson-Hinde1990; Richters & Cicchetti, Reference Richters and Cicchetti1993). This analysis begins with the assumption that early-life stress has always been part of the human experience. Indeed, almost half of children in hunter–gatherer societies (the best model for human demographics before the agricultural revolution) die before reaching adulthood (e.g., Kaplan & Lancaster, Reference Kaplan, Lancaster, Wachter and Bulatao2003; Volk & Atkinson, Reference Volk and Atkinson2013), making childhood – the time of the human life cycle when the force of selection is the strongest (Jones, Reference Jones2009; Volk & Atkinson, Reference Volk and Atkinson2008) – an intensive window for natural selection to operate on biobehavioral adaptations to stress. From an evolutionary–developmental perspective, therefore, early adversity should not so much impair biobehavioral systems as direct or regulate them toward patterns of functioning that, even if costly, are adaptive under stressful conditions (Belsky et al., Reference Belsky, Steinberg and Draper1991; Ellis & Del Giudice, Reference Ellis and Del Giudice2014, Reference Ellis and Del Giudice2019; Frankenhuis & Amir, Reference Frankenhuis and Amir2021).
Life history strategies
Life history theory (Roff, Reference Roff2002; Stearns, Reference Stearns1992) addresses how organisms allocate their limited time and energy to the various activities that comprise the life cycle, namely physical and cognitive growth, maintenance of bodily tissues, mating, and parenting. Since all these activities ultimately contribute to an individual’s (reproductive) fitness, devoting time and energy to one typically involves both benefits and costs, thereby engendering tradeoffs between competing domains. As applied to developmental science (e.g., Belsky et al., Reference Belsky, Steinberg and Draper1991; Del Giudice et al., Reference Del Giudice, Gangestad, Kaplan and Buss2015; Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009), the central focus of life history theory is on delineating these resource-allocation tradeoffs; explaining how developmental experiences and environmental exposures shape these tradeoffs; and modeling the effects of these tradeoffs on variation in demographic life history traits (i.e., traits related directly to rates of reproduction such as age at sexual maturation and offspring number) and their phenotypic mediators (such as speed of biological aging and quality of parenting). Life history traits and their mediators form a constellation of coadapted characteristics that, together, are referred to as life history strategies (Del Giudice et al., Reference Del Giudice, Gangestad, Kaplan and Buss2015; Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009; Figueredo et al., Reference Figueredo, Vasquez, Brumbach, Schneider, Sefcek, Tal, Hill, Wenner and Jacobs2006). Over both evolutionary and developmental time, life history strategies coordinate morphology, physiology, and behavior in ways that historically enhanced expected fitness within the constraints of a given environment (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009), even if links to fitness no longer hold in contemporary contexts.
At the broadest level of analysis, life history-related traits covary along a dimension of slow versus fast. Although there is ongoing debate about the robustness of the slow-fast continuum across and within species (for a review, see Del Giudice, Reference Del Giudice2020), and about the best ways to characterize human life history variation (e.g., demographic traits vs. phenotypic mediators; see Copping et al., Reference Copping, Campbell and Muncer2014; Figueredo et al., Reference Figueredo, de Baca, Black, García, Fernandes, Wolf and Anthony2015), substantial empirical evidence supports a slow-fast continuum in humans. Specifically, some individuals adopt slower strategies characterized by later reproductive development (especially in girls) and delayed sexuality, preferences for stable pair bonds and high investment in parenting, an orientation toward future outcomes, low impulsivity, and allocation of resources toward enhancing long-term survival; others display faster strategies characterized by the opposite patterns (e.g., Belsky et al., Reference Belsky, Steinberg and Draper1991; Del Giudice et al., Reference Del Giudice, Gangestad, Kaplan and Buss2015; Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009; Figueredo et al., Reference Figueredo, Vasquez, Brumbach, Schneider, Sefcek, Tal, Hill, Wenner and Jacobs2006). Fast life history strategies are comparatively high risk, focusing on mating opportunities (including more risky and aggressive behavior), reproducing at younger ages, and producing a greater number of offspring with more variable developmental outcomes. Tradeoffs incurred by faster strategies include reduced health, vitality, longevity, and offspring quality. Importantly, there is no presumption that one strategy is inherently better – more adaptive – than the other; developmental context determines that.
Life history models provide a framework for explaining why, based on a history of natural selection, development operates the way it does across a range of environmental contexts. As discussed below (see The Harshness-Unpredictability Framework), life history strategies have been shaped by natural selection to be developmentally responsive to specific dimensions of adversity.
How? The centrality of mechanism
Adverse environmental experiences can have particularly strong and enduring consequences when they occur early in life. This is due at least in part to the heightened neural plasticity that characterizes brain development during childhood and adolescence (Kolb & Gibb, Reference Kolb and Gibb2014). This plasticity reflects the capacity of the brain to be shaped by environmental experiences – a property of early brain development that confers many advantages. Experience-driven plasticity facilitates early learning, allowing neural circuits to be sculpted in ways that are adapted to the environment in which the child develops. At the same time, however, developmental plasticity can also carry costs, especially in early environments characterized by adversity. Indeed, substantial evidence suggests that early-life adversity leads to changes in neural structure and function that confer long-term costs in terms of health, well-being, and longevity. For example, extensive evidence indicates that the changes in cognitive, affective, and neural development that occur following experiences of adversity serve as developmental mechanisms contributing to elevated risk for psychopathology, poor school achievement, and physical health problems (Jenness et al., Reference Jenness, Peverill, Miller, Heleniak, Robertson, Sambrook, Sheridan and McLaughlin2021; Koss & Gunnar, Reference Koss and Gunnar2018; Luby et al., Reference Luby, Barch, Whalen, Tillman and Belden2017; McLaughlin, Sheridan, Winter et al., Reference McLaughlin, Sheridan, Winter, Fox, Zeanah and Nelson2014; Miller et al., Reference Miller, Chen and Parker2011; Noble et al., Reference Noble, Houston, Brito, Bartsch, Kan, Kuperman, Akshoomoff, Amaral, Bloss, Libiger, Schork, Murray, Casey, Chang, Ernst, Frazier, Gruen, Kennedy, Van Zijl, Mostofsky and Sowell2015; Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018; Shackman & Pollak, Reference Shackman and Pollak2014). It is critical, therefore, to delineate the mechanisms through which early experiences influence health and development – the how of development. Uncovering these mechanisms is central to identifying targets for early interventions aimed at preventing the wide-ranging negative consequences of early-life adversity (McLaughlin, DeCross, Jovanovic et al., Reference McLaughlin, DeCross, Jovanovic and Tottenham2019). Indeed, knowledge of these mechanisms has helped to generate novel interventions to reduce disparities in health and educational outcomes among children exposed to adversity (Romeo et al., Reference Romeo, Leonard, Grotzinger, Robinson, Takada, Mackey, Scherer, Rowe, West and Gabrieli2021; Yousafzai et al., Reference Yousafzai, Obradović, Rasheed, Rizvi, Portilla, Tirado-Strayer, Siyal and Memon2016).
Experience-driven plasticity mechanisms
The developmental processes that contribute to experience-driven plasticity in early life include both experience-expectant and experience-dependent mechanisms (Black et al., Reference Black, Greenough and Wallace1997; Greenough et al., Reference Greenough, Black and Wallace1987; Kolb & Gibb, Reference Kolb and Gibb2014). Importantly, although the current discussion of experience-driven plasticity focuses specifically on mechanisms involving learning and brain plasticity, numerous other mechanistic pathways are also important (e.g., epigenetic, immune, endocrine, and those involving the microbiome, among others).
Experience-expectant plasticity
Experience-expectant processes involve neural circuits that require specific types of input from the environment to develop. Experience-expectant plasticity is triggered by particular species-typical developmental experiences that the brain ‘expects’ to encounter based on a species’ evolutionary history during specific developmental windows referred to as sensitive or critical periods (Hensch, Reference Hensch2004; Werker & Hensch, Reference Werker and Hensch2015). These sensitive periods are characterized by specificity in their developmental timing, the types of environmental stimuli that trigger plasticity, and long-term stability of the neural changes that occur during these periods, with more limited later plasticity once the sensitive window closes (Takesian & Hensch, Reference Takesian and Hensch2013). The timing of these periods of heightened plasticity differs across brain regions and extends across both childhood and adolescence (Reh et al., Reference Reh, Dias, Nelson, Kaufer, Werker, Kolb, Levine and Hensch2020). Development of the visual system provides a simple demonstration of experience-expectant plasticity: Light input to the retina during a critical period in the first months of life is required for normal visual development; when this input is absent or atypical (e.g., occurs only in one eye), it produces lasting changes in vision and in the structure and function of brain circuits that support vision (Hensch, Reference Hensch2005; Hubel & Wiesel, Reference Hubel and Wiesel1970; Wiesel & Hubel, Reference Wiesel and Hubel1965). In some domains, the molecular “brakes” that dampen plasticity after a sensitive window closes can be lifted to allow further plasticity at later points in development (Bavelier et al., Reference Bavelier, Levi, Li, Dan and Hensch2010; Frankenhuis & Nettle, Reference Frankenhuis and Nettle2020; Werker & Hensch, Reference Werker and Hensch2015). Within the sensitive window, the heightened plasticity that occurs in response to experience results from re-wiring mechanisms that strengthen certain synaptic connections and from synaptic pruning that eliminates other connections (Takesian & Hensch, Reference Takesian and Hensch2013). Critically, the timing and quality of the expected experience shapes the degree of learning and plasticity that occur (Gabard-Durnam & McLaughlin, Reference Gabard-Durnam and McLaughlin2020; McLaughlin & Gabard-Durnam, Reference McLaughlin and Gabard-Durnam2021; Werker & Hensch, Reference Werker and Hensch2015).
Experience-dependent plasticity
Experience-dependent plasticity involves changes in existing neural circuits that occur in response to specific learning experiences that vary across individuals. In contrast to experience-expectant plasticity, these forms of learning are not constrained to specific developmental periods; thus experience-dependent changes facilitate learning throughout life, although plasticity tends to be most pronounced in childhood and adolescence (Fu & Zuo, Reference Fu and Zuo2011). This enhanced plasticity during development allows neural structure and function to be influenced more readily by lived experiences that occur during these early phases of life (Kolb & Gibb, Reference Kolb and Gibb2014), potentially shifting longer-term developmental trajectories. Experience-dependent learning involves the selective strengthening of particular synaptic connections in response to experience as well as the elimination of others that are under-utilized or inefficient, although these neural changes are less dramatic than in experience-expectant learning (Kolb & Gibb, Reference Kolb and Gibb2014; Takesian & Hensch, Reference Takesian and Hensch2013; Trachtenberg et al., Reference Trachtenberg, Chen, Knott, Feng, Sanes, Welker and Svoboda2002). These changes occur in the specific neural circuits that encode and process particular experiences; in other words, the type of experience determines the specific neural circuits involved in the experience-dependent change. This has been demonstrated in diverse domains ranging from music and sports training to language learning and meditation (Dayan & Cohen, Reference Dayan and Cohen2011; Gotink et al., Reference Gotink, Meijboom, Vernooij, Smits and Hunink2016; Hyde et al., Reference Hyde, Lerch, Norton, Forgeard, Winner, Evans and Schlaug2009; Kraus & Chandrasekaran, Reference Kraus and Chandrasekaran2010). For example, variation in linguistic experience within the species-typical or expected range, as measured observationally in children’s natural environments, is associated with the structure and function of circuits specialized in language processing (Romeo, Leonard et al., Reference Romeo, Leonard, Robinson, West, Mackey, Rowe and Gabrieli2018; Romeo, Segaran et al., Reference Romeo, Segaran, Leonard, Robinson, West, Mackey, Yendiki, Rowe and Gabrieli2018).
Across domains, the intensity and duration of environmental experiences influence the degree of neuroplasticity and learning that occurs (Dayan & Cohen, Reference Dayan and Cohen2011; Gourevitch et al., Reference Gourevitch, Edeline, Occelli and Eggermont2014; Kolb & Gibb, Reference Kolb and Gibb2014; Kraus & Chandrasekaran, Reference Kraus and Chandrasekaran2010). Both experience-expectant and experience-dependent plasticity mechanisms are sensitive to particular types of experiences and to the intensity and duration of those experiences. The specific nature of these experiences determines the types of changes that occur, both in neural circuits and ultimately behavior. The precise mapping of changes in neural structure and function in response to particular environmental experiences affords a nuanced approach to characterizing the early environment and is critical to uncovering how early-life adversity shapes brain and behavioral development.
The harshness-unpredictability framework
Using an evolutionary why perspective – life history theory – Ellis et al. (Reference Ellis, Figueredo, Brumbach and Schlomer2009) identified distinct environmental dimensions that account for much of the variation in patterns of development both across and within species. This analysis articulated how environmental harshness and unpredictability, concomitantly, shape life history strategies. Through a combination of evolutionary and developmental responses to environmental harshness and unpredictability, organisms make predictable resource allocation tradeoffs that result in adaptive coordination between life history strategies and environmental conditions. This regulation of life history strategies is complex, however, because of various moderating factors. Indeed, the effects of environmental harshness and unpredictability depend on related factors such as the competitive ability or physical condition of individuals, the severity of intrasexual competition, the extent to which morbidity and mortality are sensitive to the resource-allocation decisions of parents and offspring, population densities and associated levels of resource scarcity, and the extent to which fluctuating environmental risks affect individuals versus populations over short versus long timescales (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). Here we summarize the theory, focusing on the main effects of environmental harshness and unpredictability, and selectively review empirical findings, focusing on development of life history-related traits and behaviors.
Together with genetic factors, key dimensions of the environment that regulate the development of life history strategies include energy availability, extrinsic morbidity–mortality (harshness), and unpredictability (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). A central assumption of the harshness-unpredictability model is that, over evolutionary time, humans experienced environments that varied in energy availability, harshness, and unpredictability within and across generations, and that this variation recurrently affected survival and reproduction. Natural selection thus favored conditional adaptions that enabled shifts in life history strategies (within evolved reaction norms) along the slow-fast continuum in response to these environmental factors (in interaction with other related factors, as noted above) (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009).
Energy availability (energetic deprivation)
Energetic resources – caloric intake, energy expenditures, and related health conditions – set the baseline for many developmental processes. Energetic deprivation (i.e., inadequate energetic resources) slows growth, delays sexual maturation, and suppresses functioning of the mature reproductive axis (Ellis, Reference Ellis2004; Ellison, Reference Ellison2003), generally resulting in slower life history strategies. However, when bioenergetic resources are adequate to support growth and development, cues to greater harshness and unpredictability generally promote faster strategies (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). The effects of physical and psychosocial stressors, therefore, are hierarchically ordered. For example, pubertal timing is contingent firstly on health and nutrition (see especially Kyweluk et al., Reference Kyweluk, Georgiev, Borja, Gettler and Kuzawa2018) and secondly, when these are adequate, on socioemotional conditions (Coall & Chisholm, Reference Coall and Chisholm2003; Ellis, Reference Ellis2004). Other traits related to faster life history strategies, however, are less contingent on energetic conditions (see below, Integrative Discussion of Harshness and Deprivation).
Environmental harshness
Extrinsic morbidity–mortality, or harshness, refers to external sources of disability and death that are relatively insensitive to the adaptive decisions and actions of the organism (i.e., external sources of morbidity–mortality that cannot generally be attenuated or prevented). Extrinsic morbidity–mortality may result from such environmental factors as warfare, neighborhood violence, family violence, infectious disease, and famine. As denoted by the phrase “relatively insensitive,” morbidity–mortality is almost never purely extrinsic (Abrams, Reference Abrams1993); the adaptive decisions and actions of the organism almost always have at least some impact on morbidity–mortality risks. The key feature of high extrinsic morbidity–mortality, therefore, is that allocations of time and effort toward reducing morbidity–mortality/increasing survival (e.g., long-term investments in growth and health) have low return on investment; that is, risks of disability and death “cannot be significantly reduced at a reasonable cost” (André & Rousset, Reference André and Rousset2020, p. 6). For example, high extrinsic morbidity–mortality means that increased parental care has rapidly diminishing returns on investment (resulting in tradeoffs that favor offspring quantity over quality; e.g., Belsky et al., Reference Belsky, Steinberg and Draper1991).
Over evolutionary time, extrinsic morbidity–mortality is an important selection pressure on the evolution of life history strategies (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009).Footnote 1 Over developmental time, individuals cannot directly detect levels of extrinsic morbidity–mortality and, instead, may rely on observable cues to harsh environmental conditions (e.g., premature disability and death among other people in one’s local ecology). Given adequate energetic resources to support growth and development, exposures to extrinsic morbidity–mortality cues should cause individuals to develop faster life history strategies (Chisholm, Reference Chisholm1999; Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). Faster strategies in this context – a context in which allocation of resources toward reducing morbidity–mortality and/or increasing survival has low returns on investment – function to preferentially allocate resources toward reproduction prior to disability or death. An underlying assumption of this hypothesis is that early adversity carries predictive information about the larger and/or future state of the environment (e.g., danger and consequent high mortality), and that development of a faster life history strategy in this context is a conditional adaptation (e.g., Belsky et al., Reference Belsky, Steinberg and Draper1991); it functions to match the individual to this expected harsh state, despite the costs (including potential mismatch if early environmental cues prove unreliable, as may be common in the modern world).
In research on industrialized populations, extrinsic morbidity–mortality is often operationalized in terms of socioeconomic adversity (e.g., Belsky et al., Reference Belsky, Schlomer and Ellis2012; Simpson et al., Reference Simpson, Griskevicius, Kuo, Sung and Collins2012; Li et al., Reference Li, Liu, Hartman and Belsky2018) because of the systematic (though not necessarily deterministic) relationship between poverty and higher levels of virtually all forms of morbidity and mortality (e.g., Adler et al., Reference Adler, Boyce, Chesney, Folkman and Syme1993; Chen et al., Reference Chen, Matthews and Boyce2002). However, socioeconomic adversity is a rather broad measure of extrinsic morbidity–mortality risk, as it encompasses – and confounds – multiple environmental factors. Thus, some studies have used more direct indicators of risk of disability or death by assessing local mortality rates, neighborhood dangers, crime victimization, developmental exposures to death and injury, sibling death, family violence, and/or perceived danger in the environment (e.g., Chang et al., Reference Chang, Lu, Lansford, Skinner, Bornstein, Steinberg, Dodge, Chen, Tian, Bacchini, Deater-Deckard, Pastorelli, Alampay, Sorbring, Al-Hassan, Oburu, Malone, Di Giunta, Tirado and Tapanya2019; Chang & Lu, Reference Chang and Lu2018; Copping & Campbell, Reference Copping and Campbell2015; Gettler et al., Reference Gettler, McDade, Bragg, Feranil and Kuzawa2015; Gibbons et al., Reference Gibbons, Roberts, Gerrard, Li, Beach, Simons, Weng and Philibert2012; Johns, Reference Johns2011; McCullough et al., Reference McCullough, Pedersen, Schroder, Tabak and Carver2013; Nettle, Reference Nettle2010; Wilson & Daly, Reference Wilson and Daly1997). This body of research has generally supported the prediction that cues indicating heightened extrinsic morbidity–mortality, whether assessed indirectly through socioeconomic status or more directly through morbidity–mortality risk indicators, predict the development of faster life history strategies.
Environmental unpredictability
In addition to extrinsic morbidity–mortality, environmental unpredictability regulates the evolution and development of life history strategies. Environmental unpredictability is defined as stochastic variation in extrinsic morbidity–mortality (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). Harsh environments can be predictable (e.g., short life expectancy with low stochastic variation around the mean) or unpredictable (e.g., high random variation around the mean). Environmental unpredictability corresponds to low autocorrelation (the correlation of an environmental condition with itself over time or space) and/or low cue reliability (the extent to which current experiences or events predict other states of the world, current or future) in environmental conditions related to morbidity and mortality risks (Young et al., Reference Young, Frankenhuis and Ellis2020), as driven by relatively high spatial or temporal variation in harshness. In environments that fluctuate unpredictably in this way, long-term investments, and particularly investment in the development of a slow life history strategy, may not optimize fitness. This is because unpredictable environments, over both evolutionary and developmental timeframes, bias developmental systems toward discounting future costs and benefits relative to current ones (Hill et al., Reference Hill, Ross and Low1997; Wilson & Daly, Reference Wilson and Daly1997). Accordingly, individuals should respond to signals of environmental unpredictability by adopting faster life history strategies (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009).
Over evolutionary time, the statistical structure of extrinsic morbidity–mortality (e.g., variance, autocorrelation) defines environmental unpredictability as a selection pressure (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). Over developmental time, individuals cannot directly detect the statistical structure of extrinsic morbidity–mortality and, instead, may estimate environmental unpredictability based on integration of ongoing experiences over time (Young et al., Reference Young, Frankenhuis and Ellis2020) and/or rely on ancestral cues to environmental unpredictability (i.e., cues that recurrently indicated unpredictability in ancestral environments) (see below, Integrative Discussion of Environmental Unpredictability). Such cues have typically been operationalized as changes in familial and ecological conditions (e.g., parental transitions, frequent residential changes, fluctuating family economic conditions, erratic neighborhood conditions; see Belsky et al., Reference Belsky, Schlomer and Ellis2012; Li et al., Reference Li, Liu, Hartman and Belsky2018; Simpson et al., Reference Simpson, Griskevicius, Kuo, Sung and Collins2012). This has meant that theory and research guided by the harshness-unpredictability framework has generally conceptualized – and measured – unpredictability at the ecological level. This contrasts with a larger body of research in developmental science focusing on more micro-level (day-to-day, moment-to-moment) unpredictability in family routines and parent-child interactions (e.g., Cohodes et al., Reference Cohodes, Kitt, Baskin-Sommers and Gee2021; Evans et al., Reference Evans, Gonnella, Marcynyszyn, Gentile and Salpekar2005; Noroña-Zhou et al., Reference Noroña‐Zhou, Morgan, Glynn, Sandman, Baram, Stern and Davis2020). We attempt to bring these different levels of analysis together in the later integrative section on unpredictability.
In middle-class, Western populations in which people generally do not face significant food insecurity or life-threatening violence (and thus may have limited exposure to ancestral cues in these domains), the most powerful cue to environmental unpredictability may be parental transitions (Hartman et al., Reference Hartman, Sung, Simpson, Schlomer and Belsky2018): changes in parental caregivers resulting from either the onset of a new or the termination of an established marriage or cohabitating relationship. Parental transitions result in changes in morbidity and mortality risks to children (e.g., Daly & Wilson, Reference Daly and Wilson1998) as well as fluctuations in household rules/routines, relationships, and economic conditions. Other commonly used measures of unpredictability include residential changes, parental job changes, family member deaths, homelessness, and various indices of chronic instability or chaos (e.g., Brumbach et al., Reference Brumbach, Figueredo and Ellis2009; Evans et al., Reference Evans, Gonnella, Marcynyszyn, Gentile and Salpekar2005; Coley et al., Reference Coley, Lynch and Kull2015; Richardson et al., Reference Richardson, Sanning, Lai, Copping, Hardesty and Kruger2017; Sturge-Apple et al., Reference Sturge-Apple, Davies, Cicchetti, Hentges and Coe2017; Szepsenwol et al., Reference Szepsenwol, Simpson, Griskevicius and Raby2015). Notably, across these different forms of measurement, heightened unpredictability has been linked to presumed indicators of a fast life history strategy, including greater number of sexual partners and higher levels of aggressive and delinquent behavior (Belsky et al., Reference Belsky, Schlomer and Ellis2012; Simpson et al, Reference Simpson, Griskevicius, Kuo, Sung and Collins2012), as well as an unrestricted sociosexual orientation, reduced investment in parenting, increased intimate partner violence, greater risk taking, reduced effortful control, and temporal discounting (e.g., Deater-Deckard et al., Reference Deater‐Deckard, Godwin, Lansford, Tirado, Yotanyamaneewong, Alampay, Al‐Hassan, Bacchini, Bornstein, Chang, Di Giunta, Dodge, Oburu, Pastorelli, Skinner, Sorbring, Steinberg and Tapanya2019; Griskevicius et al., Reference Griskevicius, Ackerman, Cantú, Delton, Robertson, Simpson, Thompson and Tybur2013; Sturge-Apple et al., Reference Sturge-Apple, Davies, Cicchetti, Hentges and Coe2017; Szepsenwol et al., Reference Szepsenwol, Simpson, Griskevicius and Raby2015, Reference Szepsenwol, Griskevicius, Simpson, Young, Fleck and Jones2017, Reference Szepsenwol, Zamir and Simpson2019). Nevertheless, best practices for measuring unpredictability, and harshness, remain to be determined; some measures, such as family member deaths, are conceptualized as an indicator of harshness by some researchers (e.g., Gettler et al., Reference Gettler, McDade, Bragg, Feranil and Kuzawa2015) and of unpredictability by others (e.g., Richardson et al., Reference Richardson, Sanning, Lai, Copping, Hardesty and Kruger2017).
Conclusion
Life history theory provides an informed basis for making predictions about how early adversity shapes the development of multiple psychological and behavioral traits, as well as the associations of these traits with patterns of growth, sexual maturation, metabolism, and other life history-related systems. The life history approach has guided research on the links between early familial and ecological conditions and a range of life history-related traits and behaviors, such as impulsivity and risk taking, pubertal maturation, biological aging, sexual behavior, reproductive timing, parenting, and health (e.g., Belsky, Reference Belsky2019; Brumbach et al., Reference Brumbach, Figueredo and Ellis2009; Copping & Campbell, Reference Copping and Campbell2015; Colich et al., Reference Colich, Rosen, Williams and McLaughlin2020; James et al., Reference James, Ellis, Schlomer and Garber2012; Mell et al., Reference Mell, Safra, Algan, Baumard and Chevallier2018; Sheppard et al., Reference Sheppard, Pearce and Sear2016; Sung et al., Reference Sung, Simpson, Griskevicius, Kuo, Schlomer and Belsky2016; Szepsenwol et al., Reference Szepsenwol, Simpson, Griskevicius and Raby2015). This work has been instrumental in advancing our understanding of the key distinction between childhood experiences of environmental harshness and unpredictability (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009), with each of these dimensions uniquely – and often distinctly – predicting the development of life history traits and their behavioral and psychological mediators (for a meta-analysis, see Wu et al., Reference Wu, Guo, Gao and Kou2020).
The threat-deprivation framework
The threat-deprivation framework concerns the how of development; it derives from a mechanistic and neurobiological analysis of development and experience-driven plasticity (McLaughlin, Sheridan, & Lambert, Reference McLaughlin, Sheridan and Lambert2014; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2014). The model is based on a simple principle – that brain plasticity is a primary mechanism through which environmental experiences shape learning and development. Because neural plasticity mechanisms are responsive to specific types of environment experiences (Kolb & Gibb, Reference Kolb and Gibb2014; Takesian & Hensch, Reference Takesian and Hensch2013; Trachtenberg et al., Reference Trachtenberg, Chen, Knott, Feng, Sanes, Welker and Svoboda2002), the neurodevelopmental processes influenced by adverse early experiences are unlikely to be identical across all forms of adverse environments. Instead, this dimensional framework attempts to distill complex adverse experiences into core underlying dimensions that cut across multiple forms of adversity in order to identify the active ingredients in adverse environmental experiences that are likely to shape learning and patterns of brain development in similar ways. The central tenet of this model is that different dimensions of adversity will have distinct influences on neural development. This claim rests on existing knowledge about experience-driven plasticity, which indicates clearly that different types of environmental experiences produce distinct changes in neural structure and function that vary based on the timing, nature, intensity, and duration of the environmental experience (Kolb & Gibb, Reference Kolb and Gibb2014; McLaughlin & Gabard-Durnam, Reference McLaughlin and Gabard-Durnam2021; Nelson & Gabard-Durnam, Reference Nelson and Gabard-Durnam2020).
The initial model distinguished two central dimensions: threat and deprivation (McLaughlin & Sheridan, Reference McLaughlin and Sheridan2016; McLaughlin, Sheridan, & Lambert, Reference McLaughlin, Sheridan and Lambert2014; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2014). Threat encompasses learning experiences involving actual harm or threat of harm to the child’s survival, including direct victimization experiences (e.g., physical abuse), as well as those witnessed by the child (e.g., violence between caregivers). Deprivation, in contrast, involves limited or reduced social and cognitive inputs from the environment during development, resulting in reduced opportunities for learning. Although experiences of threat and deprivation often co-occur (Green et al., Reference Green, McLaughlin, Berglund, Gruber, Sampson, Zaslavsky and Kessler2010; McLaughlin et al., Reference McLaughlin, Green, Gruber, Sampson, Zaslavsky and Kessler2012), their developmental consequences – and the neurobiological processes giving rise to them – are considered at least partially distinct. The dimensions of threat and deprivation underlie numerous forms of early experience. For example, threat of harm to the child occurs in physical and sexual abuse, in witnessing domestic violence, and in other forms of violence (e.g., in the school or community). Thus, even if physical abuse typically involves a higher level of danger than witnessing violence in one’s neighborhood, both experiences share the core feature of the potential for harm to the physical integrity of the child. In contrast, deprivation in social and cognitive stimulation is a central feature of neglect, institutional rearing, and other forms of parental absence, including deprivation that arises from low levels of parental involvement or interaction in low resource environments with few social supports, such as poverty in the United States.
Threat
The ability to identify and mobilize defensive responses to environmental threats is essential for survival. Specific neural circuits, identified in animal models (LeDoux, Reference LeDoux2003, Reference LeDoux2012) and conserved across species (Milad & Quirk, Reference Milad and Quirk2012; Phelps & LeDoux, Reference Phelps and LeDoux2005), respond to potential threats, orchestrate defensive responses, and support aversive learning – encompassing amygdala, hippocampus, and dorsal and ventral medial prefrontal cortex. Predictions about the dimension of threat derive from this substantial neuroscience literature. Specifically, the model predicts that threatening experiences during childhood alter the development of neural networks that underlie salience detection and aversive learning in ways that facilitate rapid identification of potential danger in the environment and mobilize defensive responses that promote safety (McLaughlin, Sheridan, & Lambert, Reference McLaughlin, Sheridan and Lambert2014; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2014; McLaughlin & Lambert, Reference McLaughlin and Lambert2017). Because these responses to threat are critical to survival throughout development, it is unlikely that natural selection designed the brain to “expect” to encounter threats only during specific sensitive periods. Instead, these developmental changes in threat-related processing most likely reflect experience-dependent plasticity and learning (McLaughlin & Gabard-Durnam, Reference McLaughlin and Gabard-Durnam2021). Such changes are presumably adaptive under threatening conditions, even if there are long-term costs associated with these changes, such as increased vulnerability to internalizing and externalizing psychopathology (Briggs-Gowan et al., Reference Briggs-Gowan, Pollak, Grasso, Voss, Mian, Zobel, McCarthy, Wakschlag and Pine2015; Kim & Cicchetti, Reference Kim and Cicchetti2010; McLaughlin et al., Reference McLaughlin, Sheridan, Gold, Lambert, Heleniak, Duys, Shechner, Wojcieszak and Pine2016).
Strong evidence supports these predictions. A systematic review demonstrates that early-life experiences of threat are associated with changes in the structure and function of brain regions involved in emotional learning, including reduced amygdala and hippocampal volume, and elevated amygdala responses to threat cues (McLaughlin, Weissman, & Bitran, Reference McLaughlin, DeCross, Jovanovic and Tottenham2019). These neurodevelopmental patterns are consistent with substantial evidence that children raised in threatening early environments exhibit cognitive biases that facilitate the rapid identification of potential threats in the environment (Dodge et al., Reference Dodge, Petit, Bates and Valente1995; Pollak et al., Reference Pollak, Cicchetti, Hornung and Reed2000; Pollak et al., Reference Pollak, Messner, Kistler and Cohn2009; Pollak & Sinha, Reference Pollak and Sinha2002; Pollak & Tolley-Schell, Reference Pollak and Tolley-Schell2003; Pollak et al., Reference Pollak, Vardi, Putzer Bechner and Curtin2005). Elevated emotional responses to stress and heightened responsivity to cues signaling the presence of threat in the environment are well-documented among children exposed to violence (Glaser et al., Reference Glaser, van Os, Portegijs and Myin-Germeys2006; Heleniak et al., Reference Heleniak, Jenness, Van der Stoep, McCauley and McLaughlin2016; Hennessy et al., Reference Hennessy, Rabideau, Cicchetti and Cummings1994; Kim & Cicchetti, Reference Kim and Cicchetti2010; Weissman et al., Reference Weissman, Bitran, Miller, Schaefer, Sheridan and McLaughlin2019). More recent evidence has documented altered patterns of aversive learning in children exposed to threat (DeCross et al., Reference DeCross, Sambrook, Sheridan, Tottenham and McLaughlin2021; Machlin et al., Reference Machlin, Miller, Snyder, McLaughlin and Sheridan2019; McLaughlin et al., Reference McLaughlin, Sheridan, Gold, Lambert, Heleniak, Duys, Shechner, Wojcieszak and Pine2016). Critically, these patterns have been observed even in studies that control for co-occurring deprivation experiences.
Deprivation
Children exposed to deprived early environments – including those characterized by neglect, the absence of a primary caregiver (e.g., institutional rearing), chronic material deprivation, and low levels of cognitive stimulation, exposure to complex language, parental scaffolding of child learning, and other learning opportunities – do not tend to exhibit the alterations in emotional processing and associated neural circuits that are associated with threat (see McLaughlin, Weissman, & Bitran, Reference McLaughlin, DeCross, Jovanovic and Tottenham2019 for a review). Instead, children exposed to deprivation experience persistent difficulties in a range of cognitive domains, such as language ability and executive functioning (Lambert et al., Reference Lambert, King, Monahan and McLaughlin2017; Miller et al., Reference Miller, Sheridan, Hanson, McLaughlin, Bates, Lansford, Pettit and Dodge2018; Rosen et al., Reference Rosen, Hagen, Lurie, Miles, Sheridan, Meltzoff and McLaughlin2020; Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018; Sheridan et al., Reference Sheridan, Peverill and McLaughlin2017).
The threat-deprivation model links these patterns to a specific aspect of deprived early environments – the lack of consistent interactions with caregivers – which is presumed to influence development through a different set of mechanisms than do environments characterized by threat (McLaughlin, Reference McLaughlin2016; McLaughlin & Sheridan, Reference McLaughlin and Sheridan2016; McLaughlin, Sheridan, & Lambert, Reference McLaughlin, Sheridan and Lambert2014; McLaughlin et al., Reference McLaughlin, Sheridan and Nelson2017; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2014). Early in life, most learning occurs in the context of caregiver interactions. The sensory, motoric, linguistic, and social experiences provided by caregivers substantially define the complexity of the developmental environment and the degree of cognitive stimulation afforded the child. The absence or unavailability of a caregiver results in gross reductions in social and cognitive stimulation and opportunities for learning. Critically, behavioral and neural functioning associated with deprivation is mediated by the degree of cognitive and social stimulation in the early home environment (Miller et al., Reference Miller, Sheridan, Hanson, McLaughlin, Bates, Lansford, Pettit and Dodge2018; Rosen et al., Reference Rosen, Hagen, Lurie, Miles, Sheridan, Meltzoff and McLaughlin2020; Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018) – a central prediction of the model.
Reductions in expected input from the environment operate in at least two ways to shape future cognition. First, this lack of input constrains early learning and thus alters the foundation on which more complex forms of cognition are scaffolded. Second, deprivation may influence brain development through experience-driven plasticity mechanisms. The selective elimination of synaptic connections that are utilized infrequently is a central force in the remodeling of the brain across development in response to experience (Petanjek et al., Reference Petanjek, Judas, Simic, Rasin, Uylings, Rakic and Kostovic2011) and is a key neural mechanism contributing to experience-driven plasticity (Kolb & Gibb, Reference Kolb and Gibb2014; Takesian & Hensch, Reference Takesian and Hensch2013; Trachtenberg et al., Reference Trachtenberg, Chen, Knott, Feng, Sanes, Welker and Svoboda2002). Environments characterized by limited or reduced social and cognitive stimulation and available caregivers constrain opportunities for learning, which can result in accelerated and extreme synapse elimination. This has been repeatedly demonstrated in animals deprived of sensory and social input during development (Globus et al., Reference Globus, Rosenzweig, Bennett and Diamond1973; Greenough & Volkmar, Reference Greenough and Volkmar1973; OʼKusky, Reference OʼKusky1985; Turner & Greenough, Reference Turner and Greenough1985).
The threat-deprivation model predicts that exposure to social and cognitive deprivation might produce a similar pattern of accelerated synaptic pruning in humans in cortical regions that process the specific inputs that are absent or limited in their intensity or duration, leading to reductions in the thickness and volume of these regions and reduced performance on complex social and cognitive tasks that depend on these brain areas (McLaughlin et al., Reference McLaughlin, Sheridan and Nelson2017; Sheridan & McLaughlin, Reference Sheridan and McLaughlin2016). Existing evidence in line with these hypotheses can be found in research with children raised in orphanages characterized by extreme deprivation as well as in children with less severe forms of deprivation, such as neglect or material deprivation associated with poverty. These are contextual conditions characterized by reduced exposure to complex language, supervision by caregivers, and environmental stimulation (Romeo, Leonard et al., Reference Romeo, Leonard, Robinson, West, Mackey, Rowe and Gabrieli2018; Rosen et al., Reference Rosen, Hagen, Lurie, Miles, Sheridan, Meltzoff and McLaughlin2020; Rowe, Reference Rowe2008). In such deprived environments, children display difficulties in language and executive functioning (Bos et al., Reference Bos, Fox, Zeanah and Nelson2009; Lawson et al., Reference Lawson, Hook and Farah2018; Noble et al., Reference Noble, McCandliss and Farah2007; Pollak et al., Reference Pollak, Nelson, Schlaak, Roeber, Wewerka, Wiik, Frenn, Loman and Gunnar2010; Spratt et al., Reference Spratt, Friedenberg, LaRosa, Bellis, Macias, Summer, Hulsey, Runyan and Brady2012; Tibu et al., Reference Tibu, Sheridan, McLaughlin, Fox, Zeanah and Nelson2016; Windsor et al., Reference Windsor, Benigno, Wing, Carroll, Koga, Nelson, Fox and Zeanah2011), reductions in cortical gray matter volume (Hodel et al., Reference Hodel, Hunt, Cowell, Van Den Heuvel, Gunnar and Thomas2015; Noble et al., Reference Noble, Houston, Brito, Bartsch, Kan, Kuperman, Akshoomoff, Amaral, Bloss, Libiger, Schork, Murray, Casey, Chang, Ernst, Frazier, Gruen, Kennedy, Van Zijl, Mostofsky and Sowell2015; Simpson et al., Reference Simpson, Griskevicius, Kuo, Sung and Collins2012), thinner cortex throughout the brain and in the fronto-parietal network (McLaughlin, Sheridan, Winter et al., Reference McLaughlin, Sheridan, Winter, Fox, Zeanah and Nelson2014; McLaughlin, Weissman, & Bitran, Reference McLaughlin, Weissman and Bitran2019; Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018; Sheridan et al., Reference Sheridan, Peverill and McLaughlin2017), and atypical patterns of fronto-parietal recruitment during executive functioning tasks (Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018; Sheridan et al., Reference Sheridan, Peverill and McLaughlin2017). Notably, the precise cellular mechanisms that lead to reductions in cortical gray matter following deprivation are currently unknown.
Conclusion
Experience-driven plasticity mechanisms underlie learning and changes in brain circuits in response to particular types of environmental experiences. Because plasticity is heightened during childhood and adolescence, environmental experiences occurring during these windows of development are most likely to produce lasting changes in the brain and in behavior. These plasticity mechanisms provide a foundation for understanding the pathways through which experiences of threat and deprivation influence development in ways that are at least partially distinct. More broadly, these mechanisms play a key role in facilitating adaptation to the environment in which children are developing, while also contributing to heightened risk for psychopathology and other problematic outcomes as children mature.
Core-integrative concepts
As reviewed in the preceding sections, the harshness-unpredictability framework fundamentally concerns the why of development, focusing on what developmental responses to adversity were designed by natural selection to do, whereas the threat-deprivation framework fundamentally concerns the how of development, focusing on neurodevelopment processes that are shaped by adversity in the here and now. Although both frameworks seek to explain the consequences of experiencing different forms of adversity, their divergent grounding in why versus how has led to different research questions, hypotheses, and programs of research, which have largely proceeded independently and have not previously been integrated.
Here we attempt to break down these silos. We see the different research programs guided by each perspective as complementary pieces of a developmental puzzle that, if brought together, could meaningfully advance scientific understanding of the consequences of childhood adversity. The first step toward achieving this integration was to clearly articulate the nature of, and thereby essential differences between, the two frameworks (above). We now turn to discussing how these differences can be bridged to move toward an integrated model of dimensions of environmental experience. We begin by discussing three organizing principles that underpin this integration: (a) the directedness of learning, (b) different levels of developmental adaptation to the environment, and (c) relations between adaptive and maladaptive developmental processes in experience-driven plasticity. Then, through a series of integrative discussions of different dimensions of adversity, we build our integrated model (Figure 1). We conclude with a summary of the overall model and its implications.
The directedness of learning
An evolutionary-developmental analysis of dimensions of adversity stipulates that natural selection shaped the brain to treat certain environmental inputs (referred to here as ancestral cues) as privileged sources of information. From this perspective, experience-driven mechanisms of plasticity evolved to promote substantial learning and development in response to certain early experiences and environmental exposures, but not others. This basic assumption of an evolutionary-developmental approach involves three suppositions: (a) the properties of the external environment in which evolutionary adaptation occurred had a defined statistical structure (i.e., many events/experiences were probabilistically linked); (b) this structure included reliable associations between ancestral cues (e.g., harsh parenting) and current or future states of the world that influenced fitness (e.g., warfare, food shortages); and (c) natural selection embedded this structure in our neurobiology (Tooby & Cosmides, Reference Tooby and Cosmides1990). Ancestral cues thus provide information about recurrent sets of conditions and relationships that were present over evolutionary time (what Tooby & Cosmides, Reference Tooby and Cosmides1990, called the “hidden structure” of the environment), but that may or may not be consciously perceived or directly observed through one’s lived experiences. This hidden structure enables evolved mechanisms to operate adaptively based on limited information (i.e., without extensive learning trials, without modeling all of the necessary features of the environment), even though cues are probabilistic and vary in their reliability (see Frankenhuis et al., Reference Frankenhuis, Nettle and Dall2019). From this perspective: “What evolves is the directedness of learning” (Wilson, Reference Wilson1975, p. 156) – the relative ease or efficiency with which some aspects of the developmental environment, but not others, are detected, encoded, and ultimately embedded as biological changes in emerging neural circuits that guide experience-driven plasticity. That is why – ultimately – certain dimensions of the environment matter to the developing organism.
Levels of developmental adaptation to the environment
Taken together, the dimensional models of adversity that are the focus of this paper highlight the multi-level character of the environment to which children developmentally adapt. With its focus on the how of development, the threat-deprivation model centers on the developing child’s immediate experiences of adversity and how such experiences shape brain plasticity and, through it, learning and development. The focus is on how children use information in the proximal environment in which they are developing (e.g., experiences of threat from a caregiver) as a basis for concurrently adapting to that environment (e.g., early development of fear learning) (see Table 1, Level 1). A key assumption is that the nature, intensity, and duration of developmental experiences determines the degree of neuroplasticity and learning that occur. There is no implication that the developing child inferentially uses cues in their immediate environment as a basis for adapting to some broader ecological context, current or future, that they may or may not ever experience. The focus instead is on neural and behavioral adaptation to, and within, the child’s current context.
Note. LG, licking and grooming.
By contrast, with its focus on the why of development, the harshness-unpredictability model extends developmental adaptation to broader ecological contexts. The evolutionary-developmental model assumes that the environment has a “hidden” statistical structure characterized by recurrent sets of conditions and relationships that were present over evolutionary time (see above, The Directedness of Learning). Based on this implicit structure, the child uses early developmental experiences to adapt to multiple levels of the environment, from immediate contexts to broader ecological contexts that are more spatially or temporally distal (see Table 1, Level 2 and Level 3). Developmentally adapting to more distal contexts involves using immediate environmental cues inferentially (e.g., harsh parenting as a probabilistic cue to higher extrinsic morbidity–mortality).
In sum, the threat-deprivation and harshness-unpredictability frameworks have largely focused on different levels – and especially different timescales – of developmental adaptation to the environment, making the insights afforded by each perspective quite different. Critically, however, these are not alternative frames of reference, but complementary ones. Indeed, what each “sees” the other “misses.” An integrated dimensional model of environmental experience therefore needs to systematically address adaptation to the environment across all of the levels shown in Table 1. We discuss this issue further, and attempt to bridge this gap, below (see Integrative Discussion of Harshness and Threat).
Adaptive versus maladaptive processes
Related to the focus on different levels of developmental adaptation to the environment, the threat-deprivation and harshness-unpredictability models conceptualize adaptive versus maladaptive processes differently in relation to adversity-mediated phenotypic variation. Here we discuss these (partially) divergent perspectives in relation to both experience-expectant and experience-dependent developmental processes.
Experience-expectant processes influence development of both adaptive and maladaptive forms of adversity-mediated phenotypic variation. In considering this issue, it is useful to distinguish between developmental pathways underlying species-typicality versus adaptive individual differences (Figure 2). Species-typicality refers to developmental mechanisms that have been shaped by natural selection to produce a particular phenotype (e.g., vision) under expectable environmental conditions. In these cases, deviation from that phenotype (e.g., poor vision, or even blindness) resulting from deviation in expectable environmental conditions (e.g., reduced or absent light input) is maladaptive. Notably, even development of a species-typical phenotype is generally not a yes or no proposition; rather, the timing and quality of the expected experience shapes the degree of learning and plasticity that occur (Gabard-Durnam & McLaughlin, Reference Gabard-Durnam and McLaughlin2020; McLaughlin & Gabard-Durnam, Reference McLaughlin and Gabard-Durnam2021; Werker & Hensch, Reference Werker and Hensch2015). Experience-expectant processes regulate the types and timing of environmental experiences that are needed for specific cognitive, emotional, and social capacities to develop in a typical manner; environmental contexts that deprive individuals of such experiences during sensitive periods compromise those capacities, which may contribute to the emergence of psychopathology and other maladaptive outcomes (McLaughlin & Gabard-Durnam, Reference McLaughlin and Gabard-Durnam2021; Nelson & Gabard-Durnam, Reference Nelson and Gabard-Durnam2020).
In contrast to species-typicality, adaptive individual differences refer to evolved developmental mechanisms that have been shaped by natural selection to produce different – conditional – phenotypes (e.g., body camouflage in caterpillars matched to spring vs. summer foliage; high vs. low number of functional sweat glands in humans) in response to different expectable environmental cues or conditions (e.g., different seasonal diets in caterpillars during the first 3 days of life; human infant development in warmer vs. cooler climates; Gluckman, Reference Gluckman2004; Greene, Reference Greene1989). These conditional responses are structured and predictable. A key feature of adaptive individual differences is that no single species-typical phenotype exists, independent of context. That is, different contextually-induced phenotypes have been maintained by natural selection as adaptations to environmental variation. The effects of maternal licking and grooming in rats on the development of stress physiology and defensive behavior (Figure 2) provides an example of an experience-expectant process that regulates adaptive individual differences in behavior and neurobiology. Variation in physiology and behavior is structured by early caregiving experiences that show marked variation within the species-typical range (i.e., licking and grooming is an expected environmental experience that widely varies in intensity and frequency across rodents). The effects of licking and grooming are limited to a sensitive period in the first week of life, during which time licking and grooming alters glucocorticoid receptor mRNA expression in the pups’ hippocampal neurons, and then stably calibrates individual differences in gene expression, neural function, and behavior in adulthood in ways that appear to be adaptive in different contexts (Cameron et al., Reference Cameron, Champagne, Parent, Fish, Ozaki-Kuroda and Meaney2005; Meaney, Reference Meaney2010). This structured phenotypic plasticity presumably reflects conditional adaptation to recurring environmental variation encountered over evolutionary history.
In addition to experience-expectant process, experience-dependent processes shape both adaptive and maladaptive forms of adversity-mediated phenotypic variation. Consider experience-dependent responses to threat, which show directedness of learning (Figure 2). The threat-deprivation model stipulates that heightened neural plasticity early in life alters development in threatening environments in ways that are adaptive in context – facilitating rapid identification of potential dangers and mobilizing defensive responses that promote safety – but that these threat-mediated changes may be detrimental in other environments in which the individual finds themselves (e.g., school, workplace), thus increasing vulnerability for psychopathology and other negative outcomes over time. This increased vulnerability is mediated by traits such as altered fear learning, exaggerated emotional reactivity, and difficulties with emotion regulation (e.g., Sheridan et al., Reference Sheridan, Shi, Miller, Salhi and McLaughlin2020; McLaughlin & Lambert, Reference McLaughlin and Lambert2017).
The harshness-unpredictability model converges with the hypothesis that children mount adaptive responses to danger, but extends it in two ways. First, the model conceptualizes developmental adaptations to adversity in terms of tradeoffs: one system is diminished so that another can be enhanced or preserved (as evidenced, for example, by the growing empirical literature on the physical health costs of positive psychosocial adjustment in the context of childhood adversity; Hostinar & Miller, Reference Hostinar and Miller2019). Second, despite such tradeoffs, the model proposes that childhood adaptations to adversity may eventuate in long-term adaptive changes in biobehavioral systems – adaptive individual differences – that regulate development over the life course (reviewed in Ellis & Del Giudice, Reference Ellis and Del Giudice2014, Reference Ellis and Del Giudice2019). We elaborate further on this idea in the following integrative discussion of harshness and threat.
Integrative discussion of harshness and threat
As discussed earlier, extrinsic morbidity–mortality, or harshness, refers to external sources of disability and death that cannot generally be attenuated or prevented. Over human evolutionary history, experiences involving harm imposed by other agents were an important source of extrinsic morbidity–mortality for children (Frankenhuis & Amir, Reference Frankenhuis and Amir2021). In this section, we focus on the adaptive problem of physical harm from other agents – a distinct selection pressure posed by the environment – and how developmental systems evolved to detect and respond to cues indicating the presence of this adaptive problem.
As shown in Figure 1, harshness is an overarching concept that encompasses aspects of threat and deprivation. Cues to harshness resulting from harm imposed by other agents (i.e., threat-based forms of harshness) vary from more proximal to more distal to the child. Proximal cues correspond to the concept of threat in the threat-deprivation model: actual harm or threat of harm to the child’s survival as a result of experiencing (e.g., physical abuse) or witnessing (e.g., violence between caregivers) interpersonal violence or threat of violence. By contrast, more distal cues signal increased risk of morbidity–mortality from harm imposed by other agents, but do not directly involve experiencing or witnessing violence. Examples of such distal cues are living in close proximity to a recent violent crime that was not personally experienced, premature morbidity–mortality among family or community members, and short life expectancies at the local level. Distal cues may signal morbidity–mortality risk either directly (the child detects and encodes the distal cue) or indirectly through their effects on more proximal experiences of threat (Figure 1). The harshness-unpredictability model assumes that directedness of learning enables individuals to adaptively calibrate development on the basis of proximal and distal cues to threat, without extensive learning trials or modeling all of the necessary features of the environment (see below, Directedness and Specificity of Responses to Threat).
A central difference between the threat and harshness literatures is their relative focus on more proximal versus distal cues to morbidity–mortality from harm imposed by other agents (Figure 1), and the role of such cues in guiding developmental adaptation to more proximal versus distal environments (Table 1). In this section, we make clear that these differences are complementary and hierarchically ordered. We then attempt to show how these differences can be integrated in theory and research on threat-mediated acceleration of biological aging.
The adaptive problem of physical harm from other agents: multi-level cues and multi-level responses
Both detection of threat-based forms of harshness (what is detected and encoded in the environment) and phenotypic responses to threat-based forms of harshness (developmental adaptations) occur at multiple levels. In the threat-deprivation framework, individuals use experiences of threat in their proximal environment (proximal cues in Figure 1), such as experiencing harsh physical discipline, as a basis for concurrently adapting to that environment (Level 1, Table 1). Although surviving childhood is critically important, immediate rearing environments – the here and now – are not the only environments that individuals have had to adapt to over evolutionary history in order to successfully reproduce. In the harshness-unpredictability framework, therefore, early experiences of threat do double duty, as they also operate as a conduit through which young children receive information about larger environmental risks (Belsky et al., Reference Belsky, Steinberg and Draper1991; Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009). That is, individuals use harsh experiences as a basis for adapting not only to the environment in which they are developing, but also to more spatially or temporally distal environments (Level 2 and Level 3, Table 1). As children mature, they should increasingly detect and respond to distal cues directly. In total, if developmental systems function to adapt children to the broader ecological contexts into which they are likely to mature (Level 2 and Level 3), then these systems should have evolved to incorporate information from distal (as well as proximal) cues, as shown in Figure 1.
Cues to harshness are defined broadly in terms of probabilistic relations with extrinsic morbidity–mortality. For example, children may use corporal punishment as a cue to the prevalence of violence at a societal level (Level 2, Table 1), such as the prevalence of warfare or other forms of interpersonal violence among adults, which increase morbidity–mortality risk. Such relations between corporal punishment and societal-level violence are well documented cross-culturally (Lansford & Dodge, Reference Lansford and Dodge2008; see also Eltanamly et al., Reference Eltanamly, Leijten, Jak and Overbeek2021). If these two forms of violence reliably covaried over evolutionary history, then natural selection may have shaped the brain to use physically harsh parenting as a cue to extrinsic morbidity–mortality from physical harm from other agents – and thus as a basis for calibrating development (such as by regulating development toward a faster life history strategy). Although the cue in this example (corporal punishment) involves actually experiencing violence, a broader range of environmental factors may operate as cues to morbidity–mortality from physical harm from other agents (distal cues in Figure 1), such as attending the funerals of multiple peers during adolescence.
Directedness and specificity of responses to threat
Responses to threat, we argue, reflect the directedness of learning; it is one example of a privileged developmental experience that has greater potential to trigger learning and plasticity than many other types of early experiences. Threatening experiences during childhood alter the development of neural networks that underlie salience detection and aversive learning in ways that facilitate rapid identification of potential danger and mobilization of defensive responses (see above, The Threat-Deprivation Framework). As articulated by Öhman and Mineka (Reference Öhman and Mineka2001), the neural systems that underlie fear and aversive learning and govern defensive responses to threats evolved as adaptations to environments characterized by high risk of mortality from physical harm from other agents (see also Rakison, Reference Rakison, Hart and Bjorklund2022). They argue that this evolved “fear module” is selective, automatic, and specific: selective in that it responds preferentially to stimuli that have been associated with threat over the course of human evolution; automatic in that threats capture attention rapidly and produce efficient mobilization of defensive responses without the need for complex neural computations, extensive learning, or elaborated cognitive appraisals; and specific in that it relies on specialized neural circuitry that evolved to promote learning and generate appropriate behavioral responses to specific environmental threats (Öhman & Mineka, Reference Öhman and Mineka2001).Footnote 2 This specificity – and directedness of learning – in neurodevelopmental responses to threat is depicted by the pathway in Figure 1 from threat to morbidity–mortality from harm imposed by other agents (versus the absence of a pathway from threat to morbidity–mortality from insufficient environmental inputs).
An integrated approach calls for examining the role of both proximal and distal cues to physical harm from other agents in calibrating underlying mechanisms involved in regulating developmental adaptations to harsh environments. Developmental mechanisms mediating responses to proximal experiences of threat (e.g., aversive learning) are well-articulated in the threat-deprivation framework, including the work of Öhman and Mineka (Reference Öhman and Mineka2001) described above. In contrast, developmental mechanisms mediating responses to more distal cues to physical harm from other agents have received less empirical attention. One mechanism studied within the harshness-unpredictability framework is perceptions of future life expectancy – whether youth believe they will live a shorter versus longer life (e.g., Chisholm et al., Reference Chisholm, Quinlivan, Petersen and Coall2005; Dickerson & Quas, Reference Dickerson and Quas2021) – which may be calibrated (in part) by more distal cues to physical harm from other agents. Integrating across these levels, developmental mechanisms should incorporate information from both proximal and distal cues, particularly in regulating Level 2 and Level 3 adaptations (Table 1). More immediate phenotypic adaptations to childhood experiences of threat (e.g., rapid threat detection) may eventuate in longer-term changes in biobehavioral systems that are adaptive under dangerous or violent conditions (e.g., faster life history strategies), despite the potential costs of such developmental adaptations (e.g., psychopathology and other negative outcomes).
Levels of developmental adaptation to threat: the case of accelerated biological aging
In this section, we present an integrative example of experience-driven plasticity – threat-mediated acceleration of biological aging – that occurs across multiple levels of developmental adaptation to the environment. Our goal is to demonstrate how integrating across the different levels that are the main focus of the threat-deprivation and harshness-unpredictability models can advance understanding of this important developmental phenomenon.
Experiences of threat may accelerate the pace of brain development (reviewed in Tooley et al., Reference Tooley, Bassett and Mackey2021). For example, accelerated cortical thinning in ventromedial prefrontal cortex (vmPFC) – a region involved in multiple forms of social and emotional processing – has been observed consistently in children exposed to threat, although patterns of prefrontal-amygdala connectivity have been mixed (see Colich et al., Reference Colich, Rosen, Williams and McLaughlin2020, and McLaughlin, Weissman, & Bitran, Reference McLaughlin, DeCross, Jovanovic and Tottenham2019, for systematic reviews). This accelerated development of neural circuits that support emotion regulation and social information processing may allow for earlier independence from caregivers in environments lacking support or characterized by chronic threat (Callaghan & Tottenham, Reference Callaghan and Tottenham2016; Snell-Rood & Snell-Rood, Reference Snell-Rood and Snell-Rood2020). Consistent with the threat-deprivation framework, this accelerated development may facilitate adaptation to the immediate environment (Table 1, Level 1) by sharpening threat detection and response mechanisms (e.g., earlier detection of danger and more rapid mobilization of defensive responses).
A hypothesis advanced within the harshness-unpredictability framework is that harsh experiences in childhood carry predictive information about the larger and/or future state of the environment (i.e., danger and consequent high mortality; Table 1, Level 2 and Level 3), and that development of a faster life history strategy in this context – including earlier puberty – is a conditional adaptation that functions to increase the probability of reproduction prior to disability or death (Belsky, Reference Belsky2019; Belsky et al., Reference Belsky, Steinberg and Draper1991; Ellis, Reference Ellis2004). Consistent with this hypothesis, a recent meta-analysis found that childhood experiences of threat, but not deprivation, were associated with accelerated biological aging in childhood and adolescence, including earlier pubertal development (Colich et al., Reference Colich, Rosen, Williams and McLaughlin2020).Footnote 3 In this case, the phenotypic response – accelerated transition from the pre-reproductive to the reproductive phase of the human lifecycle – was distal to early experiences of threat. Early experiences of threat appear to calibrate development of the hypothalamic-pituitary-gonadal axis in ways that – years later – accelerate pubertal maturation. In turn, accelerated puberty may affect other life history-related traits such as risky sexual behavior and timing of reproduction (reviewed in Baams et al., Reference Baams, Dubas, Overbeek and Van Aken2015; Ellis, Reference Ellis2004). Taken together, early experiences of threat regulate both immediate and predictive adaptive responses, which operate through different mechanisms and have different functions in relation to survival and reproduction. Integrating the threat-deprivation and harshness-unpredictability models brings into focus these multiple levels of developmental adaptation to the environment, thus affording a more complete understanding of development in the context of adversity.
Directions for future research
We highlight concrete directions for future research that could test these propositions about harshness and threat. These include evaluating whether distal cues to harshness and threat are associated with similar brain processes as direct experiences of violence as well as other possible biological-embedding mechanisms (e.g., epigenome, microbiome, immune system, etc.), and whether these processes, in turn, are associated with life history phenotypes. For example, these ideas could be investigated in a study that simultaneously examines proximal (e.g., harsh punishment) and distal (e.g., levels of neighborhood violence) indicators of threat with aversive learning, social information processing, and/or emotional reactivity or that determines whether these processes are mechanisms linking distal and proximal threat indicators with earlier age of sexual maturation and/or sexual behavior. Specific directions for future research in this area are outlined in Table 2.
Integrative discussion of harshness and deprivation
In addition to harm imposed by other agents, insufficient environmental inputs (material, energetic, or social) was another important source of harshness over human evolutionary history (Frankenhuis & Amir, Reference Frankenhuis and Amir2021). In this section, we focus on the adaptive problem of insufficient environmental inputs – a distinct set of selection pressures posed by the environment – and how developmental systems evolved to detect and respond to cues indicating the presence of this adaptive problem (Figure 1).
In early childhood, experiences of deprivation are often mediated through caregiver behavior, which is responsive to larger ecological conditions such as social/alloparental support, socioeconomic adversity, pathogen stress, and community-level violence (Belsky, Reference Belsky1984; Belsky et al., Reference Belsky, Steinberg and Draper1991; Eltanamly et al., Reference Eltanamly, Leijten, Jak and Overbeek2021; Quinlan, Reference Quinlan2007). The quantity and quality of interactions with caregivers contributes to early childhood experiences of deprivation. In traditional human societies, and by inference over human evolutionary history, some caregiver-mediated forms of deprivation (e.g., early weaning, low provisioning of food, low sleeping proximity to infants, reduced carrying of children, and caregiver neglect) increase childhood morbidity–mortality risk from causes such as malnutrition, disease, physical exposure, predation, and conspecific violence (Frankenhuis & Amir, Reference Frankenhuis and Amir2021; Quinlan, Reference Quinlan2007; Volk & Atkinson, Reference Volk and Atkinson2008, Reference Volk and Atkinson2013). For example, in traditional human societies, maternal mortality has catastrophic and universally negative effects on the survival of young children prior to weaning age (Sear & Mace, Reference Sear and Mace2008). From this perspective, significant experiences of deprivation (especially deprivation experiences that were reliably associated with morbidity–mortality over evolutionary history) are nested within harshness (Figure 1). Our developmental systems should have evolved to detect and respond to these forms of deprivation.Footnote 4
Like cues to threat-based forms of harshness, cues to deprivation-based forms of harshness vary from more proximal to the child (e.g., caregiver neglect, limited social or cognitive input, food scarcity, homelessness and other forms of material deprivation) to more distal to the child (e.g., famine, drought, resource shortages, unemployment, and poverty). Proximal cues correspond to the concept of deprivation in the threat-deprivation model: infrequent or low-quality environmental inputs experienced by the child. Distal cues reflect ecological factors linked to deprivation. As shown in Figure 1, distal cues may signal morbidity–mortality risk from insufficient environmental inputs either directly or indirectly (through proximal cues).
In this paper, we have stipulated that threat-based forms of harshness (as well as unpredictability) both constrain development and regulate it toward strategies that are adaptive under stressful conditions. These adaptive arguments postulate that developmental adaptations to adversity evolved in response to environmental variation and function to match the developing phenotype to relevant conditions. Such arguments may be less applicable to deprivation, however, especially in its more severe forms. In many cases, developmental responses to deprivation can be more parsimoniously explained as attempts to spare or preserve function. Such responses may enable individuals to make the best of bad circumstances in the context of substantial developmental constraints (i.e., low survival, poor growth, reduced reproduction; see Bogin et al., Reference Bogin, Silva and Rios2007) imposed by deprivation-mediated tradeoffs. Making the best of bad circumstances means that individuals adjust their phenotypes to the deprived conditions under which they are developing – allowing them to achieve better survival and reproductive outcomes than if they did not adjust their phenotypes – but still fare worse than peers who did not experience deprivation. In this section, we consider such tradeoffs from the perspective of life history theory. We propose that (a) deprivation-mediated tradeoffs fundamentally involve sacrificing growth to reduce maintenance costs, and that such tradeoffs occur in response to both energetic deprivation (central to the harshness-unpredictability framework) and social/cognitive deprivation (central to the threat-deprivation framework); and (b) individuals mount responses to deprivation that enable them to make the best of bad circumstances.
Energetic deprivation
A major form of deprivation is nutritional, or energetic deprivation. Within a life history framework, a central assumption is that natural selection has favored physiological mechanisms that track variation in energetic conditions (i.e., resource availability, energy balance, and related physical condition) and adjust growth to match that variation (Ellis, Reference Ellis2004; Ellison, Reference Ellison2003). Consequently, a central resource-allocation tradeoff, beginning in the prenatal period, is between maintenance and growth (for an extensive review, see Bogin et al., Reference Bogin, Silva and Rios2007). A baseline level of energy expenditure is needed to maintain basic functioning and repair or preserve the soma (e.g., brain metabolism, digestion, immune function, cellular/DNA repair, pathogen and predator defenses). Above these baseline investments in maintenance, resources can be allocated to growth and eventually reproduction. Growth builds capacities that enhance overall reproductive potential; it encompasses developmental processes and activities that increase physical size as well as social and cognitive competencies (e.g., development of abstract information-processing capacities, acquisition of skills and knowledge).
Consistently enriched or supportive conditions in early and middle childhood may signal to the individual that investments in growth are sustainable. Conversely, conditions of chronic resource scarcity may shift individuals toward development of an energy-sparing phenotype that economizes somatic maintenance costs by limiting growth. Energetic deprivation was a common feature of ancestral human environments. In subsistence-level populations, low energy availability co-occurs with both periods of negative energy balance (when caloric expenditures exceed caloric intake) and high pathogen burden/immunological challenges (e.g., McDade, Reference McDade2003; Urlacher et al., Reference Urlacher, Ellison, Sugiyama, Pontzer, Eick, Liebert, Cepon-Robins, Gildner and Snodgrass2018). Energetic deprivation, as instantiated in this co-occurring set of conditions, is associated with slower growth, later sexual maturation, and smaller body size (e.g., Bogin et al., Reference Bogin, Silva and Rios2007; Ellis, Reference Ellis2004; Ellison, Reference Ellison2003); relatively low progesterone concentrations and reduced fecundity in women (Ellison, Reference Ellison2003; Jasienska et al., Reference Jasienska, Bribiescas, Furberg, Helle and Núñez-de la Mora2017); and relatively low testosterone concentrations and reduced skeletal muscle tissue in men (Bribiescas, Reference Bribiescas2001, Reference Bribiescas2010). These adjustments of life history-related traits to chronic ecological conditions are generally considered an example of adaptive phenotypic plasticity (Ellison, Reference Ellison2003; Ellis, Reference Ellis2004). In this case, energetic deprivation regulates development toward a slower life history strategy that is maintenance-cost-effective: growth is constrained, and reproductive capacity is achieved later in development.
Although energetic deprivation constrains growth and reproductive maturation, individuals should still mount responses to energetic deprivation that enable them to make the best of bad circumstances. Energetic deprivation, and the closely related conditions of pathogen stress (e.g., McDade, Reference McDade2003; Urlacher et al., Reference Urlacher, Ellison, Sugiyama, Pontzer, Eick, Liebert, Cepon-Robins, Gildner and Snodgrass2018) and warfare related to food shortages/food instability (Ember & Ember, Reference Ember and Ember1992), were major co-occurring causes of extrinsic morbidity–mortality over evolutionary history. These co-occurring environmental factors may have opposing effects on the development of life history strategies. Like developmental exposures to threat, cues to high pathogen stress (e.g., high local fatality rates from infectious disease) predict faster life history-related traits (Lu et al., Reference Lu, Wang, Liu and Chang2021; Quinlan, Reference Quinlan2007).Footnote 5 This means that shifts toward slower life history strategies induced by energetic deprivation often occur in the context of countervailing shifts toward faster life history strategies in response to cues to extrinsic morbidity–mortality from other sources. For example, in a comparison of 22 small-scale human societies (hunter–gatherers and subsistence-based horticulturalists), poor energetic conditions were associated with later ages of menarche and first birth, whereas higher childhood mortality rates were associated with earlier ages of menarche and first birth (Walker et al., Reference Walker, Gurven, Hill, Migliano, Chagnon, De Souza, Djurovic, Hames, Hurtado, Kaplan, Kramer, Oliver, Valeggia and Yamauchi2006). Likewise, in cohort studies in Estonia, the Philippines, and Brazil, complex adversity exposures involving energetic deprivation together with other forms of harshness or unpredictability (e.g., poverty, parental instability, sibling death) predicted both later pubertal development and earlier ages at first reproduction (Gettler et al., Reference Gettler, McDade, Bragg, Feranil and Kuzawa2015; Hõrak et al., Reference Hõrak, Valge, Fischer, Mägi and Kaart2019; Valge et al., Reference Valge, Meitern and Hõrak2021; Wells et al., Reference Wells, Cole, Cortina-Borja, Sear, Leon, Marphatia, Murray, Wehrmeister, Oliveira, Gonçalves, Oliveira and Menezes2019). In the Brazilian cohort, other shifts toward faster life history-related traits were also observed, including greater risky behavior indicative of future discounting (i.e., smoking, criminal behavior) (Wells et al., Reference Wells, Cole, Cortina-Borja, Sear, Leon, Marphatia, Murray, Wehrmeister, Oliveira, Gonçalves, Oliveira and Menezes2019). In total, the literature on energetic deprivation in the context of harshness/unpredictability provides a textbook case of the complex – and sometimes countervailing nature – of developmental responses to adversity. Despite the negative effects of energetic deprivation on growth, broader phenotypic responses may still make the best of bad circumstances. Within ecological constraints, that potentially involves diverting resources toward earlier reproduction, as well as other shifts toward faster life history-related traits, particularly in relation to other cues to extrinsic morbidity–mortality or unpredictability.
Social/cognitive deprivation
The complex effects of energetic deprivation on reproductive development and behavior provide a model for considering the effects of social/cognitive deprivation on neurodevelopment and learning. Our developmental systems may have evolved to treat experiences of deprivation as privileged sources of information, using them as a basis for implementing developmental tradeoffs favoring maintenance over growth. Such tradeoffs fundamentally concern neurodevelopment, given the unusually high energetic costs of the human brain. Indeed, glucose consumed by the brain accounts for roughly 66% of the body’s resting metabolic rate and 43% of total energy expenditure in middle childhood (Kuzawa et al., Reference Kuzawa, Chugani, Grossman, Lipovich, Muzik, Hof, Wildman, Sherwood, Leonard and Lange2014). This high energetic cost reflects the size and complexity of the human brain, with its trillions of functional connections. Achieving high levels of neural complexity is costly – in terms of time, energy, and risk – and, critically, depends on adequate parental investment and social support (Snell-Rood & Snell-Rood, Reference Snell-Rood and Snell-Rood2020).
When such investment and support is lacking from the environment, one result may be reduced investment in neural development. Indeed, social and cognitive deprivation related to institutionalization, neglect, and other environments characterized by low levels of cognitive stimulation, such as lower socioeconomic status, have been consistently linked to reductions in brain volume, cortical surface area, and cortical thickness in children and adolescents. Reduced cortical volume, surface area, and thickness in children who have experienced deprivation are global, widespread, and occur not only in regions of association cortex but also in sensory cortex (Hanson, Hair et al., Reference Hanson, Hair, Shen, Shi, Gilmore, Wolfe and Pollak2013; Herzberg et al., Reference Herzberg, Hodel, Cowell, Hunt, Gunnar and Thomas2018; Hodel et al., Reference Hodel, Hunt, Cowell, Van Den Heuvel, Gunnar and Thomas2015; Mackey et al., Reference Mackey, Finn, Leonard, Jacoby-Senghor, West, Gabrieli and Gabrieli2015; McLaughlin, Sheridan, Winter et al., Reference McLaughlin, Sheridan, Winter, Fox, Zeanah and Nelson2014; Noble et al., Reference Noble, Houston, Brito, Bartsch, Kan, Kuperman, Akshoomoff, Amaral, Bloss, Libiger, Schork, Murray, Casey, Chang, Ernst, Frazier, Gruen, Kennedy, Van Zijl, Mostofsky and Sowell2015; Sheridan et al., Reference Sheridan, Fox, Zeanah, McLaughlin and Nelson2012). The precise cellular mechanisms contributing to thinner cortex, and whether these patterns reflect an acceleration or delay in neurodevelopment, remains unknown. Social/cognitive forms of deprivation in early life are also associated with reductions in the integrity of structural connections between brain areas across a wide range of white matter tracts (Eluvathingal et al., Reference Eluvathingal, Chugani, Behen, Juhasz, Muzik and Maqbool2006; Govindan et al., Reference Govindan, Behen, Helder, Makki and Chugani2010; Hanson, Adluru et al., Reference Hanson, Adluru, Chung, Alexander, Davidson and Pollak2013; Rosen et al., Reference Rosen, Sheridan, Sambrook, Meltzoff and McLaughlin2018). In total, substantial evidence suggests that deprivation in childhood is associated with a pattern of neurodevelopment that results in a smaller brain, as reflected by global reductions in cortical gray matter volume, surface, and thickness, as well as brain that is less connected, as revealed by a global decrease in the structural integrity of white matter tracts.
Deprivation-mediated reductions in neural growth and structural connectivity have been hypothesized to result in lower brain metabolic rates (Snell-Rood & Snell-Rood, Reference Snell-Rood and Snell-Rood2020). Thus, social/cognitive deprivation, like energetic deprivation, may shift individuals toward development of a maintenance-cost effective phenotype by constraining neural development. Stated differently, the effects of energetic and social/cognitive deprivation can be understood in terms of the same underlying tradeoff favoring maintenance over growth.
Research conducted within the threat-deprivation framework, and in developmental cognitive neuroscience more generally, has taken a deficit-based approach to deprivation. This is understandable, given the well-documented associations between social/cognitive deprivation and constraints on neurodevelopment and learning (see above, The Threat-Deprivation Framework). Nonetheless, an evolutionary-developmental perspective implies that children mount responses to deprivation that make the best of bad circumstances (see Ellis et al., Reference Ellis, Abrams, Masten, Sternberg, Tottenham and Frankenhuis2020). Such responses may involve the development of stress-adapted skills, or “hidden talents,” that enable adaptation within high-adversity contexts (Ellis et al., Reference Ellis, Bianchi, Griskevicius and Frankenhuis2017; Frankenhuis & de Weerth, Reference Frankenhuis and de Weerth2013), including rearing environments characterized by deprivation. Homeless youth, for example, generally experience considerable social, cognitive, and material deprivation. Although homeless youth tend to perform worse than comparison youth on executive function tasks, they have been found to perform as well or better on tests of creativity (Dahlman et al., Reference Dahlman, Bäckström, Bohlin and Frans2013; Fry, Reference Fry2018). Heightened creativity in the context of homelessness is presumably a stress-adapted skill for solving problems relevant to surviving in a deprived and unpredictable environment. Likewise, in explore-exploit decision-making tasks, previously institutionalized children use more exploitative strategies than peers raised in families (Humphreys et al., Reference Humphreys, Lee, Telzer, Gabard-Durnam, Goff, Flannery and Tottenham2015; Kopetz et al., Reference Kopetz, Woerner, MacPherson, Lejuez, Nelson, Zeanah and Fox2019; Loman et al., Reference Loman, Johnson, Quevedo, Lafavor and Gunnar2014). This exploitive strategy was detrimental under forgiving experimental conditions, but beneficial when conditions become harsh (i.e., when parameters of the task changed to hasten punishment) (Humphreys et al., Reference Humphreys, Lee, Telzer, Gabard-Durnam, Goff, Flannery and Tottenham2015). More generally, children exposed to deprived (as well as threatening) early environmental conditions – poverty, maternal disengagement, high neighborhood crime – may develop enhanced problem-solving skills for extracting fleeting or unpredictable rewards from the environment (Li et al., Reference Li, Sturge-Apple and Davies2021; Sturge-Apple et al., Reference Sturge-Apple, Davies, Cicchetti, Hentges and Coe2017; Suor et al., Reference Suor, Sturge-Apple, Davies and Cicchetti2017). Research documenting such hidden talents, however, does not condone exposing children to experiences that are obviously impairing in modern life. Instead, observations of how neural and cognitive function adapt to harsh early circumstances may support a strengths-based approach to intervention that leverages stress-adapted skills (Ellis et al., Reference Ellis, Bianchi, Griskevicius and Frankenhuis2017, Reference Ellis, Abrams, Masten, Sternberg, Tottenham and Frankenhuis2020).
Directions for future research
Little attempt has been made to study the potentially opposing effects of energetic deprivation and other forms of harshness on life history-related traits, or to link the patterns of cognitive and neurodevelopment associated with social/cognitive deprivation with life history traits. These represent obvious avenues for future research to investigate the ideas advanced here. Research is also needed to test the “maintenance-cost effective phenotype” hypothesis, especially in relation to neural development. See Table 2 for more specific directions for future research.
Integrative discussion of environmental unpredictability
In the harshness-unpredictability framework, unpredictability is a property of harshness. That is, unpredictability is defined as stochastic variation in harshness. In the preceding integrative sections, we discussed threat-based and deprivation-based forms of harshness. Here we advance an integrative approach to unpredictability that focuses on stochastic variation in cues to both of these forms of harshness (depicted by dotted lines in Figure 1), with relevant cues ranging from more proximal to the child (immediate experiences of threat and deprivation) to more distal to the child (ecological factors linked to threat and deprivation). This integration points to important future directions for research, as the threat-deprivation model has not previously addressed how the influences of threat and deprivation vary based upon their unpredictability, and the harshness-unpredictability model has mainly focused on unpredictability at the distal ecological level. For example, we do not know whether children develop similar or different adaptations to stochastic variation in threat (unpredictable threat) compared to stochastic variation in deprivation (unpredictable deprivation). And we do not know how more distal cues to unpredictability relate to more proximal cues. As a first step toward achieving this integration, we focus on integrating the harshness-unpredictability model with recent work on neurodevelopmental processes influenced by environmental unpredictability.
The study of unpredictability fundamentally concerns change or fluctuation in environmental conditions. Existing work on the harshness-unpredictability model and in developmental cognitive neuroscience has focused on how developmental systems detect and respond to unpredictable change or fluctuation in the environment. This issue is equally relevant to understanding stochastic variation in cues to morbidity–mortality from both sources of harshness shown in Figure 1. Following Young et al. (Reference Young, Frankenhuis and Ellis2020), we address how developmental systems may respond to properties of the environment in terms of both the ancestral cue approach (which is common within the harshness-unpredictability framework and focuses on Level 2 and Level 3 of developmental adaptation to the environment) and the statistical learning approach (which aligns with developmental cognitive neuroscience research and focuses on Level 1). Our goal is to compare these approaches and demonstrate how they can inform each other to advance theory and research on environmental unpredictability.
Ancestral cues versus statistical learning
The statistical structure of dimensions of adversity can be defined in terms of means, variances (average deviations from the mean), autocorrelations (the correlation of an environmental condition with itself over time or space), and cue reliabilities (the extent to which an experience or event provides information about a current or future state of the world) (see Frankenhuis et al., Reference Frankenhuis, Nettle and Dall2019; Young et al., Reference Young, Frankenhuis and Ellis2020). Developmental systems evolve to detect and respond to this statistical structure. There are at least two ways in which this may occur.
First, organisms may evolve to use accumulated experiences as raw data (without privileging particular sources of information) to build models of the statistical structure of their environment, such as stability versus change in dimensions of adversity. Through an ongoing computational process (e.g., contingency analysis of caregiver behavior, correcting prediction errors in estimates of danger), organisms develop and revise statistical estimates of patterns of environmental experience and use these estimates to guide learning and development. This method of estimating environmental parameters is referred to as statistical learning (Young et al., Reference Young, Frankenhuis and Ellis2020).
Second, in addition to statistical learning, organisms may evolve to use ancestral cues to estimate properties of the environment. This approach conceptualizes ancestral cues, as we have in this paper, as privileged sources of information about fitness-relevant dimensions of the environment. For example, individuals may use an ancestral cue, such as a parental transition or the appearance of a new neighborhood gang, to draw inferences about environmental unpredictability (e.g., that autocorrelations in harshness are likely to change), without requiring the same level of repeated experiences that are needed to make such an inference through statistical learning. Ancestral cues are thus more efficient, potentially enabling individuals to adaptively regulate development based on limited information (as per the hidden structure of the environment). For example, threat is an ancestral cue to morbidity–mortality from harm imposed by other agents (Öhman & Mineka, Reference Öhman and Mineka2001), which children respond to efficiently and selectively, as discussed above.
Measurement of unpredictability
Figure 1 depicts environmental unpredictability in terms of stochastic variation in both proximal cues to harshness (immediate experiences of threat and deprivation) and distal cues to harshness (ecological factors linked to threat and deprivation). For both proximal and distal cues, unpredictability can be measured in terms of ancestral cues (i.e., cues hypothesized to have indicated environmental unpredictability in our evolutionary past) and/or statistical learning (direct measures of statistical patterns of change in the environment).
The harshness-unpredictability framework has largely focused on ancestral cues to environmental unpredictability at the ecological level (distal cues in Figure 1). For example, an influential measurement model within the harshness-unpredictability framework uses parental transitions, residential changes, and parental job changes as observed indicators of latent unpredictability (Belsky et al., Reference Belsky, Schlomer and Ellis2012; Simpson et al., Reference Simpson, Griskevicius, Kuo, Sung and Collins2012). Some distal cues to unpredictability, such as changes in neighborhood safety as a result of changes in gang activity or changes in income as a result of job loss, may lead directly to statistical changes in threat or deprivation (proximal cues in Figure 1), such as increased variation in experiences of danger or food availability. Other distal cues to unpredictability, such as parental transitions or geographic relocations, may signal probabilistic changes in the statistical structure of threat or deprivation. For example, over evolutionary history, parental transitions involving the exit of a biological father from the home followed by the entry of a stepfather may have been reliably associated with change points in threat (e.g., Daly & Wilson, Reference Daly and Wilson1998). Research on distal cues to unpredictability is especially relevant to larger developmental programming questions (e.g., regulation of adaptive individual differences in life history strategies), corresponding to Level 2 and Level 3 developmental adaptations to the environment (Table 1). This body of research was reviewed earlier (The Harshness-Unpredictability Framework).
The statistical learning approach to environmental unpredictability involves directly measuring, or manipulating, environmental parameters that define patterns of statistical variation. The goal is to objectively track, or control, fluctuations in lived experiences over time or space. Statistical learning assumes that neural mechanisms use these experiences (e.g., stability versus change in food availability) to develop models of the environment that guide development and behavior (though not necessarily consciously or explicitly). Some research on the role of distal cues to unpredictability in regulating life history-related traits has employed measures consistent with a statistical learning approach (e.g., Li et al., Reference Li, Liu, Hartman and Belsky2018, which calculated residual variances in income-to-needs ratio across multiple timepoints to capture random variation in socioeconomic conditions). Research on the role of proximal cues to unpredictability in regulating children’s neurodevelopment has also used measures aligned with a statistical learning approach. Consistent with the threat-deprivation framework, this research has focused on Level 1 (Table 1) of developmental adaptation to the environment, with unpredictability measured at the micro-family level based on variation, for instance, in recorded noise levels in the home (e.g., Vrijhof et al., Reference Vrijhof, van der Voort, van IJzendoorn and Euser2018) or inconsistency in observed maternal sensory signals (e.g., Davis et al., Reference Davis, Stout, Molet, Vegetabile, Glynn, Sandman, Heins, Stern and Baram2017). This latter approach has involved calculating a specific unpredictability statistic – entropy (Davis et al., Reference Davis, Stout, Molet, Vegetabile, Glynn, Sandman, Heins, Stern and Baram2017) – based on the degree to which a mother’s next sensory signal (auditory, tactile, or visual) can be predicted based her last sensory signal. Measures of entropy have proven useful in predicting neurodevelopmental outcomes in animal and human models, such as performance on hippocampus-dependent memory tasks and motivation to approach rewards (Davis et al., Reference Davis, Stout, Molet, Vegetabile, Glynn, Sandman, Heins, Stern and Baram2017; Molet et al., Reference Molet, Heins, Zhuo, Mei, Regev, Baram and Stern2016).
Integrating unpredictability into the threat-deprivation framework
Integrating the construct of unpredictability into the threat-deprivation framework will require extending this work to evaluate whether the neurodevelopmental consequences of threat and deprivation vary as a function of the predictability of these experiences. Little existing work has examined this possibility directly. However, the role of experimentally manipulated unpredictability in shaping aversive learning has been examined extensively in animal models. This work suggests that unpredictability may enhance fear and alter the types of aversive learning processes that are associated with experiences of threat-related adversity (Foa et al., Reference Foa, Zinbarg and Rothbaum1992). In human research, experiencing unpredictable (but not aversive) auditory stimuli is associated with increased contextual fear (Vansteenwegen et al., Reference Vansteenwegen, Berico, Vervliet, Marescau and Hermans2008), sustained amygdala activation, and enhanced attention bias towards subsequent threatening stimuli (Herry et al., Reference Herry, Bach, Esposito, Di Salle, Perrig, Scheffler, Luthi and Seifritz2007). Taken together, this research suggests that as the unpredictability of experiences of early-life threat increases, or even unpredictability of sensory stimuli, the influences on emotional reactivity and emotional learning are likely to be accentuated; this possibility is an important direction for future research (see Table 2).
Whether the neurocognitive outcomes associated with deprivation vary based on the predictability of these experiences is also unknown. One of the domains most consistently influenced by early-life deprivation is higher-order cognition, including executive functions. Adults and children who report growing up in more unpredictable family environments, or who experienced more family instability while growing up, tend to display reduced inhibitory control and working memory capacity, but enhanced abilities for flexibly switching between tasks or mental sets and for tracking novel environmental information, particularly when in an experimentally induced mindset of stress/uncertainty (Fields et al., Reference Fields, Bloom, VanTieghem, Harmon, Choy, Camacho, Gibson, Umbach, Heleniak and Tottenham2021; Mittal et al., Reference Mittal, Griskevicius, Simpson, Sung and Young2015; Young et al., Reference Young, Griskevicius, Simpson, Waters and Mittal2018). Importantly, the two prospective studies of family instability (Fields et al., Reference Fields, Bloom, VanTieghem, Harmon, Choy, Camacho, Gibson, Umbach, Heleniak and Tottenham2021; Mittal et al., Reference Mittal, Griskevicius, Simpson, Sung and Young2015) included participants with substantial histories of deprivation (e.g., institutionalization, poverty). This suggests the intriguing possibility that when experiences of deprivation are unpredictable, the impact on at least some aspects of executive functioning (i.e., cognitive flexibility) may be mitigated. Future work is needed to evaluate this possibility directly (see Table 2).
Directions for future research
Developmental cognitive neuroscience research on environmental unpredictability could be expanded on within the current integrated model of dimensions of environmental experience by considering unpredictability cues at multiple levels (e.g., caregiver transitions in addition to maternal sensory signals and family routines). Existing work on unpredictability from a statistical learning perspective has focused largely on deficits in cognitive performance, but unpredictability has also been linked to enhanced performance in some domains of executive functions (in ways that might enhance functioning in unstable environments; Mittal et al., Reference Mittal, Griskevicius, Simpson, Sung and Young2015; Young et al., Reference Young, Griskevicius, Simpson, Waters and Mittal2018) as well as to regulation of life history strategies. Determining whether neurodevelopmental adaptations to unpredictable environments (e.g., within the family) are associated with developmental adaptations to broader ecological contexts (i.e., life history strategies) is another important goal for future research. See Table 2 for details on future research directions related to unpredictability.
Summary of integrated model of dimensions of environmental experience
Here we summarize and discuss implications of the current integrated model of dimensions of environmental experience. This integration, we argue, advances knowledge in two basic ways. First, a mechanistic and neurobiological analysis of development informs our understanding of dimensions of adversity in ways that refine and extend the harshness-unpredictability model. Second, an evolutionary analysis of why development operates the way it does across different environmental contexts informs our understanding of the developmental consequences of early adversity in ways that refine and extend the threat-deprivation model. We discuss these different implications in turn.
Refining and extending the harshness-unpredictability model
In the original harshness-unpredictability model (Ellis et al., Reference Ellis, Figueredo, Brumbach and Schlomer2009), harshness was presented as a catch-all dimension reflecting age-specific rates of morbidity–mortality. The focus was on cues to harshness from disparate sources. Without a focus on neurodevelopmental mechanisms, Ellis et al. (Reference Ellis, Figueredo, Brumbach and Schlomer2009) stated that “… relevant cues to external morbidity and mortality risks may be conveyed to children by parents through harsh (abusive) or unsupportive (neglectful) childrearing practices” (p. 245), with both kinds of rearing experiences contributing to a common pathway toward faster life history strategies. In Figure 1, we present a revision of this original idea, now decomposing harshness into two distinct adaptive problems: morbidity–mortality from harm imposed by other agents (signaled by proximal and distal cues to threat) and morbidity–mortality from insufficient environmental inputs (signaled by proximal and distal cues to deprivation).Footnote 6 These distinct adaptive problems provide the evolutionary basis for why threat-based and deprivation-based forms of harshness have distinct influences on development. This distinction is supported by research showing that experiences of threat and deprivation have distinct influences on cognitive, emotional, and neural development (McLaughlin et al., Reference McLaughlin, Sheridan, Humphreys, Belsky and Ellis2021).
Most importantly, threat-based and deprivation-based forms of harshness calibrate the development of life history strategies in both convergent and divergent ways. Whereas experiences that historically signaled increased morbidity–mortality risk from threat regulate development toward faster life history strategies, experiences that historically indicated increased morbidity–mortality risk from deprivation have more complex effects. As reviewed above (Integrative Discussion of Harshness and Deprivation), energetic deprivation – and closely related factors such as pathogen stress and high local childhood mortality rates – are associated with shifts toward both slower and faster life history traits, from later puberty to earlier timing of reproduction. Although deprivation in high-income countries does not usually involve high levels of energetic and pathogen stress,Footnote 7 deprivation and threat still appear to have divergent effects on life-history-related traits, as demonstrated in a recent meta-analysis (Colich et al., Reference Colich, Rosen, Williams and McLaughlin2020). Indeed, in an analysis of nearly 10,000 adolescents from the National Comorbidity Survey (Thomas et al., Reference Thomas, Colich, McLaughlin and Sumner2021), both experiences of threat and deprivation were associated greater likelihood of having engaged in sexual activity; however, greater exposure to threat – but not deprivation – was associated with more risky sexual behavior (e.g., earlier age at first sex, more sexual partners in the past year, inconsistent condom use).
In summary, extending the original harshness-unpredictability model, threat and deprivation appear to have at least partially divergent effects on life history strategies, which emerge even in populations without significant energetic constraints on development. Much research conducted within the harshness-unpredictability framework has employed general measures of harshness, such as socioeconomic adversity (which encompasses both threat and deprivation; e.g., Green et al., Reference Green, McLaughlin, Berglund, Gruber, Sampson, Zaslavsky and Kessler2010; McLaughlin et al., Reference McLaughlin, Green, Gruber, Sampson, Zaslavsky and Kessler2012). Although such measures are useful for capturing overall morbidity–mortality risk from external causes (i.e., harshness), a more nuanced approach – differentiating between threat and deprivation – could enhance understanding of developmental pathways linking dimensions of adversity to variation in life history strategies.
We have argued that significant experiences of deprivation favor developmental tradeoffs that sacrifice growth to reduce maintenance costs. Conversely, environments characterized by high levels of sustenance, nurturance, and cognitive stimulation promote growth. In the parlance of life history theory, these kinds of experiences function to build “embodied capital” (Kaplan et al., Reference Kaplan, Hill, Lancaster and Hurtado2000), as manifest in a larger and more structurally connected brain that supports development of complex cognitive capacities and acquisition of skills and knowledge. Such costly investments in embodied capital are central to the development of a slower life history strategy because they are hypothesized to yield high returns in the future, which is adaptive in more safe, stable environments in which children can be expected to live long enough to experience these future payoffs (e.g., Sng et al., Reference Sng, Neuberg, Varnum and Kenrick2017). The converse assumption is that deprivation constrains longer-term allocation of resources toward development of a slow life history strategy. This is conceptually distinct from specifically regulating development toward a faster life history strategy, as generally occurs in response to childhood experiences of threat (and unpredictability).
In total, significant experiences of deprivation – particularly forms of deprivation that historically increased morbidity–mortality from insufficient environmental inputs – may (a) constrain physical development in ways that induce slower life history traits in some domains (e.g., slower growth, later sexual maturation) while (b) constraining cognitive and neural development in ways that diminish investment in slower life history traits in other domains (e.g., reduced development of abstract information-processing capacities, lower acquisition of skills and knowledge). This occurs against an evolutionary backdrop in which natural selection shapes developmental systems to respond to extrinsic morbidity–mortality cues by accelerating life history strategies (within ecological constraints). Thus, experiences of deprivation, which constrain investments in embodied capital, may still shift individuals toward accelerated onset of sexual activity or reproduction (e.g., Gettler et al., Reference Gettler, McDade, Bragg, Feranil and Kuzawa2015; Thomas et al., Reference Thomas, Colich, McLaughlin and Sumner2021; Wells et al., Reference Wells, Cole, Cortina-Borja, Sear, Leon, Marphatia, Murray, Wehrmeister, Oliveira, Gonçalves, Oliveira and Menezes2019). Such complex developmental adaptations – reflecting a combination of slower and faster traits – may be the outcome of deprivation-mediated resource-allocation tradeoffs that make the best of bad circumstances.
Understanding the developmental effects of morbidity–mortality cues from threat, deprivation, and related ecological factors is complicated by a developmental landscape characterized by frequent co-occurrence of different forms of adversity. Across small-scale (non-state) human societies, for example, the frequency of warfare (a cue to morbidity–mortality risk from physical harm from other agents) is moderately to strongly associated with various indicators of resource scarcity (a cue to morbidity–mortality risk from insufficient environmental inputs), such as chronic food shortages, famine, and destruction of food supplies by pestilence or weather (Ember & Ember, Reference Ember and Ember1992). Likewise, in population-based studies in Western societies, measures of threat and deprivation are moderately correlated (Dong et al., Reference Dong, Anda, Felitti, Dube, Williamson, Thompson, Loo and Giles2004; Green et al., Reference Green, McLaughlin, Berglund, Gruber, Sampson, Zaslavsky and Kessler2010; McLaughlin, Reference McLaughlin, Harkness and Hayden2020). Such co-occurrence means that both sources of harshness depicted in Figure 1 are commonly experienced at the same time. This can push experience-driven plasticity in diverging directions that sometimes oppose each other (discussed above, Integrative Discussion of Harshness and Deprivation). These countervailing effects of different, but co-occurring, sources of harshness pose challenges for predictive models, as multiple patterns of data can potentially be accommodated. Such complexity highlights the need for formal mathematical models that explore how different forms of harshness interact – when they operate together, when they operate opposed – to regulate specific facets of life history strategy. The harshness-unpredictability model has focused on interactions between harshness and unpredictability; the current integrated model underscores the importance of studying interactions between threat and deprivation as well.
Refining and extending the threat-deprivation model
The current integrated model of dimensions of environmental experience also suggests meaningful extensions to the threat-deprivation model. In the threat-deprivation model, children use information in the proximal environment in which they are developing (depicted as proximal cues in Figure 1) as a basis for concurrently adapting to that environment (Level 1, Table 1). Based on an evolutionary-developmental analysis, the integrated model extends the scope of the threat-deprivation model in two ways. Both extensions assume that immediate rearing environments – the here and now – are not the only environments to which individuals have had to adapt in order to survive and reproduce.
The first extension concerns the scope of environmental cues (what is detected and encoded in the environment). As shown in Figure 1, the integrated model focuses on cues – proximal and distal – to morbidity–mortality risk from both threat-based and deprivation-based forms of harshness. Proximal cues in Figure 1 reflect opportunities and constraints in one’s immediate rearing environment. However, the proximal environments in which children are developing may be quite variable or idiosyncratic in relation to the broader ecological contexts (current and future) into which children will mature. Thus, developmental systems should have evolved to detect and respond – directly or indirectly – to more distal cues (e.g., premature disability or death of close community members), and ultimately to use both proximal and distal information as a basis for regulating development. As documented in cross-cultural research on determinants of parenting (Eltanamly et al., Reference Eltanamly, Leijten, Jak and Overbeek2021; Mesman et al., Reference Mesman, van IJzendoorn, Behrens, Carbonell, Cárcamo, Cohen-Paraira, de la Harpe, Ekmekçi, Emmen, Heidar, Kondo-Ikemura, Mels, Mooya, Murtisari, Nóblega, Ortiz, Sagi-Schwartz, Sichimba, Soares, Steele and Zreik2016; Quinlan, Reference Quinlan2007), ecological factors related to threat and deprivation (e.g., warfare, resource scarcity, food shortages, pathogen stress) are transmitted to young children through the behavior and physiology of caregivers. As children mature, however, their developmental systems should increasingly track distal information (beyond the family) directly; indeed, children may even become resistant to parental cues and information in adolescence (e.g., Ellis et al., Reference Ellis, Del Giudice, Dishion, Figueredo, Gray, Griskevicius, Hawley, Jacobs, James, Volk and Wilson2012).
Building on this idea, the second extension concerns phenotypic responses to adversity. In the integrated model, cues do double duty in regulating developmental adaptations: children use them to adapt their phenotypes to both immediate contexts and broader ecological contexts that are more spatially or temporally distal (Level 2 and Level 3, Table 1). In other words, the integrated model stipulates that developmental adaptations to adversity are outward-looking and forward-looking in relation to immediate rearing environments.
One step toward examining these proposed extensions of the threat-deprivation model would be to test for developmental mechanisms that mediate responses to distal cues. Determining whether distal cues such as high rates of neighborhood violence influence neural and affective responses to potential threats in similar (or different) ways as direct experiences of violence is one example of how the current focus of the threat-deprivation model could be meaningfully extended (Table 2). This would enable more systematic evaluation of adaptation-based hypotheses – particularly regarding developmental adaptations to more spatially or temporally distal environments – which the threat-deprivation model has not previously addressed. A working assumption of the model is that heightened neural plasticity early in life alters development in threatening environments in ways that promote safety in the short term but increase vulnerability for psychopathology and other negative outcomes in the long term. Although that is one part of the story, another part is that childhood adaptations to adversity may eventuate in long-term adaptive changes in biobehavioral systems – adaptive individual differences such as variation in life history strategies – that regulate development over the life course (even if these changes translate into making the best of bad circumstances, as likely occurs in relation to deprivation). The current integrated model underscores the need to examine adaptive and maladaptive processes together – and to model them in relation to each other – to more fully understand adversity-mediated variations in development.
Finally, the threat-deprivation framework could be extended to evaluate whether the neurodevelopmental consequences of threat and deprivation vary as a function of the predictability of these experiences. An underlying question is whether developmental systems evolved to respond to these two forms of variation in similar or different ways. Methods in developmental cognitive neuroscience for assessing environmental unpredictability largely align with a statistical learning perspective, either experimentally manipulating environmental unpredictability or using repeated measures designs to calculate variation or probabilities of change over time in observed environmental parameters (e.g., maternal sensory signals). Although these approaches allow for precise calculation of unpredictability in terms of environmental statistics, future research should also consider the neurodevelopmental consequences of ancestral cues to environmental unpredictability (e.g., cues that recurrently indicated unpredictability in threat or deprivation, or change points in the statistical properties of these adversity dimensions, in ancestral environments). Research conducted within the harshness-unpredictability framework provides examples of this approach (see Young et al., Reference Young, Frankenhuis and Ellis2020).
Conclusion
The harshness-unpredictability and threat-deprivation frameworks were originally developed – and tested – in relative isolation from each other. How the assumptions and implications of each model converged or diverged from the other was unclear. Here we have attempted to break down these silos, compare the two frameworks, and build bridges between them by articulating an integrated model of dimensions of environmental experience. The proposed model – integrating threat-based forms of harshness, deprivation-based forms of harshness, and environmental unpredictability – seeks to advance an understanding of why and how these core dimensions of adversity distinctly regulate development. Toward this end, we have highlighted actionable directions for future research (Table 2). The major limitation of the model is underscored by the word future. We have attempted to set a foundation, but the work needed to systematically test the integrated model remains to be done. There are still more questions than answers.
Although our approach attempts to move beyond cumulative-risk thinking to address the why, how, and which of development in contexts of adversity, we trust that the day will come when scholars regard our presumptive advances as something that needs to be moved beyond as well. After all, this is how science works. We stand on the shoulders of other scholars and look forward to the day when our shoulders stimulate and support the insights of others.
Acknowledgements
We thank Marco Del Giudice, Lisa Diamond, Willem Frankenhuis, Steven Holochwost, Kate Humphreys, Greg Miller, Jeff Simpson, and Ethan Young, for their insightful comments on earlier drafts of this manuscript. We would also like to thank Beyond Bounds Creative for their help in creating the Figure.
Funding statement
Development of this paper was supported by the Consortium for Families and Health Research at the University of Utah and the National Institute of Mental Health (R01-MH103291, R01-MH106482, and R37-MH119194 to KM; R01-MH115004, R01-MH120314, and R24-AG065174 to MS).
Conflicts of interest
None.