People who grow up in poverty may respond to life-threatening situations by taking more financial risks—but this connection may be much weaker than previously thought. A new study published in the Journal of Experimental Psychology: Applied attempted to replicate an influential 2011 study that tied childhood poverty to riskier and more impulsive financial choices when individuals are reminded of death. The new research found only limited support for those claims, with much smaller effects than originally reported and no evidence of an effect on impulsive choices.
The study was conducted by Joe Gladstone and his colleagues at the Leeds School of Business at the University of Colorado Boulder. Their goal was to test whether early-life economic conditions shape the way people make financial decisions when confronted with thoughts of their own mortality.
The original research by Griskevicius and colleagues had suggested that people from lower-income childhood environments might adopt a faster, more risk-prone strategy to navigate an uncertain and dangerous world. The theory behind this idea comes from a framework in evolutionary biology known as life history theory.
According to this theory, organisms adjust their survival and reproductive strategies based on the availability of resources and stability in their environments. In humans, it has been proposed that those who grow up in poverty may become more oriented toward short-term rewards and higher risk, especially in situations that trigger thoughts of danger or death.
Gladstone and his team set out to test this idea using a much larger and more diverse sample than the original. Whereas the 2011 study included 71 university students, the new replication included more than 1,000 adults from across the United States, recruited online. The average age of participants was about 40 years, and the sample included a wide range of income levels and educational backgrounds.
Participants were randomly assigned to one of two conditions. One group read a news-style article that emphasized threats to life and safety, such as violence and death, to subtly prompt thoughts about mortality. The other group read a neutral article about someone losing their keys. Afterward, participants completed two decision-making tasks. One asked them to choose between a guaranteed sum of money or a gamble for a larger amount, designed to measure financial risk-taking. The other task measured time preference, or how strongly people prefer smaller rewards immediately over larger rewards later.
The researchers also asked participants to reflect on their socioeconomic background during childhood and in adulthood. This was measured through questions about whether their family had enough money growing up, how wealthy their neighborhood felt, and how they perceived their financial standing compared to peers. A similar set of questions assessed their current financial stability.
When the researchers analyzed the data, they did find a statistically significant interaction between mortality cues and childhood poverty on financial risk-taking. People who reported growing up in lower-income environments were slightly more likely to take financial risks after being exposed to thoughts about death.
However, the size of this effect was extremely small—so small that the researchers questioned its practical significance. On a seven-question task measuring risk, the difference between low-income and high-income childhood backgrounds under threat amounted to less than one additional risky choice.
In contrast to the original study, the new research found no evidence that mortality salience influenced time preference. People did not show a greater desire for immediate rewards based on their childhood income background, even when reminded of their own mortality. This calls into question one of the original study’s key findings.
To explore whether age might explain the difference in results, the researchers conducted additional analyses. They found that among younger participants—those closer in age to the university students in the original study—the predicted pattern for time preference was somewhat more apparent.
Younger people from lower-income backgrounds showed a slight tendency to prefer immediate rewards when primed with thoughts of death. This effect was not present in older participants. The researchers suggest that age-related differences in decision-making might help explain why the original findings failed to replicate in their broader sample.
The study also included a number of improvements over the original. It was preregistered, meaning the researchers specified their hypotheses and methods before collecting data. They used attention checks to make sure participants engaged with the reading materials, and all data and analysis code were made publicly available. These steps were taken to improve transparency and reduce the risk of bias.
Despite some partial replication of the original findings, the authors caution against over-interpreting their results. They note that the effect sizes observed in their study are too small to have much impact on real-world behavior. For example, the observed differences in risk-taking were unlikely to translate into noticeable changes in financial habits like investing or borrowing.
The researchers also raise questions about the broader application of life history theory to individual human behavior. While the theory may help explain broad patterns across species or large populations, it may not be as useful for predicting how individuals will respond to specific life events. The authors suggest that other factors—such as current financial status, psychological traits, or cultural context—might play a larger role in shaping financial decision making.
“Overall, we interpret our results as challenging the practical applicability of [life history theory] to individual differences in risk preferences and temporal discounting under mortality salience,” they concluded.
The study, “Childhood Poverty and Its Impact on Financial Decision Making Under Threat: A Preregistered Replication of Griskevicius et al. (2011b),” was authored by Joe J. Gladstone, Meredith Lehman, and Mallory Decker.