Our biological age — the wear-and-tear associated with the cellular aging process — may be a more significant factor in determining our cognitive abilities than our chronological age, according to new research published in the Journal of Gerontology: Biological Sciences. The findings open up new avenues for research and could have profound implications for how we approach aging and cognitive health in the future.
For years, scientists have been intrigued by the fact that people of the same age can show vastly different levels of cognitive performance. While it’s well-known that our cognitive abilities, such as memory and processing speed, generally decline as we get older, the rate and extent of this decline can vary significantly from person to person. This discrepancy has led researchers to wonder if factors other than chronological age—how long we have been alive—might be at play.
In recent years, attention has turned to the concept of “biological age” (also known as epigenetic aging) — a measure that reflects an individual’s physiological condition rather than the number of years they have lived.
Scientists have started to explore whether biological age might be a better predictor of cognitive performance than chronological age. This interest is largely due to advances in understanding DNA methylation—a process where certain chemicals are added to our DNA and can influence how genes are expressed. Researchers have developed ‘epigenetic clocks’ based on DNA methylation patterns that can estimate an individual’s biological age, and these clocks are believed to be potentially more accurate indicators of age-related cognitive decline.
“Prior research has established that when we test people’s cognitive performance in the lab in terms of how quickly they can process information and how well they can remember it, older people perform more slowly and remember less than younger people,” said study author Stacey B. Scott, an associate professor and director of the Stress, Emotions, & Health Lab at Stony Brook University.
“It’s important to note, though, that this chronological age explanation does not help us understand why two people who are the same chronological age (i.e., two 60 year olds) may show very different performance. Biological changes that occur across the lifespan have been proposed to underlie both the differences we observe on average between younger and older adults, as well as why two people the same chronological age may differ. Until recently it has not been possible to test this explanation, biological aging ‘clocks’ can be estimated from individuals’ DNA.”
The recent study analyzed data from 142 participants, aged between 25 and 65 years, who were part of a larger project called the Effects of Cognitive Aging, Physiology, and Emotion study. These participants were chosen because they had provided blood samples suitable for DNA analysis. The group was diverse, with a mix of genders and ethnicities, reflecting the population of the Bronx, New York, where the study was conducted.
Participants were given smartphones equipped with an application to measure their cognitive performance in real-life settings. Over 14 days, they completed cognitive assessments at random times each day. These assessments tested their processing speed and working memory—two key aspects of cognitive functioning that often decline with age. At the end of the two weeks, participants provided blood samples, which were then used to measure their biological age through DNA methylation analysis.
When it came to tasks that measured processing speed (how quickly someone can understand and react to information), participants who had a higher biological age than their chronological age performed worse on average. Interestingly, this decline in performance was seen in measures of biological age derived from two specific epigenetic clocks (known as Horvath 1 and Horvath 2), but not from others.
For tasks testing working memory (the ability to hold and manipulate information over short periods), a similar pattern emerged, but only with one specific epigenetic clock (GrimAge). Participants with a higher biological age according to this clock showed worse performance.
The researchers also looked at how much individuals’ performance varied from one moment to the next. They found that those with a higher biological age showed more variability in their cognitive performance, indicating potentially less stable cognitive abilities. This finding held across various epigenetic clocks.
“We found that even when we account for the fact that chronologically older people tend to respond more slowly and remember information less well than younger people, being biologically older ‘than you should be’ still differentiated people,” Scott told PsyPost. “We expected this would be the case, but this is a new area of research and this question hasn’t been tested in people’s performance in daily life outside of the lab.”
“We found support for our prediction – people who had greater epigenetic age acceleration (whose biological ages were older than their chronological ages) tended to process information more slowly and have poorer memory performance on average across the study. An important takeaway relates to our expectations about someone’s cognitive function based on their age. Some of the effects we found suggested that the accelerated epigenetic aging effects were as big or bigger than the chronological age differences. This means that for some outcomes, knowing how old someone is may not be as informative as other information about them such as their epigenetic age.”
While biological age showed a clear link to cognitive performance, chronological age was also a significant predictor, but in a different way. Older participants tended to show less variability in their cognitive performance, suggesting that as people get older, their cognitive abilities might become more consistent, even if not necessarily sharper.
“Interestingly although we found a consistent pattern for chronological age and epigenetic age acceleration in terms of average performance (older age, poorer average performance), these didn’t operate the same way for variability,” Scott explained. “Older chronological age was associated with less inconsistency (more stability) in performance, whereas for epigenetic age acceleration was associated with more inconsistency (less stability). There is much less work on variability/consistency than average levels of performance, but some researchers and theorists have proposed that patterns of greater inconsistency might be an early indicator of dementia or Mild Cognitive Impairment.”
Lead author Daisy V. Zavala, a doctoral candidate at Stony Brook University, summarized: “We found that when someone’s epigenetic age was older than their chronological age, it predicted them being on average slower at matching symbols and worse at recalling the location of the dots. We also found that people whose DNA showed that they were older than their chronological age had wider swings in their performance.”
While the study’s findings are illuminating, it’s important to recognize its limitations. The study focused on a specific group of middle-aged adults in a particular geographic area. Future research could expand on these findings by including a broader range of ages and locations to see if the patterns hold true across different populations. Another key point for future research is the exploration of how changes in biological age over time might correlate with changes in cognitive performance.
“The current paper examines concurrent associations between individuals’ cognitive performance over two weeks and their epigenetic age acceleration based on blood draws from that same time,” Scott said. “We’re fortunate that in this study, the participants carried the phones and completed the cognitive tasks for two weeks each year. Daisy Zavala’s (the lead author and PhD student here in my lab at Stony Brook) dissertation looks forward – does someone’s epigenetic age acceleration now at the beginning of the study predict their cognitive performance up to 3 years later?”
The current study also has several important strengths.
“Our study differs in several ways from prior research,” Scott told PsyPost. “Most of what we know about cognitive performance is based on how people perform on tests in the laboratory; this is the case for the new research on epigenetic aging and cognitive performance as well. This has been informative, of course, but increasingly researchers are working to understand how people perform in the real-life conditions of their lives at home and work. Additionally, because we studied people in brief tests on smartphones, we have many assessments of their performance.
“This means that our results more likely capture an overall snapshot of their typical performance than if we had to rely on a single day they visited the lab, on which they may have happened to have slept poorly or not felt well. Those “bad days” and “good days” are interesting – they may help to tell us about possible moments-of-risk in which someone may be more likely to make mistakes or work more slowly, or conditions in which they do particularly well – and we get some information about these from our findings that the clocks predict variability as well. We wouldn’t be able to know this if we only had one or a handful of observations of the person’s performance.”
“The sample is also valuable – it wasn’t a convenience sample and didn’t rely on recruiting from college students, which can result in a set of individuals that don’t represent the broader population,” Scott added. “Instead, they were systematically sampled and include people from 25-65 years old – an important span for understanding cognitive development, when the brain has finished maturation but the onset of major diseases has typically not yet occurred.”
The study, “Epigenetic Age Acceleration and Chronological Age: Associations With Cognitive Performance in Daily Life“, was authored by Daisy V. Zavala, Natalie Dzikowski, Shyamalika Gopalan, Karra D Harrington, Giancarlo Pasquini,Jacqueline Mogle, Kerry Reid, Martin Sliwinski, Jennifer E. Graham-Engeland, Christopher G. Engeland, Kristin Bernard, Krishna Veeramah, and Stacey B. Scott.