Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Cognitive Science

Epigenetic age acceleration linked to poorer memory performance and cognitive functioning

by Eric W. Dolan
November 27, 2023
in Cognitive Science
(Photo credit: OpenAI's DALL·E)

(Photo credit: OpenAI's DALL·E)

Share on TwitterShare on Facebook

Our biological age — the wear-and-tear associated with the cellular aging process — may be a more significant factor in determining our cognitive abilities than our chronological age, according to new research published in the Journal of Gerontology: Biological Sciences. The findings open up new avenues for research and could have profound implications for how we approach aging and cognitive health in the future.

For years, scientists have been intrigued by the fact that people of the same age can show vastly different levels of cognitive performance. While it’s well-known that our cognitive abilities, such as memory and processing speed, generally decline as we get older, the rate and extent of this decline can vary significantly from person to person. This discrepancy has led researchers to wonder if factors other than chronological age—how long we have been alive—might be at play.

In recent years, attention has turned to the concept of “biological age” (also known as epigenetic aging) — a measure that reflects an individual’s physiological condition rather than the number of years they have lived.

Scientists have started to explore whether biological age might be a better predictor of cognitive performance than chronological age. This interest is largely due to advances in understanding DNA methylation—a process where certain chemicals are added to our DNA and can influence how genes are expressed. Researchers have developed ‘epigenetic clocks’ based on DNA methylation patterns that can estimate an individual’s biological age, and these clocks are believed to be potentially more accurate indicators of age-related cognitive decline.

“Prior research has established that when we test people’s cognitive performance in the lab in terms of how quickly they can process information and how well they can remember it, older people perform more slowly and remember less than younger people,” said study author Stacey B. Scott, an associate professor and director of the Stress, Emotions, & Health Lab at Stony Brook University.

“It’s important to note, though, that this chronological age explanation does not help us understand why two people who are the same chronological age (i.e., two 60 year olds) may show very different performance. Biological changes that occur across the lifespan have been proposed to underlie both the differences we observe on average between younger and older adults, as well as why two people the same chronological age may differ. Until recently it has not been possible to test this explanation, biological aging ‘clocks’ can be estimated from individuals’ DNA.”

The recent study analyzed data from 142 participants, aged between 25 and 65 years, who were part of a larger project called the Effects of Cognitive Aging, Physiology, and Emotion study. These participants were chosen because they had provided blood samples suitable for DNA analysis. The group was diverse, with a mix of genders and ethnicities, reflecting the population of the Bronx, New York, where the study was conducted.

Participants were given smartphones equipped with an application to measure their cognitive performance in real-life settings. Over 14 days, they completed cognitive assessments at random times each day. These assessments tested their processing speed and working memory—two key aspects of cognitive functioning that often decline with age. At the end of the two weeks, participants provided blood samples, which were then used to measure their biological age through DNA methylation analysis.

When it came to tasks that measured processing speed (how quickly someone can understand and react to information), participants who had a higher biological age than their chronological age performed worse on average. Interestingly, this decline in performance was seen in measures of biological age derived from two specific epigenetic clocks (known as Horvath 1 and Horvath 2), but not from others.

For tasks testing working memory (the ability to hold and manipulate information over short periods), a similar pattern emerged, but only with one specific epigenetic clock (GrimAge). Participants with a higher biological age according to this clock showed worse performance.

The researchers also looked at how much individuals’ performance varied from one moment to the next. They found that those with a higher biological age showed more variability in their cognitive performance, indicating potentially less stable cognitive abilities. This finding held across various epigenetic clocks.

“We found that even when we account for the fact that chronologically older people tend to respond more slowly and remember information less well than younger people, being biologically older ‘than you should be’ still differentiated people,” Scott told PsyPost. “We expected this would be the case, but this is a new area of research and this question hasn’t been tested in people’s performance in daily life outside of the lab.”

“We found support for our prediction – people who had greater epigenetic age acceleration (whose biological ages were older than their chronological ages) tended to process information more slowly and have poorer memory performance on average across the study. An important takeaway relates to our expectations about someone’s cognitive function based on their age. Some of the effects we found suggested that the accelerated epigenetic aging effects were as big or bigger than the chronological age differences. This means that for some outcomes, knowing how old someone is may not be as informative as other information about them such as their epigenetic age.”

While biological age showed a clear link to cognitive performance, chronological age was also a significant predictor, but in a different way. Older participants tended to show less variability in their cognitive performance, suggesting that as people get older, their cognitive abilities might become more consistent, even if not necessarily sharper.

“Interestingly although we found a consistent pattern for chronological age and epigenetic age acceleration in terms of average performance (older age, poorer average performance), these didn’t operate the same way for variability,” Scott explained. “Older chronological age was associated with less inconsistency (more stability) in performance, whereas for epigenetic age acceleration was associated with more inconsistency (less stability). There is much less work on variability/consistency than average levels of performance, but some researchers and theorists have proposed that patterns of greater inconsistency might be an early indicator of dementia or Mild Cognitive Impairment.”

Lead author Daisy V. Zavala, a doctoral candidate at Stony Brook University, summarized: “We found that when someone’s epigenetic age was older than their chronological age, it predicted them being on average slower at matching symbols and worse at recalling the location of the dots. We also found that people whose DNA showed that they were older than their chronological age had wider swings in their performance.”

While the study’s findings are illuminating, it’s important to recognize its limitations. The study focused on a specific group of middle-aged adults in a particular geographic area. Future research could expand on these findings by including a broader range of ages and locations to see if the patterns hold true across different populations. Another key point for future research is the exploration of how changes in biological age over time might correlate with changes in cognitive performance.

“The current paper examines concurrent associations between individuals’ cognitive performance over two weeks and their epigenetic age acceleration based on blood draws from that same time,” Scott said. “We’re fortunate that in this study, the participants carried the phones and completed the cognitive tasks for two weeks each year. Daisy Zavala’s (the lead author and PhD student here in my lab at Stony Brook) dissertation looks forward – does someone’s epigenetic age acceleration now at the beginning of the study predict their cognitive performance up to 3 years later?”

The current study also has several important strengths.

“Our study differs in several ways from prior research,” Scott told PsyPost. “Most of what we know about cognitive performance is based on how people perform on tests in the laboratory; this is the case for the new research on epigenetic aging and cognitive performance as well. This has been informative, of course, but increasingly researchers are working to understand how people perform in the real-life conditions of their lives at home and work. Additionally, because we studied people in brief tests on smartphones, we have many assessments of their performance.

“This means that our results more likely capture an overall snapshot of their typical performance than if we had to rely on a single day they visited the lab, on which they may have happened to have slept poorly or not felt well. Those “bad days” and “good days” are interesting – they may help to tell us about possible moments-of-risk in which someone may be more likely to make mistakes or work more slowly, or conditions in which they do particularly well – and we get some information about these from our findings that the clocks predict variability as well. We wouldn’t be able to know this if we only had one or a handful of observations of the person’s performance.”

“The sample is also valuable – it wasn’t a convenience sample and didn’t rely on recruiting from college students, which can result in a set of individuals that don’t represent the broader population,” Scott added. “Instead, they were systematically sampled and include people from 25-65 years old – an important span for understanding cognitive development, when the brain has finished maturation but the onset of major diseases has typically not yet occurred.”

The study, “Epigenetic Age Acceleration and Chronological Age: Associations With Cognitive Performance in Daily Life“, was authored by Daisy V. Zavala, Natalie Dzikowski, Shyamalika Gopalan, Karra D Harrington, Giancarlo Pasquini,Jacqueline Mogle, Kerry Reid, Martin Sliwinski, Jennifer E. Graham-Engeland, Christopher G. Engeland, Kristin Bernard, Krishna Veeramah, and Stacey B. Scott.

RELATED

Musical memory remains resilient in old age, even for unfamiliar tunes
Dementia

Listening to music immediately after learning improves memory in older adults and Alzheimer’s patients

December 21, 2025
High-intensity interval training might help with premature ejaculation
Cognitive Science

How running tricks your brain into overestimating time

December 19, 2025
Girl taking a selfie on her smartphone, enjoying a drink, smiling and outdoors, illustrating social media, happiness, and modern communication.
Cognitive Science

Large meta-analysis links TikTok and Instagram Reels to poorer cognitive and mental health

December 18, 2025
Ghost sensations reveal a split between body image and reality
Cognitive Science

Ghost sensations reveal a split between body image and reality

December 17, 2025
New psychology research flips the script on happiness and self-control
Memory

Deep sleep reorganizes brain networks used for memory recall

December 16, 2025
New psychology research flips the script on happiness and self-control
Memory

Couples share a unique form of contagious forgetting, new research suggests

December 16, 2025
Does yoga and mindfulness training improve depression and anxiety among middle school students?
Cognitive Science

Formal schooling boosts executive functions beyond natural maturation

December 15, 2025
Higher diet quality is associated with greater cognitive reserve in midlife
Cognitive Science

Higher diet quality is associated with greater cognitive reserve in midlife

December 12, 2025

PsyPost Merch

STAY CONNECTED

LATEST

Why scientists are linking mitochondria to the physical toll of loneliness

Antibiotic use during pregnancy linked to slightly increased risk of ADHD

Social media surveillance of ex-partners linked to worse breakup recovery

Community gardens function as essential social infrastructure, analysis suggests

Subtle physical traits may hint at the biological roots of gender dysphoria

Smoking cannabis reduces alcohol consumption in heavy drinkers, study finds

Single moderate dose of psilocybin linked to temporary reduction in OCD symptoms

Listening to music immediately after learning improves memory in older adults and Alzheimer’s patients

RSS Psychology of Selling

  • The double-edged sword of dynamic pricing in online retail
  • How expert persuasion impacts willingness to pay for sugar-containing products
  • Experiments in sports marketing show product fit drives endorsement success
  • Study finds consumers must be relaxed for gamified ads to drive sales
  • Brain scans reveal increased neural effort when marketing messages miss the mark
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy