New research suggests that individuals experiencing mild depressive symptoms, which may not yet meet the threshold for a clinical diagnosis, exhibit subtle but distinct changes in their facial expressions. An artificial intelligence system was able to identify these specific facial muscle movements, which were also linked to how others perceive them. The study, published in the journal Scientific Reports, offers a potential new avenue for identifying individuals at risk for developing clinical depression before more severe symptoms emerge.
Researchers have long known that clinical depression can alter a person’s facial expressivity, often leading to a reduction in positive expressions like smiling. However, the period before a full diagnosis, known as subthreshold depression, has been less understood. Subthreshold depression involves experiencing mild depressive symptoms that are not severe enough to be formally diagnosed, but it represents a significant risk factor for the future development of major depression. Scientists at Waseda University in Japan wanted to investigate if this milder state was also associated with detectable changes in facial expressions.
A key goal was to see if these subtle cues could be noticed by other people and if they could be objectively measured using modern technology. The researchers also considered cultural context, as studies have shown that individuals in East Asian cultures may have different baseline levels of facial expressivity compared to Western populations, which could influence how depression manifests visually.
To explore this, the research team designed a two-part experiment involving undergraduate students in Japan. The first part focused on creating the visual material and the second on how it was perceived. First, they recruited a group of 64 students, referred to as the “evaluated participants” or “ratees.” These students filled out the Beck Depression Inventory-II, a standard questionnaire used to measure the severity of depressive symptoms.
Based on their scores, the students were categorized into two groups: a healthy group with minimal to no depressive symptoms, and a subthreshold depression group with mild symptoms. Individuals with moderate or severe scores were not included in the analysis. Each of these students then recorded a short, 10-second video of themselves giving a self-introduction. The recordings were standardized, with each participant wearing a white t-shirt against a neutral background and looking directly at the camera, to ensure consistency.
In the second part of the study, a separate group of 63 students, known as the “evaluators” or “raters,” were brought in to watch these videos. This group also completed the Beck Depression Inventory-II to assess their own depressive tendencies. The evaluators watched the silent 10-second video clips of each of the 64 evaluated participants. After each clip, they rated the person in the video on several characteristics using a five-point scale. These included positive traits such as how expressive, natural, friendly, and likeable the person appeared. They also rated negative traits, including how stiff, nervous, or fake the person seemed.
The researchers wanted to answer two main questions with this setup. First, would people with subthreshold depression be perceived differently than their healthy peers? Second, would the evaluators’ own depressive symptoms influence how they judged others?
The evaluated participants who had subthreshold depression consistently received lower scores on all the positive traits. Observers rated them as significantly less expressive, less natural, less friendly, and less likeable compared to the healthy participants.
However, there was no significant difference in the ratings for negative traits. The students with subthreshold depression were not seen as more stiff, nervous, or fake than the healthy students. This finding suggests that subthreshold depression is associated with a muted or diminished positive expressivity rather than an increase in overtly negative expressions.
Additionally, the researchers found that the evaluators’ own mental state did not bias their judgments. Evaluators with subthreshold depression rated the videos in the same way as healthy evaluators did, indicating that the differences in perception were rooted in the facial expressions of the people in the videos, not the mindset of the observers.
Following the human-led evaluation, the researchers turned to artificial intelligence for a more objective analysis. They used an open-source software called OpenFace 2.0 to examine the same 10-second video clips. This program is designed to analyze facial behavior by tracking tiny, specific muscle movements known as Facial Action Units. Each Action Unit corresponds to the contraction of a particular facial muscle, such as the muscle that raises the inner part of the eyebrow or the one that pulls the corner of the lip. The software analyzed the videos frame by frame, measuring both the presence and the intensity of 18 different Action Units.
The automated analysis uncovered a distinct pattern of facial muscle activity in the students with subthreshold depression. Compared to the healthy group, these individuals showed a higher presence or intensity of specific movements. These included the “Inner Brow Raiser,” the “Upper Lid Raiser,” and the “Lip Stretcher.” They also exhibited more frequent mouth-opening movements, such as “Lips Part” and “Jaw Drop.”
When the researchers looked for a direct relationship between these muscle movements and depression scores, they found that five of these Action Units showed a significant positive correlation. In other words, the higher a participant’s score on the depression inventory, the more pronounced these specific facial movements were. Many of these detected movements are associated with expressions of tension, discomfort, or even fear, suggesting a subtle, underlying affective state that is visible even in a non-clinical population.
The researchers point out some limitations of their work. The study relied on a self-report questionnaire to measure depressive symptoms, rather than a formal clinical diagnosis from a psychiatrist. The findings are also specific to a population of Japanese university students and may not apply to individuals with major depression or to people from different cultural backgrounds. Cultural norms can play a significant role in how emotions are expressed and perceived, so more research is needed across diverse populations.
Despite these limitations, the study opens up promising directions for future research and application. The combination of short video analysis and artificial intelligence provides a non-invasive way to detect subtle signs of psychological distress. Eriko Sugimori, an associate professor who led the study, suggested that this approach could be adapted for early mental health screening in various settings. “Our novel approach of short self-introduction videos and automated facial expression analysis can be applied to screen and detect mental health in schools, universities, and workplaces,” Sugimori stated.
Such a tool could be integrated into digital health platforms or corporate wellness programs to help identify individuals who might benefit from early support. “Overall, our study provides a novel, accessible, and non-invasive artificial intelligence-based facial analysis tool for early detection of depression, enabling early interventions and timely care of mental health,” Sugimori concluded. Future studies could employ more advanced machine learning techniques to refine the detection of these facial signatures and explore whether they are universal or culturally specific.
The study, “Subthreshold depression is associated with altered facial expression and impression formation via subjective ratings and action unit analysis,” was authored by Eriko Sugimori and Mayu Yamaguchi.