PsyPost
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
Join
My Account
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

How eye contact shapes the believability of computer-generated faces

by Karina Petrova
April 24, 2026
Reading Time: 5 mins read
Share on TwitterShare on Facebook

The direction a computer-generated character looks can dictate whether their facial expressions seem like genuine emotional responses to human observers. Direct eye contact makes simulated smiles and angry glares look more authentic, while looking downwards makes a digital face expressing sadness seem more real. These findings were published recently in Cognition and Emotion.

Digital characters frequently appear in online therapy programs, video games, customer service applications, and virtual companionship software. To succeed in these roles, virtual humans must build a sense of rapport with the users interacting with them. Doing so requires the digital characters to display emotional states that human users interpret as authentic.

Because virtual figures do not possess actual feelings, they rely entirely on visual cues to simulate a genuine state of mind. Previous research has explored how physical features shape the way people interpret an emotional display. To determine if a smile is a true reflection of happiness, a person will often look for the crinkling of the skin around the eyes.

Observers commonly interpret these eye wrinkles as a sign of true joy, even though humans are perfectly capable of flexing those muscles when they are not happy. Because visual cues heavily influence human perception, researchers wanted to know if the direction of an avatar’s eyes could also dictate how authentic an emotion appears to a viewer.

A framework in psychological study known as the shared signal hypothesis proposes a link between the movement of the eyes and the social intention of an emotion. Emotions that invite interaction or signal confrontation, like happiness and anger, represent an intention to approach. The theory suggests these approach emotions pair best with direct eye contact.

Conversely, emotions that indicate social withdrawal or a desire to escape, like sadness and fear, represent an intention to avoid. The shared signal hypothesis assumes these avoidance emotions should look most natural when the eyes gaze away from the observer.

Julia C. Haile, a researcher at the University of Western Australia, led a team to test these assumptions using digital human models. The researchers focused entirely on computer-generated faces rather than photographs of actual people. Using digital models allowed the team to completely separate the perception of an emotion from the actual feeling of an emotion, because software cannot experience feelings.

It also gave the researchers the ability to control and adjust eye positions in exact, regular increments without the natural physical variations that occur when real human beings try to hold a pose. To start the project, the research team generated ten highly realistic virtual adults using professional animation software. This underlying technology is widely used to create polished, realistic characters in modern blockbuster video games and animated movies.

Google News Preferences Add PsyPost to your preferred sources

Human experts adjusted digital muscle sliders corresponding to different parts of the human face. Because this software maps the digital faces to actual human muscular structures, the experts could manipulate the avatars by targeting specific facial muscle groups. Rather than setting every digital muscle to fifty percent, the designers tweaked the tension in the digital cheeks, eyebrows, and jawlines until the models closely mimicked reference photos of real human emotional expressions.

They modified these areas to create digital faces expressing anger, fear, happiness, and sadness. The team presented a large batch of these generated faces to a group of participants who rated how well the features communicated the target emotion. The researchers selected a final set of digital humans that conveyed the intended emotion unmistakably.

They intentionally avoided picking expressions that looked uniformly perfect, leaving room in their data for the perceived authenticity to rise or fall depending on the eye position. In the first main experiment, Haile and her colleagues recruited 150 adults to view the faces on a computer screen. The participants rated how believable each expression appeared, using a numerical scale.

The researchers altered the eyes of the angry and fearful avatars to either look straight ahead or gaze sideways at five increasingly wide angles. For the happy and sad avatars, the eyes either met the viewer directly or shifted downward in varying increments. Before the rating tasks began, participants received explicit instructions on how to evaluate the avatars.

The researchers asked them to differentiate between the sheer strength of an emotion and its authenticity. For example, a subtle frown might be entirely authentic, while a wildly exaggerated crying face might look completely posed or fake. Participants were asked to base their scores solely on the perception of a genuine internal state, regardless of whether the expression was mild or extreme.

During the rating process, the researchers took steps to simulate the physical experience of making eye contact. To standardize the viewing experience, participants rested their heads on a chin support to keep their eyes perfectly level with the digital faces. Before each face appeared, a cross mark flashed on the screen directly between where the avatar’s eyes would be.

This ensured the participant’s direct line of sight aligned exactly with the virtual human, creating a realistic simulation of mutual eye contact before the avatar’s gaze shifted to the side or downwards. Observers also rated the intensity of the expressions on a separate scale. A stronger, more vibrant expression tends to seem more authentic to an observer.

The researchers used statistical models to separate the trait of intensity from the trait of believability in their analysis. Doing so helped the researchers isolate the specific, independent effect of eye direction. For angry and happy avatars, the expressions looked the most authentic when the digital character maintained direct eye contact with the viewer.

When the avatars diverted their eyes away from the center of the screen, the illusion of a genuine feeling faded. The happy faces became consistently less believable with every step the eyes took downward. The emotion of sadness behaved quite differently.

The sad digital faces grew more believable as the avatar looked further downward. The highest ratings of authenticity occurred at the sharpest downward angles. Fear, however, did not follow the expected psychological pattern.

Altering the gaze of the fearful faces to the side produced no statistically significant changes in how real the fear appeared to observers. The viewing angles had virtually no impact on the ratings. The team then conducted a second experiment to see if the specific direction of the averted gaze mattered for the emotion of sadness.

They wanted to know if any diverted gaze worked, or if pointing the eyes down was uniquely suited to sadness. The researchers recruited a new group of 64 participants to evaluate the sad digital characters. This time, the avatars either looked straight ahead, cast their eyes downward, or looked sideways.

The results showed that direction absolutely dictates how sadness is perceived. Just as in the first test, sad expressions became increasingly believable when the avatar looked down. When the avatar looked sideways, the exact opposite happened, and the sadness appeared less authentic.

This implies that humans read specific, highly tuned social messages from different types of eye movements, rather than treating all diverted gazes as a generic signal of avoidance. The research team noted a few limitations regarding their methodology. The study utilized static images of forward-facing avatars, which eliminated the realistic motion of a shifting head.

In everyday interactions, human beings frequently rotate their heads to match the movement of their eyes. Static images also lack the continuous timing elements of a naturally unfolding facial expression. Introducing dynamic video might alter how observers interpret a fleeting glance.

Additionally, the researchers generated virtual characters designed to match White European physical characteristics. They also restricted their participant pool to individuals who grew up in majority White European countries. This design choice prevented unfamiliarity with different bodily appearances from skewing the ratings, but it prevents the findings from being generalized to a global population.

Future research will need to test a wider diversity of digital faces to see if these patterns hold across different cultures. Researchers could also test human observers using automatic psychological responses, like heart rate or pupil dilation. Measuring automatic bodily responses might capture subtle human reactions to emotions like fear that explicit conscious ratings did not catch.

The study, “Eye believe you: gaze direction affects the perceived believability of facial expressions displayed by computer-generated people,” was authored by Julia C. Haile, Romina Palermo, Amy Dawel, Eva G. Krumhuber, Clare Sutherland, and Jason Bell.

RELATED

Facebook users who ruminate and compare themselves to their friends experience increased loneliness
Artificial Intelligence

Women perceive AI as riskier than men do, study finds

April 22, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Psychologists pinpoint the conversational mechanisms that help humans bond with AI

April 22, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Unrestricted generative AI harms high school math learning by acting as a crutch

April 21, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

People remain “blissfully ignorant” of AI use in everyday messages, new research shows

April 20, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Disclosing autism to AI chatbots prompts overly cautious, stereotypical advice

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Scientists tested the creativity of AI models, and the results were surprisingly homogeneous

April 18, 2026
People ascribe intentions and emotions to both human- and AI-made art, but still report stronger emotions for artworks made by humans
Artificial Intelligence

New research links personality traits to confidence in recognizing artificial intelligence deception

April 13, 2026
Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026

Follow PsyPost

The latest research, however you prefer to read it.

Daily newsletter

One email a day. The newest research, nothing else.

Google News

Get PsyPost stories in your Google News feed.

Add PsyPost to Google News
RSS feed

Use your favorite reader. We also syndicate to Apple News.

Copy RSS URL
Social media
Support independent science journalism

Ad-free reading, full archives, and weekly deep dives for members.

Become a member

Trending

  • The age you start regularly watching adult content predicts your future mental health
  • New psychology research shows people consistently underestimate how often things go wrong across society
  • Short video addiction is linked to lower life satisfaction through loneliness and anxiety
  • Childhood trauma and attachment styles show nuanced links to alternative sexual preferences
  • Cognition might emerge from embodied “grip” with the world rather than abstract mental processes

Psychology of Selling

  • Five persuasive approaches and when each one works best for marketers
  • When salespeople feel free and connected to their boss, they’re less likely to quit
  • Want your brand to look premium? New research suggests making your logo less dynamic
  • The color trick that changes how you expect products to smell, taste, and feel
  • A new framework maps how influencers, brands, and platforms all compete for long-term value

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc