Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Artificial intelligence can predict political beliefs from expressionless faces

by Eric W. Dolan
April 17, 2024
in Artificial Intelligence
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook

Scientists have demonstrated that facial recognition technology can predict a person’s political orientation with a surprising level of accuracy. Their research, published in the journal American Psychologist, shows that even neutral facial expressions can hold clues to someone’s political beliefs. This finding poses significant privacy concerns, especially since facial recognition can operate without an individual’s consent.

Facial recognition technology is a form of artificial intelligence that identifies and verifies individuals by analyzing patterns based on their facial features. At its core, the technology uses algorithms to detect faces in images or video feeds, and then measures various aspects of the face — such as the distance between the eyes, the shape of the jawline, and the contour of the cheekbones.

These measurements are transformed into a mathematical formula, or a facial signature. This signature can be compared to a database of known faces to find a match or used in various applications ranging from security systems and mobile unlocking to tagging friends on social media platforms.

With the growing use of facial recognition technologies in both public and private sectors, there’s an increased possibility that these tools could be used for purposes beyond simple identification, such as predicting personal attributes like political orientation.

“Growing up behind the iron curtain made me acutely aware of the risks of surveillance and the elites choosing to overlook inconvenient facts for financial or ideological reasons,” explained lead author Michal Kosinski, an associate professor of organizational behavior at Stanford University’s Graduate School of Business.

“Thus, in my work, I am focused on auditing new technologies and exposing their privacy risks. In the past, we showed that data that Facebook sold (or exchanged for content) exposed users’ political views, sexual orientation, personality, and other intimate traits. We showed the worrying potential of the personality targeting approach used by Facebook, Cambridge Analytica, and others.

“We exposed how Facebook used a trick to continue selling their users’ intimate data. We showed that facial recognition technologies, widely used by companies and governments, can detect political views and sexual orientation from social-media profile pictures.”

But previous studies often didn’t control for variables that could affect the accuracy of their conclusions, such as facial expressions, orientation of the head, and the presence of makeup or jewelry. In their new study, the researchers aimed to isolate the influence of facial features alone in predicting political orientation, thus providing a clearer picture of the capabilities and risks of facial recognition technology.

Google News Preferences Add PsyPost to your preferred sources

To achieve this, they recruited 591 participants from a major private university and carefully controlled the environment and conditions under which each participant’s face was photographed. The participants were dressed uniformly in black T-shirts, used face wipes to remove any makeup, and had their hair neatly tied back. They were seated in a fixed posture, and their faces were photographed in a well-lit room against a neutral background to ensure consistency across all images.

Once the photographs were taken, they were processed using a facial recognition algorithm, specifically the VGGFace2 in a ResNet-50-256D architecture. This algorithm extracted numerical vectors — called face descriptors — from the images. These descriptors encode the facial features in a form that computers can analyze and were used to predict the participants’ political orientation through a model that mapped these descriptors onto a political orientation scale.

The researchers found that the facial recognition algorithm could predict political orientation with a correlation coefficient of .22. This correlation, while modest, was statistically significant and suggested that certain stable facial features could be linked to political orientation, independent of other demographic factors like age, gender, and ethnicity.

Next, Kosinski and his colleagues conducted a second study in which they replaced the algorithm with 1,026 human raters to assess if people could similarly predict political orientation from neutral facial images. The human raters were recruited through Amazon’s Mechanical Turk and were presented with the standardized facial images collected in the first study. Each rater was asked to assess the political orientation of the individuals in the photographs.

The raters completed over 5,000 assessments, and the results were analyzed to determine the correlation between their perceived ratings of political orientation and the actual orientations as reported by the participants. Like the algorithm, human raters were able to predict political orientation with a correlation coefficient of .21, which was comparable to the algorithm’s performance.

“We knew that both humans and algorithms can judge intimate traits, ranging from personality to sexual orientation, and political views from social media profile pictures. Much of the signal likely comes from self-presentation, facial expression, head orientation, and other choices made by the person in the photo,” Kosinski told PsyPost.

“I was surprised that both algorithms and humans could predict political orientation also from carefully standardized images of expressionless faces. That suggests the existence of links between stable facial features and political orientation.”

In a third study, the researchers extended their examination of facial recognition’s predictive power to a different context by applying the model to a set of naturalistic images — those of politicians. The study aimed to validate the findings from the controlled laboratory settings in a more real-world scenario where the images were not standardized. The sample consisted of 3,401 profile images of politicians from the lower and upper chambers of legislatures across three countries: the United States, the United Kingdom, and Canada.

The results demonstrated that the facial recognition model could indeed predict political orientation from the naturalistic images of politicians with a median accuracy of a correlation coefficient of .13. This level of accuracy, while not high, was nonetheless significant and indicated that some of the stable facial features predictive of political orientation in the controlled laboratory images could also be identified in more varied, real-life images.

The findings have worrying implications for privacy.

“While many other digital footprints are revealing of political orientation and other intimate traits, facial recognition can be used without subjects’ consent or knowledge,” Kosinski explained. “Facial images can be easily (and covertly) taken by law enforcement or obtained from digital or traditional archives, including social networks, dating platforms, photo-sharing websites, and government databases.

“They are often easily accessible; Facebook and LinkedIn profile pictures, for instance, can be accessed by anyone without a person’s consent or knowledge. Thus, the privacy threats posed by facial recognition technology are, in many ways, unprecedented.”

“All these findings are inconvenient. For ideological reasons, scientists prefer to avoid discussing links between appearance and traits,” Kosinski added. However, “companies and governments are keen to use facial recognition to identify intimate traits.”

As with any study, the research has limitations to consider. The diversity of the participants was constrained, with a significant majority being Caucasian, and all from a single private university, which might not provide a broad representation of global or even national demographics. While the study controlled for many variables, the influence of inherent biases in human perception or the algorithm’s design cannot be entirely ruled out.

Future research could expand on these findings by including a more diverse participant pool and employing more advanced imaging technologies, such as three-dimensional facial scans. Additionally, exploring these predictions across different cultures and political systems could provide deeper insights into the universality of the findings.

“We should be careful when interpreting the results of any single study,” Kosinski noted. “While our findings are in-line with previous work, the results should be treated as tentative until they are replicated by independent researchers.”

Nevertheless, the research raises important questions about the potential uses and abuses of facial recognition technology.

“I hope that our findings will inform the policymaking and regulation of facial recognition technology,” Kosinski said. “Our previous papers often resulted in tightening regulation and tech companies adjusting their privacy protections. I also hope that this research will help us to boost our understanding of the links between appearance and psychological traits.”

The study, “Facial Recognition Technology and Human Raters Can Predict Political Orientation From Images of Expressionless Faces Even When Controlling for Demographics and Self-Presentation,” was authored by Michal Kosinski, Poruz Khambatta, and Yilun Wang.

Previous Post

Extremism and attention: Radical views are rewarded on social media, new research reveals

Next Post

Looking younger linked to positive aging experiences and better health in older adults

RELATED

Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026
AI autocomplete suggestions covertly change how users think about important topics
Artificial Intelligence

AI autocomplete suggestions covertly change how users think about important topics

April 2, 2026
Study links phubbing sensitivity to attachment patterns in romantic couples
Artificial Intelligence

How generative artificial intelligence is upending theories of political persuasion

April 1, 2026
People with attachment anxiety are more vulnerable to problematic AI use
Artificial Intelligence

Relying on AI chatbots for historical facts can influence your political beliefs, new study shows

March 30, 2026
ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests
Artificial Intelligence

ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests

March 30, 2026
Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Correcting fake news about brands does not backfire, five-study experiment finds
  • Should your marketing tell a story or state the facts? A massive meta-analysis has answers
  • When brands embrace diversity, some customers pull away — and new research explains why
  • Smaller influencers drive engagement while bigger ones drive purchases, meta-analysis finds
  • Political conservatives are more drawn to baby-faced product designs, and purity values explain why

LATEST

Bladder toxicity risk appears low for psychiatric ketamine patients, though data is limited

Low doses of LSD alter emotional brain responses in people with mild depression

Narcissistic traits are linked to a brain area governing emotional control

Can video games make kids feel better about their bodies?

Reduced gray matter and altered brain connectivity are linked to problematic smartphone use

Your breathing pattern is as unique as a fingerprint

Extreme athletes just helped scientists unlock a deep evolutionary secret about human survival

How different negative emotions change the size of your pupils

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc