Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Smash or pass? AI could soon predict your date’s interest via physiological cues

by Eric W. Dolan
February 19, 2024
in Artificial Intelligence, Social Psychology
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook
Stay on top of the latest psychology findings: Subscribe now!

In a world where technology touches almost every aspect of our lives, it’s not surprising that researchers are now exploring how it can help us understand the nuances of human interaction, particularly in the realm of conversation. A recent study by engineers at the University of Cincinnati has made a fascinating leap in this direction, suggesting that soon, artificial intelligence could potentially tell you if your first date is really into you — or not — based on physiological responses alone.

By analyzing data from wearable technology that tracks heart rates, respiration, and perspiration, the study has managed to classify the type of conversation two people are having with notable accuracy. This research, published in the journal IEEE Transactions on Affective Computing, not only opens up new possibilities for enhancing interpersonal communication but also paves the way for applications in mental health counseling and education.

The inspiration for this study stemmed from a keen interest in physiological synchrony — a phenomenon where individuals’ physiological responses, such as heart rate and breathing patterns, become synchronized during conversation or collaboration. This synchrony is believed to be a marker of how engaged or in tune two people are with each other during an interaction.

Previous studies have shown that the degree of physiological synchrony can predict how well individuals collaborate on tasks, the empathy patients feel from their therapists, and the engagement students experience with their teachers.

Researchers embarked on this study to explore whether technology could leverage physiological synchrony to infer the dynamics of a conversation without needing verbal feedback or observation. In a world increasingly reliant on digital communication, understanding the underlying emotional connections between individuals could significantly enhance virtual interactions, making this research both timely and relevant.

Eighteen pairs of participants were initially recruited, with the final analysis focusing on sixteen dyads due to data quality issues.

These pairs, encompassing friends, roommates, coworkers, relationship partners, and strangers, engaged in four distinct conversation scenarios: two-sided positive, two-sided negative, one-sided with the person on the left talking, and one-sided with the person on the right talking. These scenarios were crafted to elicit a range of emotional and physiological responses, which were then recorded using wearable sensors measuring heart rate, skin conductance, and respiration.

Participants were also asked to report their feelings using the Self-Assessment Manikin (SAM) and the Interpersonal Interaction Questionnaire (IIQ) to provide a comprehensive picture of each conversation’s emotional landscape.

University of Cincinnati engineering students demonstrate how they taught a computer to distinguish types of conversations based only on physiological cues. (Photo credit: Andrew Higley/University of Cincinnati)

The researchers applied advanced data processing techniques to filter and analyze the signals, extracting features related to both individual responses and the synchrony between participants. This rich dataset served as the input for machine learning algorithms, which were tasked with classifying the type of conversation based on physiological data alone.

The researchers found that a two-stage classification approach, which first distinguished between one-sided and two-sided conversations before further classifying them, achieved the highest accuracy of 75%. This indicates a strong potential for technology to understand conversation dynamics through physiological signals.

However, when the researchers relied solely on physiological data without the additional context provided by synchrony features, the accuracy dropped to 65.6%, highlighting the importance of these features in capturing the nuances of human interaction.

Interestingly, incorporating personality traits into the analysis did not improve the classification accuracy, suggesting that the structured nature of the conversation scenarios may have limited the impact of individual personality differences. The most successful classifications came from using a combination of physiological signals, with heart rate variability, skin conductance responses, and respiration rates among the most informative features.

The potential applications of this research are varied. Lead author and University of Cincinnati doctoral student Iman Chatterjee suggested that the technology developed through their study could serve as a tool for honest feedback in social interactions, effectively playing a “smash or pass” role in evaluating the dynamics of a date or any conversation.

“The computer could tell if you’re a bore,” he explained, highlighting how a modified version of their system might measure a person’s interest level, compatibility, and engagement in a conversation.

But despite the study’s significant achievements, the researchers are quick to acknowledge its limitations and the need for further exploration. One major consideration is the high accuracy (96.9%) achieved using self-report data, raising questions about the added value of physiological measures in contexts where individuals are already aware of their interaction dynamics. This suggests that future research should focus on more spontaneous conversation scenarios that might offer new insights into the role of physiological synchrony.

Moreover, the study’s structured conversation scenarios, while useful for initial exploration, may not fully capture the complexity of natural human interactions, which are often more nuanced and less predictable. Future studies could benefit from incorporating additional data types, such as speech patterns, gestures, and even EEG signals, to enrich the analysis and potentially improve classification accuracy.

The study, “Automated Classification of Dyadic Conversation Scenarios using Autonomic Nervous System Responses,” was authored by Iman Chatterjee, Maja Goršič, Mohammad S. Hossain, Joshua D. Clapp, and Vesna D. Novak.

RELATED

Study uncovers a gendered double standard for interracial relationships
Attachment Styles

Attachment insecurity shapes mentalization in interracial long-distance relationships

September 13, 2025
AI outshines humans in humor: Study finds ChatGPT is as funny as The Onion
Artificial Intelligence

AI detects hidden movement clues linked to Parkinson’s disease

September 13, 2025
New study identifies two factors that help explain the link between narcissism and self-esteem
Body Image and Body Dysmorphia

Women prone to self-objectification tend to have lower empathy

September 12, 2025
COVID-19 lockdowns linked to lasting disruptions in teen brain and body systems
Evolutionary Psychology

Large cross-cultural study finds your “body count” affects your desirability, with little evidence of gender bias

September 11, 2025
Scientists identify a mysterious brain signal tied to stress and hormone pulses
Sexism

Students rate identical lectures differently based on professor’s gender, researchers find

September 10, 2025
What people love most about sex, according to new psychology research
Attractiveness

Attraction goes beyond looks: Study shows voices, scents, and motion all matter

September 8, 2025
Artificial intelligence reveals Trump’s language as both uniquely simplistic and divisive among U.S. presidents
Political Psychology

Elite rhetoric about Trump’s prosecution had limited impact on Republican and independent voters

September 7, 2025
Extraverts show faster, stronger, and more patterned emotional reactions
Attractiveness

People interpret long eyelashes as a signal of openness to casual relationships

September 7, 2025

STAY CONNECTED

LATEST

Cannabidiol shows no immediate effect on brain or behavior in young people with alcohol use disorder, study finds

From brain circuits to gut health, a new review details the complex biology of mood disorders

Attachment insecurity shapes mentalization in interracial long-distance relationships

AI detects hidden movement clues linked to Parkinson’s disease

New research complicates the story of dog domestication

Harvard scientists pinpoint how sleep stabilizes memory in fascinating neuroscience breakthrough

Surprising new findings force scientists to rethink decades of brain-plasticity theories

Breath-based meditation technique shifts brain into deeply relaxed state, study finds

         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy