Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Neuroticism predicts stronger emotional bonds with AI chatbots

by Karina Petrova
December 24, 2025
in Artificial Intelligence
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

As artificial intelligence becomes a staple of modern life, people are increasingly turning to chatbots for companionship and comfort. A new study suggests that while users often rely on these digital entities for stability, the resulting bond is built more on habit and trust than deep emotional connection. These findings on the psychology of human-machine relationships were published in the journal Psychology of Popular Media.

The rise of sophisticated chatbots has created a unique social phenomenon where humans interact with software as if it were a living being. This dynamic draws upon a concept known as social presence theory. This theory describes the psychological sensation that another entity is physically or emotionally present during a mediated interaction.

Designers of these systems often aim to create a sense of social presence to make the user experience more engaging. The goal is for the artificial agent to appear to have a personality and the capacity for a relationship. However, the academic community has not fully reached a consensus on what constitutes intimacy in these synthetic scenarios.

Researchers wanted to understand the mechanics of this perceived intimacy. They sought to determine if personality traits influence how a user connects with a machine. The investigation was led by Yingjia Huang from the Department of Philosophy at Peking University and Jianfeng Lan from the School of Media and Communication at Shanghai Jiao Tong University.

The team recruited 103 participants who actively use AI companion applications such as Doubao and Xingye. These apps are designed to provide emotional interaction through text and voice. The participants completed detailed surveys designed to measure their personality traits and their perceived closeness to the AI.

To measure personality, the researchers utilized the “Big Five” framework. This model assesses individuals based on neuroticism, conscientiousness, agreeableness, openness, and extraversion. The survey also evaluated intimacy through five specific dimensions: trust, attachment, self-disclosure, virtual rapport, and addiction.

In addition to the quantitative survey, the researchers conducted in-depth interviews with eight selected participants. These conversations provided qualitative data regarding why users turn to digital companions. The interview subjects were chosen because they reported higher levels of intimacy in the initial survey.

The study revealed that most users do not experience a profound sense of intimacy with their chatbots. The average scores for emotional closeness were relatively low. This suggests that current technology has not yet bridged the gap required to foster deep interpersonal connections.

Google News Preferences Add PsyPost to your preferred sources

When analyzing what composed the relationship, the authors identified trust and addiction as the primary drivers. Users viewed the AI as a reliable outlet that is always available. The researchers interpreted the “addiction” component not necessarily as a pathology, but as a habit formed through daily routines.

The data showed that specific personality types are more prone to bonding with algorithms. Individuals scoring high in neuroticism reported stronger feelings of intimacy. Neuroticism is a trait often associated with emotional instability and anxiety.

For these users, the predictability of the computer program offers a sense of safety. Humans can be unpredictable or judgmental, but a coded companion provides consistent responses. One participant noted in an interview, “He’s always there, no matter what mood I’m in.”

People with high openness to experience also developed tighter bonds. These users tend to be imaginative and curious about new technologies. They engage with the AI as a form of exploration.

Users with high openness are willing to suspend disbelief to enjoy the interaction. They view the exchange as a form of experimental play rather than a replacement for human contact. They do not require the AI to be “real” to find value in the conversation.

The interviews highlighted that users often engage in emotional projection. They attribute feelings to the bot even while knowing it has no consciousness. This allows them to feel understood without the complexities of reciprocal human relationships.

The researchers identified three distinct ways users engaged with these systems. The first is “objectified companionship.” These users treat the AI like a digital pet, engaging in routine check-ins without deep emotional investment.

The second category is “emotional projection.” Users in this group use the AI as a safe container for their vulnerabilities. They vent their frustrations and anxieties, finding comfort in the machine’s non-judgmental nature.

The third category is “rational support.” These users do not seek emotional warmth. Instead, they value the AI for its logic and objectivity, using it as a counselor or advisor to help regulate their thoughts.

Despite these uses, participants frequently expressed frustration with technological limitations. Many described the AI’s language as too formal or repetitive. One user compared the experience to reading a customer service script.

This lack of spontaneity hinders the development of genuine immersion. Users noted that the AI lacks the warmth and fluidity of human conversation. Consequently, the relationship remains functional rather than truly affective.

The study posits that this form of intimacy relies on a “functional-affective gap.” Users maintain a high frequency of interaction for functional reasons, such as boredom relief or anxiety management. However, this does not translate into high emotional intimacy.

Trust in this context is defined by reliability rather than emotional closeness. Users trust the AI not to leak secrets or judge them. This form of trust acts as a substitute for the intuitive understanding found in human bonds.

The authors reference the philosophical concept of “I–Thou” versus “I–It” relationships. A true intimate bond is usually an “I–Thou” connection involving mutual recognition. Interactions with AI are technically “I–It” relationships because the machine lacks subjectivity.

However, the findings suggest that users psychologically approximate an “I–Thou” dynamic. They project meaning onto the AI’s output. The experience of intimacy is co-constructed by the user’s imagination and needs.

This dynamic creates a new relational paradigm. The line between simulation and reality becomes blurred. The user feels supported, which matters more to them than the ontological reality of the supporter.

The researchers argue that AI serves as a technological mediator of social affect. It functions as a mirror for the user’s emotions. The intimacy is layered and highly dependent on the context of the user’s life.

The study relies on a relatively small sample size of users from a specific cultural context. This focus on Chinese users may limit how well the results apply to other populations. Cultural attitudes toward technology and privacy could influence these results in different regions.

The cross-sectional nature of the survey also limits the ability to determine causality. It is unclear if neuroticism causes users to seek AI, or if the interaction appeals to those traits. Longitudinal studies would be needed to track how these relationships evolve over time.

Future investigations could examine how improved AI memory and emotional mimicry might alter these dynamics. As the technology becomes more lifelike, the distinction between functional and emotional intimacy may narrow. The authors imply that ethical design is essential as these bonds become more common.

The study, “Personality Meets the Machine: Traits and Attributes in Human–Artificial Intelligence Intimate Interactions,” was authored by Yingjia Huang and Jianfeng Lan.

Previous Post

New research reveals a subtle and dark side-effect of belief in free will

Next Post

Study finds little evidence of the Dunning-Kruger effect in political knowledge

RELATED

Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026
AI autocomplete suggestions covertly change how users think about important topics
Artificial Intelligence

AI autocomplete suggestions covertly change how users think about important topics

April 2, 2026
Study links phubbing sensitivity to attachment patterns in romantic couples
Artificial Intelligence

How generative artificial intelligence is upending theories of political persuasion

April 1, 2026
People with attachment anxiety are more vulnerable to problematic AI use
Artificial Intelligence

Relying on AI chatbots for historical facts can influence your political beliefs, new study shows

March 30, 2026
ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests
Artificial Intelligence

ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests

March 30, 2026
Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Should your marketing tell a story or state the facts? A massive meta-analysis has answers
  • When brands embrace diversity, some customers pull away — and new research explains why
  • Smaller influencers drive engagement while bigger ones drive purchases, meta-analysis finds
  • Political conservatives are more drawn to baby-faced product designs, and purity values explain why
  • Free gifts with no strings attached can boost customer spending by over 30%, study finds

LATEST

Extreme athletes just helped scientists unlock a deep evolutionary secret about human survival

How different negative emotions change the size of your pupils

Artificial intelligence makes consumers more impatient

Stacking bad habits triples the risk of co-occurring anxiety and depression in teenagers

When the pay gap is wide, women see professional beauty as a strategic asset

Scientists discover intriguing brainwave patterns linked to rhythmic sound meditation

Drumming with friends increases oxytocin levels in children, study finds

Cognitive dissonance helps explain why Trump supporters remain loyal, new research suggests

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc