Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Neuroticism predicts stronger emotional bonds with AI chatbots

by Karina Petrova
December 24, 2025
in Artificial Intelligence
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

As artificial intelligence becomes a staple of modern life, people are increasingly turning to chatbots for companionship and comfort. A new study suggests that while users often rely on these digital entities for stability, the resulting bond is built more on habit and trust than deep emotional connection. These findings on the psychology of human-machine relationships were published in the journal Psychology of Popular Media.

The rise of sophisticated chatbots has created a unique social phenomenon where humans interact with software as if it were a living being. This dynamic draws upon a concept known as social presence theory. This theory describes the psychological sensation that another entity is physically or emotionally present during a mediated interaction.

Designers of these systems often aim to create a sense of social presence to make the user experience more engaging. The goal is for the artificial agent to appear to have a personality and the capacity for a relationship. However, the academic community has not fully reached a consensus on what constitutes intimacy in these synthetic scenarios.

Researchers wanted to understand the mechanics of this perceived intimacy. They sought to determine if personality traits influence how a user connects with a machine. The investigation was led by Yingjia Huang from the Department of Philosophy at Peking University and Jianfeng Lan from the School of Media and Communication at Shanghai Jiao Tong University.

The team recruited 103 participants who actively use AI companion applications such as Doubao and Xingye. These apps are designed to provide emotional interaction through text and voice. The participants completed detailed surveys designed to measure their personality traits and their perceived closeness to the AI.

To measure personality, the researchers utilized the “Big Five” framework. This model assesses individuals based on neuroticism, conscientiousness, agreeableness, openness, and extraversion. The survey also evaluated intimacy through five specific dimensions: trust, attachment, self-disclosure, virtual rapport, and addiction.

In addition to the quantitative survey, the researchers conducted in-depth interviews with eight selected participants. These conversations provided qualitative data regarding why users turn to digital companions. The interview subjects were chosen because they reported higher levels of intimacy in the initial survey.

The study revealed that most users do not experience a profound sense of intimacy with their chatbots. The average scores for emotional closeness were relatively low. This suggests that current technology has not yet bridged the gap required to foster deep interpersonal connections.

When analyzing what composed the relationship, the authors identified trust and addiction as the primary drivers. Users viewed the AI as a reliable outlet that is always available. The researchers interpreted the “addiction” component not necessarily as a pathology, but as a habit formed through daily routines.

The data showed that specific personality types are more prone to bonding with algorithms. Individuals scoring high in neuroticism reported stronger feelings of intimacy. Neuroticism is a trait often associated with emotional instability and anxiety.

For these users, the predictability of the computer program offers a sense of safety. Humans can be unpredictable or judgmental, but a coded companion provides consistent responses. One participant noted in an interview, “He’s always there, no matter what mood I’m in.”

People with high openness to experience also developed tighter bonds. These users tend to be imaginative and curious about new technologies. They engage with the AI as a form of exploration.

Users with high openness are willing to suspend disbelief to enjoy the interaction. They view the exchange as a form of experimental play rather than a replacement for human contact. They do not require the AI to be “real” to find value in the conversation.

The interviews highlighted that users often engage in emotional projection. They attribute feelings to the bot even while knowing it has no consciousness. This allows them to feel understood without the complexities of reciprocal human relationships.

The researchers identified three distinct ways users engaged with these systems. The first is “objectified companionship.” These users treat the AI like a digital pet, engaging in routine check-ins without deep emotional investment.

The second category is “emotional projection.” Users in this group use the AI as a safe container for their vulnerabilities. They vent their frustrations and anxieties, finding comfort in the machine’s non-judgmental nature.

The third category is “rational support.” These users do not seek emotional warmth. Instead, they value the AI for its logic and objectivity, using it as a counselor or advisor to help regulate their thoughts.

Despite these uses, participants frequently expressed frustration with technological limitations. Many described the AI’s language as too formal or repetitive. One user compared the experience to reading a customer service script.

This lack of spontaneity hinders the development of genuine immersion. Users noted that the AI lacks the warmth and fluidity of human conversation. Consequently, the relationship remains functional rather than truly affective.

The study posits that this form of intimacy relies on a “functional-affective gap.” Users maintain a high frequency of interaction for functional reasons, such as boredom relief or anxiety management. However, this does not translate into high emotional intimacy.

Trust in this context is defined by reliability rather than emotional closeness. Users trust the AI not to leak secrets or judge them. This form of trust acts as a substitute for the intuitive understanding found in human bonds.

The authors reference the philosophical concept of “I–Thou” versus “I–It” relationships. A true intimate bond is usually an “I–Thou” connection involving mutual recognition. Interactions with AI are technically “I–It” relationships because the machine lacks subjectivity.

However, the findings suggest that users psychologically approximate an “I–Thou” dynamic. They project meaning onto the AI’s output. The experience of intimacy is co-constructed by the user’s imagination and needs.

This dynamic creates a new relational paradigm. The line between simulation and reality becomes blurred. The user feels supported, which matters more to them than the ontological reality of the supporter.

The researchers argue that AI serves as a technological mediator of social affect. It functions as a mirror for the user’s emotions. The intimacy is layered and highly dependent on the context of the user’s life.

The study relies on a relatively small sample size of users from a specific cultural context. This focus on Chinese users may limit how well the results apply to other populations. Cultural attitudes toward technology and privacy could influence these results in different regions.

The cross-sectional nature of the survey also limits the ability to determine causality. It is unclear if neuroticism causes users to seek AI, or if the interaction appeals to those traits. Longitudinal studies would be needed to track how these relationships evolve over time.

Future investigations could examine how improved AI memory and emotional mimicry might alter these dynamics. As the technology becomes more lifelike, the distinction between functional and emotional intimacy may narrow. The authors imply that ethical design is essential as these bonds become more common.

The study, “Personality Meets the Machine: Traits and Attributes in Human–Artificial Intelligence Intimate Interactions,” was authored by Yingjia Huang and Jianfeng Lan.

RELATED

AI chatbots often misrepresent scientific studies — and newer models may be worse
Artificial Intelligence

Sycophantic chatbots inflate people’s perceptions that they are “better than average”

January 19, 2026
Google searches for racial slurs are higher in areas where people are worried about disease
Artificial Intelligence

Learning from AI summaries leads to shallower knowledge than web search

January 17, 2026
Neuroscientists find evidence meditation changes how fluid moves in the brain
Artificial Intelligence

Scientists show humans can “catch” fear from a breathing robot

January 16, 2026
Poor sleep may shrink brain regions vulnerable to Alzheimer’s disease, study suggests
Artificial Intelligence

How scientists are growing computers from human brain cells – and why they want to keep doing it

January 11, 2026
Misinformation thrives on outrage, study finds
Artificial Intelligence

The psychology behind the deceptive power of AI-generated images on Facebook

January 8, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

Conversational AI can increase false memory formation by injecting slight misinformation in conversations

January 7, 2026
Generative AI simplifies science communication, boosts public trust in scientists
Artificial Intelligence

Simple anthropomorphism can make an AI advisor as trusted as a romantic partner

January 5, 2026
Legalized sports betting linked to a rise in violent crimes and property theft
Artificial Intelligence

The psychology behind our anxiety toward black box algorithms

January 2, 2026

PsyPost Merch

STAY CONNECTED

LATEST

Depression’s impact on fairness perceptions depends on socioeconomic status

Early life adversity primes the body for persistent physical pain, new research suggests

Economic uncertainty linked to greater male aversion to female breadwinning

Women tend to downplay their gender in workplaces with masculinity contest cultures

Young people show posttraumatic growth after losing a parent, finding strength, meaning, and appreciation for life

MDMA-assisted therapy shows promise for long-term depression relief

Neuroscience study reveals that familiar rewards trigger motor preparation before a decision is made

Emotional abuse predicts self-loathing more strongly than other childhood traumas

RSS Psychology of Selling

  • How defending your opinion changes your confidence
  • The science behind why accessibility drives revenue in the fashion sector
  • How AI and political ideology intersect in the market for sensitive products
  • Researchers track how online shopping is related to stress
  • New study reveals why some powerful leaders admit mistakes while others double down
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy