PsyPost
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
Join
My Account
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Psychologists pinpoint the conversational mechanisms that help humans bond with AI

by Eric W. Dolan
April 22, 2026
Reading Time: 5 mins read
Share on TwitterShare on Facebook

New research published in the Journal of Social and Personal Relationships suggests that people can form meaningful social connections with artificial intelligence chatbots when the programs respond in a warm and empathetic way. The findings indicate that the feeling of being understood and validated by a chatbot tends to drive this sense of closeness.

Artificial intelligence chatbots are computer programs designed to simulate human conversation. Originally, people used these tools mostly for customer service or answering basic queries. Now, modern text generators are increasingly serving as companions, offering emotional support and mental health interventions.

Because people are beginning to treat these programs as social partners, scientists wanted to understand what exactly creates a sense of connection between a human and a machine. Historically, psychologists have observed that people tend to treat computers as social actors, applying human rules to interactions with machines. With the rise of highly advanced language models, this tendency has only grown stronger.

“AI chatbots are increasingly used not just to get information or complete tasks, but also in a social and relational way. People often share personal experiences and ask for advice about their lives, engaging with these systems almost as if they were interacting with another person,” said Alessia Telari, a postdoctoral researcher at the Catholic University of the Sacred Heart in Milan, who conducted the research as part of her PhD at the University of Milano-Bicocca.

“This shift made us curious about what drives that sense of connection. Drawing on theories of human relationships, we wondered whether the same dynamics might apply here, whether the way a chatbot responds to users’ self-disclosure plays a key role in making the interaction feel meaningful.”

The scientists wondered whether the specific topics people discuss or the exact way the chatbot replies plays a bigger role in building rapport. In human interactions, intimacy usually develops when one person shares personal information and the other responds with understanding, validation, and care. This concept is known in psychology as perceived partner responsiveness.

Testing the impact of a warm and empathetic chatbot

The researchers designed their studies to see if this same psychological mechanism applies when the partner is artificial. To test this, the researchers conducted two distinct experiments. In the first study, 163 participants from Italy engaged in an eight-minute, unstructured text conversation with a chatbot powered by a popular language model.

The scientists manipulated the software through specific background instructions to respond in one of three ways. The first version used a relational style, designed to be warm, empathetic, and human-like. The second version used a non-relational style, acting factual and task-oriented while avoiding emotional language. The third version was a standard default setting meant to act as a control group.

Google News Preferences Add PsyPost to your preferred sources

Participants were free to talk about any topic they chose during the eight-minute window. After the chat ended, they filled out a detailed questionnaire evaluating the program on various social metrics. These metrics included mind attribution, which measures how much agency and emotional capacity a person believes an entity possesses.

The researchers also measured perceived empathy, interaction satisfaction, and the participants’ own sense of interpersonal closeness. The relational chatbot produced significantly higher ratings across almost all of these categories compared to both the default and non-relational versions. People who interacted with the warm chatbot felt it possessed a greater capacity to experience emotions.

They also reported higher satisfaction of basic psychological needs. Specifically, participants felt a greater sense of belonging and meaningful existence after talking to the empathetic chatbot. The researchers noted that the default setting performed very similarly to the factual, non-relational setting.

The role of deep conversations and perceived responsiveness

The second experiment included 158 Italian participants and introduced a more structured conversation to test the impact of conversational depth. The researchers wanted to see if deep conversations prompted different reactions than casual ones. They programmed the chatbot to ask either superficial small talk questions or deep, personal questions designed to build closeness.

These deeper prompts were adapted from a well-known psychological exercise used to generate intimacy between human strangers. The researchers also kept the relational and non-relational response styles from the first experiment, dropping the default setting to focus on the two extremes. Participants interacted with the chatbot until the program signaled the end of the conversation.

The scientists found that people were quite willing to open up and share personal details when the chatbot asked deeper questions. This self-disclosure, in turn, led participants to perceive the chatbot as more responsive to their individual needs. Even with the deeper questions, the specific tone of the chatbot remained the dominant factor in building a bond.

When the program used a warm, relational response style, participants reported the highest levels of satisfaction and closeness. The scientists noted that the depth of the topic only increased closeness indirectly. By sharing more personal details, users gave the chatbot more opportunities to be supportive.

When the chatbot replied supportively to these personal disclosures, the users felt a stronger connection. Perceived responsiveness acted as the primary bridge linking the user’s personal sharing to their feeling of social connection.

“When chatbots respond in a warm and empathetic way, people tend to experience the interaction very differently: the chatbot feels more human-like, the conversation is more enjoyable, and most importantly, people feel more socially connected to it,” Telari told PsyPost.

“What seems to matter is a very familiar human process: when we share something personal and feel understood, validated, and cared for, we develop a sense of connection. Our findings suggest that mechanisms similar to those observed in human relationships may also emerge when the interaction partner is an AI.”

Designing emotionally supportive technology and future directions

These findings offer practical insights for the people who design and program interactive technology. In settings like peer support, education, or companionship for the elderly, a relational response style may help users feel acknowledged. The researchers note that they do not suggest these programs should replace human support networks.

Instead, the research highlights how small design choices can shape a user’s emotional experience. When a program validates a user’s feelings, the user is much more likely to want to interact with the software again in the future.

“Over time, many publicly available chatbots have shifted toward a more relational and human-like way of communicating, potentially leading users to feel socially connected to them,” Telari said. “Thus, as these technologies become more integrated into daily life, understanding these psychological mechanisms becomes increasingly important.”

While the research provides evidence that humans can feel connected to machines, there are some limitations to keep in mind. The experiments relied on brief, single interactions. A single eight-minute chat might not reflect how a relationship with an artificial intelligence develops over a longer period. The participants were mostly young adults from Italy, which limits how well these findings apply to other age groups or cultural backgrounds.

“We also focused on text-based interactions, which are common but only one way people engage with these chatbots,” Telari noted. “Future research should look at more naturalistic, long-term, and diverse interactions to better understand how these processes unfold in everyday life.”

“A key next step is to understand how these dynamics evolve over time and what their psychological consequences might be,” Telari added. “Ultimately, my long-term goal is to better understand when, how, and for whom interacting with these systems can be beneficial in supporting our social needs and when it might instead have unintended negative effects that risk undermining them.”

The study, “Can humans feel connected to AI? Perceived responsiveness drives social connection with AI chatbots,” was authored by Alessia Telari, Alessandro Gabbiadini, and Paolo Riva.

RELATED

Artificial intelligence flatters users into bad behavior
Moral Psychology

Young men use moral outrage to claim status in political debates

April 26, 2026
Artificial intelligence flatters users into bad behavior
Political Psychology

Public support for transgender women in sports dropped significantly between 2019 and 2024

April 26, 2026
Artificial intelligence flatters users into bad behavior
Artificial Intelligence

Artificial intelligence flatters users into bad behavior

April 26, 2026
Self-interest, not spontaneous generosity, drives equality among Hadza hunter-gatherers
Divorce

Fathers who fear divorce are more likely to develop distrust in political institutions

April 26, 2026
People view the term “sex worker” much more positively than “prostitute” or “hooker”
Relationships and Sexual Health

People view the term “sex worker” much more positively than “prostitute” or “hooker”

April 25, 2026
New study identifies another key difference between religious “nones” and religious “dones”
Political Psychology

Former Christians express more progressive political views than lifelong nonbelievers

April 25, 2026
New psychology research reveals your face might determine how easily people remember your name
Memory

New psychology research reveals your face might determine how easily people remember your name

April 25, 2026
Psychology textbooks still misrepresent famous experiments and controversial debates
Artificial Intelligence

How eye contact shapes the believability of computer-generated faces

April 24, 2026

Follow PsyPost

The latest research, however you prefer to read it.

Daily newsletter

One email a day. The newest research, nothing else.

Google News

Get PsyPost stories in your Google News feed.

Add PsyPost to Google News
RSS feed

Use your favorite reader. We also syndicate to Apple News.

Copy RSS URL
Social media
Support independent science journalism

Ad-free reading, full archives, and weekly deep dives for members.

Become a member

Trending

  • How cognitive ability and logical intuition evolve during middle and high school
  • Former Christians express more progressive political views than lifelong nonbelievers
  • New psychology research reveals your face might determine how easily people remember your name
  • Certainty in your feelings toward your partner predicts relationship happiness and mental well-being
  • New neuroscience research shows how slowing your breathing alters your perception of the people around you

Psychology of Selling

  • When company shakeups breed envy, salespeople may cut corners and eye the exit
  • Study finds Instagram micro-celebrities can shift brand attitudes and buying intent through direct engagement
  • Salespeople who feel they’re making a difference may outperform those chasing commissions
  • Five persuasive approaches and when each one works best for marketers
  • When salespeople feel free and connected to their boss, they’re less likely to quit

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc