PsyPost
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
Join
My Account
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

People with attachment anxiety are more vulnerable to problematic AI use

by Karina Petrova
October 17, 2025
Reading Time: 4 mins read
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A new study finds that individuals with high attachment anxiety are more prone to developing problematic usage patterns with conversational artificial intelligence. This connection appears to be strengthened when these individuals form an emotional bond with the technology and have a tendency to view it as human-like. The research was published in the journal Psychology Research and Behavior Management.

The recent rise of conversational artificial intelligence, such as chatbots and virtual assistants, has provided people with a new way to interact and find companionship. These programs use natural language to hold personalized, one-on-one conversations. During periods of increased social isolation, like the COVID-19 pandemic, millions of people turned to these technologies. This trend raised an important question for scientists: Does this innovation pose risks for specific groups of people?

Researchers led by Shupeng Heng at Henan Normal University focused on individuals with attachment anxiety. This personality trait is characterized by a persistent fear of rejection or abandonment in relationships, leading to a strong need for closeness and reassurance. People with high attachment anxiety are already known to be at a higher risk for other forms of problematic technology use, like smartphone and online gaming addictions. The research team wanted to see if this same vulnerability applied to conversational artificial intelligence and to understand the psychological processes involved.

The investigation sought to explore the direct link between attachment anxiety and what the researchers call the problematic use of conversational artificial intelligence, a pattern of addictive-like engagement that negatively impacts daily life. Beyond this direct link, the researchers examined two other factors. They explored whether forming an emotional attachment to the artificial intelligence acted as a bridge between a person’s anxiety and their problematic use. They also investigated if a person’s tendency to see the artificial intelligence as human-like, a trait called anthropomorphic tendency, amplified these effects.

To conduct their investigation, the researchers recruited 504 Chinese adults who had experience using conversational artificial intelligence. The participants were gathered through an online platform and completed a series of questionnaires designed to measure four key variables. One questionnaire assessed their level of attachment anxiety, with items related to fears of rejection and a desire for closeness. Another measured their emotional attachment to the artificial intelligence they used, asking about the strength of the emotional bond they felt.

A third questionnaire evaluated their anthropomorphic tendency, which is the inclination to attribute human characteristics, emotions, and intentions to nonhuman things. Participants rated their agreement with statements like, “I think AI is alive.” Finally, a scale was used to measure the problematic use of conversational artificial intelligence. This scale included items describing addictive behaviors, such as trying and failing to cut back on use. The researchers then used statistical analyses to examine the relationships between these four measures.

The results first showed a direct connection between attachment anxiety and problematic use. Individuals who scored higher on attachment anxiety were also more likely to report patterns of compulsive and unhealthy engagement with conversational artificial intelligence. This finding supported the researchers’ initial hypothesis that this group is particularly vulnerable.

The analysis also revealed a more complex, indirect pathway. The study found that people with higher attachment anxiety were more likely to form a strong emotional attachment to the conversational artificial intelligence. This emotional attachment was, in itself, a strong predictor of problematic use. This suggests that emotional attachment serves as a connecting step. Anxious individuals’ need for connection may lead them to form a bond with the technology, and it is this bond that in part drives the problematic usage.

Google News Preferences Add PsyPost to your preferred sources

The most nuanced finding involved the role of anthropomorphic tendency. The researchers discovered that this trait acted as a moderator, meaning it changed the strength of the relationship between attachment anxiety and problematic use. When they separated participants into groups based on their tendency to see the artificial intelligence as human-like, a clear pattern emerged.

For individuals with a low anthropomorphic tendency, their level of attachment anxiety was not significantly related to their problematic use of the technology. In contrast, for those with a high tendency to see the artificial intelligence as human, attachment anxiety was a powerful predictor of problematic use. Seeing the artificial intelligence as a social partner appears to make anxious individuals much more susceptible to developing an unhealthy dependency.

This moderating effect also applied to the formation of emotional bonds. Anxious individuals developed emotional attachments to the artificial intelligence regardless of their anthropomorphic tendencies. However, this effect was much stronger for those with a high tendency to see the technology as human. In other words, having high attachment anxiety combined with a tendency to anthropomorphize created the strongest emotional bonds with the artificial intelligence, which then increased the risk of problematic use.

The study has some limitations that the authors acknowledge. Because the data was collected at a single point in time, it shows a relationship between these traits but cannot prove that attachment anxiety causes problematic use. Future research could follow individuals over time to better establish a causal link. Another area for future exploration is the design of the technology itself. Different types of conversational artificial intelligence, such as a simple chatbot versus a virtual assistant with a human-like avatar, may have different effects on users.

The researchers suggest that their findings have practical implications for the design of these technologies. For instance, developers could consider creating versions with less human-like features for users who may be at higher risk. They could also embed features into the software that monitor for excessive use or provide educational content about healthy technology engagement. For individuals identified as being at high risk, the study suggests that interventions aimed at reducing anxiety, such as mindfulness practices, could help decrease their dependency on these virtual companions.

The study, “Attachment Anxiety and Problematic Use of Conversational Artificial Intelligence: Mediation of Emotional Attachment and Moderation of Anthropomorphic Tendencies,” was authored by Shupeng Heng and Ziwan Zhang.

RELATED

People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Fascinating new research suggests artificial neurodivergence could help solve the AI alignment problem

May 1, 2026
Gold digging is strongly linked to psychopathy and dark personality traits, study finds
Artificial Intelligence

High trust in AI leaves individuals vulnerable to “cognitive surrender,” study finds

April 30, 2026
Artificial intelligence flatters users into bad behavior
Artificial Intelligence

Artificial intelligence flatters users into bad behavior

April 26, 2026
Psychology textbooks still misrepresent famous experiments and controversial debates
Artificial Intelligence

How eye contact shapes the believability of computer-generated faces

April 24, 2026
Facebook users who ruminate and compare themselves to their friends experience increased loneliness
Artificial Intelligence

Women perceive AI as riskier than men do, study finds

April 22, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Psychologists pinpoint the conversational mechanisms that help humans bond with AI

April 22, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Unrestricted generative AI harms high school math learning by acting as a crutch

April 21, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

People remain “blissfully ignorant” of AI use in everyday messages, new research shows

April 20, 2026

Follow PsyPost

The latest research, however you prefer to read it.

Daily newsletter

One email a day. The newest research, nothing else.

Google News

Get PsyPost stories in your Google News feed.

Add PsyPost to Google News
RSS feed

Use your favorite reader. We also syndicate to Apple News.

Copy RSS URL
Social media
Support independent science journalism

Ad-free reading, full archives, and weekly deep dives for members.

Become a member

Trending

  • Gold digging is strongly linked to psychopathy and dark personality traits, study finds
  • Narcissism runs in the family, but not because of parenting
  • A reduced sense of belonging links childhood emotional abuse to unhappier romantic relationships
  • Scientists reveal the biological pathways linking childhood trauma to chronic gut pain
  • How cognitive ability and logical intuition evolve during middle and high school

Psychology of Selling

  • Why cramped spaces sometimes make customers happier: The surprising science of “spatial captivity”
  • Seven seller skills that drive B2B sales performance, according to a Norwegian study
  • What makes customers stick with a salesperson? A study traces the path from trust to long-term commitment
  • When company shakeups breed envy, salespeople may cut corners and eye the exit
  • Study finds Instagram micro-celebrities can shift brand attitudes and buying intent through direct engagement

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc