Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

People with attachment anxiety are more vulnerable to problematic AI use

by Karina Petrova
October 17, 2025
in Artificial Intelligence, Attachment Styles
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A new study finds that individuals with high attachment anxiety are more prone to developing problematic usage patterns with conversational artificial intelligence. This connection appears to be strengthened when these individuals form an emotional bond with the technology and have a tendency to view it as human-like. The research was published in the journal Psychology Research and Behavior Management.

The recent rise of conversational artificial intelligence, such as chatbots and virtual assistants, has provided people with a new way to interact and find companionship. These programs use natural language to hold personalized, one-on-one conversations. During periods of increased social isolation, like the COVID-19 pandemic, millions of people turned to these technologies. This trend raised an important question for scientists: Does this innovation pose risks for specific groups of people?

Researchers led by Shupeng Heng at Henan Normal University focused on individuals with attachment anxiety. This personality trait is characterized by a persistent fear of rejection or abandonment in relationships, leading to a strong need for closeness and reassurance. People with high attachment anxiety are already known to be at a higher risk for other forms of problematic technology use, like smartphone and online gaming addictions. The research team wanted to see if this same vulnerability applied to conversational artificial intelligence and to understand the psychological processes involved.

The investigation sought to explore the direct link between attachment anxiety and what the researchers call the problematic use of conversational artificial intelligence, a pattern of addictive-like engagement that negatively impacts daily life. Beyond this direct link, the researchers examined two other factors. They explored whether forming an emotional attachment to the artificial intelligence acted as a bridge between a person’s anxiety and their problematic use. They also investigated if a person’s tendency to see the artificial intelligence as human-like, a trait called anthropomorphic tendency, amplified these effects.

To conduct their investigation, the researchers recruited 504 Chinese adults who had experience using conversational artificial intelligence. The participants were gathered through an online platform and completed a series of questionnaires designed to measure four key variables. One questionnaire assessed their level of attachment anxiety, with items related to fears of rejection and a desire for closeness. Another measured their emotional attachment to the artificial intelligence they used, asking about the strength of the emotional bond they felt.

A third questionnaire evaluated their anthropomorphic tendency, which is the inclination to attribute human characteristics, emotions, and intentions to nonhuman things. Participants rated their agreement with statements like, “I think AI is alive.” Finally, a scale was used to measure the problematic use of conversational artificial intelligence. This scale included items describing addictive behaviors, such as trying and failing to cut back on use. The researchers then used statistical analyses to examine the relationships between these four measures.

The results first showed a direct connection between attachment anxiety and problematic use. Individuals who scored higher on attachment anxiety were also more likely to report patterns of compulsive and unhealthy engagement with conversational artificial intelligence. This finding supported the researchers’ initial hypothesis that this group is particularly vulnerable.

The analysis also revealed a more complex, indirect pathway. The study found that people with higher attachment anxiety were more likely to form a strong emotional attachment to the conversational artificial intelligence. This emotional attachment was, in itself, a strong predictor of problematic use. This suggests that emotional attachment serves as a connecting step. Anxious individuals’ need for connection may lead them to form a bond with the technology, and it is this bond that in part drives the problematic usage.

The most nuanced finding involved the role of anthropomorphic tendency. The researchers discovered that this trait acted as a moderator, meaning it changed the strength of the relationship between attachment anxiety and problematic use. When they separated participants into groups based on their tendency to see the artificial intelligence as human-like, a clear pattern emerged.

For individuals with a low anthropomorphic tendency, their level of attachment anxiety was not significantly related to their problematic use of the technology. In contrast, for those with a high tendency to see the artificial intelligence as human, attachment anxiety was a powerful predictor of problematic use. Seeing the artificial intelligence as a social partner appears to make anxious individuals much more susceptible to developing an unhealthy dependency.

This moderating effect also applied to the formation of emotional bonds. Anxious individuals developed emotional attachments to the artificial intelligence regardless of their anthropomorphic tendencies. However, this effect was much stronger for those with a high tendency to see the technology as human. In other words, having high attachment anxiety combined with a tendency to anthropomorphize created the strongest emotional bonds with the artificial intelligence, which then increased the risk of problematic use.

The study has some limitations that the authors acknowledge. Because the data was collected at a single point in time, it shows a relationship between these traits but cannot prove that attachment anxiety causes problematic use. Future research could follow individuals over time to better establish a causal link. Another area for future exploration is the design of the technology itself. Different types of conversational artificial intelligence, such as a simple chatbot versus a virtual assistant with a human-like avatar, may have different effects on users.

The researchers suggest that their findings have practical implications for the design of these technologies. For instance, developers could consider creating versions with less human-like features for users who may be at higher risk. They could also embed features into the software that monitor for excessive use or provide educational content about healthy technology engagement. For individuals identified as being at high risk, the study suggests that interventions aimed at reducing anxiety, such as mindfulness practices, could help decrease their dependency on these virtual companions.

The study, “Attachment Anxiety and Problematic Use of Conversational Artificial Intelligence: Mediation of Emotional Attachment and Moderation of Anthropomorphic Tendencies,” was authored by Shupeng Heng and Ziwan Zhang.

RELATED

AI chatbots often misrepresent scientific studies — and newer models may be worse
Artificial Intelligence

Sycophantic chatbots inflate people’s perceptions that they are “better than average”

January 19, 2026
Trump supporters and insecure men more likely to value a large penis, according to new research
Attachment Styles

Study links unpredictable childhoods to poorer relationships via increased mating effort

January 18, 2026
Google searches for racial slurs are higher in areas where people are worried about disease
Artificial Intelligence

Learning from AI summaries leads to shallower knowledge than web search

January 17, 2026
Neuroscientists find evidence meditation changes how fluid moves in the brain
Artificial Intelligence

Scientists show humans can “catch” fear from a breathing robot

January 16, 2026
Insecure attachment is linked to Machiavellian personality traits
Attachment Styles

Insecure attachment is linked to Machiavellian personality traits

January 12, 2026
Poor sleep may shrink brain regions vulnerable to Alzheimer’s disease, study suggests
Artificial Intelligence

How scientists are growing computers from human brain cells – and why they want to keep doing it

January 11, 2026
Misinformation thrives on outrage, study finds
Artificial Intelligence

The psychology behind the deceptive power of AI-generated images on Facebook

January 8, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

Conversational AI can increase false memory formation by injecting slight misinformation in conversations

January 7, 2026

PsyPost Merch

STAY CONNECTED

LATEST

Neuroscience study reveals that familiar rewards trigger motor preparation before a decision is made

Emotional abuse predicts self-loathing more strongly than other childhood traumas

Sycophantic chatbots inflate people’s perceptions that they are “better than average”

Preschool gardening helps young children eat better and stay active

FDA-cleared brain stimulation device fails to beat placebo in ADHD trial

Study finds education level doesn’t stop narcissists from believing conspiracy theories

Frequent pornography use does not always indicate a problem, new study suggests

Psilocybin microdosing fails to boost cognitive performance in rigorous trials

RSS Psychology of Selling

  • The science behind why accessibility drives revenue in the fashion sector
  • How AI and political ideology intersect in the market for sensitive products
  • Researchers track how online shopping is related to stress
  • New study reveals why some powerful leaders admit mistakes while others double down
  • Study reveals the cycle of guilt and sadness that follows a FOMO impulse buy
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy