Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

People with attachment anxiety are more vulnerable to problematic AI use

by Karina Petrova
October 17, 2025
in Artificial Intelligence, Attachment Styles
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook
Follow PsyPost on Google News

A new study finds that individuals with high attachment anxiety are more prone to developing problematic usage patterns with conversational artificial intelligence. This connection appears to be strengthened when these individuals form an emotional bond with the technology and have a tendency to view it as human-like. The research was published in the journal Psychology Research and Behavior Management.

The recent rise of conversational artificial intelligence, such as chatbots and virtual assistants, has provided people with a new way to interact and find companionship. These programs use natural language to hold personalized, one-on-one conversations. During periods of increased social isolation, like the COVID-19 pandemic, millions of people turned to these technologies. This trend raised an important question for scientists: Does this innovation pose risks for specific groups of people?

Researchers led by Shupeng Heng at Henan Normal University focused on individuals with attachment anxiety. This personality trait is characterized by a persistent fear of rejection or abandonment in relationships, leading to a strong need for closeness and reassurance. People with high attachment anxiety are already known to be at a higher risk for other forms of problematic technology use, like smartphone and online gaming addictions. The research team wanted to see if this same vulnerability applied to conversational artificial intelligence and to understand the psychological processes involved.

The investigation sought to explore the direct link between attachment anxiety and what the researchers call the problematic use of conversational artificial intelligence, a pattern of addictive-like engagement that negatively impacts daily life. Beyond this direct link, the researchers examined two other factors. They explored whether forming an emotional attachment to the artificial intelligence acted as a bridge between a person’s anxiety and their problematic use. They also investigated if a person’s tendency to see the artificial intelligence as human-like, a trait called anthropomorphic tendency, amplified these effects.

To conduct their investigation, the researchers recruited 504 Chinese adults who had experience using conversational artificial intelligence. The participants were gathered through an online platform and completed a series of questionnaires designed to measure four key variables. One questionnaire assessed their level of attachment anxiety, with items related to fears of rejection and a desire for closeness. Another measured their emotional attachment to the artificial intelligence they used, asking about the strength of the emotional bond they felt.

A third questionnaire evaluated their anthropomorphic tendency, which is the inclination to attribute human characteristics, emotions, and intentions to nonhuman things. Participants rated their agreement with statements like, “I think AI is alive.” Finally, a scale was used to measure the problematic use of conversational artificial intelligence. This scale included items describing addictive behaviors, such as trying and failing to cut back on use. The researchers then used statistical analyses to examine the relationships between these four measures.

The results first showed a direct connection between attachment anxiety and problematic use. Individuals who scored higher on attachment anxiety were also more likely to report patterns of compulsive and unhealthy engagement with conversational artificial intelligence. This finding supported the researchers’ initial hypothesis that this group is particularly vulnerable.

The analysis also revealed a more complex, indirect pathway. The study found that people with higher attachment anxiety were more likely to form a strong emotional attachment to the conversational artificial intelligence. This emotional attachment was, in itself, a strong predictor of problematic use. This suggests that emotional attachment serves as a connecting step. Anxious individuals’ need for connection may lead them to form a bond with the technology, and it is this bond that in part drives the problematic usage.

The most nuanced finding involved the role of anthropomorphic tendency. The researchers discovered that this trait acted as a moderator, meaning it changed the strength of the relationship between attachment anxiety and problematic use. When they separated participants into groups based on their tendency to see the artificial intelligence as human-like, a clear pattern emerged.

For individuals with a low anthropomorphic tendency, their level of attachment anxiety was not significantly related to their problematic use of the technology. In contrast, for those with a high tendency to see the artificial intelligence as human, attachment anxiety was a powerful predictor of problematic use. Seeing the artificial intelligence as a social partner appears to make anxious individuals much more susceptible to developing an unhealthy dependency.

This moderating effect also applied to the formation of emotional bonds. Anxious individuals developed emotional attachments to the artificial intelligence regardless of their anthropomorphic tendencies. However, this effect was much stronger for those with a high tendency to see the technology as human. In other words, having high attachment anxiety combined with a tendency to anthropomorphize created the strongest emotional bonds with the artificial intelligence, which then increased the risk of problematic use.

The study has some limitations that the authors acknowledge. Because the data was collected at a single point in time, it shows a relationship between these traits but cannot prove that attachment anxiety causes problematic use. Future research could follow individuals over time to better establish a causal link. Another area for future exploration is the design of the technology itself. Different types of conversational artificial intelligence, such as a simple chatbot versus a virtual assistant with a human-like avatar, may have different effects on users.

The researchers suggest that their findings have practical implications for the design of these technologies. For instance, developers could consider creating versions with less human-like features for users who may be at higher risk. They could also embed features into the software that monitor for excessive use or provide educational content about healthy technology engagement. For individuals identified as being at high risk, the study suggests that interventions aimed at reducing anxiety, such as mindfulness practices, could help decrease their dependency on these virtual companions.

The study, “Attachment Anxiety and Problematic Use of Conversational Artificial Intelligence: Mediation of Emotional Attachment and Moderation of Anthropomorphic Tendencies,” was authored by Shupeng Heng and Ziwan Zhang.

RELATED

Mind captioning: This scientist just used AI to translate brain activity into text
Artificial Intelligence

Artificial intelligence exhibits human-like cognitive errors in medical reasoning

November 10, 2025
Mind captioning: This scientist just used AI to translate brain activity into text
Artificial Intelligence

Mind captioning: This scientist just used AI to translate brain activity into text

November 10, 2025
The psychology of forbidden pleasures: New insights into paraphilic interests uncovered
Attachment Styles

Does your relationship with your parents influence your sexual fantasies?

November 9, 2025
Shyness linked to spontaneous activity in the brain’s cerebellum
Artificial Intelligence

AI roots out three key predictors of terrorism support

November 6, 2025
AI outshines humans in humor: Study finds ChatGPT is as funny as The Onion
Artificial Intelligence

Smarter AI models show more selfish behavior

November 4, 2025
In neuroscience breakthrough, scientists identify key component of how exercise triggers neurogenesis
Artificial Intelligence

Brain-mimicking artificial neuron could solve AI’s growing energy problem

November 1, 2025
In neuroscience breakthrough, scientists identify key component of how exercise triggers neurogenesis
Artificial Intelligence

Google’s AI co-scientist just solved a biological mystery that took humans a decade

November 1, 2025
Scientists discover unique neuron density patterns in children with autism
Artificial Intelligence

The secret to sustainable AI may have been in our brains all along

October 31, 2025

PsyPost Merch

STAY CONNECTED

LATEST

Artificial intelligence exhibits human-like cognitive errors in medical reasoning

A multi-scale view of the brain uncovers the blueprint of intelligence

Cognitive disability might be on the rise in the U.S., particularly among younger adults

For individuals with depressive symptoms, birdsong may offer unique physiological benefits

Mind captioning: This scientist just used AI to translate brain activity into text

Brain imaging study reveals how different parts of the brain “fall asleep” at different times

Mehmet Oz’s provocative rhetoric served as a costly signal, new study suggests

A neuroscientist explains how to build cognitive reserve for a healthier brain

RSS Psychology of Selling

  • How supervisors influence front-line salespeople
  • Age shapes how brains respond to guilt-based deceptive advertising
  • Is emotional intelligence the hidden ingredient in startup success?
  • Which videos make Gen Z shoppers click “buy now”?
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy