Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

People with attachment anxiety are more vulnerable to problematic AI use

by Karina Petrova
October 17, 2025
in Artificial Intelligence, Attachment Styles
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A new study finds that individuals with high attachment anxiety are more prone to developing problematic usage patterns with conversational artificial intelligence. This connection appears to be strengthened when these individuals form an emotional bond with the technology and have a tendency to view it as human-like. The research was published in the journal Psychology Research and Behavior Management.

The recent rise of conversational artificial intelligence, such as chatbots and virtual assistants, has provided people with a new way to interact and find companionship. These programs use natural language to hold personalized, one-on-one conversations. During periods of increased social isolation, like the COVID-19 pandemic, millions of people turned to these technologies. This trend raised an important question for scientists: Does this innovation pose risks for specific groups of people?

Researchers led by Shupeng Heng at Henan Normal University focused on individuals with attachment anxiety. This personality trait is characterized by a persistent fear of rejection or abandonment in relationships, leading to a strong need for closeness and reassurance. People with high attachment anxiety are already known to be at a higher risk for other forms of problematic technology use, like smartphone and online gaming addictions. The research team wanted to see if this same vulnerability applied to conversational artificial intelligence and to understand the psychological processes involved.

The investigation sought to explore the direct link between attachment anxiety and what the researchers call the problematic use of conversational artificial intelligence, a pattern of addictive-like engagement that negatively impacts daily life. Beyond this direct link, the researchers examined two other factors. They explored whether forming an emotional attachment to the artificial intelligence acted as a bridge between a person’s anxiety and their problematic use. They also investigated if a person’s tendency to see the artificial intelligence as human-like, a trait called anthropomorphic tendency, amplified these effects.

To conduct their investigation, the researchers recruited 504 Chinese adults who had experience using conversational artificial intelligence. The participants were gathered through an online platform and completed a series of questionnaires designed to measure four key variables. One questionnaire assessed their level of attachment anxiety, with items related to fears of rejection and a desire for closeness. Another measured their emotional attachment to the artificial intelligence they used, asking about the strength of the emotional bond they felt.

A third questionnaire evaluated their anthropomorphic tendency, which is the inclination to attribute human characteristics, emotions, and intentions to nonhuman things. Participants rated their agreement with statements like, “I think AI is alive.” Finally, a scale was used to measure the problematic use of conversational artificial intelligence. This scale included items describing addictive behaviors, such as trying and failing to cut back on use. The researchers then used statistical analyses to examine the relationships between these four measures.

The results first showed a direct connection between attachment anxiety and problematic use. Individuals who scored higher on attachment anxiety were also more likely to report patterns of compulsive and unhealthy engagement with conversational artificial intelligence. This finding supported the researchers’ initial hypothesis that this group is particularly vulnerable.

The analysis also revealed a more complex, indirect pathway. The study found that people with higher attachment anxiety were more likely to form a strong emotional attachment to the conversational artificial intelligence. This emotional attachment was, in itself, a strong predictor of problematic use. This suggests that emotional attachment serves as a connecting step. Anxious individuals’ need for connection may lead them to form a bond with the technology, and it is this bond that in part drives the problematic usage.

The most nuanced finding involved the role of anthropomorphic tendency. The researchers discovered that this trait acted as a moderator, meaning it changed the strength of the relationship between attachment anxiety and problematic use. When they separated participants into groups based on their tendency to see the artificial intelligence as human-like, a clear pattern emerged.

For individuals with a low anthropomorphic tendency, their level of attachment anxiety was not significantly related to their problematic use of the technology. In contrast, for those with a high tendency to see the artificial intelligence as human, attachment anxiety was a powerful predictor of problematic use. Seeing the artificial intelligence as a social partner appears to make anxious individuals much more susceptible to developing an unhealthy dependency.

This moderating effect also applied to the formation of emotional bonds. Anxious individuals developed emotional attachments to the artificial intelligence regardless of their anthropomorphic tendencies. However, this effect was much stronger for those with a high tendency to see the technology as human. In other words, having high attachment anxiety combined with a tendency to anthropomorphize created the strongest emotional bonds with the artificial intelligence, which then increased the risk of problematic use.

The study has some limitations that the authors acknowledge. Because the data was collected at a single point in time, it shows a relationship between these traits but cannot prove that attachment anxiety causes problematic use. Future research could follow individuals over time to better establish a causal link. Another area for future exploration is the design of the technology itself. Different types of conversational artificial intelligence, such as a simple chatbot versus a virtual assistant with a human-like avatar, may have different effects on users.

The researchers suggest that their findings have practical implications for the design of these technologies. For instance, developers could consider creating versions with less human-like features for users who may be at higher risk. They could also embed features into the software that monitor for excessive use or provide educational content about healthy technology engagement. For individuals identified as being at high risk, the study suggests that interventions aimed at reducing anxiety, such as mindfulness practices, could help decrease their dependency on these virtual companions.

The study, “Attachment Anxiety and Problematic Use of Conversational Artificial Intelligence: Mediation of Emotional Attachment and Moderation of Anthropomorphic Tendencies,” was authored by Shupeng Heng and Ziwan Zhang.

RELATED

How common is anal sex? Scientific facts about prevalence, pain, pleasure, and more
Artificial Intelligence

Humans and AI both rate deliberate thinkers as smarter than intuitive ones

December 5, 2025
Song lyrics have become simpler, more negative, and more self-focused over time
Artificial Intelligence

An “AI” label fails to trigger negative bias in new pop music study

November 30, 2025
Daughters who feel more attractive report stronger, more protective bonds with their fathers
Artificial Intelligence

Learning via ChatGPT leads to shallower knowledge than using Google search, study finds

November 30, 2025
Scientists observe “striking” link between social AI chatbots and psychological distress
Artificial Intelligence

Scientists observe “striking” link between social AI chatbots and psychological distress

November 29, 2025
Stanford scientist discovers that AI has developed an uncanny human-like ability
Artificial Intelligence

Artificial intelligence helps decode the neuroscience of dance

November 28, 2025
AI chatbots often misrepresent scientific studies — and newer models may be worse
Artificial Intelligence

One in eight US adolescents and young adults use AI chatbots for mental health advice

November 26, 2025
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

A mathematical ceiling limits generative AI to amateur-level creativity

November 24, 2025
Why do women cheat? New study reveals complex motivations behind female infidelity
Attachment Styles

New research highlights the role of family background and attachment in shaping infidelity intentions

November 22, 2025

PsyPost Merch

STAY CONNECTED

LATEST

New research differentiates cognitive disengagement syndrome from ADHD in youth

Laughing gas may offer rapid relief for treatment-resistant depression

Synesthesia is several times more frequent in musicians than in nonmusicians

Blue light exposure alters cortical excitability in young adults, but adolescents respond differently

Common left-right political scale masks anti-establishment views at the center

New research suggests deep psychological schemas fuel problematic porn use

Study links anxiety and poor sleep to heart and kidney disease progression

MDMA’s blue Tuesday: Study confirms three-day drop in mental well-being after ecstasy use

RSS Psychology of Selling

  • Unlocking the neural pathways of influence
  • How virtual backgrounds influence livestream sales
  • Brain wiring predicts preference for emotional versus logical persuasion
  • What science reveals about the Black Friday shopping frenzy
  • Research reveals a hidden trade-off in employee-first leadership
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy