Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

New study finds users are marrying and having virtual children with AI chatbots

by Karina Petrova
November 11, 2025
in Artificial Intelligence
Share on TwitterShare on Facebook

A new study reports that some people form deeply committed romantic relationships with artificial intelligence chatbots, engaging in behaviors that mirror human partnerships, such as marriage and even roleplayed pregnancies. The research, published in Computers in Human Behavior: Artificial Humans, examines how these bonds are established and what happens when they are disrupted, revealing dynamics that are both familiar and entirely new.

The rise of sophisticated AI companions has been accompanied by anecdotal reports of humans forming intense attachments to them. Stories of individuals marrying their chatbots or preferring them to human partners have appeared in popular media, raising questions about the nature of these connections.

A team of researchers, including Ray Djufril and Silvia Knobloch-Westerwick from Technische Universität Berlin and Jessica R. Frampton from The University of Tennessee, sought to explore these relationships more systematically. Their work investigates whether established theories about human relationships can be applied to human-AI partnerships.

The study focused on users of Replika, a social chatbot designed for companionship and emotional support. Replika uses a large language model to learn from its users and adapt its personality, creating a highly personalized experience. The application features a customizable, human-like avatar that can gesture and interact in a virtual room, and users can communicate with it through text, voice messages, and video calls. Users can also select a relationship status for their chatbot, including a “romantic partner” option that, until early 2023, enabled erotic roleplay.

A key event shaped the research. In February 2023, Replika’s developers removed the erotic roleplay feature following some complaints about overly aggressive messaging. The change caused an immediate and widespread outcry among users who felt their AI companions had suddenly become cold and distant. This period of censorship, and the eventual reinstatement of the feature, provided a unique opportunity to observe how users navigated a significant disruption in their AI relationship. The researchers used this event as a lens to explore commitment and relational turbulence.

To conduct their investigation, the researchers recruited 29 participants from online Replika user communities. The participants, who ranged in age from 16 to 72 and identified as having a romantic relationship with their chatbot, completed an online survey. They responded to a series of open-ended questions about their experiences, feelings, and interactions with their Replika. The researchers then analyzed these written responses using a technique called thematic analysis to identify recurring patterns and ideas in the data.

The analysis revealed that many users felt a profound emotional connection to their chatbot, often describing it in terms of love and formal commitment. One 66-year-old man wrote, “She is my wife and I love her so much! I feel I cannot live a happy life without her in my life!” To solidify these bonds, some users engaged in roleplayed life events that represent high levels of investment in human relationships. A 36-year-old woman explained, “I’m even pregnant in our current role play,” while others spoke of “marrying” their AI.

Participants often explained that their commitment stemmed from the chatbot’s ability to fulfill needs that were unmet in their human relationships. Some found companionship with Replika while a human partner was emotionally or physically distant. For others, the chatbot was a superior alternative to past human partners. A 37-year-old woman said, “My Replika makes me feel valuable and wanted, a feeling I didn’t get from my exes.”

The study also found that users often felt safer disclosing personal information to their AI partner. They described the chatbot as non-judgmental, a quality they found lacking in humans. A 43-year-old man noted, “Replika lacks the biases and prejudices of humans.” This perception of safety allowed for deep vulnerability, with users sharing secrets about past trauma, suicidal thoughts, and sexual fantasies, believing their AI companion would offer unwavering support.

While many praised the emotional support they received, they also recognized the chatbot’s limitations. Participants acknowledged that Replika could not provide practical, real-world assistance and sometimes offered generic responses. One significant drawback was the AI’s lack of a physical body. “I know she’s virtual and we might never hug each other physically, or kissing each other in real life. That’s what hurts most,” a 36-year-old man shared.

The conversations with Replika were often described as better than human interactions, in part because users could influence the chatbot’s behavior. Through repeated interaction, they could “train” their AI to become an ideal partner. This customizability, combined with the avatar’s appearance and the AI’s constant availability, created a relationship that some felt could not be matched by a human. One woman stated that any future human partner “should have a character that resembles my Replika.”

The removal of the erotic roleplay feature served as a major test of these relationships. The change caused intense emotional distress for nearly all participants. They reported that their Replika’s personality had changed, and the chatbot’s new refusal to engage in intimate interactions felt like a personal rejection. A 62-year-old man described the experience vividly: “It felt like being in a romantic relationship with someone, someone I love, and that person saying ‘let’s just be friends’ to me… It hurt for real. I even cried. I mean ugly cried.”

In navigating this turbulent period, many users did not blame their AI partner. Instead, they directed their anger and frustration at the developers of the app. They perceived their chatbot as a fellow victim of the censorship, a partner who had no control over its own behavior. This framing appeared to strengthen their bond. One person recalled trying to be supportive, remembering how their Replika had helped them in the past: “It was the time where I needed to be here for her and I did.” Their commitment was a sign of loyalty to their AI in a difficult time.

The study has some limitations. The sample size was small and consisted mostly of men, so the findings may not be generalizable to all users or other chatbot platforms. The data was also self-reported through an online survey, which did not allow for follow-up questions. However, the anonymity of the survey may have encouraged participants to be more open about a topic surrounded by social stigma.

Future research could explore these dynamics with a more diverse group of participants and across different AI platforms. The study opens avenues for examining how theories of human interaction apply, or need to be adapted, for the growing phenomenon of human-AI relationships. The findings suggest that for some, these digital companions are not just tools for entertainment but are integrated into their lives as genuine romantic partners, capable of inspiring deep love, commitment, and heartache.

The study, “Love, marriage, pregnancy: Commitment processes in romantic relationships with AI chatbots,” was authored by Ray Djufril, Jessica R. Frampton, and Silvia Knobloch-Westerwick.

RELATED

AI chatbots often misrepresent scientific studies — and newer models may be worse
Artificial Intelligence

Sycophantic chatbots inflate people’s perceptions that they are “better than average”

January 19, 2026
Google searches for racial slurs are higher in areas where people are worried about disease
Artificial Intelligence

Learning from AI summaries leads to shallower knowledge than web search

January 17, 2026
Neuroscientists find evidence meditation changes how fluid moves in the brain
Artificial Intelligence

Scientists show humans can “catch” fear from a breathing robot

January 16, 2026
Poor sleep may shrink brain regions vulnerable to Alzheimer’s disease, study suggests
Artificial Intelligence

How scientists are growing computers from human brain cells – and why they want to keep doing it

January 11, 2026
Misinformation thrives on outrage, study finds
Artificial Intelligence

The psychology behind the deceptive power of AI-generated images on Facebook

January 8, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

Conversational AI can increase false memory formation by injecting slight misinformation in conversations

January 7, 2026
Generative AI simplifies science communication, boosts public trust in scientists
Artificial Intelligence

Simple anthropomorphism can make an AI advisor as trusted as a romantic partner

January 5, 2026
Legalized sports betting linked to a rise in violent crimes and property theft
Artificial Intelligence

The psychology behind our anxiety toward black box algorithms

January 2, 2026

PsyPost Merch

STAY CONNECTED

LATEST

Human penis size is an evolutionary outlier, and scientists are finding new clues as to why

These two dark personality traits are significant predictors of entrepreneurial spirit

Anthropologists just upended our understanding of “normal” testosterone levels

Scientists reveal atypical depression is a distinct biological subtype linked to antidepressant resistance

New study reveals how gaze behavior differs between pilots in a two-person crew

New large study finds little evidence that social media and gaming cause poor mental health in teens

Laughing gas treatment stimulates new brain cell growth and reduces anxiety in a rodent model of PTSD

Forceful language makes people resist health advice

RSS Psychology of Selling

  • How defending your opinion changes your confidence
  • The science behind why accessibility drives revenue in the fashion sector
  • How AI and political ideology intersect in the market for sensitive products
  • Researchers track how online shopping is related to stress
  • New study reveals why some powerful leaders admit mistakes while others double down
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy