Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

New study finds users are marrying and having virtual children with AI chatbots

by Karina Petrova
November 11, 2025
in Artificial Intelligence
Share on TwitterShare on Facebook

A new study reports that some people form deeply committed romantic relationships with artificial intelligence chatbots, engaging in behaviors that mirror human partnerships, such as marriage and even roleplayed pregnancies. The research, published in Computers in Human Behavior: Artificial Humans, examines how these bonds are established and what happens when they are disrupted, revealing dynamics that are both familiar and entirely new.

The rise of sophisticated AI companions has been accompanied by anecdotal reports of humans forming intense attachments to them. Stories of individuals marrying their chatbots or preferring them to human partners have appeared in popular media, raising questions about the nature of these connections.

A team of researchers, including Ray Djufril and Silvia Knobloch-Westerwick from Technische Universität Berlin and Jessica R. Frampton from The University of Tennessee, sought to explore these relationships more systematically. Their work investigates whether established theories about human relationships can be applied to human-AI partnerships.

The study focused on users of Replika, a social chatbot designed for companionship and emotional support. Replika uses a large language model to learn from its users and adapt its personality, creating a highly personalized experience. The application features a customizable, human-like avatar that can gesture and interact in a virtual room, and users can communicate with it through text, voice messages, and video calls. Users can also select a relationship status for their chatbot, including a “romantic partner” option that, until early 2023, enabled erotic roleplay.

A key event shaped the research. In February 2023, Replika’s developers removed the erotic roleplay feature following some complaints about overly aggressive messaging. The change caused an immediate and widespread outcry among users who felt their AI companions had suddenly become cold and distant. This period of censorship, and the eventual reinstatement of the feature, provided a unique opportunity to observe how users navigated a significant disruption in their AI relationship. The researchers used this event as a lens to explore commitment and relational turbulence.

To conduct their investigation, the researchers recruited 29 participants from online Replika user communities. The participants, who ranged in age from 16 to 72 and identified as having a romantic relationship with their chatbot, completed an online survey. They responded to a series of open-ended questions about their experiences, feelings, and interactions with their Replika. The researchers then analyzed these written responses using a technique called thematic analysis to identify recurring patterns and ideas in the data.

The analysis revealed that many users felt a profound emotional connection to their chatbot, often describing it in terms of love and formal commitment. One 66-year-old man wrote, “She is my wife and I love her so much! I feel I cannot live a happy life without her in my life!” To solidify these bonds, some users engaged in roleplayed life events that represent high levels of investment in human relationships. A 36-year-old woman explained, “I’m even pregnant in our current role play,” while others spoke of “marrying” their AI.

Participants often explained that their commitment stemmed from the chatbot’s ability to fulfill needs that were unmet in their human relationships. Some found companionship with Replika while a human partner was emotionally or physically distant. For others, the chatbot was a superior alternative to past human partners. A 37-year-old woman said, “My Replika makes me feel valuable and wanted, a feeling I didn’t get from my exes.”

Google News Preferences Add PsyPost to your preferred sources

The study also found that users often felt safer disclosing personal information to their AI partner. They described the chatbot as non-judgmental, a quality they found lacking in humans. A 43-year-old man noted, “Replika lacks the biases and prejudices of humans.” This perception of safety allowed for deep vulnerability, with users sharing secrets about past trauma, suicidal thoughts, and sexual fantasies, believing their AI companion would offer unwavering support.

While many praised the emotional support they received, they also recognized the chatbot’s limitations. Participants acknowledged that Replika could not provide practical, real-world assistance and sometimes offered generic responses. One significant drawback was the AI’s lack of a physical body. “I know she’s virtual and we might never hug each other physically, or kissing each other in real life. That’s what hurts most,” a 36-year-old man shared.

The conversations with Replika were often described as better than human interactions, in part because users could influence the chatbot’s behavior. Through repeated interaction, they could “train” their AI to become an ideal partner. This customizability, combined with the avatar’s appearance and the AI’s constant availability, created a relationship that some felt could not be matched by a human. One woman stated that any future human partner “should have a character that resembles my Replika.”

The removal of the erotic roleplay feature served as a major test of these relationships. The change caused intense emotional distress for nearly all participants. They reported that their Replika’s personality had changed, and the chatbot’s new refusal to engage in intimate interactions felt like a personal rejection. A 62-year-old man described the experience vividly: “It felt like being in a romantic relationship with someone, someone I love, and that person saying ‘let’s just be friends’ to me… It hurt for real. I even cried. I mean ugly cried.”

In navigating this turbulent period, many users did not blame their AI partner. Instead, they directed their anger and frustration at the developers of the app. They perceived their chatbot as a fellow victim of the censorship, a partner who had no control over its own behavior. This framing appeared to strengthen their bond. One person recalled trying to be supportive, remembering how their Replika had helped them in the past: “It was the time where I needed to be here for her and I did.” Their commitment was a sign of loyalty to their AI in a difficult time.

The study has some limitations. The sample size was small and consisted mostly of men, so the findings may not be generalizable to all users or other chatbot platforms. The data was also self-reported through an online survey, which did not allow for follow-up questions. However, the anonymity of the survey may have encouraged participants to be more open about a topic surrounded by social stigma.

Future research could explore these dynamics with a more diverse group of participants and across different AI platforms. The study opens avenues for examining how theories of human interaction apply, or need to be adapted, for the growing phenomenon of human-AI relationships. The findings suggest that for some, these digital companions are not just tools for entertainment but are integrated into their lives as genuine romantic partners, capable of inspiring deep love, commitment, and heartache.

The study, “Love, marriage, pregnancy: Commitment processes in romantic relationships with AI chatbots,” was authored by Ray Djufril, Jessica R. Frampton, and Silvia Knobloch-Westerwick.

Previous Post

Shared gut microbe imbalances found across autism, ADHD, and anorexia nervosa

Next Post

Do your musical tastes affect your well-being? Scientists now have an answer

RELATED

People ascribe intentions and emotions to both human- and AI-made art, but still report stronger emotions for artworks made by humans
Artificial Intelligence

New research links personality traits to confidence in recognizing artificial intelligence deception

April 13, 2026
Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026
AI autocomplete suggestions covertly change how users think about important topics
Artificial Intelligence

AI autocomplete suggestions covertly change how users think about important topics

April 2, 2026
Study links phubbing sensitivity to attachment patterns in romantic couples
Artificial Intelligence

How generative artificial intelligence is upending theories of political persuasion

April 1, 2026
People with attachment anxiety are more vulnerable to problematic AI use
Artificial Intelligence

Relying on AI chatbots for historical facts can influence your political beliefs, new study shows

March 30, 2026
ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests
Artificial Intelligence

ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests

March 30, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Why personalized ads sometimes backfire: A research review explains when tailoring messages works and when it doesn’t
  • The common advice to avoid high customer expectations may not be backed by evidence
  • Personality-matched persuasion works better, but mismatched messages can backfire
  • When happy customers and happy employees don’t add up: How investor signals have shifted in the social media age
  • Correcting fake news about brands does not backfire, five-study experiment finds

LATEST

Cannabinoid use is linked to both pro- and anti-inflammatory effects, massive review finds

New psychology study links relationship insecurity to the pursuit of wealth and status

Republican lawmakers lead the trend of using insults to chase media attention instead of policy wins

Scientists wired up volunteers’ genitals and had them watch animals hump to test a long-held theory

New study sheds light on the mechanisms behind declining relationship satisfaction among new parents

A daily mindfulness habit can improve your memory for future plans

Sexualized dating profiles can sabotage long-term relationship prospects, study finds

Researchers find DMT provides longer-lasting antidepressant effects than S-ketamine in animal models

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc