Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Generative AI chatbots like ChatGPT can act as an “emotional sanctuary” for mental health

by Eric W. Dolan
January 13, 2025
in Artificial Intelligence, Mental Health
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

Could generative AI chatbots make a meaningful contribution to mental health care? A study published in npj Mental Health Research suggests they could. Researchers conducted interviews with individuals who used chatbots such as ChatGPT for mental health support and found that many participants reported experiencing a sense of emotional sanctuary, receiving insightful guidance, and even deriving joy from their interactions.

Generative AI chatbots are advanced conversational agents powered by large language models, such as OpenAI’s ChatGPT or Google’s Gemini. Unlike rule-based chatbots, which rely on preprogrammed scripts and decision trees, generative AI chatbots are trained on vast datasets to understand and produce human-like text. This enables them to engage in nuanced and flexible conversations, answer complex questions, and provide tailored responses based on context.

In mental health contexts, generative AI chatbots represent a novel approach to providing support. They are available 24/7, nonjudgmental, and capable of engaging in dynamic, empathetic interactions. These characteristics make them appealing to individuals who may face barriers to traditional therapy, such as cost, stigma, or geographic limitations. Despite their growing use, however, little is known about how people experience these tools in real-world mental health scenarios.

“I’ve long been convinced that technology holds great promise to address the global mental health crisis—nearly a billion people worldwide suffer from mental disorders, the overwhelming majority of whom don’t get adequate treatment—but I’ve also been daunted by the low effectiveness of mental health apps despite a decade of development,” said Steven Siddals, who conducted the study in collaboration with King’s College London and Harvard Medical School.

“Like so many people, I was blown away by ChatGPT in late 2022, and I started hearing more and more about mental health use cases in 2023. It didn’t take much testing to realize this is an entirely new capability, with real potential, that will need a lot of research to understand its implications.”

The research team recruited nineteen participants from diverse backgrounds, ranging in age from 17 to 60, with a mix of male and female users from eight countries. Participants were required to have had at least three meaningful conversations with a generative AI chatbot about mental health topics, each lasting at least 20 minutes. Recruitment was conducted through online platforms, including Reddit and LinkedIn, and participants voluntarily joined without receiving compensation.

The researchers conducted semi-structured interviews, allowing participants to share their experiences in their own words. Questions addressed topics such as their initial motivations for using chatbots, the impact on their mental health, and comparisons to other forms of support. Conversations were recorded, transcribed, and analyzed using a thematic analysis approach, which involved coding participant responses and grouping them into broader themes.

Siddals was surprised by “the depth of impact it had on people. Participants described their interactions with AI for mental health support as life changing, for example in how it supported them through their darkest times, or helped them heal from trauma.”

The researchers identified four major themes that captured participants’ experiences:

Emotional sanctuary

Many participants described generative AI chatbots as a safe, nonjudgmental space where they could express their feelings without fear of rejection. The chatbots were perceived as patient and empathetic, helping users process complex emotions and cope with difficult life events. One participant remarked: “Compared to like friends and therapists, I feel like it’s safer.”

However, frustrations arose when the chatbot’s safety protocols disrupted conversations, leaving some users feeling rejected during moments of vulnerability. For example, some participants reported that when discussing sensitive or intense emotions, the chatbots abruptly reverted to pre-scripted responses or suggested seeking human help, which could feel dismissive.

“Ironically, the only distressing experiences reported by our participants were the times when the AI chatbot left them feeling rejected in moments of vulnerability, because its safety guardrails were activated.”

Insightful guidance

Participants valued the chatbots’ ability to offer practical advice and new perspectives, particularly regarding relationships. For example, one user credited a chatbot with helping them set healthier boundaries in a toxic friendship. Others found the chatbots effective at reframing negative thoughts or providing strategies for managing anxiety.

However, the level of trust in this guidance varied. While some participants found the advice empowering and life-changing, others were skeptical, particularly when the chatbot’s responses seemed generic or inconsistent.

Joy of connection

Beyond emotional support, many participants experienced a sense of enjoyment and companionship from interacting with chatbots. For many users, interacting with a chatbot brought a sense of companionship and even happiness, particularly during periods of loneliness. The conversational style of generative AI made interactions feel engaging and human-like, which some participants found awe-inspiring.

Additionally, a number of participants noted that using chatbots helped them build confidence in opening up to others, strengthening their real-life relationships.

“[It] reduced my inhibition to open up to people… I don’t think I would have had this conversation with you maybe year before, when I was dealing with my depression,” one participant explained.

The AI therapist?

Comparisons between generative AI chatbots and human therapists were common. Some participants found the chatbots to be valuable supplements to therapy, using them to prepare for sessions or process thoughts between appointments. Others turned to chatbots because therapy was inaccessible or unaffordable.

However, participants also noted limitations, such as the chatbot’s inability to lead the therapeutic process or provide deep emotional connection. The lack of memory and continuity in conversations was another frequently cited drawback.

“They forget everything,” a participant explained. “It’s sad… When someone forgets something important, it hurts.”

 

Siddals also highlighted the “creativity and diversity in how people used” AI chatbots. For instance, one participant used the chatbot to assemble fictional characters with contrasting perspectives for support during a breakup, while another recreated an imagined, healing conversation with an estranged parent to address unresolved guilt and find emotional closure.

“If you’re suffering emotionally, you might be able to find meaningful emotional support from ChatGPT and other generative AI chatbots – at no cost, at any time of day or night, in a judgement-free space,” Siddals told PsyPost. “Our study participants experienced it as an ’emotional sanctuary’ for processing feelings and healing from trauma, as a source of insightful guidance (especially about relationships), and as a joy to connect with, in a way that bears comparison with human therapy. Just bear in mind that this is emerging technology and not well understood, so if you do use it, you’ll need to use it carefully and take responsibility for your safety.”

While the study offers valuable insights, it also has limitations. The small sample size and reliance on self-selected participants mean the findings may not represent the broader population. Most participants were tech-savvy and from high-income countries, potentially excluding perspectives from those who face the greatest barriers to mental health care. Additionally, the qualitative nature of the study does not provide quantitative measures of effectiveness or safety.

“If you’re interested to try these tools, it’s important to know that nobody really understands how generative AI is able do what it does, not even the companies that built it,” Siddals noted. “While nobody in our study reported serious negative experiences, AI chatbots are known to make things up (“hallucinate”) at times, and examples have been reported of AI chatbots responding inappropriately when used for mental health or companionship.”

Future research should explore the long-term impacts of generative AI chatbots on mental health outcomes, particularly through large-scale, controlled studies. It will also be important to investigate how these tools perform across diverse populations and mental health conditions.

“I hope this research will help to get generative AI for mental health on the agenda as one of the more promising developments in the field,” Siddals said. “We urgently need: More research, to understand safety and effectiveness, for example with large scale longitudinal studies to assess the impact on different conditions and populations. More innovation, to develop better safety paradigms and better ways to connect the people who need mental health support with the tools that could help them, at scale. More experimentation from clinicians on how these tools can complement therapy to help their clients.”

“This is a fast-moving area, with constant evolution of the technology and rapid adoption – which only adds to the urgent need for more research on real-world usage to understand this new capability and find out how to deploy it safely and effectively.”

The study, “‘It happened to be the perfect thing’: experiences of generative AI chatbots for mental health,” was authored by Steven Siddals, John Torous, and Astrid Coxon.

RELATED

New psychology research flips the script on happiness and self-control
Mental Health

New psychology research flips the script on happiness and self-control

December 16, 2025
Ayahuasca retreat participants report greater gratitude and nature appreciation after the experience, study finds
Addiction

Recent LSD use linked to lower odds of alcohol use disorder

December 15, 2025
Paternal psychological strengths linked to lower maternal inflammation in married couples
Dementia

Music training may delay age-related hearing decline by a decade

December 15, 2025
Dim morning light triggers biological markers of depression in healthy adults
Body Image and Body Dysmorphia

Sexual difficulties in eating disorders may stem from different causes in men and women

December 14, 2025
AI outshines humans in humor: Study finds ChatGPT is as funny as The Onion
Artificial Intelligence

Most top US research universities now encourage generative AI use in the classroom

December 14, 2025
Dim morning light triggers biological markers of depression in healthy adults
Depression

Dim morning light triggers biological markers of depression in healthy adults

December 14, 2025
Media coverage of artificial intelligence split along political lines, study finds
Artificial Intelligence

Survey reveals rapid adoption of AI tools in mental health care despite safety concerns

December 13, 2025
Harrowing case report details a psychotic “resurrection” delusion fueled by a sycophantic AI
Autism

Researchers uncover a distinct narrative pattern in autistic people and their siblings

December 13, 2025

PsyPost Merch

STAY CONNECTED

LATEST

Volume reduction in amygdala tracks with depression relief after ketamine infusions

Couples share a unique form of contagious forgetting, new research suggests

Naturalistic study reveals nuanced cognitive effects of cannabis on frequent older users

New study identifies five strategies women use to detect deception in dating

The mood-enhancing benefits of caffeine are strongest right after waking up

New psychology research flips the script on happiness and self-control

Disrupted sleep might stop the brain from flushing out toxic waste

Formal schooling boosts executive functions beyond natural maturation

RSS Psychology of Selling

  • Brain scans reveal increased neural effort when marketing messages miss the mark
  • Mental reconnection in the morning fuels workplace proactivity
  • The challenge of selling the connected home
  • Consumers prefer emotionally intelligent AI, but not for guilty pleasures
  • Active listening improves likability but does not enhance persuasion
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy