Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Why would shoppers prefer chatbots to humans? New study pinpoints a key factor

by Eric W. Dolan
June 26, 2024
in Artificial Intelligence, Business
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook

Technological advancements are revolutionizing customer service interactions, with firms increasingly relying on chatbots—automated virtual agents that can simulate human conversation. While many people generally prefer interacting with human customer service agents, a new study reveals an interesting twist: when consumers feel embarrassed about their purchases, they actually prefer dealing with chatbots. The study was published in the Journal of Consumer Psychology.

The primary aim of the study was to understand how consumers’ concerns about self-presentation—essentially, their worries about being judged by others—affect their interactions with chatbots compared to human customer service agents. Lead researcher Jianna Jin, an assistant professor at the University of Notre Dame, and her colleagues wanted to explore whether chatbots could mitigate feelings of embarrassment in online shopping scenarios. This inquiry was particularly relevant as chatbots, with their ambiguous or disclosed identities, become more prevalent in the digital marketplace.

The researchers conducted a series of five studies to understand consumer preferences when dealing with chatbots versus human agents in contexts likely to elicit embarrassment. Participants were recruited from Amazon Mechanical Turk and other platforms.

Study 1 involved 403 participants who were asked to imagine buying a personal lubricant from an online store. They interacted with an ambiguous chat agent, meaning the agent’s identity as either human or chatbot was not disclosed. The participants then had to infer the agent’s identity and complete a measure of self-presentation concerns related to sex-related topics.

The results showed that participants with higher self-presentation concerns were more likely to infer that the ambiguous chat agent was human. This finding suggested that in situations where people felt anxious about how they were perceived, they tended to err on the side of caution, assuming the agent might be human to prepare themselves for potential embarrassment.

Study 2 expanded on these findings by comparing reactions to different product categories. Here, 795 female participants imagined purchasing either a personal lubricant or body lotion from an online store and interacted with the same ambiguous chat agent as in Study 1. The study aimed to see if the type of product influenced their perception of the chat agent’s identity.

As predicted, participants inferred the agent to be human more frequently when shopping for personal lubricant compared to body lotion. This demonstrated that the nature of the product could activate self-presentation concerns, affecting how consumers perceive and interact with customer service agents.

Study 3 shifted the focus to clearly identified chatbots and human agents. A large sample of 1,501 participants was asked to imagine buying antidiarrheal medication and interacted with either a non-anthropomorphized chatbot (a chatbot without human-like features), an anthropomorphized chatbot (a chatbot with human-like features), or a human service rep.

Google News Preferences Add PsyPost to your preferred sources

Participants showed a higher willingness to engage with the non-anthropomorphized chatbot compared to the human agent, particularly when the purchase context involved potential embarrassment. However, this preference diminished when the chatbot was anthropomorphized, indicating that giving chatbots human-like qualities can make consumers feel similarly judged as they would by human agents.

Study 4 delved deeper into how self-presentation concerns influenced perceptions of a clearly identified anthropomorphized chatbot versus a human agent. Participants were asked to imagine purchasing a personal lubricant and rated the chatbot or human agent on perceived experience (the capacity to feel emotions and have consciousness).

Those with higher self-presentation concerns ascribed more experience to the anthropomorphized chatbot, despite knowing it was not human. This finding suggested that anthropomorphism introduces ambiguity about a chatbot’s human-like qualities, affecting consumer comfort levels.

Studies 5a and 5b involved real interactions with chatbots. In Study 5a, 386 undergraduate students were asked to choose between two online stores, one with a human service agent and one with a chatbot, for purchasing either antidiarrheal or hay fever medication. Participants preferred the chatbot store for the embarrassing product (antidiarrheal medication) and the human store for the non-embarrassing product (hay fever medication). This choice was mediated by feelings of embarrassment, as indicated by participants’ spontaneous explanations.

Study 5b involved 595 participants interacting with a real chatbot about skincare concerns. Participants were more willing to provide their email addresses to the chatbot than to a human agent, a behavior mediated by reduced feelings of embarrassment when interacting with the chatbot.

“In general, research shows people would rather interact with a human customer service agent than a chatbot,” said Jin, who led the study as a doctoral student at Ohio State’s Fisher College of Business. “But we found that when people are worried about others judging them, that tendency reverses and they would rather interact with a chatbot because they feel less embarrassed dealing with a chatbot than a human.”

While the study offers significant insights, it has some limitations. The reliance on self-reported measures and hypothetical scenarios in some of the studies may not fully capture real-world behaviors. Additionally, the focus was mainly on specific embarrassing product categories, which may not generalize to all types of products or services.

Nevertheless, the findings have some practical implications. Companies should consider these findings when designing their customer service strategies, especially for products that might cause consumers to feel self-conscious. By clearly identifying chatbots and avoiding excessive anthropomorphism, businesses can improve customer comfort and engagement.

“Chatbots are becoming more and more common as customer service agents, and companies are not required in most states to disclose if they use them,” said co-author Rebecca Walker Reczek, a professor at Ohio State’s Fisher College. “But it may be important for companies to let consumers know if they’re dealing with a chatbot.”

The study, “Avoiding embarrassment online: Response to and inferences about chatbots when purchases activate self-presentation concerns,” was authored by Jianna Jin, Jesse Walker, and Rebecca Walker Reczek.

Previous Post

Positive affect and openness linked to better cognitive outcomes in older adults, study finds

Next Post

DNA methylation patterns reveal faster aging in children with mental health issues

RELATED

Psychology study reveals a fascinating fact about artwork
Artificial Intelligence

AI art fails to trigger the same empathy as human works

February 20, 2026
Emotionally intelligent women use more emojis when communicating with friends
Business

New study sheds light on the psychological burden of having a massive social media audience

February 20, 2026
ChatGPT’s social trait judgments align with human impressions, study finds
Artificial Intelligence

AI chatbots generate weight loss coaching messages perceived as helpful as human-written advice

February 16, 2026
Scientists use machine learning to control specific brain circuits
Artificial Intelligence

Scientists use machine learning to control specific brain circuits

February 14, 2026
Younger women find men with beards less attractive than older women do
Artificial Intelligence

Bias against AI art is so deep it changes how viewers perceive color and brightness

February 13, 2026
AI outshines humans in humor: Study finds ChatGPT is as funny as The Onion
Artificial Intelligence

AI boosts worker creativity only if they use specific thinking strategies

February 12, 2026
Psychology study sheds light on the phenomenon of waifus and husbandos
Artificial Intelligence

Psychology study sheds light on the phenomenon of waifus and husbandos

February 11, 2026
How people end romantic relationships: New study pinpoints three common break up strategies
Artificial Intelligence

Psychology shows why using AI for Valentine’s Day could be disastrous

February 9, 2026

STAY CONNECTED

LATEST

AI art fails to trigger the same empathy as human works

New research highlights the enduring distinctiveness of marriage

Genetic analysis reveals shared biology between testosterone and depression

Artificial sweeteners spark more intense brain activity than real sugar

Parental math anxiety linked to lower quantitative skills in young children

What is a femcel? The psychology and culture of female involuntary celibates

New study sheds light on the psychological burden of having a massive social media audience

Viral AI agent OpenClaw highlights the psychological complexity of human-computer interaction

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc