Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Using AI to verify human advice could damage your professional relationships

by Bianca Setionago
March 17, 2026
in Artificial Intelligence
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

New research published in Computers in Human Behavior suggests that consulting artificial intelligence (AI) for advice may unintentionally strain relationships with human professionals.

AI tools are rapidly becoming part of everyday decision-making, promising quick answers, personalised guidance, and lower costs. Many individuals utilize these tools alongside human professionals to double-check information or get a second opinion.

Previous studies have demonstrated that human advisors sometimes react negatively when clients consult multiple experts. In those situations, advisors may interpret the search for a second opinion as a lack of trust. Yet until recently, little attention had been given to how advisors respond when the second opinion originates from a computer algorithm rather than another person.

Hence, researchers Gerri Spassova (Monash University, Australia) and Mauricio Palmeira (University of South Florida, USA) set out to explore how human advisors react when clients consult AI in addition to seeking professional advice.

To investigate, the pair conducted four experiments involving roughly 180 to 300 adult participants each. In the first experiment, the participants had actual real-world advisory experience. In the subsequent three studies, participants were general adults asked to imagine working in advisory roles such as travel planning, finance, and nutrition. All participants read scenarios in which they had already provided professional advice to a client.

For example, in one experiment, financial advisors were told that after receiving their investment recommendation, the client also sought advice from either another human financial advisor or an artificial intelligence system. The advisors then rated how motivated they felt about the situation and if it affected their willingness to continue working with the client.

Across all four studies, a clear pattern emerged: advisors were noticeably less motivated to work with clients who had also consulted AI. In fact, the negative reaction was stronger than when clients consulted another human advisor.

The researchers suggested the motivation behind the negative response lies in professional identity. Advisors often view AI systems as far less capable than trained professionals. As a result, when clients place an AI tool alongside a human expert as a comparable source of advice, the comparison can feel insulting.

Google News Preferences Add PsyPost to your preferred sources

The study also uncovered another surprising effect: advisors tended to judge clients who used AI more negatively. Participants rated those clients as less competent and less warm compared to clients who sought advice from another human expert.

Importantly, the negative reaction persisted even when the AI system was used only for initial background information (rather than a final decision), or as a complementary service (rather than a replacement for the human expert’s advice). In other words, simply checking an AI tool could be enough to change how advisors view their clients.

“Our findings suggest that learning that the client consults AI may, consciously or not, change how the advisor perceives the client and how much effort they are willing to invest in the relationship. Such negative effects, even if subtle, could, in the long run, undermine the advisor’s relationship with the client and potentially result in missed opportunities,” Spassova and Palmeira concluded.

However, the study has several limitations. The research relied heavily on experimental roleplaying scenarios rather than real-world advisory relationships, meaning actual reactions may vary in practice. Additionally, it remains unclear whether these negative responses persist, diminish, or disappear entirely within longer‑term advisor-client relationships, particularly when the advisor knows the client well.

The study, “Offended by the Algorithm: The Hidden Interpersonal Costs of Clients Seeking AI Second Opinion,” was authored by Gerri Spassova and Mauricio Palmeira.

Previous Post

Brain scans reveal a bipolar-like link to childhood trauma in some depressed patients

Next Post

New psychology research reveals the cognitive cost of smartphone notifications

RELATED

Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026
AI autocomplete suggestions covertly change how users think about important topics
Artificial Intelligence

AI autocomplete suggestions covertly change how users think about important topics

April 2, 2026
Study links phubbing sensitivity to attachment patterns in romantic couples
Artificial Intelligence

How generative artificial intelligence is upending theories of political persuasion

April 1, 2026
People with attachment anxiety are more vulnerable to problematic AI use
Artificial Intelligence

Relying on AI chatbots for historical facts can influence your political beliefs, new study shows

March 30, 2026
ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests
Artificial Intelligence

ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests

March 30, 2026
Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Should your marketing tell a story or state the facts? A massive meta-analysis has answers
  • When brands embrace diversity, some customers pull away — and new research explains why
  • Smaller influencers drive engagement while bigger ones drive purchases, meta-analysis finds
  • Political conservatives are more drawn to baby-faced product designs, and purity values explain why
  • Free gifts with no strings attached can boost customer spending by over 30%, study finds

LATEST

Reduced gray matter and altered brain connectivity are linked to problematic smartphone use

Your breathing pattern is as unique as a fingerprint

Extreme athletes just helped scientists unlock a deep evolutionary secret about human survival

How different negative emotions change the size of your pupils

Artificial intelligence makes consumers more impatient

Stacking bad habits triples the risk of co-occurring anxiety and depression in teenagers

When the pay gap is wide, women see professional beauty as a strategic asset

Scientists discover intriguing brainwave patterns linked to rhythmic sound meditation

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc