Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

News chatbots that present multiple viewpoints tend to earn the trust of conspiracy believers

by Eric W. Dolan
March 20, 2026
in Artificial Intelligence, Conspiracy Theories
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A recent study published in the journal Computers in Human Behavior suggests that automated news chatbots programmed to deliver balanced viewpoints can earn the trust of people with varying ideological backgrounds. The research provides evidence that individuals who hold strong conspiracy beliefs tend to respond well to these chatbots, viewing them as useful tools for reading diverse news. These findings point to new ways technology might help pierce information bubbles and reduce societal division by exposing people to multiple perspectives.

In recent years, generative artificial intelligence has transformed how people interact with information online. Generative artificial intelligence refers to computer systems that can process massive amounts of text and generate human-like responses. News chatbots rely on similar technology to act as automated conversational agents. These programs allow users to browse topics, providing real-time text summaries of news articles in a chat window.

The authors behind the new study wanted to see if these chatbots could help solve a growing problem in the modern media landscape. People often engage in selective exposure, meaning they only click on news that matches their existing beliefs. Over time, this habit creates an echo chamber, which tends to increase political and social polarization.

When people are only exposed to one side of a story, they often become defensive or dismissive of alternative viewpoints. The scientists wanted to know if a neutral, automated chatbot could encourage people to step outside their comfort zones. They suspected that people might view a machine as more objective than a human journalist.

“People who believe in conspiracy theories tend to distrust mainstream media, seeing it as biased or agenda-driven,” said study author Shreya Dubey (@sdubey03), a postdoctoral researcher in the Amsterdam School of Communication Research at the University of Amsterdam.

“We wanted to test whether a chatbot, which may be perceived as more neutral than a traditional news outlet, might be better received by this group. We designed a chatbot that presented both mainstream and alternative news articles, then looked at whether conspiracy believers were more willing to trust and use it compared to people who don’t hold such beliefs.”

Specifically, the scientists developed a custom chatbot named Infobot. This program was designed to present users with eight different news headlines about climate change.

Four of the headlines represented mainstream scientific perspectives supporting climate action. The other four headlines represented alternative viewpoints, including arguments against climate action and narratives that framed climate change as a hoax. Users could scroll through the headlines and click on any article to read a brief summary generated by the chatbot.

Google News Preferences Add PsyPost to your preferred sources

After reading a summary, that article disappeared, prompting the user to choose another. The software tracked which articles the users selected and how much time they spent reading them. In the first study, the scientists recruited a sample of 177 adult residents of the United States.

They split these participants into two groups based on their responses to a questionnaire about general conspiracy theories. The final sample included 93 individuals with low generic conspiracy beliefs and 84 individuals with high generic conspiracy beliefs. Participants were instructed to interact with Infobot and read at least four article summaries.

Afterward, they answered survey questions rating the chatbot on its ease of use, perceived usefulness, and potential risks. They also rated their overall trust in the program, their general attitude toward it, and their intention to use such a tool in the future. The data showed that participants who found the chatbot useful and trustworthy tended to have a positive attitude toward it.

This positive attitude directly predicted their intention to use news chatbots again. Unexpectedly, the scientists found that people with high generic conspiracy beliefs trusted the chatbot more than those with low conspiracy beliefs. The high-belief group also reported a more positive attitude and a greater intention to use the program in the future.

Both groups read a similar number of mainstream and alternative articles. However, the software revealed that individuals with higher conspiracy beliefs spent significantly less time actually reading the mainstream summaries compared to the alternative ones. The researchers noticed a potential flaw in their first study.

They had grouped people based on their belief in general conspiracies, rather than their specific beliefs about climate change. In fact, the two groups did not significantly differ in their actual belief in human-caused global warming. To fix this, the scientists conducted a second study.

For the second study, the researchers recruited 58 participants. This time, they specifically screened for beliefs about climate change. The sample included 35 individuals with low climate change conspiracy beliefs and 23 individuals with high climate change conspiracy beliefs.

The procedure was nearly identical to the first experiment. However, participants had to enter a special code from the chatbot to prove they had paid attention to the summaries. The second study replicated the findings of the first.

Once again, trust and perceived usefulness predicted a positive attitude toward the chatbot. Participants with high climate change conspiracy beliefs trusted the chatbot more and showed a greater intention to use it than those with low conspiracy beliefs. The scientists noted that both groups generally responded positively to the program, but the high-belief group was consistently more enthusiastic.

The researchers suspect this happens because individuals with strong conspiracy beliefs often feel that mainstream media is biased against them. Because the chatbot presented their alternative views on equal footing with mainstream science, they likely viewed the machine as a fair and unbiased source of information.

“Most of us, regardless of our beliefs, tend to think we’ve formed our opinions objectively and from good information,” Dubey told PsyPost. “Our findings suggest that a chatbot presenting multiple perspectives feels refreshingly balanced to people across the board, including those who distrust mainstream media.”

“But this raises an uncomfortable question: is balance always desirable? Climate change is not genuinely contested among scientists, yet our chatbot presented mainstream and alternative views side by side. While this approach made the tool widely accepted, it also risks creating a false equivalence. That is, giving fringe or misleading viewpoints the same weight as scientific consensus. The very feature that made our chatbot appealing could, if applied carelessly, end up legitimising misinformation.”

“So the real takeaway is a tension worth sitting with: tools that feel balanced and neutral may be our best shot at reaching people across ideological divides, but ‘balance’ on issues like climate change is not a neutral act in itself,” Dubey said.

While the findings offer hope for reducing polarization, the researchers noted several limitations. First, the studies only compared people at the extreme ends of the conspiracy belief spectrum. Individuals with moderate beliefs were excluded from the main analysis, which means the results might not represent the entire population. Second, the participants only interacted with the chatbot one time in a controlled survey environment.

It is unclear if their positive attitudes would persist after repeated use over weeks or months. It also remains to be seen if people would voluntarily choose to use a balanced news chatbot in the real world when they have access to highly personalized social media feeds.

Future research should investigate exactly which features of the chatbot make it appealing to different groups. Scientists could also explore whether giving users some control over the ratio of mainstream to alternative news might increase their willingness to engage with opposing viewpoints.

The study, “Investigating perceived trust and utility of balanced news chatbots among individuals with varying conspiracy beliefs,” was authored by Shreya Dubey, Paul E. Ketelaar, Tilman Dingler, Hannah K. Peetz, and Hein T. van Schie.

Previous Post

New study finds link between receptivity to “corporate bullshit” and weaker leadership skills

RELATED

Lifelong diet quality predicts cognitive ability and dementia risk in older age
Artificial Intelligence

Emotionally intelligent AI chatbots improve mental health but destroy real-world social ties

March 19, 2026
AI-assisted venting can boost psychological well-being, study suggests
Artificial Intelligence

Popular AI chatbots generate unsafe diet plans for teenagers

March 18, 2026
Generative AI chatbots like ChatGPT can act as an “emotional sanctuary” for mental health
Artificial Intelligence

Using AI to verify human advice could damage your professional relationships

March 17, 2026
LLM red teamers: People are hacking AI chatbots just for fun and now researchers have catalogued 35 “jailbreak” techniques
Artificial Intelligence

Artificial intelligence struggles to consistently evaluate scientific facts

March 17, 2026
Scientists just uncovered a major limitation in how AI models understand truth and belief
Artificial Intelligence

The bystander effect applies to virtual agents, new psychology research shows

March 12, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

Therapists test an AI dating simulator to help chronically single men practice romantic skills

March 9, 2026
Researchers identify two psychological traits that predict conspiracy theory belief
Artificial Intelligence

Brain-controlled assistive robots work best when they share the workload with users

March 8, 2026
Republicans’ pro-democracy speeches after January 6 had no impact on Trump supporters, study suggests
Conspiracy Theories

Trump voters who believed conspiracy theories were the most likely to justify the Jan. 6 riots

March 5, 2026

STAY CONNECTED

RSS Psychology of Selling

  • When saying sorry with a small discount actually makes things worse
  • How dark and light personality traits relate to business owner well-being
  • Why mobile game fail ads make you want to download the app
  • The science of sound reduplication and cuteness in product branding
  • How consumers react to wait time predictions from humans versus AI chatbots

LATEST

New study finds link between receptivity to “corporate bullshit” and weaker leadership skills

An analysis of data from 75 countries confirms that nature connectedness predicts well-being

The psychological impact of ghosting lasts longer than outright rejection

Building muscle strength may help prevent depression, especially in women

Researchers use machine learning to reveal how gasoline prices drive presidential approval ratings

A faulty brain waste disposal system may lead to psychosis

Emotionally intelligent AI chatbots improve mental health but destroy real-world social ties

New neuroimaging study maps the brain networks behind scientific creative thinking

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc