Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Groundbreaking psychology research sheds light on the trust dynamics of human-machine collectives

by Eric W. Dolan
June 20, 2023
in Artificial Intelligence, Social Psychology
Share on TwitterShare on Facebook

A series of psychological experiments has shown that people’s treatment of machines differs from their treatment of humans, which influences the establishment of trust within human-machine collectives. The new findings provide insights into the dynamics of human-bot interactions and have implications for understanding human behavior in the emerging context of AI systems. The research was published in Nature Communications.

The researchers were interested in studying how humans and bots interact in online communities and how their behavior is influenced by social norms. They wanted to understand the challenges that arise in mixed human-bot collectives, similar to those faced by human societies, such as cooperation, exploitation, and norm stabilization.

“Ever since I was a child, I have always been fascinated by human-bot collectives and their societal implications, inspired by classic movies such as Terminator 2 and Blade Runner,” said study co-author Talal Rahwan, an associate professor of computer science at New York University Abu Dhabi.

“My fascination with these issues grew bigger after I received my PhD in Artificial Intelligence, but I felt that studying human-bot interactions should include the perspective of social scientists and psychologists. With this in mind, I sought out two collaborators, Kinga Makovi (a sociologist) and Jean-François Bonnefon (a psychologist), and together we developed the current study of cooperation and punishment in human-bot collectives.”

To conduct their study, the researchers performed a series of online experiments with a total of 7,917 participants. They created a stylized society (a simplified and artificial representation of a social system) in which participants could take on different roles: Beneficiaries, Helpers, Punishers, and Trustors. The participants played economic games with real financial consequences, which served as proxies for real-life interactions.

In all the experiments, the distinction between human and bot participants was communicated to participants through textual descriptions (referring to humans as “MTurk worker” and bots as “Bot”) and stylized images of robots or people. This information was presented on the screen where participants made their choices.

Humans typically earn trust by sharing and punishing those who don’t share, but the researchers observed that this trust was less pronounced when humans interacted with bots. Sharing or punishing behaviors towards bots didn’t lead to as much trust as when humans interacted with other humans.

In other words, sharing resources with bots resulted in a smaller increase in trust compared to sharing with humans. Similarly, people who did not share with bot beneficiaries were less likely to be punished compared to those who did not share with humans. Bots also did not receive the same level of trust gain as humans when they shared resources. As a result, trust was not easily established in these mixed human-bot communities, which led to worse collective outcomes.

But the trust gains were not completely eliminated when interacting with bots. This suggests that people carried assumptions about social norms from human societies into these mixed communities, the researchers said.

“It is known that people can signal their trustworthiness to others by acting cooperatively, or by punishing those who do not cooperate with others. We show that that same holds in human-bot collectives, albeit to a lesser extent,” explained co-author Kinga Makovi, an assistant professor at New York University Abu Dhabi.

“More specifically, in our experiments, the trust gained by sharing resources with a bot was less than the trust gained when sharing with a fellow human. We saw something similar when it comes to punishing a bot as opposed to a human. Importantly though, the trust-gains for ‘doing the right thing’ (sharing or punishing those who do not share) were only attenuated, rather than eliminated, suggesting that people carry into human-bot societies similar assumptions about the social norms that they have long relied on within human societies.”

Additionally, the researchers found that when participants were informed about the high consensus regarding the norm of sharing, trust gains generally increased. This suggests that people may alter their behavior once they are made aware of social norms.

“Previous attempts to increase trust and cooperation between humans and bots often tried to make bots look more like humans, but this approach led to disappointing results,” noted co-author Jean-François Bonnefon, a research director at the Toulouse School of Economics.

“We show that there is a better approach: instead of trying to pull bots into the circle of trust by giving them a human appearance, you can nudge people to expand the circle of trust so that it reaches out to nonhuman bots. This is done by making them aware that social norms are shifting, and that many people are starting to think it is a good thing to cooperate with bots, even if they don’t yet realize it is a common opinion.”

The researchers noted that stylized societies with incentivized interactions are useful for studying human cooperation in the lab. However, it may not be as suitable for studying human-bot cooperation since bots have no use for money. Participants recognized that bots didn’t desire money but acted as if they did, possibly because prosocial behavior towards bots is seen as a signal to other humans.

“It is totally understandable that people can signal their trustworthiness by acting cooperatively with fellow humans, but it is surprising that they can also do so by acting cooperatively with bots, or by punishing bots who do not act cooperatively,” Rahwan told PsyPost. “After all, machines do not have emotions or needs, and so one could argue that it is perfectly fine not to share with a bot, or that it is pointless to punish a bot. Yet, our study shows otherwise.”

Like any study, the new research includes some caveats. The sample consisted of online participants, primarily from Amazon Mechanical Turk, who tend to be younger, more educated, and more technologically savvy. The results may not generalize to older or less technologically savvy populations. In addition, the use of a stylized society allowed for experimental control but may yield different results in other contexts.

“Our conclusions are based on an experimental setup that is meant to capture a ‘stylized society’ of humans and bots,” Makovi said. “Will people act differently when facing bots in the field? This remains to be seen.”

“The paper would not have come to fruition without Wendi Li who was an undergraduate student at NYU Abu Dhabi when we started the study, and Anahit Sargsyan who supported the data collection and analysis of the multiple iterations,” Rahwan added.

The study, “Trust within human-machine collectives depends on the perceived consensus about cooperative norms“, was authored by Kinga Makovi, Anahit Sargsyan, Wendi Li, Jean-François Bonnefon, and Talal Rahwan.

RELATED

Common left-right political scale masks anti-establishment views at the center
Political Psychology

Common left-right political scale masks anti-establishment views at the center

December 7, 2025
Surprisingly few “#bodypositivity” videos on TikTok actually contain messaging related to body positivity, study finds
Depression

Nonmedical TikTok creators outperform doctors in engagement on SSRI videos

December 6, 2025
How common is anal sex? Scientific facts about prevalence, pain, pleasure, and more
Artificial Intelligence

Humans and AI both rate deliberate thinkers as smarter than intuitive ones

December 5, 2025
People struggle to separate argument quality from their own political opinions
Political Psychology

People struggle to separate argument quality from their own political opinions

December 5, 2025
Childhood trauma survivors show increased emotional intensity and variability
Dark Triad

Women with high Dark Triad scores exhibit more anhedonia and alexithymia

December 4, 2025
Scientists observe “striking” link between social AI chatbots and psychological distress
Moral Psychology

A field experiment reveals the psychology behind the “Batman effect”

December 3, 2025
Endorsing easily disproved lies acts as a psychological “power move” for some
Authoritarianism

Endorsing easily disproved lies acts as a psychological “power move” for some

December 2, 2025
Introversion, texting habits, and self-confidence: Understanding the connections
Cognitive Science

Higher social media engagement linked to reduced performance on cognitive assessments

December 2, 2025

PsyPost Merch

STAY CONNECTED

LATEST

Common left-right political scale masks anti-establishment views at the center

New research suggests deep psychological schemas fuel problematic porn use

Study links anxiety and poor sleep to heart and kidney disease progression

MDMA’s blue Tuesday: Study confirms three-day drop in mental well-being after ecstasy use

Nonmedical TikTok creators outperform doctors in engagement on SSRI videos

Learning about the “nocebo effect” prevents false ADHD self-diagnosis, study shows

Children with better musical skills may benefit from a prolonged window of brain plasticity

Large-scale U.S. study links water fluoridation to slightly improved cognitive development

RSS Psychology of Selling

  • How virtual backgrounds influence livestream sales
  • Brain wiring predicts preference for emotional versus logical persuasion
  • What science reveals about the Black Friday shopping frenzy
  • Research reveals a hidden trade-off in employee-first leadership
  • The hidden power of sequence in business communication
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy