Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Study on fighter pilots and drone swarms sheds light on the dynamics of trust within human-machine teams

by Eric W. Dolan
February 25, 2024
in Artificial Intelligence, Aviation Psychology and Human Factors
(U.S. Air Force photo/R. Nial Bradshaw)

(U.S. Air Force photo/R. Nial Bradshaw)

Share on TwitterShare on Facebook

In a new study published in the Journal of Cognitive Engineering and Decision Making, researchers from the U.S. Air Force, Leidos, and Booz Allen Hamilton have taken a significant leap in understanding the dynamics of trust within human-machine teams, specifically in the context of military operations involving unmanned aerial vehicles (UAVs).

This research illuminates how fighter pilots’ trust in one component of a technological system can affect their trust in the system as a whole—a phenomenon known as the pull-down effect. Crucially, the study finds that experienced pilots can differentiate between reliable and unreliable UAVs, suggesting that the pull-down effect can be mitigated, thereby enhancing mission performance and reducing cognitive workload.

The trust between humans and machines is a pivotal factor in the successful deployment of autonomous systems, especially in high-stakes environments like military operations. Trust is considered a cornerstone of effective human-machine interaction, influencing operators’ reliance on technology.

Prior research has shown that trust in automation is directly linked to system reliability. However, when multiple autonomous systems are involved, as in the case of UAV swarms, judging reliability becomes significantly more complex. This complexity introduces the risk of the pull-down effect, where trust in all system components is reduced due to the unreliability of a single element.

“I was interested in conducting this research because the pull-down effect is a phenomena based in heuristic responding that has relevance to the U.S. Air Force,” explained study author Joseph B. Lyons, the senior scientist for Human-Machine Teaming at the Air Force Research Laboratory and co-editor of Trust in Human-Robot Interaction.

“In the Air Force, we need to understand how humans respond to human-machine interactions and, in this case, if perceptions of one technology can propagate to others, that is something we need to account for when fielding novel technologies. Also, this is a topic that has only been done in laboratory settings, so it was not clear if the observed effects would translate into more Air Force relevant tasks with actual operators.”

To investigate this phenomenon, the researchers employed a highly immersive cockpit simulator to create a realistic operational environment for the participants. The study involved thirteen experienced fighter pilots, including both retired and currently active pilots, with a wealth of flying hours in 4th and 5th generation Air Force Fighter platforms, such as the F-16 and F-35.

Participants were exposed to a series of six flight scenarios, of which four included an unreliable UAV exhibiting errors, while the other two scenarios featured perfectly reliable UAVs. Each pilot encountered 24 UAV observations in total, with a mix of reliable and unreliable UAVs designed to simulate real-world operational conditions closely.

Google News Preferences Add PsyPost to your preferred sources

The simulation environment was designed to reflect the complexities of managing UAV swarms, requiring pilots to monitor and control multiple UAVs (referred to as Collaborative Combat Aircraft or CCAs) simultaneously. The scenarios tasked pilots with monitoring four CCAs for errors, communicating any unusual behaviors, and selecting one CCA for a mission-critical strike on a ground target, all while managing the cognitive workload and maintaining situational awareness.

Contrary to what might have been expected based on previous research, the study revealed that the presence of an unreliable UAV did not significantly diminish the trust that experienced fighter pilots placed in other, reliable UAVs within the same system. This suggests that experienced operators, such as the fighter pilots participating in this study, are capable of nuanced trust evaluations, effectively distinguishing between the reliability of individual system components.

This finding challenges the assumption underpinning the pull-down effect — that the unreliability of one component can tarnish operators’ trust in the entire system. Instead, the pilots demonstrated what can be described as a component-specific trust strategy, suggesting that their expertise and familiarity with operational contexts enable them to make more discerning judgments about technology.

Moreover, the study found a significant increase in cognitive workload associated with the unreliable UAV compared to the reliable ones. This was an expected outcome, logically aligning with the notion that dealing with unreliable system components requires more mental effort and monitoring from human operators.

Yet, the researchers observed that higher trust in UAVs corresponded with lower reported cognitive workload, hinting at the potential for trust to mitigate the cognitive demands placed on operators by unreliable technology.

“After reading about this study, people should take away a couple things,” Lyons told PsyPost. “First, we found no evidence that negative experiences with one technology contaminate perceptions of other similar technologies in realistic scenarios with actual operators (in this case fighter pilots). While this is interesting, it also represents one study and thus requires replication in other settings. Second, people should take away the idea that theories and concepts should be tested in realistic domains with non-student samples wherever possible.”

Interestingly, despite introducing heterogeneity in the UAV systems to see if this could mitigate the pull-down effect — through different naming schemes and suggested capability differences — the study did not find significant evidence that such measures influenced the pilots’ trust evaluations. This result suggests that the pilots’ ability to maintain specific trust towards reliable components was not necessarily enhanced by these attempts at system differentiation.

“I was surprised that our manipulation of different asset types did not seem to have any bearing on the pilot’s attitudes or behaviors,” Lyons said. “However, and as noted in the manuscript, it is possible that the scenario did not pull out the need (or affordance) for this asset heterogeneity as much as it should have. I think this is an area that is ripe for additional research.”

However, the study is not without its limitations. The small sample size and the specific context of military aviation may limit the generalizability of the findings. Furthermore, the researchers acknowledge the need for future studies to explore the mitigating effects of factors such as system heterogeneity and operator training on the pull-down effect.

“There are always caveats with any scientific work,” Lyons said. “This was just one study, with a pretty small sample size. These findings need to be replicated across other samples and other domains of interest. Never put all of your eggs into the basket of just one study.”

This research opens up new avenues for enhancing the design and deployment of autonomous systems in military operations, ensuring that trust calibration is finely tuned to the demands of high-stakes environments.

“Within the Air Force, we seek to understand how to build effective human-machine teams,” Lyons told PsyPost. “A significant part of that challenge resides in understanding why, when, and how humans form, maintain, and repair trust perceptions with advanced technologies. The types of technologies we care about are diverse, and we care about the gamut of Airmen and Guardians across the Air and Space Force.”

“I feel that the heuristics we use in making trust-related judgements are underestimated and underrepresented in the literature,” he added. “This is a great topic for academia to advance our collective knowledge.”

The study, “Is the Pull-Down Effect Overstated? An Examination of Trust Propagation Among Fighter Pilots in a High-Fidelity Simulation,” was authored by Joseph B. Lyons, Janine D. Mator, Tony Orr, Gene M. Alarcon, and Kristen Barrera.

Previous Post

Celebrity admiration vs. obsession: New study sheds light on stalking behaviors

Next Post

Selfie culture and self-esteem: Study unravels the impact of social media on adolescent girls

RELATED

People with attachment anxiety are more vulnerable to problematic AI use
Artificial Intelligence

Relying on AI chatbots for historical facts can influence your political beliefs, new study shows

March 30, 2026
ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests
Artificial Intelligence

ChatGPT acts as a “cognitive crutch” that weakens memory, new research suggests

March 30, 2026
Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026
Scientists just uncovered a major limitation in how AI models understand truth and belief
Artificial Intelligence

Most Americans don’t fear an AI apocalypse, according to new research

March 26, 2026
AI can generate images that are just as effective at triggering human emotions as traditional photographs
Artificial Intelligence

AI can generate images that are just as effective at triggering human emotions as traditional photographs

March 24, 2026
ChatGPT’s social trait judgments align with human impressions, study finds
Artificial Intelligence

Efforts to make AI inclusive accidentally create bizarre new gender biases, new research suggests

March 22, 2026
Building muscle strength may help prevent depression, especially in women
Artificial Intelligence

News chatbots that present multiple viewpoints tend to earn the trust of conspiracy believers

March 20, 2026
Lifelong diet quality predicts cognitive ability and dementia risk in older age
Artificial Intelligence

Emotionally intelligent AI chatbots improve mental health but destroy real-world social ties

March 19, 2026

STAY CONNECTED

RSS Psychology of Selling

  • When a goal-driven boss ignores relationships, manipulative employees may fight back
  • When salespeople fail to hit their targets, inner drive matters more than bonus checks
  • The “dark” personality traits that predict sales success — and when they backfire
  • What communication skills do B2B salespeople actually need in a digital-first era?
  • A founder’s smile may be worth millions in startup funding, research suggests

LATEST

Belief in the harmfulness of speech is linked to both progressive ideology and symptoms of depression

Better parent-child communication is linked to stronger soft skills and emotional stability in teens

Men who favor the tradwife lifestyle often view the women in it with derision

A diet based on ultra-processed foods impairs metabolic and reproductive health, study finds

Psychologists identify nine core habits associated with healthy non-monogamous partnerships

Childhood trauma linked to elevated risk of simultaneous physical and mental illness in old age

Short-acting psychedelic DMT shows promise as a rapid treatment for major depressive disorder

How a twin study untangled the surprising roots of authoritarian political beliefs

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc