Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Cognitive Science

Explanatory reflection reduces pseudo-profound bullshit receptivity, study finds

by Mane Kara-Yakoubian
January 25, 2024
in Cognitive Science
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook
Stay on top of the latest psychology findings: Subscribe now!

A series of three studies published in Applied Cognitive Psychology found that explanatory reflection—that is, critically thinking about, and being able to explain to others, one’s reasons for particular beliefs or behaviors—reduced receptivity to pseudo-profound bullshit, but had no effect for scientific bullshit or fake news.

“We had previously published a paper demonstrating that some people have a ‘bullshit blind spot,’ meaning that they not only don’t realize how bad they are at falling for bullshit, they also mistakenly believe they’re better at spotting it than the average person,” said study author Shane Littrell, PhD (@MetacogniShane), a Postdoctoral Fellow at the Munk School of Global Affairs and Public Policy at the University of Toronto.

“There is also some previous work by other researchers that claims that people fall for bullshit and certain types of misinformation largely because they fail to engage in reflective, analytic thinking when exposed to the BS. Unfortunately, most of this work is based on correlations rather than experimental evidence.”

“We wondered if people in general, but especially people with a ‘bullshit blind spot,’ could use analytic thinking to improve their ability to detect and reject bullshit. So, our goal was to experimentally put this to the test and find out whether getting people to engage in reflective, analytic thinking when evaluating bullshit and fake news would decrease how persuasive or convincing it is to them.”

“There’s other work that’s shown that misinformation is more persuasive and convincing when it comes from perceived authoritative sources like spiritual or religious leaders, politicians, celebrities, etc. This is called ‘the guru effect.’ So, we wanted to find out if getting people to engage in reflective thinking could also counteract ‘guru effects’ that might occur when bullshit comes from purported experts.”

Study 1 included 136 participants from the US and Canada, recruited via Amazon’s Mechanical Turk. They were tasked with rating the profoundness of 10 statements, including five pseudo-profound statements and five popular motivational quotes on a 5-point scale. Next, they engaged in an explanatory reflection task, writing explanations for why they found each statement profound or not. This was followed by a re-rating of the profoundness.

Study 2 involved two separate experiments (2a and 2b), with final samples of 143 participants for Experiment 2a and 138 for Experiment 2b. The procedure mirrored that of Study 1, with the addition of scientific statements and the use of a 7-point rating scale. In Study 2a, participants rated the profoundness of pseudo-profound and motivational statements, while in Study 2b, they gave truthfulness ratings for scientific bullshit and real scientific statements.

Study 3 also comprised two experiments (3a and 3b) with final samples of 130 and 112 participants respectively. Participants rated the accuracy of fake and real news headlines on a 7-point scale, before and after the same reflection task used in previous studies. The fake news headlines were derived from Snopes.com.

Across the studies, “half of the participants rated these statements as if they came from anonymous sources. The statements that the other half of the participants rated were attributed to various expert or authoritative sources, such as famous political or cultural leaders (for the pseudo-profound bullshit), famous scientists (for the pseudo-scientific bullshit), or mainstream news sources (for the fake news headlines),” Littrell explained.

“As far as the results go, I guess you could say that I’ve got good news and bad news,” Littrell told PsyPost.

“The good news is that, when a person is exposed to certain types of bullshit claims and other misinformation, if they stop to reflect and analytically think about it, this can help decrease how persuasive and appealing the bullshit is. By doing so, the person might realize that there are certain aspects of the claim that seem sketchy, which might prompt them to doubt it and seek out fact-checks from a reliable source. Or, they might already have relevant knowledge that helps them realize that the claim is false or at least not as strongly supported as the person bullshitting them is trying to make it seem.”

“The bad news is that this may only work for bullshit or misinformation about topics the person already has background knowledge in, or for claims that are delivered in an unconvincing ‘bullshitty’ way that would alert the listener that something about it is fishy. For bullshit that involves more complex or technical topics (e.g., bullshit that invokes scientific concepts and jargon), or bullshit that would require some level of expertise or broad knowledge to spot (e.g., fake news headlines about current events), then ‘thinking harder’ or ‘thinking more critically’ about it may not have much of an effect. In such cases, if a person feels like they don’t know enough to judge the information (to either believe it or reject it), they may be more likely to base their level of belief on what a perceived expert says about it.”

“And, unfortunately, bullshit from perceived experts is usually perceived as more persuasive and convincing. And, equally as unfortunate, we also found that critically reflecting on pseudo-scientific bullshit and fake news that comes from perceived experts and authoritative sources wasn’t very effective at reducing how persuasive or appealing the BS was. In fact, in some cases, pseudo-scientific bullshit and fake news from experts was nearly as persuasive and convincing as truthful statements from an anonymous source, even after people critically reflected on it. This finding was unsettling to me, to say the least, and highlights just how important it is to hold experts, authorities, and leaders accountable for the accuracy of what they say, given how much power they possess to irresponsibly misinform and mislead the public.”

I asked Littrell what questions still need answering. He said, “Well, the type of reflective, analytic thinking we had people engage in when evaluating the bullshit is called ‘explanatory reflection.’ Basically, we asked them to explain, in as much detail as possible, what the bullshit/headline meant and why they rated it the way that they did. It could be that there are ways of analytically reflecting on bullshit and misinformation that would be more effective at undermining it.”

“For instance, if we asked people to think about counterarguments (e.g., why it might be wrong), or to focus specifically on sources of evidence for the claims, etc., these types of approaches might yield different results. Also, we used politically neutral bullshit and fake news. There’s a lot of research showing that belief in some types of misinformation is deeply rooted in political partisanship and ideology, so future work should test the effects of different types of reflective, analytic thinking on the persuasiveness of politically and/or ideologically charged bullshit to see if the results are different.”

“I think our results highlight the fact there’s no ‘one-size-fits-all’ solution to reducing the influence of bullshit and other forms of misinformation. Whether it’s bullshit posts on social media, biased media coverage, flashy consumer marketing, or just someone we know bullshitting us, our ability to detect it and reject it depends on several factors – including what type of bullshit it is – and sometimes these factors are outside our control.”

The researcher reflected, “It’s impossible to know everything about everything and sometimes we might trust the wrong people. But Carl Sagan shared a great Latin proverb in ‘The Demon-Haunted World’ that’s stuck with me for years: ubi dubium, ibi libertas which means, ‘where there is doubt, there is freedom.’ So, a great first step to reducing our chances of being misled, and the advice I give everyone, is to practice being more intellectually humble in our day-to-day lives; be more attentive to and skeptical of the information we’re exposed to. As I often say, ‘what if I’m wrong?’ should be the loudest thing the little voice in the back of your head shouts at you every day.”

The research, “Not all bullshit pondered is tossed: Reflection decreases receptivity to some types of misleading information but not others”, was authored by Shane Littrell, Ethan A. Meyers, and Jonathan A. Fugelsang.

TweetSendScanShareSendPin1ShareShareShareShareShare

RELATED

Neuroimaging study suggests mindfulness meditation lowers sensory gating
Cognitive Science

Neuroimaging study suggests mindfulness meditation lowers sensory gating

June 7, 2025

A new study finds that mindfulness meditators are more likely to report feeling a touch — even when none occurs — and that this sensitivity is linked to altered brain rhythms.

Read moreDetails
Your brain’s insulation might become emergency energy during a marathon
Cognitive Science

Scientists map the hidden architecture of the brain’s default mode network

June 5, 2025

A new study reveals that the brain’s default mode network is made up of distinct anatomical types that support both internal thoughts and external processing. This structural diversity helps explain the network’s role in everything from memory to imagination.

Read moreDetails
Sleep deprivation reduces attention and cognitive processing capacity
Cognitive Science

Sleep deprivation reduces attention and cognitive processing capacity

May 31, 2025

A new study shows that 36 hours without sleep impairs table tennis players’ reaction times, attention, and brain connectivity. The findings reveal how acute sleep deprivation disrupts spatial cognitive processing, with potential consequences for athletic performance and decision-making under pressure.

Read moreDetails
Neuroscientists pinpoint part of the brain that deciphers memory from new experience
Memory

Neuroscientists find individual differences in memory response to amygdala stimulation

May 31, 2025

Stimulating the brain’s amygdala during memory formation can boost recall after 24 hours, a new study finds. But the effect varies: some people’s memory improves, others’ worsens—and baseline memory performance appears to be the best predictor of outcome.

Read moreDetails
MDMA therapy: Side effects appear mild, but there are problems with the evidence
Cognitive Science

Consciousness remains a mystery after major theory showdown

May 30, 2025

A groundbreaking collaboration has tested two of the most influential theories of consciousness—global neuronal workspace and integrated information theory. While neither came out on top, the project marks a major shift in how scientists approach one of the mind’s biggest mysteries.

Read moreDetails
A common calorie-free sweetener alters brain activity and appetite control, new research suggests
Cognitive Science

A common calorie-free sweetener alters brain activity and appetite control, new research suggests

May 30, 2025

A recent brain imaging study finds that sucralose, unlike sugar, increases activity in the hypothalamus and boosts hunger, suggesting that calorie-free sweetness may confuse the brain’s appetite control system.

Read moreDetails
Delusion-like cognitive biases predict conspiracy theory belief
Cognitive Science

Delusion-like cognitive biases predict conspiracy theory belief

May 28, 2025

People prone to conspiracy theories may share cognitive tendencies with those who experience delusional thinking. Two new studies suggest that biases like anomalous perception and impulsive reasoning help explain why some are more likely to embrace conspiratorial beliefs.

Read moreDetails
Psychology study sheds light on why some moments seem to fly by
Memory

Psychology study sheds light on why some moments seem to fly by

May 24, 2025

A new study suggests life feels like it speeds up during periods of personal growth and satisfaction. Rather than routine making time seem short, researchers found that fulfilled, nostalgic memories are more likely to make the past feel like a blur.

Read moreDetails

SUBSCRIBE

Go Ad-Free! Click here to subscribe to PsyPost and support independent science journalism!

STAY CONNECTED

LATEST

Socioeconomic background tied to distinct brain and behavioral patterns

Do dark personality traits predict vote choices in U.S. presidential elections?

Anxious-depressed individuals underestimate themselves even when they’re right

Ayahuasca entity encounters linked to lasting religious belief changes, especially in men

Older women dating younger men report higher sexual satisfaction, study finds

Genetic and biological clues point to inflammation’s role in mental health

Study links anorexia nervosa to elevated opioid receptor levels in brain’s reward centers

Feeling unseen fuels support for the populist right, study finds

         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy