Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Cognitive Science

New psychology research reveals the “bullshit blind spot”

by Mane Kara-Yakoubian
May 31, 2023
in Cognitive Science, Social Psychology
Share on TwitterShare on Facebook

Is there a bullshit blind spot? A series of two studies recently found that people who were the worst at detecting bullshit not only grossly overestimated their detection ability, but also overestimated their ability compared to other people. In other words, they not only believe that they are better at detecting BS than they actually are, they also believe that they are better at it than the average person.

At the same time, those who were best at detecting BS not only underestimated their own performance but also believed that they were slightly worse at detecting BS than the average person. This research was published in Thinking & Reasoning.

“Broadly, I’m interested in figuring out why relatively smart people believe dumb things (and I include myself sometimes in that category!). So, this includes trying to understand what characteristics are common among people who fall for misinformation as well as what characteristics are common in the misleading messages that make them appealing and persuasive to some people (such as the features of the message itself, how it is delivered, etc.),” said Shane Littrell, PhD (@MetacogniShane), a postdoctoral research associate at the University of Miami.

“My co-authors and I recently published a study examining whether people who spread misinformation are also more likely to fall for it – that is, whether one can ‘bullshit a bullshitter’ (open-access version) – and one of the main implications of that work suggests that people who intentionally spread misinformation in some situations can also unintentionally spread it without realizing it in other situations. To me, this seemed to suggest that some people who knowingly spread bullshit are unaware of the fact that they often fall for it themselves, possibly because they think they’re better at detecting it than everyone else.”

“And, on a certain level, that makes intuitive sense. A con man might not think he can be conned because he ‘knows all the tricks,’ so to speak. So, our next set of studies set out to test that idea by examining how confident people who fall for bullshit are in their own bullshit detection skills, and what cognitive processes they use when they evaluate misleading information.”

Across two studies, the researchers recruited 412 participants to examine the link between bullshit detection, overconfidence in one’s abilities, and the perceived thinking processes people engage in when they encounter and evaluate potentially misleading information. In Study 1, the bullshit detection task involved rating 20 statements as profound or not profound.

Half of the statements were real quotes from famous public figures that are typically judged to be profound (e.g., “A river cuts through a rock, not because of its power but its persistence”). The other half were randomly generated by an algorithm to have proper grammatical structure but also be nonsensical and inherently meaningless (e.g., “Wholeness quiets infinite phenomena”).

A bullshit detection score was derived for each person based on the number of real (profound) and fake (not profound) statements that they were able to correctly classify. Participants also estimated their own performance as well as others’ performance on this task, which provided confidence metrics. They also provided a confidence rating for their bullshit detection ability in general.

Google News Preferences Add PsyPost to your preferred sources

Past research has suggested that some people fall for bullshit because they are more likely to rely on fast, intuitive thinking rather than slower, reflective thinking. Thus, to test whether this is true, Study 2 examined individual differences in the types of thinking processes people perceive that they engage in when trying to detect bullshit. In other words, did participants feel that they were able to spot bullshit immediately or did they need to reflect on it before making a determination?

To find out, the researchers had participants complete measures that assessed their perceptions of the thinking processes they used when evaluating potentially misleading information (i.e., intuitive versus reflective thinking). To ensure that the perceived speed of their thinking process (faster intuition vs slower reflection) aligned with the actual speed of their evaluations, participants’ subjective ratings of their thinking process were compared with objective measures of their evaluation speed (i.e., time spent evaluating statements), revealing that the two were positively correlated.

Overall, Study 2 found that both intuitive and reflective thinking processes are involved in detecting – and falling for – bullshit, rather than one particular thinking process being dominant.

“Our main finding was that the people who are the most susceptible to falling for bullshit are not only very overconfident in their ability to detect it, but they also think that they’re better at detecting it than the average person. This applied whether they evaluated the BS quickly/intuitively or spent more time reflecting on it,” said Littrell.

“This is kind of a double-whammy in terms of bullshit susceptibility that we call the ‘bullshit blind spot.’ The other interesting finding was that the people who are best at detecting BS are actually underconfident in their detection skills and think they’re worse at it than the average person (i.e., they have a bullshit ‘blindsight’),” he added.

“It’s objectively worse to be a person who is not only bad at spotting BS but thinks they’re awesome at it than it is to be a person who is good at spotting BS but underconfident at it. So, I think the most important thing to take away from our findings is that everyone would be better off practicing more intellectual humility and skepticism. This is tough for most people, because we all like to believe that we’re smart, and in control of what we think and believe, and that we aren’t easily fooled. Unfortunately, many people who believe this are quite wrong.”

With regard to study limitations, the researcher explained that the type of pseudo-profound bullshit stimuli that was used in this work was what might be encountered in conversation, on social media, or from the self-help/inspiration guru industries.

“It could be that people evaluate or otherwise react to bullshit in other types of contexts (e.g., organizational, consumer marketing) differently or that the effect sizes would be different. Past research suggests that our findings would probably generalize to other types of BS and misinformation, but that needs to be empirically tested for us to be sure. Also, we used a Western, English-speaking sample of participants, so we can’t draw any firm conclusions on whether these results would replicate in other types of cultures and languages.”

Are there other lessons we can take from this work? According to Dr. Littrell, “I think our findings underscore the simple truth that all of us not only can be fooled (some more than others), but all of us likely have been fooled at some point in our lives, either by misinformation on social media, biased media coverage, flashy consumer marketing, or even/especially by someone we know bullshitting us.”

“By being more intellectually humble in our day-to-day lives, we’ll be better prepared to resist bullshit and other misinformation by being more mindful of our own cognitive vulnerabilities which will hopefully encourage us to be more attentive to and skeptical of the information we’re exposed to. The phrase, ‘what if I’m wrong?’ can be an incredibly liberating and protective mantra to live by.”

The research, “Bullshit blind spots: the roles of miscalibration and information processing in bullshit detection”, was authored by Shane Littrell and Jonathan A. Fugelsang.

Previous Post

The psychology of clown doctors: New study examines humor styles and playfulness

Next Post

Breathing exercises show potential in modulating Alzheimer’s biomarkers, study finds

RELATED

Collective narcissism, paranoia, and distrust in science predict climate change conspiracy beliefs
Conspiracy Theories

New study reveals how political bias conditions the impact of conspiracy thinking

April 19, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Cognitive Science

Cognition might emerge from embodied “grip” with the world rather than abstract mental processes

April 19, 2026
Women’s cognitive abilities remain stable across menstrual cycle
Cognitive Science

Men and women show different relative cognitive strengths across their lifespans

April 19, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Cognitive Science

Soft brain implants outperform rigid silicon in long-term safety study

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Dating

The decline of hypergamy: How a surge in university degrees changed marriage in the US and France

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Political Psychology

New research finds a persistent and growing leftward tilt in the social sciences

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Cognitive Science

Live music causes brain waves to synchronize more strongly with rhythm than recorded music

April 18, 2026
New study links narcissism and sadism to heightened sex drive and porn use
Narcissism

The narcissistic mirror: how extreme personalities view their friends’ humor

April 17, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Why personalized ads sometimes backfire: A research review explains when tailoring messages works and when it doesn’t
  • The common advice to avoid high customer expectations may not be backed by evidence
  • Personality-matched persuasion works better, but mismatched messages can backfire
  • When happy customers and happy employees don’t add up: How investor signals have shifted in the social media age
  • Correcting fake news about brands does not backfire, five-study experiment finds

LATEST

Childhood trauma and attachment styles show nuanced links to alternative sexual preferences

New study reveals how political bias conditions the impact of conspiracy thinking

Cognition might emerge from embodied “grip” with the world rather than abstract mental processes

Men and women show different relative cognitive strengths across their lifespans

Early exposure to forever chemicals linked to altered brain genes and impulsive behavior in rats

Soft brain implants outperform rigid silicon in long-term safety study

Disclosing autism to AI chatbots prompts overly cautious, stereotypical advice

Can choking during sex cause brain damage? Emerging evidence points to hidden neurological risks

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc