A new study published in PLOS One sheds light on how people respond to health-related disinformation on social media. The research suggests that people who enjoy thinking critically and analytically tend to be better at identifying false or misleading content. Political affiliation also played a role in some cases, with liberals tending to perform better than conservatives at evaluating politically charged posts. But across the board, the trait known as “need for cognition” was the strongest predictor of successfully spotting disinformation.
The study was motivated by ongoing concern about the spread of false health information online. In recent years, social media platforms have enabled widespread exposure to disinformation, including dangerous claims about vaccines, alternative cures, and public health guidance. While misinformation can be shared innocently, disinformation refers to content that is intentionally deceptive.
Acting on false health claims can have serious consequences, such as avoiding effective treatments or embracing harmful remedies. During the COVID-19 pandemic, disinformation campaigns contributed to poor health decisions, with some estimates linking these beliefs to thousands of preventable deaths.
Against this backdrop, researchers have become increasingly interested in why some people fall for disinformation while others do not. Past studies have pointed to a range of factors, including political beliefs, cognitive style, and personality traits. This new study aimed to determine whether the need for cognition or political affiliation was a more consistent predictor of a person’s ability to detect false claims about health.
“In 2022, I conducted two studies about detecting disinformation in health-related social media posts. The studies featured student participants, and I wanted to extend the work to a larger, general population,” said study author Joey F. George, distinguished professor emeritus at Iowa State University.
“At the same time, there was a debate in the literature about the key factors that affected detection success, specifically need for cognition and political affiliation. I wanted to add to this debate by comparing the roles of need for cognition and political affiliation in disinformation detection, using a large U.S.-based sample.”
To explore this question, the researcher recruited 508 American adults through a survey panel. The participants were shown 10 different social media posts, each making a claim related to health. Some posts were accurate, while others contained false or misleading information. Sixty percent of the posts were considered disinformation. After viewing each post, participants were asked to judge whether it was honest or dishonest and explain the reasoning behind their decision by selecting from a list of possible justifications.
George also collected background data on each participant, including political affiliation, education, income, and gender. In addition, participants completed a short questionnaire to assess their level of need for cognition. This trait refers to how much a person enjoys and engages in analytical thinking. People high in need for cognition tend to prefer complex problems and deliberate reasoning, while those low in this trait may rely more on intuition or surface-level cues.
Each participant was randomly assigned to view one of two versions of the survey, with each version containing a different mix of 10 posts. The posts covered a range of topics, including COVID-19 treatments, vaccines, dieting strategies, and alternative cures. Some featured content from government agencies like the Food and Drug Administration, while others cited public figures or anonymous sources.
Participants were generally able to identify false health claims with moderate success. On average, they correctly identified disinformation about two-thirds of the time. This is better than chance and also higher than typical success rates found in deception detection studies.
“I was surprised that detection success rates were as high as they were, as the consensus finding in the deception literature is that people successfully detect deception only about 54% of the time,” George told PsyPost.
However, the results also revealed significant variation depending on cognitive style and political leanings.
The most consistent predictor of success across the entire dataset was need for cognition. Participants who scored higher on this trait were significantly better at identifying which posts were false. This pattern held true regardless of the participant’s political affiliation, age, education, or gender. Statistical tests showed that people with a high need for cognition had success rates around 70 percent, while those with a lower need for cognition performed closer to 60 percent.
Political affiliation also played a role, but only in certain situations. For half of the posts, neither political identity nor cognitive style made much difference in how people responded. In the remaining cases, need for cognition was the key factor for most of the posts, while political affiliation was a factor for only a few. The posts that were most strongly influenced by political beliefs tended to involve topics related to COVID-19, especially claims made by political figures or warnings issued by federal agencies.
“I found that both need for cognition and political affiliation were important to detecting disinformation in social media posts about health, but their importance varied by post,” George explained. “In 35% of the posts, those with a higher need for cognition were better at detecting disinformation. Politics played no role. In 15% of the posts, those with conservative politics were worse at detection. These were political posts about COVID-19 vaccines, hydroxychloroquine, and ivermectin. Need for cognition played no role. For the rest of the posts, neither factor was important.”
For example, one post falsely claimed that giving children the COVID-19 vaccine amounted to a government experiment, citing former Housing Secretary Ben Carson. Conservatives were more likely to believe this claim, while liberals were more likely to reject it. In contrast, when shown legitimate warnings from the Food and Drug Administration about the dangers of unapproved COVID-19 treatments like ivermectin or hydroxychloroquine, conservatives were more likely to dismiss the warnings as false.
These results indicate that partisanship may influence how people interpret politically charged health information, especially when it involves trusted figures or controversial treatments. But when the posts did not touch on political hot-button issues, political identity had little to no effect. In contrast, a higher need for cognition was consistently linked with better performance in detecting false information across a range of topics.
“I was not surprised that political affiliation seemed to get in the way of successful detection for conservative participants, but I was surprised that neither political affiliation nor need for cognition mattered at all for half of the posts I used,” George said.
Participants who were better at identifying false claims often cited specific reasons for their judgments. These included noticing that the post made unsubstantiated claims, that the source lacked credentials, or that it came from an unknown or random person. In contrast, participants who mistakenly identified false content as true often based their trust on the presence of a familiar photo or the perceived credibility of a well-known figure, even when the claim itself lacked evidence.
The findings provide support for the idea that encouraging people to think more analytically could help combat disinformation. Since the need for cognition is related to how much someone enjoys thinking critically, it may be possible to foster this trait through education, critical thinking exercises, or other forms of cognitive engagement. While political beliefs are less easily changed, cognitive habits may be more malleable over time.
“Overall, participants successfully detected disinformation two-thirds of the time,” George told PsyPost. “The major takeaway is that people seem to do pretty well at detecting disinformation about health, but how well they do depends on the post they are evaluating and on their own personalities and beliefs.”
Like all research, this study has some limitations. The sample was restricted to American adults who were members of an online research panel, which may not fully represent the broader population. The posts used in the study were drawn from a specific period in late 2022, and many focused on the COVID-19 pandemic. As public attention shifts to other health issues, the kinds of disinformation circulating online are likely to evolve.
Future studies could expand this research by including participants from other countries and cultural backgrounds. They could also develop new sets of posts that reflect current health concerns and misinformation trends. Another promising direction would be to explore how people with chronic health conditions engage with health-related content, as they may be more vulnerable to disinformation or more motivated to seek accurate information.
George has retired and is not planning additional research projects. However, the findings offer clear avenues for future work by other researchers, including potential interventions aimed at boosting cognitive engagement or exploring how artificial intelligence might help flag deceptive content in real time.
The study, “Political affiliation or need for cognition? It depends on the post: Comparing key factors related to detecting health disinformation in the U.S.,” was published August 26, 2025.