New research indicates that political biases are a more important predictor of misinformation susceptibility than the ability to distinguish between true and false information. The findings have been published in the Journal of Experimental Psychology: General.
“Misinformation represents one of the greatest challenges for the functioning of societies in the information age,” said study author Bertram Gawronski, a professor at the University of Texas at Austin.
“Because people base their decisions on information that is available to them, misinformation can lead to suboptimal decision outcomes even when the decision-process itself is perfectly rational from a normative point of view. Such misinformed decisions can affect not only the individual who made the decision, but society as a whole.”
The researchers conducted four preregistered experiments, which included 2,423 U.S. adults in total. They used Prolific Academic, a crowdsourcing platform that provides access to diverse samples of participants, to recruit individuals who identified as either Democrat or Republican.
Participants read a set of 60 news headlines. Some of the headlines were true and some were false, and some were biased towards Democrats and some towards Republicans. After reading each headline, the participants were either asked to decide if the headline was true or false or asked if they would consider sharing the story on social media.
In all four experiments, the participants exhibited higher truth sensitivity when they were asked to judge whether the headlines were true or false compared to when they were asked if they would share the headlines. “Put differently, although participants showed a considerable ability in distinguishing between true and false headlines when they were asked to judge the veracity of the headlines, sharing decisions were completely unaffected by actual information veracity,” the researchers explained.
Importantly, the researchers found that political biases were a more important predictor of determining the likelihood of believing false information compared to the ability to tell if the information is true or false. This means that people who strongly identify with a particular political party are more likely to believe false information that supports their party’s views but less likely to believe information that goes against their party’s views. This was true both when participants were asked to judge the truth of news and when they were asked if they would share it online.
“Inability to distinguish between true and false information plays a surprisingly minor role for misinformation susceptibility,” Gawronski told PsyPost. “A much more important factor is the tendency to readily accept information that is congruent with one’s personal beliefs and to dismiss information that is incongruent with one’s personal beliefs.”
Self-perceived ability in recognizing made-up news and cognitive reflection played important roles.
In Experiment 1, participants were asked how good they thought they were at recognizing fake news compared to other Americans. Most participants rated themselves as better than average at recognizing fake news. But the study found that people who thought they were better at recognizing fake news also tended to have had stronger political biases when deciding if news was true or false.
In Experiment 2, participants were asked to react quickly to headlines (low-reflection) or take time to think about their answer (high-reflection). Those in the low-reflection condition had only 7 seconds to respond, while those in the high-reflection condition could take as much time as they wanted. Experiment 2 showed that people were better at telling if news was true or false when they had more time to think (high-reflection). However, this improvement only applied to judging the truth of the news, and not to the decision of whether to share it or not.
“The tendency to readily accept information that is congruent with one’s personal beliefs and to dismiss information that is incongruent with one’s personal beliefs is often assumed to be the product of motivated reasoning or wishful thinking,” Gawronski told PsyPost. “We found very little evidence for this assumption. Instead, differences in the acceptance of belief-congruent and belief-incongruent information seem to arise from overconfidence and a failure to recognize the limits of one’s knowledge.”
In Experiment 4, half of the participants were asked to judge the accuracy of news headlines before deciding whether to share them on social media, while the other half was not promoted to make an accuracy judgment. The researchers found that those who were asked to judge the accuracy of the news were less likely to share false information. This effect was stronger for news that supported people’s political views.
“We are currently investigating whether susceptibility to misinformation can be reduced by making people aware of the limits of their knowledge,” Gawronski said.
The study, “Truth Sensitivity and Partisan Bias in Responses to Misinformation“, was authored by Bertram Gawronski, Nyx L. Ng, and Dillon M. Luke.