In a new study published in Frontiers in Psychology, researchers have uncovered that while news media literacy helps individuals recognize and doubt news from unfamiliar sources, strong partisan biases can reverse this skepticism, especially in the case of fake news. This insight emerged from an extensive investigation involving over 2,400 American participants, aiming to untangle how people judge the accuracy of news content in the digital age.
“Understanding what improves detection of misinformation and disinformation is of global concern. A great deal of high quality research has suggested that technological factors, such as a chaotic social media environment, can lead to quick, surface-level, consideration of new information,” said study author Daniel Sude, a visiting assistant professor at George Washington University.
“We believed, however, that the problem went deeper than that. Drawing on social-cognitive psychology and communication science, we argued that even people who want to process more deeply may do so in biased ways. Even people who are aware of their potential for bias may have difficulty truly correcting for that bias.”
“Further, we disputed the notion that mere detection of “fake news” should be society’s goal. Even a broken clock is right twice a day. We wanted to know what led people to be justifiably skeptical of content from unfamiliar news outlets, even if this content turned out to be true. Believing true information by accident, or rejecting false information by accident, is not, in our minds, sufficient. We want to know what leads people to take their time and patiently come to, hopefully correct, conclusions. Our two-study paper is just one of many tackling that question.”
The study involved two separate participant groups, with 1,008 individuals in the first study and 1,397 in the second. These participants were recruited through Qualtrics, a platform that facilitates online surveys. To be eligible, participants needed to be eligible voters in the United States and be aware of the potential exposure to misinformation on the web.
The participants were presented with a series of news posts formatted to resemble content on Facebook. These posts included elements typical of social media content, such as a media brand logo, a headline, an accompanying image, and a URL linking to the news organization. This approach allowed the researchers to control the variables in the experiment while maintaining ecological validity — the extent to which the findings can be applied to real-world settings.
The first part of the study focused on politically consonant fake news content. The participants were shown news posts that included content supporting their political leanings. For example, a politically consonant fake news headline for Democrats was: “Trump Say Republicans Are the ‘Dumbest Group of Voters.'”
The participants were also shown other types of content such as politically dissonant fake news and apolitical fake news. This mix was intended to obscure the primary purpose of the study and mimic the diverse content typically encountered on social media. The news posts were attributed to various news outlets, including both mainstream and alternative media sources.
The second study replicated and extended the approach of the first, but with a focus on real news headlines instead of fake news. The real news headlines were also attributed to a mix of mainstream and alternative news outlets, and the study aimed to differentiate impressions of the content from impressions of the outlets themselves.
The researchers controlled for a range of factors that might influence how participants interacted with the news content, such as information literacy, political interest, trust in news media and various institutions, reliance on intuition for assessing facts, the need for evidence in forming beliefs, perceptions that truth is influenced by politics, belief in conspiracy theories, and the extent of news exposure.
A key observation was that content from alternative, less recognized news sources was generally viewed as less accurate than that from mainstream news outlets. This suggests that people are naturally more skeptical of news from sources they are not familiar with.
“People, at least when participating in our online studies, were remarkably suspicious of content from an unfamiliar news outlet,” Sude told PsyPost. “This contrasts with other studies suggesting that people tend to pay more attention to content than to sources. This could have something to do with our design – people were looking at headlines and leads ‘posted’ to social media. The source information was obvious. If they had gone to click on the actual article, and gotten involved with the content, that experience might then overwhelm their memory of the source of that content, dulling their skepticism.”
However, the researchers also found a surprising twist: for individuals with strong political biases, this relationship was reversed. They tended to perceive fake news content from unfamiliar outlets as more accurate, suggesting a vulnerability to misinformation influenced by political leanings.
Moreover, the study highlighted the role of news media literacy in shaping perceptions. Participants with higher news media literacy were more discerning of the news source’s credibility. But, again, this protective effect was not uniform across all participants. For those with strong political commitments, high news media literacy paradoxically led to a greater likelihood of believing in fake news.
“Strong partisans (Democrats or Republicans), particularly the ones who were concerned with media bias, acknowledged that the source was unfamiliar but defended its content anyway,” Sude explained. “As strong partisans get more out of spreading misinformation and disinformation, this is a concerning finding.”
These findings have significant implications for efforts to combat the spread of fake news and misinformation. They suggest that interventions need to be multi-faceted, targeting not just the enhancement of news media literacy but also addressing the influence of partisan biases on information processing and sharing.
“Information that supports what you already believe feels good. Information that warns you of threats from ‘the other side’ feels important,” Sude said. “However, information that feels good or important is not necessarily accurate. If you’re not familiar with the source of information, take your time with it. It might be true; it might be false; it might be a little of both. The same, of course, can be said for information from ‘mainstream’ news outlets. However, these outlets get a lot more scrutiny and are more likely to have their errors corrected compared to outlets with a small audience or someone’s blog.”
“In the end, other studies have suggested that sharing misinformation and being called out for it hurts both your reputation and the reputation of your ‘side.’ Patience and caution are worth it.”
While the study offers significant insights, it is not without limitations. Its findings are most applicable to the politically polarized context of the United States and might not extend globally. The psychological processes behind how people interpret news sources were not directly examined, and future research could explore this further.
“Any particular study, even one using real world fake and real news headlines, cannot tell you the whole story,” Sude told PsyPost. “This work contributes to the balance of evidence suggesting that while, under the right conditions, people will be skeptical of content from an unfamiliar outlet, content that supports their views can still have a powerful allure. Acknowledging this, we can design and test more effective interventions to improve not just how accurate people are, but how clearly they are thinking about accuracy.”
The study, “True, justified, belief? Partisanship weakens the positive effect of news media literacy on fake news detection“, was authored by Daniel Jeffrey Sude, Gil Sharon, and Shira Dvir-Gvirsman.