New research in Psychological Science provides evidence that belief updating is proportional to the magnitude of prediction error. In other words, people are more likely to update their beliefs after learning that there is a large gap between what they (falsely) thought was true and what is in fact true, but are relatively less likely to update their beliefs when the gap is small. The findings suggest that the element of surprise could play a role in reducing the spread of misinformation.
“Designing and testing belief change strategies is a research direction I became interested in after several large-scale misinformation campaigns were deployed around the world, with long-term disastrous consequences. In our work, my collaborators and I are looking for tools policy makers can use to fight misinformation by changing false beliefs in vulnerable communities,” explained Madalina Vlasceanu, a postdoctoral research fellow at New York University and the corresponding author of the new study.
In two experiments, which included 1,777 individuals in total, the researchers exposed participants to statistical evidence about ideologically-neutral topics (such as shark attacks) and ideologically-charged topics (such as gun control and abortion.) The participants first viewed a set of 36 statements and were asked to indicate the degree to which they believed each statement was accurate.
The participants were then randomly assigned to one of two conditions: In the experimental condition, the participants made predictions about the evidence associated with those statements and were then immediately given the correct answer. In the control condition, the participants were presented with the evidence alone. Finally, participants in both groups were instructed to rate the believability of the initial 36 statements again.
The researchers found that participants who engaged in predictions were more likely to update their beliefs, especially if they had made large errors, compared to those who were just presented with evidence. These effects were similar for both Democrats and Republicans, and for ideologically-neutral and ideologically-charged topics.
“Our minds constantly make predictions about the future. In this study, we used this fundamental property of the cognitive system to change people’s false beliefs by presenting relevant evidence in a prediction-then-feedback format,” Vlasceanu explained.
“For example, to change someone’s false belief that ‘The US justice system is fair to racial minorities,’ you should first ask them to predict ‘How many times is an African American more likely to be imprisoned compared to a White American for a similar crime?’ After the guess, you should give them the correct answer, in this case, ‘An African American is 5 times more likely to be imprisoned compared to a White American for a similar crime.'”
“If the difference between the person’s prediction and the correct answer is significant, our research shows the person is likely to update their original belief by incorporating the new information received,” Vlasceanu told PsyPost. “What’s more, the degree to which they will update their belief is a function of the size of their error. In other words, someone who predicted ‘twice as likely’ will update their belief more than someone who predicted ‘four times as likely.'”
As far as limitations go, Vlasceanu noted that “this belief change intervention was tested in a controlled, lab environment. In future work we are interested in assessing its impact in more ecologically valid contexts.”
The study, “The Effect of Prediction Error on Belief Update Across the Political Spectrum“, was authored by Madalina Vlasceanu, Michael J. Morais, and Alin Coman.