A series of five studies published in the journal of Judgment and Decision Making has revealed that inducing feelings of ignorance makes people more receptive to expert (vs. public) opinion. The researchers found that overestimating one’s own knowledge played a role in this type of behavior – people often believe they know more than they truly do.
“It seems even truer now than when we started this work, but at the time it was difficult to avoid claims made by various members of the media as well as those in my own social circles that expertise was under attack. That there was a sort of ‘war on experts.’ What they meant was that the influence over public behavior and opinion experts once held had disappeared – that now people seem to willfully ignore the collective wisdom of experts,” explained study author Ethan Meyers (@ethanameyers), a graduate student at the University of Waterloo. “There was convincing evidence for people not privileging the opinions of experts over that of laypeople, observed in both ‘lab-like’ and ‘real-world’ settings. A natural question emerged: Why do people not listen to experts?”
Meyers and colleagues recruited a total of 2,862 American participants. Generally, participants were presented with an economic issue (such as, “Trade with China makes most Americans better off”) and were prompted to rate how well they thought they understood the presented topic. Participants were then asked to explain in detail how this topic worked (e.g., how trading with China affects the US economy). Next, they rated their understanding of the topic a second time and provided an agreement rating for the economic issue. Afterwards, they were presented with consensus information from either expert economists or the general public and were asked to re-rate their agreement with the given issue.
Overall, the researchers found that people revised their beliefs in response to public opinion, but did not do this more so in response to expert opinion. Importantly, when gaps in knowledge were exposed (known as ‘illusion of explanatory depth’), people revised their beliefs far more in response to expert (vs. public) opinion – something they were not doing prior to having their gaps in knowledge exposed.
Exposing this knowledge gap did not have to be topic relevant. For example, failing to explain how a helicopter takes flight still resulted in similar belief revision regarding economic issues, such as Medicare.
“If you’re trying to change another person’s mind about something, try making them recognize the limits of their knowledge, but not in a confrontational way. Instead, your goal should be to have them gently recognize that they might not know something quite as well as they once thought,” Meyers told PsyPost. “One way you could accomplish this is by having them explain how something works. It is in the process of failing to explain something where we recognize what we do not know and experience an intellectually humbled state where we might be more receptive to information.”
Given this study uniquely examined belief revision in response to economic issues, with professional economists as the expert group, the effect cannot yet be generalized to other areas (such as medicine).
“While we believe that our effect would likely extend to any case wherein people think they know more than they really do, our paper does not contain the data needed to support or refute that,” Meyers explained.
When asked about research questions that still need answers, Meyers responded, “in our paper, we found that people revised their normative beliefs on economic issues more to economists than to the public even after failing to explain how a helicopter takes flight. This suggests that a general feeling of ignorance might be induced that makes people question their degree of knowledge in many topics, not just the one they tried to explain. We are currently in the process of testing this. So far, we’ve failed to find a boundary condition for the effect across three studies. It seems, even failing to explain how snow is formed, can make us recognize that maybe we don’t know as much as we thought we did about how a zipper works.”
He added, “we have recently begun exploring whether people will also become more receptive to epistemic trespassers. Such as, a medical doctor giving their opinion on an economic issue or an economist giving their opinion on a medical diagnosis, like they do to topic-pertinent experts. This can help us answer whether people are heuristically following cues of intelligence, or whether they become highly selective to the source of information they will revise their beliefs to.”
The study, “Inducing feelings of ignorance makes people more receptive to expert (economist) opinion”, was authored by Ethan A. Meyers, Martin H. Turpin, Michał Białek, Jonathan A. Fugelsang, and Derek J. Koehler.