A new study conducted by researchers from Michigan State University suggests that the battle against online disinformation cannot be won by content moderation or banning those who spread fake news. Instead, the key lies in early and continuous education that teaches individuals to critically evaluate information and remain open to changing their minds.
The study was recently featured in SIAM News, a publication of the Society for Industrial and Applied Mathematics (SIAM).
“Disinformation is one of the most important problems of modern times and is poised to worsen as the power of AI increases. Our research group develops models for the spread of ‘contagions,’ so disinformation, like disease, is a natural topic,” explained study author Michael Murillo, a professor in the Department of Computational Mathematics, Science and Engineering.
The researchers used a type of math called “agent-based modeling” to simulate how people’s opinions change over time. They focused on a model where individuals can believe the truth, the fake information, or remain undecided. The researchers created a network of connections between these individuals, similar to how people are connected on social media.
They used the binary agreement model to understand the “tipping point” (the point where a small change can lead to significant effects) and how disinformation can spread.
They tested three main disinformation mitigation strategies under consideration by the U.S. Congress: content moderation (such as banning those who spread fake news), public education (teaching people to fact-check and be skeptical), and counter campaigns (promoting groups committed to spreading the truth).
The researchers implemented each strategy in the simulated environment to test its effectiveness. They created thousands of small networks representing different types of social connections and applied mathematical rules to simulate real-world scenarios.
“Disinformation is an important problem that policy makers are attempting to address,” Murillo told PsyPost. “We have developed models to simulate the spread of disinformation to test various mitigation strategies. From the mathematics and many thousands of simulations, we are able to assess the most fruitful strategies.”
The researchers found that found that if just 10% of the population strongly believes in disinformation, the rest may follow suit. The findings suggest that disinformation spreads easily because people naturally want to believe things that align with their existing beliefs.
Teaching people to recognize their biases, be more open to new opinions, and be skeptical of online information proved the most effective strategy for curbing disinformation.
Early education (teaching people to be skeptical and question information early on, before they form strong opinions) had the most significant effect on stopping disinformation. Late education (trying to correct people’s beliefs after they have already formed opinions) was not as effective as early education but still had some impact.
Strategies like removing people who share fake content or creating counter campaigns were not as effective as education. The researchers explained that even though these strategies might seem like quick solutions, they don’t work as well in the long run.
“We were surprised, and disheartened, by how difficult this problem is,” Murillo said. “If one guesses the cost and time to implement strategies, such as broad education on critical thinking and education, we are looking at a generational-scale problem.”
As with all research, the new study includes some caveats.
“We deliberately created a parsimonious model to uncover the essential factors at play; however, much more detail could be added to better match specific situations,” Murillo explained. “Also, many proposed strategies are only ‘band-aids’ that treat the symptom, such as labeling videos in YouTube, but do not address the underlying cause that may be related to a social or political issue.”
“More research is needed to understand how and why people are drawn toward disinformation in general,” Murillo added. “People tend to be drawn toward sensationalist ideas, which empower and gives advantage to the sources of disinformation. Given improved knowledge of this aspect of human nature, we can enhance our models and policy makers could perhaps develop more optimal ‘seat belts’ to control the spread of disinformation.”
The study, “Evaluating the Effectiveness of Mitigation Policies Against Disinformation“, was authored by David J. Butts and Michael S. Murillo.