ARTICLE AD
There are two types of 'superspreaders' of online misinformation: the intentional and organized spreaders of falsehoods or misleading claims, and those who unwittingly share information they didn't know was false.
We've seen some of the fatal consequences of their combined effect running rife during the COVID-19 pandemic, but have far less detail on how eyeing such misinformation on social media changes people's behavior, particularly around vaccination.
Researchers at Massachusetts Institute of Technology (MIT) and the University of Pennsylvania set out to connect the dots to show cause and effect, analyzing the impact of more than 13,000 headlines on vaccination intentions among roughly 233 million US-based Facebook users – a pool equivalent to nearly 70 percent of the country's population.
Casting a wide net, the researchers didn't just look at content flagged as false or misleading by the platform's fact-checkers; their dataset included all vaccine-related headlines popular during the first three months of the US vaccine rollout, from January to March 2021. This included 'vaccine skeptical' information, which isn't factually inaccurate but still raises questions about vaccines, and is far less scrutinized on social media.
"By taking an a priori agnostic view of what content might change vaccination intentions, we discover from the bottom-up which types of content drive overall vaccine hesitancy," MIT computational social scientist Jennifer Allen and colleagues write in their published paper.
Many assumptions have been made about the relationship between exposure to misinformation and resulting behaviors, based on studies pointing out links between sharing and believing online misinformation, and diminished COVID-19 vaccination.
But it's a chicken and egg situation. Other research has suggested that initial vaccine hesitancy leads people to consume more misinformation, rather than misinformation seeding the initial doubts that see them refrain from vaccinating.
To get at the root cause, the researchers first tested the effect of different headlines on vaccination intentions, in two experiments involving more than 18,700 online survey participants.
In the second experiment they found that regardless of whether a headline was true or false, or accurate or not, if it led people to believe that vaccines could be harmful to health, it reduced intentions to vaccinate.
Next, the researchers extrapolated those findings showing cause and effect to their pool of 233 million US Facebook users, using a combination of crowdsourcing and machine learning to estimate the impact of some 13,200 vaccine-related URLs popular during early 2021.
They found misinformation flagged by fact-checkers as false or misleading gained relatively little traction on Facebook compared to unflagged stories that reached more people and implied vaccines were harmful to health.
These unflagged stories were largely published by credible mainstream news outlets, viewed hundreds of millions of times and – left to encourage vaccine skepticism unchecked – had an impact some 46 times greater than flagged posts, the team's predictive model showed.
In other words, vaccine-skeptical content from mainstream sites that was not flagged as misinformation had more of an impact on vaccine hesitancy than outright false content published by fringe outlets.
"Unflagged stories highlighting rare deaths after vaccination were among Facebook's most-viewed stories," Allen and colleagues explain, showing that people's exposure to misleading content determines how broadly influential it is.
Of course, many other real-world factors could influence someone's decision to get vaccinated, and vaccine hesitancy might not be the only the driving factor.
Vaccination intentions are also not the same as hard data on vaccination uptake. This study focuses on just one country, too, but the findings could provide insights into how information spreads globally.
"Our work suggests that while limiting the spread of misinformation has important public health benefits, it is also critically important to consider gray-area content that is factually accurate but nonetheless misleading," the team concludes.
The study has been published in Science.