Vaccine misinformation moderately curbed by new Facebook policy

by | Mar 3, 2022

New study provides evidence that social media companies have the tools to reduce the impact of vaccine misinformation on their platforms.
A line of people scroll through their phones on a subway.

Social media platforms have a misinformation problem — this is nothing new. Whether it be about climate change, the COVID-19 pandemic, vaccine misinformation, or world news, there is mounting concern from experts that as a result of the nature of platforms like Facebook, TikTok, and Twitter, the issue is becoming an insurmountable and significant public concern.

“As a researcher who studies social and civic media, I believe it’s critically important to understand how misinformation spreads online,” wrote Ethan Zuckerman, associate professor of Public Policy, Communication, and Information at UMass Amherst, in an article published on The Conversation. “But this is easier said than done. Simply counting instances of misinformation found on a social media platform leaves two key questions unanswered: How likely are users to encounter misinformation, and are certain users especially likely to be affected by misinformation?”

While the latter question may be user dependent, some experts believe that limiting exposure to or even helping individuals better identify misinformation may be a way forward.

As a means to address the issue, some tech companies have been establishing policies to curtail or identify the amount of false information being shared on their platforms. In 2019, for example, Facebook established its first policy to counter the spread of misinformation surrounding vaccines.

Whether these policies are performative or if they have any affect has been hotly debated. A team of researchers led by Lorien Abroms, professor of Prevention and Community Health at the George Washington University, were curious to determine if there were any traceable changes as a result of Facebook’s 2019 policy.

“There is a growing consensus that health misinformation on social media platforms presents a critical challenge for public health, especially during the COVID-19 pandemic,” Abroms said in a statement. “While new policies by social media companies are an important first step to stopping the spread of misinformation, it is also necessary to rigorously evaluate these policies to ensure they are effective.”

For the study, the team carried out an interrupted time series analysis — a method of statistical analysis that involves tracking data over a period of time before and after a point of intervention. They identified l72 anti- and pro-vaccine Facebook pages, and over the course of six months, both before and after the policy went into place, they monitored “user endorsements” via likes on published posts.

The study found that Facebook’s misinformation policy moderately curtailed the number of likes of anti-vaccine content on pages on its platform. “When the number of subscribers was considered, the policy effect on the number of likes for anti-vaccine posts was much smaller, but still statistically significant,” wrote the authors in their paper.

This means that while the interactions of page subscribers with posts was less pronounced than non-subscribers, the change before and after the policy was implement still had statistical significance, lending credence to the policy’s effectiveness, according to the study.

Though not a silver bullet, understanding what policies are effective and which aren’t can help social media platforms and policy makers better understand how to stem the amount of misinformation on their sites.

“This research is a good first step in developing a process to evaluate the effectiveness of social media policies that are created to stop the spread of misinformation,” said Jiayan Gu, Ph.D. student in Abroms’ group and author on the study. “We are excited to continue this work and grow our understanding of how social media policy interventions can positively change online information sharing ecosystems.”

Reference: Jianyan Gu, et al., The Impact of Facebook’s Vaccine Misinformation Policy on User Endorsements of Vaccine Content: An Interrupted Time Series Analysis, Vaccine (2022). DOI: 10.1016/j.vaccine.2022.02.062

Feature image credit: Robin Worrall Unsplash

Related posts:

Invisible underwater robots

Invisible underwater robots

A transparent underwater robot camouflages itself to explore the ocean, reducing encounters with delicate sea life.