Is Facebook's fight against vaccine misinformation failing?

NewsGuard 100/100 Score

In a study published in the journal Science Advances, researchers from the United States evaluated the efficacy of Facebook's policies regarding the removal of vaccine misinformation from the platform during the coronavirus disease 2019 (COVID-19) pandemic. They found that while Facebook removed some anti- and pro-vaccine content, there was no decrease in users' overall engagement with anti-vaccine content. Further, when pro-vaccine content was removed, the anti-vaccine content became even more misinformative, polarized, and was more likely to show up in users' newsfeeds. The authors suggest that Facebook's system architecture may have allowed anti-vaccine content producers to circumvent anti-misinformation policies.

Study: The efficacy of Facebook’s vaccine misinformation policies and architecture during the COVID-19 pandemic. Image Credit: PeopleImages.com - Yuri A / ShutterstockStudy: The efficacy of Facebook’s vaccine misinformation policies and architecture during the COVID-19 pandemic. Image Credit: PeopleImages.com - Yuri A / Shutterstock

Background

Digital misinformation reduces people's trust in clinical evidence and recommendations, thereby hampering public health. The spread of misinformation on social media during the coronavirus disease 2019 (COVID-19) pandemic may have lowered the rate of vaccine adoption, prompting the stringent curtailing of such misinformation on these portals. Despite imposing "soft" remedies such as warnings and "hard" remedies such as the removal of objectionable content and accounts, the short-term evidence of their efficacy remains inconclusive, with no systematic long-term examination.

Therefore, this study addresses the need to understand if and why a combination of hard and soft remedies could potentially help reduce users' exposure to misinformation, particularly anti-vaccine misinformation on Facebook, the world's largest social network. The researchers also examine the platform's system architecture to understand how it potentially affected the outcomes of misinformation-curbing policies.

About the study

Data from Facebook was downloaded using CrowdTangle starting 15 November 2020, preceding Facebook's announcement of hard remedies against COVID-19 vaccine misinformation spread.

The search resulted in 216 English-language pages and 100 English-language groups that discussed vaccines. The data included 119,091 posts from pages and 168,419 posts from groups between 15 November 2019 to 15 November 2020. Similarly, 177,615 and 244,981 posts, respectively, were created on the same pages and groups from 16 November 2020 to 28 February 2022. Pages and groups were treated differently because of their distinct functions in the platform's architecture. While pages are for marketing objectives, where only administrators can add posts, groups allow every member to post, offering them a forum to discuss common interests. It is important to note that pages may act as group administrators.

A comparative interrupted time-series (CITS) design was used to study weekly posts in public anti- and pro-vaccine pages and groups and compare them to pre-policy trends as well as to one another. Additionally, the total number of engagements (sum of comments, shares, likes, and emotional reactions) was measured for each post as Facebook content in users' newsfeeds is prioritized using these parameters. Therefore, as per the study, a policy that effectively reduces engagement with anti-vaccine content as compared to pre-policy trends would be considered efficacious.

Results and discussion

By 28 February 2022, 49 pages and 31 groups were removed by Facebook, and 5% of the anti-vaccine groups turned from public to private. Compared to pre-policy trends, the anti-vaccine page content volume reduced by 1.47 times that of the pro-vaccine page content. However, there was no significant difference in users' engagement with anti-vaccine page content.

Additionally, on anti-vaccine pages, misinformative content mainly related to severe adverse events (odds ratio 1.41), including hospitalization and death (odds ratio 1.23) following COVID-19 vaccination, was found to have increased. An increase was also observed in anti-vaccine and pro-vaccine discussions in groups compared to pre-policy trends. Furthermore, low-credibility links to external content increased until September 2021 and then decreased. However, the links may have exposed users more to politically polarized content, as shown by increased engagement (odds ratio 2.37). The findings also emphasize accounting for the users' demand for misinformative content, which is challenging to address with policies alone.

The researchers suggest that a platform's architecture could allow users to access interdicted content via alternative paths. In the case of Facebook, its system architecture is a layered hierarchy of pages, groups, and users, which can potentially be manipulated to provide such alternative paths. For example, in the top layer, page administrators can link each other's pages, enabling their easy discovery. In the middle layer, groups and pages can engage and share the same content simultaneously, allowing copies of the content to remain on the platform even after removing the original content. In the bottom layer, users who engage with misinformative content in their newsfeeds end up increasing other users' exposure to it. The study is limited to public data available on Facebook alone. Also, the data do not differentiate between unique individuals posting and engaging vs. the same individuals repeatedly posting.

Conclusion

The findings from this work suggest that while Facebook's policies could reduce the anti-vaccine post volumes, the engagement with anti-vaccine content could not be reduced sustainably. As per the study, this could be attributed to the flexibility offered by the platform's system architecture to its users. Thus, the study highlights the challenges in curbing the spread of misinformation on social media and further encourages platform designers to ensure that their policies and system architecture align with scientific evidence to curb the potential threats posed by social media to public health.

Journal reference:
Susha Cheriyedath

Written by

Susha Cheriyedath

Susha is a scientific communication professional holding a Master's degree in Biochemistry, with expertise in Microbiology, Physiology, Biotechnology, and Nutrition. After a two-year tenure as a lecturer from 2000 to 2002, where she mentored undergraduates studying Biochemistry, she transitioned into editorial roles within scientific publishing. She has accumulated nearly two decades of experience in medical communication, assuming diverse roles in research, writing, editing, and editorial management.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Cheriyedath, Susha. (2023, September 18). Is Facebook's fight against vaccine misinformation failing?. News-Medical. Retrieved on April 28, 2024 from https://www.news-medical.net/news/20230918/Is-Facebooks-fight-against-vaccine-misinformation-failing.aspx.

  • MLA

    Cheriyedath, Susha. "Is Facebook's fight against vaccine misinformation failing?". News-Medical. 28 April 2024. <https://www.news-medical.net/news/20230918/Is-Facebooks-fight-against-vaccine-misinformation-failing.aspx>.

  • Chicago

    Cheriyedath, Susha. "Is Facebook's fight against vaccine misinformation failing?". News-Medical. https://www.news-medical.net/news/20230918/Is-Facebooks-fight-against-vaccine-misinformation-failing.aspx. (accessed April 28, 2024).

  • Harvard

    Cheriyedath, Susha. 2023. Is Facebook's fight against vaccine misinformation failing?. News-Medical, viewed 28 April 2024, https://www.news-medical.net/news/20230918/Is-Facebooks-fight-against-vaccine-misinformation-failing.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Nigeria first to rollout new Men5CV vaccine against meningitis