In a recent study published in Science Advances, researchers used a mathematical model to depict how the general public drifted away from best-science guidance early during the coronavirus disease 2019 (COVID-19) pandemic.
They empirically mapped and quantitatively analyzed the emitter-receiver network of COVID-19 guidance among online communities of Facebook, the dominant social media platform worldwide. Notably, Facebook has over three billion active users in nearly 156 countries.
Distrust of guidance based on the best available science has reached dangerous levels. During the COVID-19 2020 pre-vaccine period of maximal uncertainty and social distancing, many people went to their online communities for guidance about how to avoid catching it and proposed cures. A 13.2% jump in users of social media in 2020 surged total users to a whopping 4.2 billion, corresponding to 53.6% of the global population. All these people joined social media to seek information about safeguarding themselves and their close ones from the wrath of COVID-19.
Unfortunately, there is a huge possibility that these members eventually get exposed to guidance that is not the best science, which, in turn, results in deaths due to rejecting masks or drinking bleach. It raised the question of who emits and receives guidance and how to intervene in current and future crises beyond COVID-19 (e.g., Monkeypox or climate change misinformation).
A node and a link represent a Facebook page and a page recommending another page, respectively. Each page aggregates people around some common interest, and its analysis does not require accessing personal information. A page member simply mentioning another page does not work. But when a link from a Facebook page recommends another page to all its members, they will automatically be exposed to fresh content, i.e., how an emitter-receiver network gets established.
Although not all members necessarily pay attention to such content, a recent study experimentally and theoretically showed that only 25% of members could tip an online community to an alternate standpoint.
About the study
In the present study, researchers manually searched through Facebook pages made in 2018 and 2019 using keywords and phrases involving COVID-19 vaccines and vetted their findings via human coding and computer-assisted filters. Then, they indexed these pages’ connections to other Facebook pages. Finally, two independent researchers classified each identified node (or Facebook page) as neutral, pro-, or anti-vaccination by reviewing its posts, the 'About' section, and the self-described category.
A pro-page had content promoting best-science guidance; an anti-page, on the contrary, opposed this guidance, and a neutral page had community-level links with pro/anti communities. Parenting pages, for instance, are considered neutral as they focus on topics such as child education, pets, and organic food.
To make the initial seed of Facebook pages as diverse as possible, the researchers repeated the process of manually identifying these pages posted in different languages, focused across geographical locations, and with managers from a wide range of countries. Further, the researchers developed a mathematical model that mimicked the collective dynamics of these Facebook communities. The findings of this model could be verified manually using standard calculus.
The study classification methodology yielded a list of 1356 interlinked Facebook pages comprising 86.7 million people. The data analysis from December 2019 to August 2020 showed that the initial conversations over COVID-19 guidance began primarily among the 501 anti-communities comprising 7.5 million individuals, much before the official announcement of the pandemic in March 2020.
Notably, there were 211 pro-vaccine communities and 644 neutral communities comprising 13 and 66.2 million individuals, respectively. The most frequent manager locations were the United States, Canada, the United Kingdom, Australia, Italy, and France.
Nearly seven million individuals were exposed exclusively to COVID-19 guidance from non-pro communities, and 5.40 million were exposed to both. This imbalance was worse for individuals in parenting (neutral) communities, with 1.10 million exposed exclusively to COVID-19 guidance from non-pro communities. Upon randomly deleting up to 15% of COVID-19-related links from the entire network to mimic missed Facebook links, the researchers still found that their findings and conclusions were robust.
Overall, the anti communities jumped in to dominate the conversation before the official announcement of the COVID-19 pandemic, and neutral communities (e.g., parenting) subsequently moved closer to extreme communities and hence became highly exposed to their content.
Thus, parenting communities began receiving COVID-19 guidance from anti-communities as early as January 2020, after which they even started adding their guidance to the conversation. Conversely, best science guidance from pro communities remained low throughout the entire study duration.
The combination of network mapping and model showed more possible approaches to flipping the conversation than merely removing all extreme elements from the system. Removing all extreme elements may not even be the most appropriate solution. It could come across as harsh, counter to the idea of open participation, and compromise the business model of maximizing user numbers.
Nevertheless, the study model could tackle the question of online misinformation more generally, beyond COVID-19 and vaccinations. It could also help predict tipping point behavior and system-level responses to interventions in future crises.