In a recent study published in Scientific progressresearchers used a mathematical model to represent how the general public drifted away from the best scientific guidance early during the 2019 coronavirus disease (COVID-19) pandemic.
They empirically mapped and analyzed the sender-receiver network of COVID-19 guidance among online communities of Facebook, the dominant social media platform worldwide. Notably, Facebook has more than three billion active users in nearly 156 countries.
The mistrust of guidance based on the best available science has reached a dangerous level. During the COVID-19 2020 pre-vaccination period of maximum uncertainty and social distancing, many people went to their online communities for advice on how to prevent it and suggested treatments. A 13.2% increase in social media users in 2020 pushed the total number of users to a staggering 4.2 billion, equivalent to 53.6% of the world’s population. All of these people have joined social media to seek information on how to protect themselves and their loved ones from the wrath of COVID-19.
Unfortunately, there is a huge chance that these members will end up being exposed to counseling that isn’t the best science, which in turn results in deaths from rejecting masks or drinking bleach. It raised the question of who is sending and receiving guidance and how to intervene in current and future crises beyond COVID-19 (e.g. Monkeypox or misinformation about climate change).
A node and a link respectively represent a Facebook page and a page that recommends another page. Each page gathers people around a common interest, and its analysis does not require access to personal information. A page member that simply mentions another page will not work. But when a link from a Facebook page recommends another page to all its members, they will automatically be exposed to new content, ie how a sender-receiver network comes into being.
While not all members necessarily pay attention to such content, a recent experimental and theoretical survey showed that only 25% of members were able to tip an online community toward an alternative point of view.
About the study
In the current study, researchers manually searched Facebook pages created in 2018 and 2019 using keywords and phrases related to COVID-19 vaccines and vetted their findings through human coding and computer-aided filters. They then indexed these pages’ connections to other Facebook pages. Finally, two independent researchers classified each identified node (or Facebook page) as neutral, pro- or anti-vaccine by reviewing the posts, the “About” section, and the self-described category.
A pro page had content promoting the best scientific guidance; an anti page, on the other hand, was against this guideline, and a neutral page had community-level ties to pro/anti communities. Parenting pages, for example, are considered neutral because they focus on topics such as child rearing, pets, and organic food.
To make the initial beginnings of Facebook pages as diverse as possible, the researchers repeated the process of manually identifying these pages posted in different languages, targeting geographic locations and with executives from a wide range of countries. Furthermore, the researchers developed a mathematical model that mimicked the collective dynamics of these Facebook communities. The findings of this model could be verified manually using standard calculus.
The survey ranking method yielded a list of 1,356 interconnected Facebook pages with 86.7 million people. The data analysis from December 2019 to August 2020 found that initial conversations about COVID-19 guidelines started mainly among the 501 anti-communities, comprising 7.5 million individuals, long before the official announcement of the pandemic in March 2020.
Notably, there were 211 pro-vaccine communities and 644 neutral communities comprising 13 and 66.2 million individuals, respectively. The most common manager locations were the United States, Canada, the United Kingdom, Australia, Italy and France.
Nearly seven million individuals were exposed exclusively to COVID-19 counseling from non-professional communities, and 5.40 million were exposed to both. This imbalance was worse for individuals in (neutral) parenting communities, with 1.10 million individuals exposed exclusively to COVID-19 counseling from non-professional communities. When randomly removing up to 15% of the entire network’s COVID-19-related links to mimic missed Facebook links, the researchers still found their findings and conclusions to be robust.
In general, the anti-communities jumped in to dominate the conversation before the official announcement of the COVID-19 pandemic, and neutral communities (e.g. parenting) subsequently moved closer to extreme communities and were therefore highly exposed to their content.
For example, parenting communities started receiving COVID-19 guidance from anti-communities as early as January 2020, after which they even started adding their guidance to the conversation. Conversely, the best scientific guidance from professional communities remained low throughout the study duration.
The combination of network mapping and model revealed more possible approaches to turning the conversation around than just removing all the extreme elements from the system. Removing all the extreme elements may not even be the most appropriate solution. It can come across as harsh, contrary to the idea of open participation, and jeopardize the business model of maximizing the number of users.
Nevertheless, the study model could address the issue of online disinformation more broadly, in addition to COVID-19 and vaccinations. It can also help predict tipping point behavior and system-level responses to interventions in future crises.