Carding
Professional
- Messages
- 2,870
- Reaction score
- 2,511
- Points
- 113
Facebook has failed to deal with misinformation about COVID-19.
During the COVID-19 pandemic, disinformation was actively spread on the Internet. In this regard, many platforms announced measures to combat myths and false information. However, were these measures really effective?
A study published in the journal Science Advances shows that Facebook's policy regarding misinformation about COVID-19 vaccines did not bring the desired result. The study, titled "The Effectiveness of Facebook's Policies and Structures against Vaccination Misinformation during the COVID-19 pandemic," was conducted with the participation of experts from Johns Hopkins University and led by researchers from George Washington University.
The main conclusion: the main design features of the platform itself hinder the fight against misinformation. "To effectively combat disinformation and other online threats, we need to focus not only on content and algorithms, but also on design and architecture," said David Broniatowski, lead author of the study.
The researchers found that despite Facebook's active attempts to remove anti-vaccination content during the pandemic, overall activity around such content either did not decrease, or even increased. In addition, the remaining anti-vaccination content on the platform has become more erroneous, containing links to dubious external sources and "alternative" social platforms.
Broniatowski suggests that social media designers can work collaboratively to develop" building codes " for their platforms based on scientific evidence to reduce online harm.
According to the researchers, this is the first and only scientific assessment of the attempts of the largest social platform to systematically remove disinformation and accounts that spread it.
During the COVID-19 pandemic, disinformation was actively spread on the Internet. In this regard, many platforms announced measures to combat myths and false information. However, were these measures really effective?
A study published in the journal Science Advances shows that Facebook's policy regarding misinformation about COVID-19 vaccines did not bring the desired result. The study, titled "The Effectiveness of Facebook's Policies and Structures against Vaccination Misinformation during the COVID-19 pandemic," was conducted with the participation of experts from Johns Hopkins University and led by researchers from George Washington University.
The main conclusion: the main design features of the platform itself hinder the fight against misinformation. "To effectively combat disinformation and other online threats, we need to focus not only on content and algorithms, but also on design and architecture," said David Broniatowski, lead author of the study.
The researchers found that despite Facebook's active attempts to remove anti-vaccination content during the pandemic, overall activity around such content either did not decrease, or even increased. In addition, the remaining anti-vaccination content on the platform has become more erroneous, containing links to dubious external sources and "alternative" social platforms.
Broniatowski suggests that social media designers can work collaboratively to develop" building codes " for their platforms based on scientific evidence to reduce online harm.
According to the researchers, this is the first and only scientific assessment of the attempts of the largest social platform to systematically remove disinformation and accounts that spread it.