The news: Facebook will remove false claims that have been “debunked by public health experts” about covid-19 vaccines, he said. In one Publish, the company explained how Facebook plans to apply its current ban on covid misinformation – which aims to eliminate publications that could cause “imminent physical harm” – as countries around the world move closer to vaccine acquisition and deployment. The deletions will apply to both Facebook and Instagram.
Effective vaccines are coming: The success of covid-19 vaccines is seen as critical to overcoming the pandemic, with a number of candidates in late testing. Earlier this week, the UK became the first country to approve a vaccine, granting authorization to use the treatment developed by Pfizer and BioNTech and specifying that the first doses could be administered to patients within a few days.
What is Facebook removing? The policy announcement is not exhaustive, but it gives some examples of what would be removed from the site:
“This could include false claims about the safety, efficacy, ingredients or side effects of vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips or anything else that is not on the official vaccine ingredient list. We will also remove the COVID-19 vaccine conspiracy theories that we now know to be false: as specific populations are used without their consent to test the vaccine for safety.
So, is this a big deal? Yes and no. It’s important that Facebook discusses how it will handle vaccine misinformation in more detail, especially as we enter what could be the most important public health moment in modern history. Misinformation about vaccines has long thrived on Facebook, so anything it announces in terms of a major ban or crackdown has the potential to be very big.
The “but” here is also important and multifaceted. Facebook’s policies are only as effective as their enforcement. With health misinformation in particular, these bans will only achieve their goals if they are effectively implemented within the many private groups on Facebook where bogus health claims are promoted and amplified. This has been a problem with the platform’s previous attempts to crack down on damaging lies.
Uneven application: Even after Facebook started to roll out policies to limit the spread of vaccine misinformation in 2019 – by restricting group recommendations and hashtags promoting such messages, for example – the anti-vaccine ecosystem has continued to thrive in the private spaces of the site. Since the pandemic, however, Facebook has been more aggressive removal of some misinformation about health, citing its policy against content that could cause imminent physical damage. A few weeks ago, Facebook prominent anti-vaccine personality banned Larry Cook, and a huge Facebook group he ran, for violating his policies on the QAnon conspiracy theory.