Vaccine misinformation has been around since well before the pandemic, but ensuring that anti-scientific conspiracies don’t get boosted online is more crucial than ever as the world races against the spread of a deadly, changing virus.
Now, Facebook says it will expand the criteria it uses to take down false vaccine claims. Under the new rules, which Facebook said it made in consultation with groups like the World Health Organization, the company will remove posts claiming that COVID-19 vaccines aren’t effective, that it’s “safer to get the disease” and the widely debunked longstanding anti-vaxxer claim that vaccines could cause autism.
Facebook says it will place a “particular focus” on enforcement against groups, Pages, groups and accounts that break the rules, noting that they may be removed from the platform outright.
Facebook took steps to limit COVID-19 vaccine misinformation in December, preparing the platform for the vaccine rollout while still lagging well behind the rampant spread of anti-vaccine claims. The company began removing posts containing some misinformation about the vaccine, including “false claims that COVID-19 vaccines contain microchips” and content claiming that the vaccine is being tested on portions of the population without their consent.
Why this kind of stuff didn’t already fall under Facebook’s rules against COVID-19 misinformation is anyone’s guess. The company came out of the gate early in the pandemic with a new set of policies intended to prevent an explosion of potentially deadly COVID-related conspiracies, but time and time again the company fails to evenly and firmly enforce its own rules.