Facebook Expands Its Policies Against Covid-19 Vaccine Misinformation to Include Kids

Meta, the tech giant that was once Facebook, has partnered with the Centers for Disease Control and Prevent and World Health Organization in a rare preemptive action to stop misinformation from spreading before it becomes viral. This partnership will help to remove harmful content about the coronavirus vaccine and its effect on children. This announcement was made in conjunction with the U.S. Food and Drug Administrations approval of the covid-19 vaccine for children aged 5-11 years.

Facebook users will soon start to see in-feed reminders about the vaccine's approval for children. They also have information about where it is available. Metas will be distributing both Spanish and English versions of the reminders.

It is also expanding its anti-vaccine misinformation policies in order to eliminate false claims specifically about the vaccine and children. It includes misinformation regarding vaccines availability, efficacy and vetting within the scientific community. For example, claims that covid-19 vaccine could kill or seriously injure children. All posts claiming that children can be protected from the virus with any other treatment than the covid-19 vaccine will be removed.

This is not a one-off update. It is part of an ongoing partnership with health authorities such as the CDC and WHO, both nationally and internationally, Kang-Xing Jin (Metas head for health) wrote in a blog post announcing Friday's partnership. We will continue to clarify and add new claims regarding the COVID-19 vaccination for children.

FDA approval covers the Pfizer/BioNTech vaccine, which is 90.7% effective in preventing covid-19 among children aged 5-11 years. The vaccine will be available in a child version, which is a lower dose than that for adults. It will be administered as a two-dose shot over three weeks. No serious adverse reactions to the vaccine have been reported in clinical trials.

G/O Media could get a commission Black Friday Deals 2020: What to Expect and What's on Sale Now. Show the suppy chain who is boss

Amazon, Target, Best Buy and other retailers offer holiday shopping tips to help you get started. Black Friday Deals - Shop early

Meta has been criticized for not limiting the spread of misinformation via its platform over the years. This problem was highlighted by the covid-19 pandemic. Right-wing conspiracy theorists and anti-vaxxers have always been present on Facebook. They quickly began flooding users' feeds with propaganda about covid-19, the efficacy and use of masks, and later added the coronavirus vaccine.

Facebook implemented a number of new policies to limit or eliminate harmful content amid mounting criticism. Many critics say that Facebook's response was too slow and too ineffective. President Joe Biden claimed that the company had killed people by allowing false vaccine claims to be posted on its platform. Facebook isn't done with damage control. The hashtag #VaccinesKill which seemed like an obvious candidate for Facebooks ban hammer remained functional until July, before Facebook blocked it.


Meta announced Friday that it has removed a total of 20,000,000 pieces of content, 3,000 pages and groups on Facebook and Instagram, as well as several hundred accounts and pages related to vaccine misinformation and covid-19 since the outbreak.