Facebook puts tighter restrictions on vaccine misinformation targeted at children

Pfizers COVID-19 vaccine was approved by the FDA for children between five and eleven years old. Meta, Facebook's new identity, announced it would be enforcing stricter policies regarding vaccine misinformation targeting children (via Engadget). Although the platform had previously placed restrictions on COVID-19 misinformation in late 2020 it didn't have specific policies for children.
Meta has posted a blog post stating that it is partnering with the Centers for Disease Control and Prevention and the World Health Organization (WHO), to remove harmful content regarding children and the COVID-19 vaccination. All posts that suggest the COVID-19 vaccine may be unsafe, untested or not effective for children are excluded from this list. Meta will send in-feed reminders in English or Spanish that the vaccine is approved for children, as well as information about where it is available.

Meta reports that it has taken down 20 million pieces COVID-19, and vaccine misinformation from Facebook and Instagram since the outbreak of the pandemic. These numbers contrast with the information we have seen in leaked documents from Facebook. The Facebook Papers revealed just how unprepared Facebook was for misinformation regarding the COVID-19 vaccination. Facebook could have launched campaigns earlier to counter misinformation in the pandemic for both children and adults if it was more prepared.