Home > Media News > Meta Rolls Out Stricter Policies for Covid Vaccine Misinformation Targeted at ...

Meta Rolls Out Stricter Policies for Covid Vaccine Misinformation Targeted at Children
1 Nov, 2021 / 08:23 am / Reeny Joseph

626 Views

 Meta, Facebook’s brand new identity, announced that it’s rolling out stricter policies for vaccine misinformation targeted at children. The platform previously put restrictions on COVID-19 vaccine misinformation in late 2020, but didn’t have policies specific to kids.

Meta says in a new blog post that it’s partnering with the Centres for Disease Control and Prevention (CDC) and the World Health Organization (WHO) to take down harmful content related to children and the COVID-19 vaccine. This includes any posts that imply the COVID-19 vaccine is unsafe, untested, or ineffective for children. Additionally, Meta will provide in-feed reminders in English and Spanish that the vaccine has been approved for kids, and will also provide information about where it’s available.

Meta notes that it’s taken down a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram since the beginning of the pandemic. These numbers are at odds with what we’ve seen from the leaked internal documents from Facebook — the Facebook Papers made it clear just how unprepared the platform was for misinformation related to the COVID-19 vaccine. If Facebook were more prepared, it might’ve rolled out campaigns to combat misinformation earlier in the pandemic, both for children and adults, possibly removing more false content as a result.

The U.S. Food and Drug Administration had authorized the emergency use of the Pfizer-BioNTech COVID-19 Vaccine for the prevention of COVID-19 to include children 5 through 11 years of age. The authorization was based on the FDA’s thorough and transparent evaluation of the data that included input from independent advisory committee experts who overwhelmingly voted in favour of making the vaccine available to children in this age group.