Home > Media News >
Source: https://mashable.com/
This policy goes well beyond COVID-19.
The internet's biggest anti-vaxxers are losing a major platform where they spread their falsehoods about vaccines.
On Wednesday, YouTube announced a major update to its medical misinformation policy that will see the Google-owned platform ban all types of dangerous anti-vaccination content.
As a result of the policy update, YouTube has banned prominent anti-vaxxers, such as Joseph Mercola and Robert F. Kennedy Jr., from its service. Mercola and Kennedy are part of a group that misinformation researchers have dubbed the "Disinformation Dozen."
Earlier this year, the Center for Countering Digital Hate and Anti-Vax Watch released a report detailing how just 12 prominent anti-vaxxer influencers are responsible for around 65 percent of “anti-vaccine content” on major social media platforms like Facebook and Twitter.
The Disinformation Dozen gained widespread attention over the summer after President Joe Biden referenced the report when criticizing Facebook while talking to reporters.
YouTube's latest policy update expands on the company's existing COVID-19 misinformation rules.
"We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines," said YouTube in its statement.
YouTube's new vaccine misinformation policies prohibit "content that falsely alleges that approved vaccines are dangerous and cause chronic health effects." The platform will also no longer allow false claims about vaccines related to disease transmission or contraction and misinformation related to "substances contained in vaccines."
The new policy specifically mentions some long standing falsehoods within the anti-vaxxer movement concerning vaccines being used to track or monitor individuals or vaccines causing autism, cancer, or infertility.
The video platform says it worked with "local and international health organizations and experts" to create these new rules surrounding vaccines.
YouTube has struggled with how to handle COVID-19 content since the earliest days of the pandemic. In March 2020, prior to the lockdowns in the U.S., YouTube demonetized all content about the novel coronavirus. Creators who were making videos about COVID-19 during that time could not make money from them via YouTube's Partner Program.
Just weeks later, YouTube reversed that decision and allowed creators to monetize COVID-19 content.
The evolution of YouTube's current medical misinformation policies can be traced back to the end of last year. In an effort to curb misinformation from spreading in the months before the vaccines became available, YouTube banned COVID-19 vaccine misinformation in October 2020.
YouTube says it has removed more than 130,000 videos found in violation of its COVID-19 vaccine misinformation policies since last year.
Right Now
Top Stories