YouTube Bans Anti-Vaccine Misinformation on large level. YouTube bans all anti-vaccine misinformation, removes ‘Disinformation Dozen’ channels
This policy goes well beyond COVID-19. The internet’s biggest anti-vaxxers are losing a major platform where they spread their falsehoods about vaccines.
On Wednesday, YouTube announced a major update to its medical misinformation policy that will see the Google-owned platform ban all types of dangerous anti-vaccination content.
YouTube Bans Anti-Vaccine Misinformation
YouTube announced a total ban Wednesday on vaccine misinformation and the termination of the accounts of several prominent anti-vaccine influencers, including Joseph Mercola and Robert F. Kennedy Jr., citing “the need to remove egregious harmful content.”
The new policy was crafted as the company began to see false claims about Covid-19 vaccines “spill over into misinformation about vaccines in general,” according to a company blog post.
“We’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company wrote.
YouTube already had a policy against Covid vaccine misinformation, but the new ban against broader vaccine misinformation includes content that falsely claims approved vaccines are dangerous or ineffective, including the false belief that vaccines cause autism or cancer.
Anti-vaccine creators have flourished on YouTube for over a decade, moving to the Google-owned platform after traditional media stopped promoting their messaging. Anti-vaccine content was so ubiquitous that vaccine advocacy organizations were forced off the platform years ago.
“It has been incredibly frustrating to try and share good, science-based information about vaccines on YouTube, only to have the algorithms then suggest anti-vaccine content to our viewers,” said Erica DeWald, communications director of Vaccinate Your Family, the nation’s largest nonprofit group dedicated to advocating for vaccines. “We’re hopeful this is a positive step toward ensuring people have access to real information about vaccines and will signal other social media companies to follow suit.”
Earlier this year, the Center for Countering Digital Hate and Anti-Vax Watch released a report detailing how just 12 prominent anti-vaxxer influencers are responsible for around 65 percent of “anti-vaccine content” on major social media platforms like Facebook and Twitter.
YouTube’s latest policy update expands on the company’s existing COVID-19 misinformation rules.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” said YouTube in its statement.
The Bottom Line
YouTube’s new vaccine misinformation policies prohibit “content that falsely alleges that approved vaccines are dangerous and cause chronic health effects.” The platform will also no longer allow false claims about vaccines related to disease transmission or contraction and misinformation related to “substances contained in vaccines.”
The new policy specifically mentions some long standing falsehoods within the anti-vaxxer movement concerning vaccines being used to track or monitor individuals or vaccines causing autism, cancer, or infertility.
The video platform says it worked with “local and international health organizations and experts” to create these new rules surrounding vaccines.
YouTube has struggled with how to handle COVID-19 content since the earliest days of the pandemic. In March 2020, prior to the lockdowns in the U.S., YouTube demonetized all content about the novel coronavirus. Creators who were making videos about COVID-19 during that time could not make money from them via YouTube’s Partner Program.
Just weeks later, YouTube reversed that decision and allowed creators to monetize COVID-19 content.