Madison Kwiecinski
News Editor
The popular streaming platform YouTube has recently updated its vaccine misinformation policy, expanding it to cover a variety of topics related to discouraging people from getting medically-safe vaccines.
“Vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness,” the company wrote in a statement regarding the policy released on their blog. “Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.”
YouTube’s previous community guidelines already banned medical misinformation, and has been adopted to additionally cover all “currently administered” vaccines that are proven to be safe by important health officials.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company said.
The new policy implementation will begin the process of removing large amounts of accounts that violate the updated community guidelines. The site now bans videos that claim vaccines aren’t safe or effective or cause other health issues such as cancer and infertility. This has been an issue for decades, with anti-vaxxer parents consistently reiterating unsubstantiated claims that certain modern day vaccines cause side effects such as developmental issues, autism, infertility, and a wide-variety of other disorders that have nothing to do medically with vaccines.
The company is also banning accounts that report false ingredients in vaccines, or claim that these medically beneficial vaccinations are used as a method of “tracking” people who receive them.
This policy is in no way meant to target an individual’s right to free speech, but instead mimics the concept of how free speech is fine, until someone is endangering others around them. For example, this policy is similar to how you are allowed to say you dislike an airport, but an individual still cannot yell “fire” in an airport without reasonable repercussions for the harm they caused.
Following that policy, users are still permitted to share their personal experiences with vaccines, or talk about how it has had a personal effect on them. This allowance is based on the condition that the account still does not regularly promote “vaccine hesitancy” and still follows the updated community guidelines.
The company, which is technically owned by Google, has warned that the widespread removal of all of these misinformative videos may take some time, but the process has already begun.
YouTube has already removed some of the pages known for sharing anti-vaccination sentiments such as those belonging to prominent vaccination opponents like Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy Jr.’s Children’s Health Defense organization, reported CNBC.
Many conservative pages spreading anti-vaccine misinformation are still currently active, as the process for removal has just begun. These videos still attract millions of viewers, and are something people should be constantly aware of when searching for appropriate medical information resources.


Leave a comment