YouTube announced on Wednesday that it will begin banning accounts that promote anti-vaccine misinformation from its platform.

The video platform, which is owned by Google, previously had a similar ban on misinformation about Covid-19, but the newest measure expands the ban to include anyone who spreads incorrect information about approved vaccines that exist even outside of the current pandemic.

Prominent channels run by Dr. Joseph Mercola and Robert F. Kennedy Jr. have been the largest blocks so far. A 67-year-old physician from Florida, Dr. Joseph Mercola raised over 2 million followers spreading false information about vaccines. Likewise, Robert F. Kennedy Jr., son of the famously assassinated politician, was also banned by Instagram for his anti-vaccine speeches.

In a blog post titled “Managing harmful vaccine content on YouTube,” the video platform stated that it would be banning “content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines.”

The ban goes on to include anyone who pushes the beliefs on their channel that vaccines cause autism, cancer, infertility, or include substances that can track their location, which are all false claims that have been widely criticized by the medical community for years.

The company said that they “consulted with local and international health organizations and experts in developing these policies,” and wanted to make sure that viewers searching their platform for answers would receiving accurate information.

“Today’s policy update is an important step to address vaccine and health misinformation on our platform,” the blog post said, “and we’ll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community.”

A spokeswoman told The New York Times that other prominent anti-vaccination activists such as Erin Elizabeth and Sherri Tenpenny would also be banned from the platform.

Previously, Facebook and Instagram attempted to remove information on their website, however the social media platforms still remain a hotbed of misinformation. For people spreading the false claims that Ivermectin, a horse deworming agent, is successful at preventing the virus, Facebook is still the most popular online destination for discussion.

Twitter, another online social network, has banned politicians such as Georgia Representative Marjorie Taylor Green for misinformation in the past, but often lets users recreate new profiles following a “five strike rule.” As of Wednesday morning, Dr. Joseph Mercola and Robert F. Kennedy Jr.’s accounts remain active on both Facebook and Twitter, according to The Washington Post.

Dr. Mercola, part of a group dubbed the “disinformation dozen” by critics, was the lead promoter of misinformation about vaccines online, according to a study by the Center for Countering Digital Hate. The nonprofit group found that just a dozen influencers accounted for roughly 65% of the anti-vaccine messaging on social media.

Just this past July, the White House cited the study in a battle with Facebook as the administration argued for stricter rules about what can and cannot be published on the social networking site.

Over the past year, YouTube allegedly banned over 130,000 videos spreading false information about Covid-19, but also discussed how some videos exist in a kind of gray area. Videos discussing vaccine skepticism do not always also promote false claims, the site said, calling them “borderline videos.”

According to the company, the plan is to leave these videos up but remove them from search results, bumping up videos from verified health professionals and organizations to the top of the page instead of user-created content.

The process is expected to take some time as YouTube scours its platform for anti-vaccination content. NBC News reported that most of the accounts with large bases have been taken down, but some users with smaller follower counts still remain active.