Aside from anti-COVID-19 misinformation, video-sharing giant will also bar erroneous claims about flu and measles jabs.
YouTube will block all anti-vaccine content, moving beyond its ban on false information about COVID-19 vaccines to include material that contains misinformation about other approved vaccines, the social media giant has announced.
YouTube said on Wednesday the expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO (World Health Organization).”
The new policy will also see false claims about routine immunisations for diseases like measles, Hepatitis B and influenza removed from YouTube.
That would include cases where vloggers who post content on the platform have claimed that approved vaccines do not work, or wrongly linked them to chronic health effects.
The online video company is also banning channels associated with several prominent anti-vaccine activists including Robert F Kennedy Jr and Joseph Mercola, a YouTube spokesperson said.
A news email for Mercola’s website said in a statement: “We are united across the world, we will not live in fear, we will stand together and restore our freedoms.”
Kennedy, who is a member of the prominent US political family, said in a statement: “There is no instance in history when censorship and secrecy has advanced either democracy or public health.”
RT accused of violation
YouTube said it had removed more than 130,000 videos since last year for violating its COVID-19 vaccine policies.
On Tuesday, the video platform told German media that it had blocked the German-language channels of Russia’s state broadcaster RT for violating its COVID-19 misinformation guidelines.
YouTube said it had issued a warning to RT before shutting the two channels down, but the move prompted a threat from Moscow to block the video site.
It is not the only social media giant grappling with how to deal with the spread of COVID-19 conspiracy theories and medical misinformation in general.
Facebook this month launched a renewed effort to tackle violence and conspiracy groups, beginning by taking down a German network spreading COVID-19 misinformation.
YouTube said content that “falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them” will be taken down.
“As with any significant update, it will take time for our systems to fully ramp up enforcement,” YouTube added.