YouTube Aims to Stop Spread of ‘Cancer Misinformation’ With Updated Policy

YouTube Aims to Stop Spread of ‘Cancer Misinformation’ With Updated Policy
YouTube logo on display during LeWeb Paris 2012 on Dec. 4, 2012. (Eric Piermont/AFP via Getty Images)
Mary Gillis
Updated:
0:00

YouTube on Tuesday announced a major step in dealing with medical misinformation policies, including “removing cancer misinformation.”

The video-sharing and social media platform is slated to begin a mass takedown of videos promoting cancer treatments “proven to be harmful or ineffective” or content that “discourages viewers from seeking professional medical treatment,” according to an official blog post. Examples include content claiming foods like garlic cure cancer or videos that push viewers to opt for vitamin C over radiation therapy—neither of which have scientific merit, the post said.

The enforcement strategy builds upon policies and lessons learned as the media giant continues to tackle medical content saturated with inaccuracies around topics like COVID-19, vaccines, and reproductive health.

YouTube’s framework outlines a plan to protect viewers, creators, and partners by streamlining its medical misinformation guidelines into three categories: Prevention, Treatment, and Denial. Content that contradicts or disputes information from prominent health authorities or agencies like the World Health Organization (WHO) will be removed. As of now, first-time guideline offenders will receive a warning, and their links will be taken down. Three violations within 90 days will result in channel termination. However, this could change, as the policy has not yet been implemented.

YouTube is a massive source of information and the users who upload medical information do not necessarily have medical backgrounds or experience; as such, some of the content may not be fully supported by evidence. A 2021 paper analyzed the content and reliability of 40 pediatric cancer clinical trial videos and found that more than half were “misleading with serious shortcomings.” A similar study published in 2022 discovered that 98 percent of YouTube videos on prostate cancer had moderate to high levels of misinformation related to screening recommendations.
At the same time, YouTube and other social media platforms have faced criticism for labeling content as “misinformation.” The Epoch Times previously reported on a lawsuit filed by Democratic presidential candidate Robert F. Kennedy Jr. against YouTube and its parent company, Google. The suit claims Mr. Kennedy’s First Amendment rights were violated as a result of an intentional “censorship campaign” designed to silence his views on vaccines. Meta has also been heavily criticized for its lack of transparency and consistency in deciding to remove certain COVID-19-related content from its platform over others without clear explanation.

The YouTube blog authors wrote: “Looking ahead, we want to make sure there is a robust framework to build upon when the need for new medical misinformation policies arises. We’ll continue to monitor local and global health authority guidance to make sure our policies adapt. We want our approach to be clear and transparent, so that content creators understand where the policy lines are, and viewers know they can trust the health information they find on YouTube.”

Mary Elizabeth Gillis is a health reporter and cardiopulmonary specialist with over a decade of experience. After graduating with her doctorate in applied physiology, she earned a master of science degree in journalism from Columbia University.
Related Topics