By Staff Writer Meg Sullivan
YouTube’s algorithm has been recommending a number of videos of YouTubers promoting disproven cures to cancer. BBC investigated YouTube in ten languages, and there has been about eighty videos found of misinformation in ways to cure cancer. YouTube tends to advertise big named brands but, also, strangely fake cancer cure videos.
The meaning of YouTube’s algorithm is it renders the most personalized content to its users. YouTube analyzes the behavior of the viewer based off what videos they watch so it can recommend and promote videos of similar behavior. However, this system is not perfect, for it does not know what information in videos are credible and non-credible.
Some YouTubers have claimed they have a “cure” for cancer with substances such as turmeric, baking soda, fennel seeds, and vegetables. There have also been claims of juice diets and extreme fasting, baking soda, donkey’s milk as “cures” for cancer. Professor Justin Stebbing, a cancer expert at Imperial College London, explained how these claims are considered fake.
Stebbing claims there are no data or clinical trials to suggest juicing can cure cancer. In addition, Stebbing responded to the baking soda claim with, “some alternative therapies such as baking soda when consumed in certain amounts or in certain ways, I think can be harmful as well as not having any positive effects. They can actually be detrimental to health. And I think baking soda is one such example. An Italian doctor [has been] convicted of manslaughter for using it as a treatment.”
Ads before this video were huge brands like Samsung, Grammarly, and Heinz that imply these YouTubers are getting paid for these clips of misinformation. The big named brands’ ads that appeared before these videos deny any correlation to these videos and contacted Google to have their ads be stripped away from fake cancer cure videos.
YouTube has claimed, “We have taken a number of steps to address this, including showing more authoritative content on medical issues and removing ads from videos that promote harmful claims. Our systems are not perfect, but we’re constantly making improvements, and we remain committed to progress in this space.”
I believe since YouTube’s algorithm is a computer, it wouldn’t instinctively recognize what is credible and what is not. There are millions upon millions of videos on YouTube, but there should be a way that can prevent these types of videos from existing. YouTube already has a ban on harmful content that promote “dangerous remedies or cures: content which claims that harmful substances or treatments can have health benefits,” in their Community Guidelines.
What is tricky, however, is that YouTube reduced their recommendations of content that could harmfully misinform, but this change only affected recommendations of a small set of videos in the United States, which didn’t apply to languages other than English. Most of the fake cancer cure videos were mainly in languages that were not in English…
When the YouTubers of the unproven cancer cure videos were confronted by BBC, many either deleted their videos or went silent. It’s crazy that these YouTubers were very suspicious about the confrontation. It is sickening that they were making money off of these videos that could really harm someone.
Stebbing gave a warning of, “I’m not saying that conventional medicine has all the answers, because it doesn’t. But what I am saying is that be very careful with some alternative remedies on the unfiltered internet making totally baseless claims.” It seems unreal with what some people will put out in the world on the internet, but everyone should keep these words of wisdom in mind.