YouTube made money from a coronavirus vaccine conspiracy theory after claiming to have banned it
Written by Alex Kaplan
Published
YouTube has made money off of multiple videos that push a false conspiracy theory about a potential coronavirus vaccine, despite previously claiming it had banned the conspiracy theory from the platform.
On October 14, Reuters reported that the platform had announced it was expanding its rules to “remove videos from YouTube containing misinformation about COVID-19 vaccines,” specifically “content with claims about COVID-19 vaccines that contradict consensus from local health authorities or the World Health Organization.” As an example of prohibited content, according to Reuters, the platform specifically mentioned the false conspiracy theory that claims “microchips will be implanted in people who receive the vaccine.” That conspiracy theory, often tied to Microsoft founder Bill Gates, had previously widely spread on multiple social media platforms, including YouTube -- and at least one poll suggests people have fallen for it.
Despite YouTube’s announcement, a review by Media Matters found multiple videos with at least 430,000 combined views that either directly pushed the microchip conspiracy theory or mocked rebuttals of it. (We looked on the tracking tool BuzzSumo for YouTube videos uploaded since October 15 with at least 5,000 views.) Some of the videos were monetized, meaning both YouTube and the accounts made money off of the conspiracy theory.
One video featuring ads was titled “Is The Coronavirus Vaccine, The Mark Of The Beast” and showed a clip of Gates’ wife, Melinda Gates, saying, “I know my husband is not vaccinating people and putting a microchip in their arm because that technology doesn’t even exist, and he's never uttered the words out of his mouth.” The clip was followed by a man waving his finger and shaking his head and a voiceover sarcastically saying, “Yeah, we believe you.” The voiceover in the video later said the two are “definitely not sincere and you can sense their wicked intentions,” speculating that they could be involved with creating a “form of identification” in the vaccine.
Another video featured a voiceover complaining about people microchipping their pets, saying people “automatically think that it's safe just like they automatically think that vaccinations are safe,” and suggesting that Bill Gates has a “depopulation agenda” and that people will be “chipped.” Another video from that same channel featured a clip of talk show host Stephen Colbert mocking the conspiracy theory and a voiceover saying, “Oh right, duh. That’s totally crazy but let's bring it to the public’s attention because if it was completely insane, you know, you wouldn't bring attention to it would you?” Both videos featured ads and also sold merchandise on the page (which YouTube may take a cut of).
In another video, which also had ads and sold merchandise below it, a man was featured discussing the “potential that is really inside” vaccines and claiming vaccines could force people to have “a unique identifier on them,” which is like “having fingerprints.” The man mentioned Gates as one of the people possibly behind this.
In yet another video, a man discussed President-elect Joe Biden’s plan to fight the coronavirus pandemic and said, “Here's what's messed up about this whole thing. There are some camps that say you have to obey the law of the land. So they're saying you have to let them vaccinate you and possibly let them … chip you. That’s very unfortunate.” The video, which was livestreamed, had “super chats” -- where people can pay the account to have their comments featured and which YouTube takes a cut of.
Other videos uploaded since YouTube’s October announcement of the ban included some falsely claiming that Gates had a patent for microchipping people and could be involved with the vaccines, that a “chip/vaccine can monitor your sexual activity,” and that Gates and others “believe that they can ... trick people” by denying the conspiracy theory.
Thanks to YouTube’s apparent inaction over the spread of the conspiracy theory, it has reached at least one influencer: Actor Letitia Wright in early December tweeted a recently uploaded YouTube video that pushed the conspiracy theory (that video has since been taken down for violating YouTube’s Terms of Service).
YouTube’s failure to crack down on the conspiracy theory despite promising to do so comes as the platform has repeatedly fallen short on enforcing its own rules, and as the platform has repeatedly monetized misinformation.