A number of seemingly harmless conspiracy theory TikTok accounts appear to be peddling dangerous misinformation to their unknowing audience. Even worse, TikTok’s recommendation algorithm appears to encourage users to follow accounts that push similar extremist misinformation.
While these accounts may look benign or silly on the surface, a deeper dive reveals a darker truth: They’re also disseminating far-right conspiracy theories.
Conspiracy TikTok, also known as “ConspiracyTok,” is a community that regularly discusses conspiracy theories. Because of the massive size of the community and popularity of the content, discussion topics widely vary. Some accounts are dedicated to documenting alleged proof of extraterrestrial life; others solely post flat earth conspiracy theories.
Research published in 2014 by the University of Chicago found that about “half of the American public consistently endorses at least one conspiracy theory.” Conspiracy theories have been (and most likely always will be) popular, but not every conspiracy theory is built the same -- and some have the potential to present material harm to their subscribers.
TikTok is encouraging users to follow far-right conspiracy theory accounts
Beyond the innate popularity of conspiracy theories, TikTok’s account recommendation algorithm (which is tailored to the “interests” or “connections” of an individual user) makes it easier for users to be pulled into a world of radical content. In one instance, when a user follows a seemingly harmless flat earth account, they get prompts to follow a slew of accounts pushing anti-vaxx misinformation, QAnon-related theories, COVID-19 denial, and anti-Semitic conspiracy theories. This pattern repeats when further following TikTok’s account recommendations.
The camouflage
A user curious about conspiracy theories runs the risk of inadvertently getting sucked into a far more malicious rabbit hole, which is why seemingly harmless conspiracy theory accounts posting far-right conspiracy narratives is uniquely dangerous.
For example, “Conscious Content” is an account with over 11,300 followers and an innocuous bio that reads, “Learn and inspire!” Some of its first videos are about Atlantis, TV show predictions, and the “amazing intelligence of mushrooms.” However, a closer examination of the profile reveals that the creator also reposts clips in support of far-right conspiracy theorist Alex Jones and believes that Jeffery Epstein was an Israeli spy.
This is not an isolated example, and in fact seems to be a pattern among other popular conspiracy theory accounts.
Another user, “jeff.speaks.facts” has over 157,500 followers and over 2.3 million likes. Their bio reads “Jesus is Lord and Savior” and the account appears to push conspiracy theories about celebrities. Yet again, a slightly closer look reveals that they have pushed the wildly anti-Semitic Rothschild conspiracy theory and received over 24,800 likes on that single video.
Similarly, “we.are.the.cure,” an account with over 31,900 followers, frames itself as a spirituality account -- pushing conspiracy theories through a religious lense. However, mixed in is a video pushing the Rothschild conspiracy theory with a picture of what appears to be the devil above the name. The caption of the video encourages users to tag someone who “doesn’t know this.” The account also uploaded a 12-part video series about the Illuminati and its “secret plans for the world.”
Some of the dangerous conspiracy theories circulating in ConspiracyTok
Some accounts that appear to focus on niche conspiracy theories or spiritual enlightenment are also posting dangerous COVID-19 misinformation and QAnon conspiracy theories, deceiving unsuspecting followers.
Misinformation about adrenochrome (a substance QAnon followers believe is harvested from the blood of children and then consumed by “global elites”) appears to be a popular piece of misinformation circulating in conspiracy theory communities.
- “ConspiracyRebels” has a profile picture of a triangle and eye, commonly associated with the Illuminati, and many of the account’s latest posts are about ancient aliens. Yet, just days earlier the same account posted a video purporting to show “Adrenochrome” with the caption, “They Are Lying To Us.”
- “Jabarr,” another conspiracy theory account with over 17,200 followers and bio reading “Knowledge is Power. Knowledge is truth” also posts about adrenochrome.
- “Deep Down The Rabbit Hole,” an account that claims to focus on “health” and “spirituality” and has over 7,500 followers posted a video about adrenochrome. The account also uses the QAnon-affiliated hashtags “#thestormisuponus,” “#deepstate,” and “#cannibalism.”
- An account with over 35,400 followers and “Conspiracy is facts” written in their bio seems like a harmless conspiracy theory account, but frequently posts Pizzagate and QAnon conspiracy theories.
- Infinite.energy, an account with over 127,500 followers and over 1.6 million likes, presents itself as a spirituality account, posting videos about “creating your own reality” and “how to manifest.” A deeper dive shows that the account has promoted conspiracy theories about the New World Order and has used the hashtag “Q.”
COVID-19 misinformation widely circulating on TikTok is a documented problem, and an issue that the platform has promised to aggressively combat. Yet, harmful anti-vaccination and COVID-19 misinformation routinely circulate in conspiracy theory spaces.
- “This shot will rearrange your DNA. They’ve planned this for one hundred years, it is the mark of the beast,” says ember_inside_me1, a conspiracy theory account with the Illuminati eye icon as their profile picture. The account has over 27,500 followers.
- One account called “TruthSeeker1111” with the bio “Truths, yoga, self inquiry” seems to be a spirituality and enlightenment account. Yet, the account is also peppered with anti-vaccination and COVID-19 denial videos.
- Another conspiracy theory account, “Opened Eyes,” claims to “aid spiritual growth” in other users and has over 15,500 followers. Many of their posts preach enlightenment, but scattered in their feed is a variety of COVID-19 misinformation. “You probably won’t be getting the vaccine…right? Educate people why…” reads overlaid text.
TikTok is failing its users
By not diligently moderating extremist content on its own platform, TikTok is allowing for the rapid spread of far-right misinformation to an audience of young users. All of the extremist content identified in this report is supposedly prohibited by TikTok, but remains widely circulated.