QAnon groups are taking advantage of Facebook's inconsistent ban enforcement to coordinate jumps to other platforms
Written by Olivia Little
Research contributions from Alex Kaplan & Rhea Bhatnagar
Published
Yesterday Facebook announced that the company is allegedly removing QAnon-affiliated groups and pages from the platform, an action that is long overdue. However, almost 24 hours later, many QAnon groups remain accessible and are giving followers of the conspiracy theory a space to actively discuss tactics for evading the digital purge, as well as allowing group administrators to organize their members to jump ship to other, less-moderated platforms.
Followers of the QAnon conspiracy theory allege that President Donald Trump is secretly fighting members of the “deep state,” a supposed cabal of satanic high-ranking officials who they claim are secretly operating global pedophile rings. QAnon followers use a variety of “Q drops” -- clues posted to online message boards from the central figure in the conspiracy theory, known as “Q” -- to substantiate their beliefs. QAnon supporters have been linked to numerous threats and acts of violence in recent years, and an FBI field office listed the conspiracy theory as a potential domestic terrorism threat in May 2019.
By announcing the ban without thoroughly removing QAnon accounts, even QAnon accounts are acknowledging that Facebook gifted these groups with critical extra time to rehome their tens of thousands of followers and allow the conspiracy theory movement another chance at life online, such as on Parler, MeWe, and others. (See examples here and here).
The story is much the same on Instagram, which is owned by Facebook, with QAnon-promoting accounts and their followers discussing plans for content and community migration. This should not be framed as a positive PR moment for Facebook, but rather as a lesson for other social media platforms about the consequences of failing to comprehensively remove QAnon content before it can spread.
This follows a similar pattern as when Reddit finally banned the pro-Trump subreddit “r/The_Donald” in June, but had previously implemented gradually harsher restrictions that gave moderators time to organize and jump to a backup forum, TheDonald.win, which has since allowed other banned or incendiary subreddits filled with toxic content to find a new home. Previously, r/The_Donald had been the source of frequent violent discussions and platform violations.
As a result of Facebook’s gradual enforcement rollout, Media Matters has identified active QAnon Facebook groups discussing plans to jump from Facebook to alternative platforms like Parler, as well as other fringe social sites. By warning of upcoming restrictions but not universally pulling the plug on QAnon groups, Facebook has enabled members to organize their migration to other platforms to continue spreading dangerous misinformation and conspiracy theories, and where -- like TheDonald.win -- they may face even less moderation.
Facebook has enabled the exponential growth of QAnon by allowing its followers to exist and recruit on the site. Congratulating Facebook for purging the conspiracy theory from its platform at this point is like thanking someone for belatedly cleaning up a mess they allowed to fester for years. The damage is done.