Facebook promised in August to remove content associated with “movements and organizations tied to violence” including the QAnon conspiracy theory, which an FBI intelligence bulletin labeled as a domestic terrorist threat in 2019. Even though Facebook promised to remove pages and groups featuring “discussions of potential violence,” Media Matters has identified over 40 violent death threats calling for George Soros’ execution in more than a dozen public QAnon groups on Facebook.
Facebook’s QAnon enforcement problem is not new -- the company has consistently struggled to hold violent QAnon groups accountable. Its initial policy update in August included limited information about previous enforcement actions: “As a result of some of the actions we’ve already taken, we’ve removed over 790 groups, 100 Pages and 1,500 ads tied to QAnon from Facebook.” At first glance the numbers look promising, but Facebook failed to include critical information about how large or active those groups were prior to termination. That omission may have inflated its community enforcement achievements, making it impossible to assess how much impact the removals actually had. Perhaps Facebook’s promise to actively remove violent QAnon groups was just an effort to gain positive PR, but the company’s inaction has created a dire social media landscape filled with growing threats of violence.
Members of QAnon Facebook groups are using the platform to call for the execution of liberal philanthropist and billionaire George Soros with graphic descriptions. Their reasoning differs depending on the group and post, but generally they claim that he is funding human trafficking and riots in American cities; neither allegation is true. The majority of threats directed at Soros were found in the comment sections of posts sharing these baseless far-right conspiracy theories.
Almost every popular post relating to Soros that Media Matters identified had at least one blatant death threat in the comments, including suggestions to shoot him, hang him, and more.
The posts were collected from 13 large, public QAnon Facebook groups via CrowdTangle, a social media analytics software owned by Facebook, between August 28 and September 28. The groups were identified as affiliated with the QAnon conspiracy theory by their name or description. While the groups vary, the example groups included below all either reference “Q” explicitly in the name or use the slogan “WWG1WGA” or “Q” in the description. There are over 99,000 total members in the 13 identified groups.
The research approach used to identify these violent groups was not exhaustive and required very little specific searching, suggesting that Facebook could be doing much more to actively identify and respond to basic patterns of QAnon supporters threatening violence on its platform. The longer Facebook takes to curb the dangerous spread of QAnon, the more these groups will continue to grow.
It is time for Facebook to start enforcing its own policies and remove QAnon groups engaging in a clear pattern of inciting violence. Below are examples of such activity.
Through The Looking Glass
Q Patriots 2020 🇺🇸