Twitter just cracked down on QAnon for causing harm. Facebook has been making money off the conspiracy theory in ads.
Facebook has run at least 144 ads promoting QAnon since 2019
Written by Alex Kaplan & Kayla Gogarty
Published
Twitter has announced a major crackdown on accounts promoting the extremist QAnon conspiracy theory. The move comes as Facebook has allowed and profited off of at least 144 ads promoting the conspiracy theory in some manner for the past year and a half.
On July 21, Twitter announced that it would suspend thousands of accounts connected to QAnon, which has been tied to multiple acts of violence. According to NBC News, Twitter said it will also “stop recommending accounts and content related to QAnon, including material in email and follow recommendations, and it will take steps to limit circulation of content in features like trends and search.” NBC also reported that Twitter is now classifying “QAnon material and behavior as coordinated harmful activity,” and the network said an FBI memo warning that the conspiracy theory’s supporters presented a potential domestic terrorist threat factored into Twitter’s decision.
Since Twitter’s announcement, a CNN reporter and The New York Times have reported that Facebook is allegedly planning to crack down on QAnon in some fashion as well. Yet a review by Media Matters of Facebook’s ad library found that the platform has repeatedly featured and approved ads on Facebook and Instagram promoting QAnon since at least the beginning of 2019, meaning Facebook made money off of the conspiracy theory. Using the Dewey Square Adwatch toolset to analyze Facebook ad data, Media Matters found at least 144 ads promoting “QAnon” or its de facto slogan, “wwg1wga” (short for “where we go one, we go all”), that ran on Facebook and/or Instagram since the beginning of 2019, with at least 41 of those ads running in the last 30 days alone. These 144 ads could have yielded up to $14,356 for Facebook.
Two ads approved by the platform in February 2019, for example, came from an entity called “Concerned Citizens Of America” and featured multiple QAnon hashtags, such as “#wwg1wga” and “#Q.” One of the ads even featured an image of a “Q.” Another ad, in March 2019, promoted a YouTube show that included the phrase “#Qanon The punisher,” while another approved just this week is about a podcast that promotes QAnon.
Facebook has also repeatedly approved ads promoting QAnon merchandise. One ad in February 2019 from “Say What You Will Apparel” promoted a QAnon shirt, while other ads in April 2019 promoted QAnon hats described as “Official QAnon Campaign Rally Gear,” along with a “WWG1WGA T SHIRT.” In June 2019, a page called “Red Pill 45” promoted a “Q” shirt, and this past June the platform allowed ads promoting QAnon clothing and a QAnon mug to run for days. One approved ad from last October even promoted a “Q-Pizza Box.”
In September 2019, the platform also approved ads from “Q! Alerts” — a QAnon app developer whose app has since been banned from Google Play for “distributing misleading or harmful information” — selling “Q+” shirts (which is what QAnon supporters call President Donald Trump).
Additionally, the platform has approved multiple ads from political candidates featuring QAnon content. One candidate, Dan Belcher, who ended up dropping out of his Oklahoma congressional race, ran multiple ads with the QAnon slogan that also featured video in which he said the slogan. Other congressional candidates with QAnon-related Facebook ads have included a former candidate in California and current candidates in Florida and Hawaii.
Facebook has also allowed QAnon-related ads from candidates for other offices: It approved ads with QAnon hashtags for candidates running for the state legislatures in Hawaii and New York just this week, as well as an ad for a candidate for governor of Washington earlier this month (though the latter was eventually taken down).
Facebook has also allowed ads featuring other extremist content to run, such as those promoting the “boogaloo” movement and white nationalism.