Melissa Joskow / Media Matters
Facebook just announced the removal of a notable cross-section of extremists from social networks Facebook and Instagram, including neo-Nazi sympathizer Milo Yiannopoulos, anti-Muslim bigot Laura Loomer, far-right YouTuber Paul Joseph Watson, conspiracy theorist Alex Jones (again), and white supremacist Paul Nehlen, a failed Republican congressional candidate, while also removing Nation of Islam Minister Louis Farrakhan for his record of anti-Semitic rhetoric. This move by Facebook is a step in the right direction, opening doors to making its platforms safer and inspiring some optimism that the tech company might be capable of taking responsibility for the ways its platforms have empowered extremists. But it is clear that there is more to do.
A long record of hate
The newly banned figures owed their influence to the massive reach they were allowed to cultivate through Facebook and Instagram, using their accounts to post content that dehumanized entire communities, promoted hateful conspiracy theories, and radicalized audiences -- all while they profited from directing people to their own websites.
After being banned from most other social media platforms, including YouTube, Twitter, and Facebook itself, Jones found a safe haven on Instagram, where he had continued to post Infowars content that featured hate speech, promoted conspiracy theories, and amplified other extremists.
Similarly, Laura Loomer used her private Instagram account to post content that violated the platform’s hate speech and bullying policies, consistently spewing dehumanizing anti-Muslim rhetoric.
For his part, Yiannopoulos was banned from Twitter in 2016 for leading a racist harassment campaign against actress Leslie Jones, but the former Breitbart editor went on to use Instagram and Facebook to spread hateful anti-Muslim rhetoric and mock people of color.
Watson, who had long been affiliated with Jones’ Infowars outlet, used Facebook and Instagram to push anti-Muslim content, masquerading his hateful rhetoric as thinly veiled irony, and regularly maligning Islam as “incompatible with western society.”
White supremacist Nehlen -- who has publicly stated that a “race war” needs to be “kick[ed] off” in the U.S. -- had already lost his Instagram account after posting anti-Semitic memes, but he still had an active Facebook page he used to share anti-Semitic dog whistles and screenshots of neo-Nazi website The Daily Stormer with tens of thousands of followers, as well as to profit from running ads on the platform.
What comes next
It’s a welcome but long overdue step in the right direction that Facebook has now taken definitive action against some of the most glaring examples of toxicity on its platforms -- especially considering the tech company’s record of struggling to enforce policies that are effective in curbing the reach and influence of extremists. The company’s recent attempt to ban white supremacist content from its platforms proved insufficient, as its lack of specificity allowed extremists to continue posting racist content as long as they weren’t too explicit.
However, there are still a number of achievable measures that Facebook could take to make users safer and to convince the public of the company’s resolve to fight extremism. Shireen Mitchell, who founded Stop Online Violence Against Women and the nonprofit Digital Sisters to promote diversity in the tech industry, has explained how Facebook’s moderation policies have been weaponized to harass women of color -- especially if they’re advocating for social change. Speaking to Media Matters, Mitchell said Facebook has banned people of color and activists like herself as a result of posts that mention white people in the context of racism and white supremacy. Her experience is consistent with a Media Matters analysis of Facebook pages that showed that white supremacist content is often treated as equivalent to content from groups that actually fight oppression, such as the Black Lives Matter movement, seemingly treating white people as a protected group while ignoring the historical context of structural racism.
Some achievable measures that could help curb extremism while protecting users who experience oppression include:
-
Commit to enforcing standards against more codified white nationalism by more effectively pairing automated and human reviews to better identify violating content. Increasing the number of people tasked with platform monitoring and staffing those positions with culturally competent individuals would help identify white supremacists’ use of the coded extremist rhetoric and insidious false equivalences that artificial intelligence seems to be missing. Doing so would also help curb the uncritical amplification of dangerous content such as video clips of violent hate crimes or the manifestos of their perpetrators.
-
Proactively limit the visibility of content when its traffic is being directed from known toxic sources like anonymous message boards 8chan and 4chan. As reported by NBC’s Ben Collins, platforms are already able to identify traffic coming from toxic sources. In light of recent crimes in which perpetrators have gone on anonymous message boards to link to their Facebook accounts and broadcast mass shootings as extremist propaganda, the platform should more actively limit the visibility and spread of content that starts receiving high influxes of traffic from extremist sites.
-
Extend anti-discrimination policies currently applied to ads to include event pages and groups. Event pages and private groups are often useful tools that help extremists organize and mobilize. Existing anti-discrimination policies should also apply to content in these pages and groups.
-
Reassess fact-checking partnership with Tucker Carlson's Daily Caller, which has ties to white supremacists and anti-Semites. The Daily Caller has a long history of publishing white supremacists, anti-Semites, and bigots; just yesterday it was revealed that The Daily Caller has fired the managing editor of the affiliated Daily Caller News Foundation (DCFN) for his connections to white supremacists. DCFN provides significant funding to the Daily Caller's fact-checking operation, Check Your Fact. Daily Caller founder Carlson constantly echoes white nationalist talking points on his Fox News show. And yet Facebook has teamed up with Check Your Fact as a fact-checker.
-
Pay attention to the cross-platform influence of highly followed users. White nationalists often use platforms like Instagram to sanitize their images with lifestyle content while spreading extremist propaganda on other platforms. As Data & Society research affiliate Becca Lewis told Media Matters, influential extremists on Instagram “will simply mimic fashion, beauty, or fitness influencers, but will espouse white supremacist propaganda elsewhere. In those cases, Instagram acts as a kind of honeypot.” Lewis suggested Facebook emulate Medium’s cross-platform moderation approach, in which users that violate Medium’s content policies on other platforms get banned on Medium.
-
Increase transparency in metrics for third-party auditors. Experts have warned about the risks of Facebook’s most recent privacy initiatives that limit Application Programming Interface (API) access to researchers (or access to the tools that allows individuals unaffiliated with Facebook to build software that uses Facebook data), hide Instagram metrics, and prioritize groups on Facebook (which would allow propaganda and extremism to propagate unchecked). As BuzzFeed’s Jane Lytvynenko pointed out, the move makes it harder for researchers and experts to audit content and metrics on the platforms. While it might save the tech company some bad press, it hinders outside researchers in their efforts to identify and scrutinize security concerns.
Overall, FB frames these tools as privacy improvements but it’s not halting the vacuuming of data. Privacy doesn’t amount to platform security.
The changes are going to further hinder research into FB. Fixes suggested by lawmakers and academics, ultimately, weren't embraced.— Jane Lytvynenko 🤦🏽♀️🤦🏽♀️🤦🏽♀️ (@JaneLytv) May 1, 2019