Angelo Carusone on AM Joy: In order to combat white supremacy, tech companies must embrace a holistic approach to enforcing terms of service

From the May 5 edition of MSNBC's AM Joy:

Video file

SHIREEN MITCHELL (FOUNDER OF STOP ONLINE VIOLENCE AGAINST WOMEN): So we’ve watched, though, both Facebook and Twitter, by the way, have said out loud that they won’t ban hate speech or use their algos to stop hate speech. And let's just be clear, we’re talking about hate speech. We’re just not talking about conservative bias, we’re talking about hate speech here. And they won’t ban hate speech because the algos may sweep up politicians in the sweep of their algos. That’s a problem only because -- why? -- politicians are peddling in hate speech. We should stop that, we should acknowledge that, and not say that this is just conservative bias, because that's not what this is. And both of these platforms have done that and both these platforms understand that this is part of the process. But also, Facebook had originally -- let’s just go back. Facebook had already originally said white nationalism on their platform was fine. They’re now saying it’s not fine and that little part of this conversation is also being missed, because they allowed that in the first place, they accepted it, and they participated, and that part of the conversation we’re still avoiding. So until the platforms accept that -- including myself, I was on YouTube in 2009 and was removed from YouTube for speaking up about the tech companies and the race discrimination and the diversity issues, and yet Alex Jones gets to say whatever he wants. And by the time they took Alex Jones off is when they decide to kind of give me my account back. We have a problem here in the fundamental frame of the tech companies in general. 

JOY REID (HOST): Yeah. So Angelo, as an activist here, Media Matters really sort of monitors this, kind of actively weighs in on it. What can the tech companies do about that?

ANGELO CARUSONE (PRESIDENT OF MEDIA MATTERS): So I think we should go back to 2016 and think about what are some lessons learned and how they learned them. So in May of 2016 when Mark Zuckerberg met with all those conservatives that were complaining about censorship, and then he went in and made a massive overhaul to the trending topics section based off of no data whatsoever, there was a three-fold increase in the reach of fake news, the sort of real fake news that’s on Facebook immediately after that. It finally exceeded the reach of actual news outlets. So they got worked, and so really what we’re talking about here today with these bannings are not necessarily just the bannings of these individuals, but I’d like to take a step back and look at the rules. 

What are the rules? And to Shireen's point, they did. One, we needed to actually get the rules about what was white nationalism even changed. That took advocacy, that took civil rights organizations and activists speaking up and getting Facebook, number one, to actually change their rules and say, oh, yeah, we’re going to include white nationalism and white supremacy and white separatism. And then separately, and I think this is where it comes full circle, is not just -- is then looking at the enforcement side of it. And the one thing about this action that makes me feel just a little bit better is that instead of using the basic keyword analysis that they did in the past, they took it to a little bit more of a holistic view, which is to say what are their activities and their actions, and are they actually contributing to harm of individuals? So taking a broader view actually allows for more robust conversation. It eliminates some of the free speech censorship concerns and the reduction of those ideas, while it also enhances the enforcement to get rid of some people that are trying to game the system and do some of these sorts of activities that are both abusive, but also designed to suppress the engagement and the participation of all kinds of communities, in particular women and people of color.

The full segment is available here.



Previously:

Facebook just removed six extremists from its platforms. Here's what should happen next.

Facebook said it was banning Infowars content from its platforms-- but several associated pages are still up

NBC's Ben Collins explains Twitter's failure to combat white supremacist content