On CBSN, Media Matters' Angelo Carusone explains how Facebook is the “engine” of the fake news ecosystem
Media Matters named Facebook's Mark Zuckerburg the 2017 Misinformer of the Year because Facebook helps “spread misinformation simply because it's performing well”
Written by Media Matters Staff
Published
From the January 3 edition of CBS News' CBSN:
VLADIMIR DUTHIERS (CO-HOST): So, Angelo, explain the study and explain why Facebook is the most influential fake news source.
ANGELO CARUSONE (MEDIA MATTERS PRESIDENT): So, what the study did was sort of connect two piece of data together. They did a bunch of surveys of thousands of consumers of information and media, and -- on their patterns, on their habits, on their trust, whether or not they like fact checkers, trust fact checkers. And then they also had an enormous amount of data about how these users were engaging with information online, and what the study ultimately found were two critical pieces of takeaways.
One was that 25 percent of Americans consumed Facebook in the last -- fake news in the last six week of the election. So one out of every four Americans were consuming fake news. And, on top of that, that 30 percent, 30 percent of all of the fake news consumption in that period of time was driven by Facebook. So that was the first thing that it found. And then the second thing it found was that the consumers of fake news disproportionately don't even like fact checkers. So when they were asked about fact-checking broadly, they had very hostile and negative sentiment about it. And that means that Facebook's primary solution for the fake news problem, which is to serve these people fact-checks, is not going to work.
ANNE-MARIE GREEN (CO-HOST): Yeah, so that was going to be my followup question, because I thought that was really interesting that the study indicates that people would prefer -- significant amount of the population would prefer to get their information from fake news outlets, fake news websites, than the actual fact checkers. Their sole job is to make sure that they're providing them with accurate information. So if they don't believe the fact checkers and they prefer to believe the fake news sites, then how do you solve this problem?
CARUSONE: I think the biggest way that you solve this problem, and knowing that Facebook is actually a primary source of it, is helpful actually in solving the problem. You can't force those people to change and you're not going to be able to educate them about the importance of fact-checking. But, what was really critical when you really dig into this data and other data that Media Matters has sort of compiled, is that the engine of the fake news ecosystem is actually Facebook, it's the traffic that Facebook provides. And what we found in our studies is that the overwhelming majority of these fake news websites, they're not even ideological. They're just doing it for clicks that they then convert into revenue, they're doing it for traffic. And so if you take away their ability to -- or reduce their ability to get enormous amounts of traffic, they actually atrophy and go away. They stop producing this kind of content. And what that means is one critical thing: Facebook right now treats the way that it gives you content based on only one major metric, which is whether or not you're engaging with it. The more you like it, the more Facebook will share that to more people.
And instead, what Facebook should do is treat their news feed the way that Google does. It's not that they punish content, they're not censoring content, but there should be an authority component to it. So if it's a brand new website that just spun up that has no reputation, it should certainly not be treated the same as CBS News when it comes to the algorithm deciding whether or not to serve it to more and more people just because it's getting a lot of clicks. That would go almost all the way in solving Facebook's fake news problem, and we know that because even this study shows it. Less than a percent of fake news was served up by Google in the same period of time that this study looked at, and that's because Google includes an authority component.
[...]
The solution, actually, is to come up with a policy prescription that is content-neutral, geographically-neutral, it doesn't matter who the bad actor is, but to prevent them from exploiting the ecosystem. And that exploit is right now largely centered around the fact that Facebook wants to serve you content that you engage with. And so they don't care where it comes from or who it's coming from. And in fact, Facebook's own algorithm turbocharges content, meaning, if a bunch of people like something really fast, Facebook says, “Wow, this is interesting, I'm going to send it to even more and more people, I'm going to actually help spread it.” That's why Facebook was named Misinformer of the Year for us, because they actually help spread misinformation simply because it's performing well. Google doesn't do that, any good search engine doesn't do that. They have other problems, but they're certainly not helping spread intentional misinformation, whether it's from a foreign actor or just some random kid just because it's performing well.
And so the solution is to simply say authority matters, and so we're not going to turbocharge content unless it comes from a reputable source, and you base a reputable source the same way that search engines do. How long it's been around for, whether other reputable sources have referenced it or linked to it. There are ways to solve this that don't require censorship and that don't require us to guess on where the next threat's going to come from. And Facebook's refusal to actually engage in it has been a big part of the problem.
Previously:
Misinformer of the Year: Facebook CEO Mark Zuckerberg
New study: Facebook “the most important mechanism facilitating” fake news