Political scientist Brendan Nyhan points to a Boston Globe essay by Joe Keohane (based largely on research conducted by Nyhan) about the stickiness of misperceptions and the challenge this poses for those who think it is important for people to not be wrong.
Keohane notes that “Americans lack even a basic understanding of how their country works.” Or, as Princeton's Larry Bartels put it in 1996: “the political ignorance of the American voter is one of the best documented data in political science.” That, I think, is quite clearly true, and is an indictment of the news media as much as (or more than) it is a criticism of American voters.
Keohane also points out that studies have found that “misinformed people often have some of the strongest political opinions.” Again, probably not surprising.
The really troubling part, though, is that several studies have concluded that presenting people with the facts may not do much to convince them. Keohane summarizes Nyhan's findings:
Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
Keohane explains:
What's going on? How can we have things so wrong, and be so sure that we're right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn't. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge.
Nyhan suggests one solution to this problem:
Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on 'Meet the Press' and you get hammered for saying something misleading,” he says, “you'd think twice before you go and do it again.” Unfortunately, this shame-based solution may be as implausible as it is sensible.
I am a huge fan of increasing the “reputational costs” of peddling misinformation -- of not only shaming, but shunning, too.
But, as Keohane suggests, that isn't sufficient, both because there are plenty of people who are incapable of being shamed, and because neither journalists nor politicians demonstrate much interest in shaming their peers.
Keohane's essay, I think, reinforces something I've been arguing for years: the importance of repetition. It isn't enough for news organizations to occasionally correct false statements; they must do so every time they quote, paraphrase, or refer to a false statement. And it isn't enough occasionally give readers and viewers basic information about public policy debates -- it must be done over and over again. Such an approach would, I think (hope?) have two benefits: It could make it more likely that voters internalize the truth before misinformation takes hold and the repetition could break through the barriers presented by preconceived notions -- it seems likely that it's harder to dismiss something you hear a dozen times than something you hear once.
And that, by the way, is why I keep coming back to this point...
UPDATE: Also, the way in which false claims are debunked is important...
UPDATE 2: See also: Assessing the media's health care coverage