Earlier this week, I noted that a New York Times article about the prospects for health care reform misleadingly reported that the Democrats are trying to “advance the bill despite the loss of their 60-vote majority in the Senate” -- a wording that suggests that Democrats lost their entire Senate majority, when in fact they have simply lost a supermajority required to break a filibuster. Sure, if you happen to know the significance of 60 votes in the Senate, you might realize the Times was simply lazy in its wording -- but how many people know that?
Now comes the latest Pew Research Center News IQ Quiz, which finds that only 26 percent of Americans know that 60 votes are required to break a filibuster in the Senate. Let's assume -- just for the hell of it -- that New York Times readers are three times as likely as the general public to know this. That seems wildly generous to me -- but even that generous assumption still leaves a quarter of Times readers unaware of the significance of 60 Senate votes. Many of those readers likely interpreted “loss of their 60-vote majority in the Senate” to mean “loss of their majority” -- or were, at the very least, confused by the Times' language.
I know I'm obsessing over what may seem a minor point. But misleading and confusing reporting like this happens all the time, and is easily avoidable. And it highlights the fact that news organizations would serve their readers and viewers better if they made clearly informing them about things that matter a top priority -- and if they thought about their reporting in terms of some basic questions: What do people know? What do they need to know?
It's hard to read a typical news report about health care reform or proposals to stimulate the economy and conclude that the primary goal of the people who produced the report was clearly informing their readers about things they need to know. That may be the most staggering disconnect between what journalists should do and what they actually do.