John Stossel attacked the methodology of a Department of Education study demonstrating nearly identical levels of academic achievement among public and private elementary school students, claiming that "[t]he researchers tortured the data" by using regression analysis -- a universally used statistical tool that even Stossel admitted is “valid.”
Statistician Stossel: Researchers in school study “tortured the data” by using standard, universally accepted method of analysis
Written by Simon Maloy
Published
In his July 26 nationally syndicated column, ABC 20/20 co-anchor John Stossel attacked the methodology of a recently released Department of Education study demonstrating nearly identical levels of academic achievement among public and private elementary school students. Stossel, an advocate of school voucher programs, claimed that "[t]he researchers tortured the data" by conducting regression analyses. Regression analysis is a universally used statistical tool to isolate the factor being studied -- in this case, achievement by public versus private school students -- by statistically controlling for the effect of other factors. Stossel then acknowledged that "[m]aybe it's unfair to call that 'torturing the data,' " noting: “Such regression analysis is a valid statistical tool.” But Stossel then resumed his attacks, claiming that regression analysis is “prone to researcher bias” and constitutes "[s]tatistical hocus-pocus."
Stossel never explained how he came by his belief that regression is “prone to researcher bias” -- in fact, there is nothing “biased” about controlling for factors like race, income, or parents' education when examining student performance. Indeed, if one's goal is to compare student achievement in private schools to that in public schools, one has no choice but to control for other factors that also influence student performance in order to isolate the effect of the different types of schools. Controlling for the influence of other variables is not “hocus-pocus”; it is among the most basic techniques of statistical analysis, used by researchers of every ideological stripe the world over.
According to the study's description on the Department on Education's National Center for Education Statistics website:
This study compares mean 2003 National Assessment of Educational Progress (NAEP) reading and mathematics scores of public and private schools in 4th and 8th grades, statistically controlling for individual student characteristics (such as gender, race/ethnicity, disability status, identification as an English language learner) and school characteristics (such as school size, location, and the composition of the student body).
Stossel also suggested that The New York Times and the “mainstream media” were “so eager to defend” the study and public schools because of “the editors' ” dislike of conservative Christian schools, the Bush administration, capitalism, and free markets.
From Stossel's July 26 column:
Then why did the new study conclude that public schools performed as well?
The researchers tortured the data.
It seems the private school kids actually scored higher on the tests, but then the researchers “dug deeper.” They “put test scores into context” by adjusting for “race, ethnicity, income and parents' educational backgrounds to make the comparisons more meaningful.”
Maybe it's unfair to call that “torturing the data.” Such regression analysis is a valid statistical tool. But it's prone to researcher bias. Statistical hocus-pocus is not the best way to compare schools. “Ideally, to ascertain the difference between the two types of schools, an experiment would be conducted in which students are assigned (by an appropriate random mechanism) to either public or private schools.” That quote, believe it or not, is from the study. But the ever-scrupulous journalists at The Times didn't find that “fit to print.”
Why are the mainstream media so eager to defend a unionized government monopoly? Maybe The Times gave the “adjusted” test data (and an earlier version of it published in January) so much play partly because of the editors' dislike of “conservative Christian” schools (which did poorly in the study) and the Bush administration (which has talked about bringing market competition to education).
But I suspect the biggest reason is that the editors just don't like capitalism and free markets.