One of the first things that I tell students in my Data Literacy and Data Visualization course is that, when they walk in the door, they should leave their ideological predilections behind. I don’t care whether they’re Sanders socialists or Rockefeller Republicans—the point of the class isn’t to learn how to support Team Red or Team Blue. Our goal, to borrow the beautifully succinct subtitle from Gapminder.org, is to achieve “a fact-based worldview.”
I’m far from alone in this pursuit. While my colleagues study politics for a living and often take clear positions on specific issues, the overwhelming majority are pretty circumspect about expressing any sort of party affiliation. To some extent, I think that’s because our worldviews don’t map very well to party platforms. For the most part, though, we realize that impartiality is essential to our ability to function as researchers and educators. That’s why, in my “How to Lie with Data Visualization” lecture, I point out the disingenuousness of both Washington Monthly’s change-in-change-in-unemployment graph and the Heritage Foundation’s “26 months of gas prices” graph. It’s important for young citizens to recognize that the truth has no political affiliation.
That’s why I was incensed after reading the Wall Street Journal‘s editorial, “Scientific Fraud and Politics.” The Journal pounces on the LaCour scandal, arguing that the findings got a free pass into Science magazine in part because they “flattered the ideological sensibilities of liberals.” The editorial then generalizes from this one instance in such a breathtaking manner that “sweeping” doesn’t quite seem to do it justice:
Similar bias contaminates inquiries across the social sciences, which often seem to exist so liberals can claim that “studies show” some political assertion to be empirical. Thus they can recast stubborn political debates about philosophy and values as disputes over facts that can be resolved by science.
It’s easy to dismiss the Journal’s editorial page as being rather extreme (or, more to the point, just terrible to the point of irresponsibility). To do so misses the real importance of the issue. This argument will almost certainly come up again and again in the run-up to the 2016 elections. It will be a talking point for any politician whose positions are inconveniently at odds with scientific findings. To the extent that it resonates with voters, it will further degrade the role of scientific knowledge in guiding public policy. Worse, and perversely, the discrediting of science may give more weight to policy arguments that specifically run contrary to scientific findings.
Fortunately, there are two major holes in the Journal‘s reasoning. The first is that, as Gary King argued, this is how science actually works. The fact that something like the LaCour-Green study can be discredited is crucial: as Karl Popper famously argued, a science is only a science if its claims can be disproved. The fact that studies can be challenged and their findings overturned should increase our confidence in the findings that survive the process.
Second, the majority of studies prior to LaCour and Green (2014) pointed to a very different conclusion regarding the ability of canvassers to change people’s minds. As Green himself put it,
Conventional wisdom was that a canvasser might prompt you to rethink your stance on a controversial issue for a few days at most, but that once you went back into your social milieu, your opinion would snap back into accordance with your preexisting views.
If, as the Wall Street Journal editorial suggests, ideological biases are endemic in academic studies, why is it that such a large body of academic literature prior to LaCour-Green pointed to conclusions that were not flattering to those biases?