Maybe political scientist Brendan Nyhan is closing in on the truth.
Nyhan, of Dartmouth College, was part of a New Hampshire Public Radio interview on Dec. 7, 2015.
On the bad side, Nyhan still clings to his dubious fact-checking “backfire effect”:
“My research suggests that it’s very hard to change peoples’ minds about these very controversial issues and political figures. And in some cases, corrective information can actually make the problem worse.”
The problem? Nyhan’s “backfire effect” cropped up when his experiment subjects had reason to mistrust the correction. It’s natural for someone who believes a statement to trust that statement more in the face of flawed criticism.
We think the implications for fact-checking are obvious. And it looks like Nyhan’s poised to cinch the connection:
“The job of fact checkers is to set the record straight, not to change people’s minds about who they’re going to vote for necessarily. But at the same time I think we can do a better job with fact checking of making the information in them credible to people who might otherwise be skeptical. One important way to do that is to draw on information that’s going to be credible (to) the audience we might expect to resist the fact check.”
Not coincidentally, we have always used that approach at Zebra Fact Check. We stick with sources skeptical members of our audience are likely to accept. We place a keen focus on avoiding fallacious reasoning in our arguments. We use words and phrases from the tribal vocabularies of both liberals and conservatives.
We’re confident Nyhan’s “backfire effect” correlates to fact checks that fail to follow Nyhan’s advice above. The “backfire effect” will occur with fact checks that make a weak or fatally flawed case.
But maybe Nyhan’s just reluctant to admit his earlier research was flawed.
“It really is a fundamental aspect of human psychology to want to believe something even if it may not be true. It’s hard for us to admit that we’re wrong and that our side is wrong.”