PolitiFact’s misleading ‘Truth-O-Meter’ explainer

We saw something funny on PolitiFact’s Facebook page the other day. It was a video purporting to answer questions about PolitiFact’s trademarked “Truth-O-Meter” system for rating the relative truth of political statements. We would count the explanatory video as an unmitigated disaster.

Much of PolitiFact’s video deserves critical commentary. We invite readers to watch the embedded video via YouTube before we start our critique.

Best Information Available?

Our video host, PolitiFact’s editor of audience engagement Josie Hollingsworth, answers a question about whether PolitiFact ever changes a rating by claiming “All our fact checks are thoroughly researched with the best information available at time of publication.”

If PolitiFact wanted to claim that it tries to thoroughly research all its fact checks with the best information available at the time, we could take PolitiFact’s word for it. But when PolitiFact raises the bar from trying to do those things to having succeeded in that attempt with “all” its fact checks, we cannot accept the claim at face value without proof.

In 2021, in fact, PolitiFact removed from its “database” a fact check it published in 2020. In it, PolitiFact pushed the view that the COVID-19 virus was not a result of human tampering. An editor’s note on the archived fact check explained that the view it pushed is “now more widely disputed.” Even if that’s true, it was always disputed. The dispute simply didn’t count for anything in the archived version of the fact check. PolitiFact made that clear in its concluding paragraphs:

The genetic structure of the novel coronavirus, which has been shared by thousands of scientists worldwide, rules out the possibility that it was manipulated in a lab.

PolitiFact made its assertion sound beyond dispute even though it was not.

There’s no good reason to accept Hollingsworth’s claim. There’s reason to find the claim false.

PolitiFact Makes Mistakes

Hollingsworth admits PolitiFact makes mistakes that need correcting “from time to time.”

It’s true that PolitiFact makes mistakes, so credit to Hollingsworth there for telling the truth. But “from time to time” counts as the type of vagueness that may easily mislead her audience. What does “from time to time” mean?

PolitiFact has a page it says shows all its corrections and updates. As of Sept. 28, 2021, that page shows three fact checks corrected/updated in September 2021 and three articles corrected/updated in the same month. But PolitiFact’s method of tracking its updates and corrections tends to hide the correction rate. The lists are ordered by publication date, not by the date of the correction. So, if PolitiFact in September 2021 corrects a fact check published in August (any August!) that entry on the list will appear according to its date of publication.

That update method creates a lag in the statistics. We do not know how many corrections PolitiFact will make to September 2021 material by November, for example.

Moreover, PolitiFact simply does a poor job of staying up to date on its corrections and updates. We obtained a correction on this Sept. 20, 2021 article on Sept. 22, 2021. But it does not appear on PolitiFact’s list of corrected or updated stories. There’s no guarantee it ever will.

To obtain a more representative estimate of PolitiFact’s monthly error rate, we could look at the month of September in 2020 for both fact checks and articles. PolitiFact for now offers no means of looking up its full list of corrected articles. The corrections and updates page only shows “Latest Articles” with no link to a full list.

“Are the Ratings Sometimes Wrong?”

About 90 seconds after PolitiFact introduces this question, PolitiFact’s video explainer will admit that its “Truth-O-Meter” is a gimmick. The ratings, as we have documented over the years, are substantially if not wholly subjective (more).

How can PolitiFact’s opinion about the relative truth of a claim possibly be wrong? It’s an opinion.

Hollingsworth gamely tried to answer the question as though “Truth-O-Meter” ratings fit in the realm of objective judgment. She said PolitiFact rarely issues different ratings based on the “same set of facts.” Indeed, why do the exact same fact check more than once? But her claim simply ignores things like gender wage gap ratings that inexplicably draw from both ends of the “Truth-O-Meter” even when the justifications for the different ratings are virtually identical.

“We Publish Hundreds of New Fact Checks Every Month”

In flacking for PolitiFact’s historical archive of fact checks, Hollingsworth said: “We publish hundreds of new fact checks every month with the latest information.”

Hundreds every month? We were instantly skeptical.

We surveyed the fact checks and articles PolitiFact published in August 2021. PolitiFact published 155 works in total, 133 fact checks and 22 articles. That’s short of 200, which we consider the minimum number to legitimately count as “hundreds.”

For this article we did another survey, choosing March 2021 as the target month (plenty of days, and no shortage of political activity). We counted 186 fact checks and 40 articles, totaling 226 published works. Those numbers fail on a strict interpretation of “fact check” but qualify using a looser definition that would count PolitiFact’s articles.

Those results made us want a third sample. Our August sample was the month preceding the September release of PolitiFact’s video. We chose the full month with the second-best proximity to the publication of the video: July 2021. For July, we counted 144 fact checks and 16 articles, totaling 160 published works.

We conclude Hollingsworth exaggerated the number of fact checks PolitiFact publishes every month.

PolitiFudged Qualifications

Hollingsworth mentioned a pair of questions PolitiFact receives at about the one minute mark of the video. First, “What are your qualifications?” Second, “Who on staff can participate in a ratings decision?”

Hollingsworth appeared to fudge both the answers. She did not answer the general qualifications for PolitiFact fact checkers. Instead, she appeared to address the question of qualifications for staff who may cast a vote on the “Truth-O-Meter” rating. She effectively conflated the two questions into one:

The answer is that everyone in a ratings jury (three people–ed.) has multiple years under their belt at PolitiFact. And they’re familiar with the PolitiFact Rating System.

As Hollingsworth moved on to describe the “Truth-O-Meter” process, the video showed four PolitiFact staffers discussing (we assume) a rating.

Who’s the fifth wheel in the group of four? Three of the four were editors, presumably the voting members of the jury. The fourth was Miriam Valverde, a staff writer.

PolitiFact’s Aug. 24, 2021 video, “How PolitiFact decides ratings” mostly explained what’s going on. The earlier video showed editor Rebecca Catalanello and Valverde discussing what rating to use on Valverde’s fact check. Then the same video said the two took the issue to two other editors. The four then discuss the rating and the three editors vote on the rating.

Did the reporter not participate in the decision by virtue of not having a vote? That looks like Hollingsworth’s reasoning.

State Franchise ‘Star Chambers’ a Thing of the Past?

As an added twist, PolitiFact has in the past said that state PolitiFact franchises use their own “Truth-O-Meter” jury where possible. That presents the newly-minted franchises with the difficulty of populating their juries with editors having multiple years of experience with PolitiFact.

What are their qualifications? They are journalists.

Who can participate in a ratings decision? All the writers and editors get to participate in the discussion. Three get to vote. Maybe the three voting members of the jury have more than two years of experience at PolitiFact. But it’s unlikely it’s always that way unless PolitiFact has stopped the state operations from operating their own juries. If that change was made, it remains doubtful that Hollingsworth’s assurance applied before the change was made.

We might add that no “Truth-O-Meter” jury vote in 2007 or 2008 featured a member with more than two years of experience with PolitiFact’s rating system. PolitiFact started out in 2007.

‘A Reasonable Conclusion Using the Truth-O-Meter’

Hollingsworth continues her sales pitch for the trustworthiness of the “Truth-O-Meter” by assuring the audience that “Experienced staff using our questions about jurisprudence, interpretation, and more can come to a reasonable conclusion using the Truth-O-Meter.”

Hollingsworth’s description suggests PolitiFact may have a formal rubric for treating “Truth-O-Meter” ratings. But the description fell far short of giving any real assurance that the final decision represents anything more than a majority opinion. Indeed, Hollingsworth allowed that sometimes the jury cannot even reach a majority opinion where two out of three agree on one rating. A rubric based firmly on objective criteria would drive a fact-driven jury inexorably and unanimously to one “Truth-O-Meter” rating.

“Reasonable conclusion” in the context of the “Truth-O-Meter” means “opinion.”

‘Truth-O-Meter’: What’s in a Name?

After her description of the voting process, Hollingsworth turned to the question “Why call it a Truth-O-Meter?”

Hollingsworth said “It is not a scientific instrument” as a lead-in to comments by Neil Brown, president of the Poynter Institute which owns PolitiFact. Brown was the head of the St. Petersburg Times when the Times created PolitiFact in 2007.

Brown:

As I always said about the Truth-O-Meter, it’s a gimmick. But as gimmicks go, it’s a really good one.

What makes a gimmick a good gimmick? We can only guess as to Brown’s view on that, for his description ends there. But Hollingsworth stepped in to build the gimmick up as respectable social science:

Ratings are based on three people deciding on the relative truthfulness of a claim. At the end of the day, it’s a judgment call using a tried and tested system we’ve honed over the years.

Our ratings are for you, the reader, to help place political rhetoric on a spectrum of truthfulness.

It’s a social science, not a quantitative one.

A gimmick that doubles as legitimate social science seems about as plausible to us as a floor wax that doubles as a dessert topping. “Tried and tested system” communicates reliability, not gimmickry.

In fact, Hollingworth’s description seemed to contradict PolitiFact’s past position on its “Truth-O-Meter.” In 2009, for example, PolitiFact founding editor Bill Adair said the ratings are not social science:

Our ratings are journalism, not social science, after all, and the items are chosen based on our news judgment and staffing, not randomly selected.

We think Hollingsworth’s misperception about the “Truth-O-Meter” crept into the explainer video by mistake, expressing a view not held by organization leadership.

The “Truth-O-Meter” is a subjective gimmick, not social science. The name misleads as to its true nature, and PolitiFact’s descriptions inadequately clarify that nature.

Doubling Down on Deception

From the start, PolitiFact creator Bill Adair saw the fact-checking project as a means to create an informative database allowing readers to draw a picture of candidate truthfulness. The “Truth-O-Meter” was built to assist that aspect of the project with an attractive and graphic presentation. But Adair and PolitiFact have never plausibly addressed the problem of bias.

In 2011, Eric Ostermeier stumped PolitiFact by asking how it can justify presenting its ratings with a frame that suggests Republicans lie more without using a scientific approach. Joseph E. Uscinski and Ryden W. Butler followed in 2013 with The Epistemology of Fact-Checking, which pointed out numerous ways fact checker bias could influence ratings.

PolitiFact’s traditional inadequate response? The ratings are not meant as social science and PolitiFact is not saying Republicans lie more.

PolitiFact’s 2021 video explainer reverses course on whether the “Truth-O-Meter” counts as social science, making the admission that it is not a scientific instrument even more hollow than it was in years past.

We remain appalled at PolitiFact’s difficulty in reporting accurately about itself.

The “Truth-O-Meter” has misled PolitiFact’s audience since 2007, and the 2021 video explainer makes things worse.

Update/correction Dec. 14, 2021: The original version of this fact check stated “We obtained a correction on this Sept. 20, 2021 article on Sept. 22, 2022.” The correction was obtained on Sept. 22, 2021. Zebra Fact Check is indebted to Riley Michel, who pointed out the typo in a Facebook comment.

1 Comment

  1. Pingback: PolitiFact's misleading 'Truth-O-Meter' explainer II - Zebra Fact Check

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.