The Torture of Government Statistics

The Torture of Government Statistics
Elnur/Shutterstock
Jeffrey A. Tucker
Updated:
0:00
Commentary

Ever since I started writing about fishy government economic statistics, I’ve been flooded with a fun series of letters from current and retired bean counters. They are thrilled that I’ve taken up the topic and have added various insights. The most compelling point I’ve seen—one that had not occurred to me—comes down to the innumeracy of the employees themselves. They lack the basic intuition to see where their figures just don’t make sense.

My correspondent blames technology. Back when mathematicians and students used slide rules, they had to keep their wits about them, deducing the larger numbers from smaller digits and have the capacity to manipulate decimal points in a way that keeps them consistent. A numbers sense was always there to test the results against core rationality.

That ended once the calculator came along. The calculator did the work so that the human brain no longer had to, and that broke the intuitive skill that was necessary in the past. Having lived through this transition, I know precisely what he means. One day, people understood the logic of numbers; the next day, the skill was no longer required.

Then came the computer and all bets were off. Now people merely operate the tools without thinking, and have no idea what to do if it spits out the wrong answers, if the operator even recognizes that this is taking place. My correspondent assumes that most data collectors in government do routine jobs now, just as those reporting the data to the government do routine jobs too.

They all operate within a system. The system itself might be widely considered to be broken but no one has the incentive to fix it. It just keeps going on as is because no one in particular is held responsible. That is why the GDP figures are not fixed to zero out government spending even though we long ago realized that government spending makes no net contribution to output. And it is the same with many features of the inflation index and the jobs data. Everyone knows about the undercounting and the overcounting but no one is in charge of fixing the problem. So it never gets fixed.

It’s hardly a new problem. This issue has vexed government data collection for a very long time.

A quick story about a pioneering economist: his name was G. Warren Nutter of the Universities of Chicago and Virginia. He had a sense that the Soviet Union’s economic data was suspect. So he did a deep dive. At the time, in the 1950s and ’60s, most economists predicted that the Soviet GDP would soon outpace the United States. They concluded this based on existing data and growth rules, using a ruler to see where things would land in 5 to 10 years.

Nutter had grave doubts and offered revised figures that covered the entire period. He concluded that there was real and astounding growth from 1925 through the Second World War—human and natural resources were newly deployed—then the problems began. The economy never recovered, and cheating on data began to replace truth and honesty. The system began to generate fake numbers. He concluded that the United States was far ahead in economic growth and that the Soviets were headed in the wrong direction.

This was already in 1962. Most economists rejected his thinking but he was proven correct after the end of the Cold War. Instead of mighty industrial collapses, what we saw instead was a decrepit illusion with broken and rusted everything, a place where nothing worked, a land of deprivation, black markets, lies, and general economic ruin. The reality was even worse than anything Nutter could have imagined.

Keep in mind that Nutter was a huge outsider. The predictions that the Soviet Union would outperform the United States were in every mainstream textbook—I recall this from the one I first used!—and this was true all the way through 1988 or so, if you can believe it. For this reason, the whole of mainstream economics put down Nutter’s work, never taking it seriously and dismissing it as the work of a crank.

He has been proven correct on every point, but of course still not really given any credit.

I’ve been thinking of this often simply because I’ve wondered to what extent the United States today might be subject to some similar forces. Bureaucracies do this: they generate the answer that the politicians want. And the more complex the system, the fewer checks there are on the system that generates results for which no one in particular is responsible.

If it could happen there, why not here? So I pulled out Nutter’s old book and reread it. I was not disappointed. Here is a passage to share:

“Fault can be found with the economic statistics of every country. They represent, in the first place, a mere sampling of the unbounded volume of data that might be recorded. They have been collected with specific objectives in mind—more varied and far-reaching in some countries than in others—and will therefore be of varying use depending on the purposes they are made to serve. They contain, in the second place, errors introduced at different stages of observation and assemblage. These will depend on the state of statistical literacy among the collectors and suppliers of data, on the effort expended on record-keeping, and on the degree of active competition in gathering and analyzing data. They are, finally, subject to manipulation and distortion by parties with a stake in the figures, checked only to the extent that there are independent factseekers and fact-gatherers with competing interests. No government or other statistical agency can be relied upon to resist the temptation to stretch figures to its own account if it feels it can get away with it.”

That last sentence strikes me very hard. Obviously the Biden administration has had an extremely strong incentive to generate good-looking data. We’ve known for a long time that the results have contradicted all alternative sources. We can see grocery prices and we know for sure that they are up more than 20 percent over four years, and it is the same with housing and insurance and health-care insurance. In many cases, the private sector is generating results that are twice as high than is being reported.

We know for sure that the jobs data is not adding up. And so on it goes.

Who has the incentive to fix the data reporting? No one. Who has the incentive to tweak its collection, assembly, and distribution in ways that make it look better than it is? The party in power. We know for a fact that this went on for many decades in the Soviet Union. We know it happens in China now—if we can manage to get any data out of China at all. And we know it happens in every Latin American country plus North Korea and probably Russia right now.

Why not the United States? Of course it happens here and probably has been going on for a long time. I’m quite certain at this stage that a seriously realistic accounting of the last four years will show no recovery in real terms from March 2020 until now. But when will the revisions come? Very likely, the answer is never.

What is the old line? Data can be tortured until it confesses. I know from personal experience that this goes on in every science within academia every day, all in the interest of resume padding. Why would this not be going on at government agencies? Of course it goes on. With the great G. Warren Nutter as our guide, we do well to be deeply skeptical, no matter how official or how seemingly credible the source.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
Jeffrey A. Tucker
Jeffrey A. Tucker
Author
Jeffrey A. Tucker is the founder and president of the Brownstone Institute and the author of many thousands of articles in the scholarly and popular press, as well as 10 books in five languages, most recently “Liberty or Lockdown.” He is also the editor of “The Best of Ludwig von Mises.” He writes a daily column on economics for The Epoch Times and speaks widely on the topics of economics, technology, social philosophy, and culture.