Watch stocks you care about
The single, easiest way to keep track of all the stocks that matter...
Your own personalized stock watchlist!
It's a 100% FREE Motley Fool service...
They call economics "the dismal science." We got a good example why this week.
Three years ago, Harvard economists Ken Rogoff and Carmen Reinhart published a paper based on solid history and rigorous statistics showing that when a country's debt-to-GDP ratio breaches 90%, its growth plunges into negative territory. As highly respected economists, the paper influenced policymakers around the world.
"[I]t is widely acknowledged, based on serious research, that when public debt levels rise about 90% they tend to have a negative economic dynamism, which translates into low growth for many years," said EU Commissioner Olli Rehn in 2010.
"Economists who have studied sovereign debt tell us that letting total debt rise above 90 percent of GDP creates a drag on economic growth and intensifies the risk of a debt-fueled economic crisis," said Congressman Paul Ryan in 2011.
"It's an excellent study," said former Treasury secretary Tim Geithner two years ago.
But it wasn't an excellent study. A separate group of economists tried replicating Rogoff and Reinhart's results, and couldn't. Stumped, they asked for the actual spreadsheet used in the seminal study, and found it littered with data omissions and Excel coding errors. Rogoff and Reinhart showed economies with debt-to-GDP above 90% experience average GDP growth of negative 0.2%. Fix the math errors, and the real figure is positive 2.2%. Oops.
Economists and pundits have been floored at the discovery all week. As they should; it was a flagrant error.
But these kind of "now-you-know-it-now-you-don't" moments are more common than people think in economics.
Take the monthly jobs report. Almost every initial report is revised in subsequent months, often by a lot.
In September 2011, the initial report from the Bureau of Labor Statistics showed zero jobs were created that August. "Zero Job Growth Latest Bleak Sign for U.S. Economy" wrote The New York Times. "Hiring Grinds to a Halt" wrote CNNMoney. "President Zero." Wrote the Republican National Committee. "THE ECONOMY ADDED ZERO, ZIP, NADA JOBS IN AUGUST."
Except that, yes it did. Revisions later showed the economy added 132,000 jobs in August 2011, not zero. It was actually the third-best August jobs report in the previous decade. Few seemed to care about the revisions, or even notice. By then, the damage had been done.
Or take productivity. For most of the last decade, it was assumed that the American manufacturing sector became more productive, as employment shrank by output grew. "The decline in U.S. manufacturing employment is explained by rapid growth in manufacturing productivity over the past 50 years," said Columbia Business School dean Glenn Hubbard.
But maybe not. As The Washington Post pointed out, many economists now think the productivity numbers are grossly inflated, since determining whether, say, a car assembled in Ohio with Japanese parts should be counted as domestic or foreign manufacturing is a messy subject. Cost savings from outsourcing can mistakenly show up as domestic output. Adjust for that bias, and as much as half of manufacturing output growth between 1997 and 2007 melts away, according to economist Susan Houseman. As Rob Atkinson of Information Technology and Innovation Foundation put it: "I bought into this idea for a long time that it was superior labor productivity that caused most manufacturing job losses. Then I began to dig into the numbers."
Everyone knows Japan's economy has stagnated over the last two decades. Everyone, it seems, except Tokyo-based journalist Eamonn Fingleton. Japan, Fingleton notes, calculates GDP and inflation differently from other countries. He writes:
Luckily there is a yardstick that finesses many of these problems: electricity output, which is mainly a measure of consumer affluence and industrial activity. In the 1990s, while Japan was being widely portrayed as an outright "basket case," its rate of increase in per-capita electricity output was twice that of America, and it continued to outperform into the new century.
This is true for companies, too. AIG (NYSE: AIG ) blew up in 2008 after making suicidal derivative bets. Last month I asked Hank Greenberg, AIG's former chairman and CEO (who left three years before the blowup) whether an investor could have possibly known how much risk AIG was taking, even with hindsight. "No, I don't think so," he said. "I'm not sure the [annual reports] that they filed were complete ... I was a major shareholder of AIG, the largest individual shareholder. I lost about 90% of my net worth." Investors thought AIG was a good, old-fashioned insurer suitable to hold in a retirement account. And then they learned otherwise.
The evidence is just overwhelming that we know much less than we think we do, even when we're armed with data and studies. Sometimes especially when we're armed with data and studies, because they give us a false sense of confidence. British neurologist John Hughlings Jackson once said, "It takes 50 years to get a wrong idea out of medicine, and 100 years a right one into medicine."
And this spreads far beyond economics. In his excellent book The Half-Life of Facts, Sam Arbesman writes:
Facts change all the time. Smoking has gone from doctor recommended to deadly ... We used to think that the Earth was the center of the universe, and our planet has since been demoted. I have no idea any longer whether red wine is good for me. My father, a dermatologist, told me about a multiple-choice exam he took in medical school that included the same question two years in a row. The answer choices remained exactly the same, but one year the answer was one choice and the next year it was a different one.
The takeaway here isn't a plea to ignore data, statistics, and studies. But they have to be taken for what they are: Fallible and often incomplete. People get tripped up when they take one set of data or one study and put all of their weight behind it. "I'm investing in X because this study shows Y." Or, "I'm selling everything because this economic report shows trouble ahead." At best, data guides us in a certain direction, but investors always have to have a healthy appreciation for the unknown. We talk a lot about how bad analysts are at predicting the future, but it's really worse than that. We barely know what's happening right now, or even what happened in the past.
Rogoff and Reinhart wrote a book titled This Time is Different, poking fun at what have been called "the four most dangerous words." I disagree. The four most dangerous in the English language words may be, "This study proves that ..."
Check back every Tuesday and Friday for Morgan Housel's columns on finance and economics.