In 1987, I was invited to Harvard University to debate “The Declining Middle Class” with Lester Thurow, Frank Levy, Barry Bluestone and the late John Kenneth Galbraith. Someone complained the panel seemed slightly out of balance. I explained that those in left field had tried to get a few more on their team, to make it fairer, but the other fellows chickened out. I must have said something impolitic (politically incorrect), because Harvard neglected to ask me back.
As a result, it was only with great trepidation that I recently accepted an invitation to Manhattan to debate Nomi Prins of Demos and David Leonhardt of the New York Times on the topic, “Economic Anxiety: The new normal or a result of bad policies?”
Having tried to support a family from 1974 to 1982 with the help of a 14 percent mortgage rate, I couldn’t imagine what could possibly be considered new about economic anxiety. Luckily for me, Peter Orszag, the newly appointed head of the Congressional Budget Office, wrote about it in the Boston Globe three days before my talk. “For the past three decades,” he said, “macroeconomic growth has not made American families feel sufficiently secure. Median wages have stagnated, and families now face substantial new risks. According to Yale’s Jacob Hacker, the average family had a 7 percent chance in the early 1970s of seeing its income drop by half or more. By 2002, that probability rose to nearly 17 percent.” To say “median wages have stagnated” surely suggests the last five years or even the last three decades. But where did he get that idea? There are no government data on “median wages,” so Mr. Orszag was probably referring to estimates from the Economic Policy Institute (EPI).
Measured in 2005 dollars, the EPI estimate of the median wage fell from $13.31 in 1992 to $12.83 in 1996, before rebounding to $13.88 in 2000. But the median wage kept rising every year until it hit $14.46 in 2004. It briefly slipped to $14.29 in 2005 with the energy price spike. Yet the real median wage was still 3 percent higher than in 2000. Because inflation has now dropped to 1.3 percent, all measures of real wages are up strongly for 2006.
Lacking any evidence of “wage stagnation” — aside from that embarrassing four-year decline during President Clinton’s first term — those yearning for something to complain about have turned to an index of “Family Income Stability” in Jacob Hacker’s book “The Great Risk Shift.” That index ends with 2002, which is when Mr. Orszag’s “past three decades” really ended. It is not a measure of downward instability, as suggested, but a measure of “transitory variance” — temporary income change of any sort. If incomes never budged, that would get a perfect score.
This index of “fluctuations of income around its overall growth path” has become another ritualistic chant among New York Times writers. “According to a measure of volatility constructed by Jacob S. Hacker,” wrote Daniel Gross, “income volatility rose 88 percent between 1978 and 2000.” “According to a recent series of papers by Jacob Hacker,” wrote Noam Scheiber, “while incomes have been rising, so has the degree to which those incomes fluctuate.… Between the early 1970s and the early ’90s, the index of income volatility he devised rose by a factor of 5.”
Paul Krugman chimed in with, “As Mr. Hacker and others have documented, over the past three decades the lives of ordinary Americans have become steadily less secure, and their chances of plunging from the middle class into acute poverty ever larger.”
The first nine years in the Hacker index, from 1974 to 1982, were among the worst in U.S. economic history. Yet because they look like the best years in Mr. Hacker’s index, the inflationary recessions of 1974–75 and 1979–81 are now celebrated as a magnificent golden age by Messrs. Hacker, Orszag, Gross, Scheiber, Krugman, etc.
During the high anxiety of the ’70s, it became customary to add together the unemployment and inflation rates to arrive at a “misery index.” Even if we leave out energy, inflation was 9 percent to 10 percent in 1974 and 1975, and the unemployment rate averaged 8? percent in the latter year. Measured in 2004 dollars, median family income was $44,381 in 1973 and $43,913 in 1982. Yet those were the very best years of all, according to the Hacker index, with a record low index of instability of 0.16 and 0.19, respectively. Those were the good old days for young Mr. Hacker, who imagines it just doesn’t get any better than that.
Mr. Hacker’s index ranks 1979–81 nearly as favorably as 1974–75. His index remained at 0.23 in 1979 and 1980, barely rising to 0.24 in 1981. In 1979, 1980 and 1981, the nonenergy CPI rose by 10 percent to 12 percent every year, and unemployment rose every year from 7.1 percent in 1980 to 9.7 percent in 1982. What a wonderful situation that was for Mr. Hacker’s index — nearly as terrific as 1975.
Things turn worse as soon as the economy and stock market turned up, with the index above 0.30 from 1983 to 1988. As Noam Scheiber of the New Republic inadvertently let on, family income stability doesn’t get really awful, by Mr. Hacker’s measure, until President Clinton’s first term — jumping to 0.74 in 1993 and remaining at 0.57 in 1996. It fell back to 0.32 during the 1998–2000 stock boom, yet that too was supposedly worse than the 1990 recession. The sole Bush-era estimate is 0.48 for 2002, but that was just one year after recession and the September 11, 2001 attack — yet it was still much lower than any year from 1993 to 1996.
I have no idea what to conclude from a set of made-up statistics that supposedly make the economy of 1996 appear 3? times worse than 1974, except that political scientists should not tinker with economics.