Harvard’s Jeffrey Frankel (hat tip, Mark Thoma) is the latest econ-blogger to cast an admiring gaze in the direction of nominal gross domestic product (GDP) targeting. Frankel’s post is titled “The Death of Inflation Targeting,” and the demise apparently includes the notion of “flexible targeting.” The obituary is somewhat ironic in that at least some of us believe that the U.S. central bank has recently taken a big step in the direction of institutionalizing flexible inflation targeting. Frankel, nonetheless, makes a case for nominal GDP targeting
“One candidate to succeed IT [inflation targeting] as the preferred nominal monetary-policy anchor has lately received some enthusiastic support in the economic blogosphere: nominal GDP targeting. The idea is not new. It had been a candidate to succeed money-supply targeting in the 1980’s, since it did not share the latter’s vulnerability to so-called velocity shocks.
“Nominal GDP targeting was not adopted then, but now it is back. Its fans point out that, unlike IT, it would not cause excessive tightening in response to adverse supply shocks. Nominal GDP targeting stabilizes demand—the most that can be asked of monetary policy. An adverse supply shock is automatically divided equally between inflation and real GDP, which is pretty much what a central bank with discretion would do anyway.”
That’s certainly true, but a nominal GDP target is consistent with a stable inflation or price-level objective only if potential GDP growth is itself stable. Perhaps the argument is that plausible variations in potential GDP are not large enough or persistent enough to be of much concern. But that notion just begs the core question of whether the current output gap is big or small. At least for me, uncertainty about where GDP is relative to its potential remains the key to whether policy should be more or less aggressive.
In another recent blog item (also with a pointer from Mark Thoma), Simon Wren-Lewis offers the opinion that acknowledging uncertainty about size of the output gap actually argues in favor of being “less cautious” about taking an aggressive policy course. The basic idea is familiar. It is a simple matter to raise rates should the Fed overestimate the magnitude of the output gap. But with the short-term policy rates already at zero, it is not so easy to go in the opposite direction should we underestimate the gap.
No argument there. As I pointed out in a May 3 macroblog item, Atlanta Fed President Dennis Lockhart has said the same thing. But, as I argued in that post, this point of view is only half the story. Though I agree that the costs are asymmetric with respect to the downside with respect to the FOMC’s employment and growth mandate, they look to me to be asymmetric to the upside with respect to the price stability mandate. And I view with some suspicion the claim that we know how to easily manage policy that turns out to be too aggressive after the fact.
My issues are not merely academic. In an important paper published a decade ago, Anasthsios Orphanides made this assertion:
“Despite the best of intentions, the activist management of the economy during the 1960s and 1970s did not deliver the desired macroeconomic outcomes. Following a brief period of success in achieving reasonable price stability with full employment, starting with the end of 1965 and continuing through the 1970s, the small upward drift in prices that so concerned Burns several years earlier gave way to the Great Inﬂation. Amazingly, during much of this period, specifically from February 1970 to January 1977, Arthur Burns, who so opposed policies fostering inﬂation, served as Chairman of the Federal Reserve. How then is this macroeconomic policy failure to be explained? And how can such failures be avoided in the future?..
“The likely policy lapse leading to the Great Inﬂation …can be simply identified. It was due to the overconfidence with which policymakers believed they could ascertain in real-time the current state of the economy relative to its potential. The willingness to recognize the limitations of our knowledge and lower our stabilization objectives accordingly would be essential if we are to avert such policy disasters in the future.”
With this historical observation in hand, it seems a short leap to turn Wren-Lewis’s thought experiment on its head. Arguably, the last several years have demonstrated that nonconventional policy actions have been quite successful at short-circuiting the disinflationary spirals that pose the central downside risk when interest rates are near zero. (If you can tolerate a little math, a good exposition of both theory and evidence is provided by Roger Farmer.)
On the opposite side of the ledger, we know little about the conditions that would cause the Fed to lose credibility with respect to its commitment to its inflation goals, and very little about the triggers that would cause inflation expectations to become unanchored. Thus, I think it not difficult to construct a plausible argument about the risks of being wrong about the output gap that is exact opposite of the Wren-Lewis conclusion.
I end up about where I did in my previous post. Flexible inflation targeting, implemented in such a way that the 2 percent long-run inflation target rate exerts an observable gravitational pull over the medium term, feels about right to me. Despite what Frankel seems to believe, I think that idea is far from dead.
This post originally appeared at macroblog and is posted with permission.