The dominance of micro-founded macroeconomic models—models derived directly from the microeconomic concepts of utility-maximizing individuals and profit-maximizing firms, and based on the Ramsey Neoclassical growth model (Ramsey 1928)—did not go unchallenged prior to the Global Financial Crisis. But the critics were treated in the time-honoured Neoclassical way, of being both ignored and disparaged—if they were, like me, not Neoclassicals themselves—or politely listened to but still effectively ignored, if they were.
Pre-eminent amongst the tolerated critics was Robert Solow, a recipient of the “Nobel” Prize in Economics in 1987 for his work on a Neoclassical theory of economic growth (Solow 1956). In a series of papers (Solow 1994, 2001, 2003; Solow 2006; Solow 2007, 2008), Solow railed against the very idea of building macroeconomic analysis on the foundation of Ramsey’s growth model.
At a Festschrift for another economics “Nobel” recipient, Joseph Stiglitz, Solow delivered a dismissive judgment on micro-founded macroeconomics in a paper provocatively entitled “Dumb and Dumber in Macroeconomics”. Solow began with the question of “So how did macroeconomics arrive at its current state? The answer might provide a lead as to where it ought to go”. He continued:
The original impulse to look for better or more explicit micro foundations was probably reasonable… What emerged was not a good idea. The preferred model has a single representative consumer optimizing over infinite time with perfect foresight or rational expectations, in an environment that realizes the resulting plans more or less flawlessly through perfectly competitive forward-looking markets for goods and labor, and perfectly flexible prices and wages.
How could anyone expect a sensible short-to-medium-run macroeconomics to come out of that set-up? (Solow 2003. Emphasis added)
He also disparaged the assumption of equilibrium through time—which is imposed on a model that in fact has an unstable equilibrium—stating that “This choice between equilibrium and disequilibrium thinking may be a false choice”. He continued with the colourful metaphor that:
If I drop a ripe watermelon from this 15th-floor window, I suppose the whole process from t0 to the mess on the sidewalk could be described as some sort of dynamic equilibrium. But that may not be the most fruitful—sorry—way to describe the falling-watermelon phenomenon. (Solow 2003)
When the crisis hit, Solow was one of several economists invited by the US Congress’s House Committee on Science and Technology Subcommittee on Investigations and Oversight to explain what when wrong, in a hearing entitled “Building a Science of Economics for the Real World”. His testimony, as colourful as ever, highlighted a key problem for economics, that people schooled in this tradition had largely lost the capacity for critical thought about it:
every proposition must pass the smell test: does this really make sense? I do not think that the currently popular DSGE models pass the smell test. They take it for granted that the whole economy can be thought about as if it were a single, consistent person or dynasty carrying out a rationally designed, long-term plan, occasionally disturbed by unexpected shocks, but adapting to them in a rational, consistent way. I do not think that this picture passes the smell test… The advocates no doubt believe what they say, but they seem to have stopped sniffing or to have lost their sense of smell altogether. (Solow 2010. Emphasis added)
Solow’s quip that the advocates of modern Neoclassical macroeconomic modelling had “lost their sense of smell altogether” neatly characterized the debate that ensued amongst these economists in the aftermath to the Global Financial Crisis. They could not deny that the crisis had happened, but likewise they could not contemplate that their models—which had not only not seen it coming, but had predicted a bountiful economic harvest, when a famine ensued—could possibly be wrong. Their dialogue resembled men—and they are almost exclusively men—without a sense of smell, trying to distinguish the aroma of a rose garden from the stink of a sewer.
My favorite “representative agent” in this journey of non-discovery is Olivier Blanchard. Blanchard was the “Class of 1941” Professor of Economics at MIT from 1994 till 2010, Chair of Department from 1998 till 2003, Chief Economist of the IMF from September 2008 till 2015, Robert M. Solow Professor of Economics at MIT from 2010-2014 (which is somewhat ironic, given his vastly different opinion of DSGE models to Solow’s), and the President of the American Economic Association in 2018. The only major mainstream economic guernsey he lacks is a “Nobel” Prize.
He began his journey in blissful ignorance of the economic crisis unfolding around him. In August 2008, Blanchard self-published an NBER working paper with the title “The State of Macro”, in which he declared that “The state of macro is good”. Starting with a portrayal of the initial conflicts between “New Classicals” and “New Keynesians”, he opined that:
there has been enormous progress and substantial convergence. For a while—too long a while—the field looked like a battlefield. Researchers split in different directions, mostly ignoring each other, or else engaging in bitter fights and controversies. Over time however, largely because facts have a way of not going away, a largely shared vision both of fluctuations and of methodology has emerged. Not everything is fine. Like all revolutions, this one has come with the destruction of some knowledge, and suffers from extremism, herding, and fashion. But none of this is deadly. The state of macro is good. (Blanchard 2008, p. 2)
To call this blind ignorance is to insult the unsighted. The crisis is regarded as having started on August 9th, 2007—precisely a year before he uploaded this paper—when BNP Paribas Investment Partners shut down redemptions from three of its investment funds that were based on the US housing market. Figure 1 also shows that the rate of economic growth peaked in 2006 Q4 (at 4.7% in the Rest of the World and 3.7% in the USA). By the time of the BNP Paribas announcement, growth in the USA had faltered to 2.3%, in the subsequent quarter (2007 Q4) it was 0.2%. By the third quarter of 2008—which includes August, when Blanchard released his paper—it was minus 2%.
Perhaps in atonement for this monumentally badly-timed and false homage to mainstream economics, Blanchard subsequently published a string of papers that tried to assess why the state of macro was, in fact, extremely bad, and to propose what might be done to fix it (Blanchard 2014, 2016a, 2016b, 2018).
His first sortie, published in the IMF’s semi-populist journal Finance and Development, had the somewhat cartoonish title “Where Danger Lurks” (Blanchard 2014), and it was accompanied by a cartoon demon, as shown in Figure 3. Nonetheless, this paper had the most perceptive observations about the failure of macroeconomic theory that he managed to make. He focused on the assumption that economic fluctuations were linear—”so that small shocks had small effects and a shock twice as big as another had twice the effect”:
Until the 2008 global financial crisis, mainstream U.S. macroeconomics had taken an increasingly benign view of economic fluctuations in output and employment. The crisis has made it clear that this view was wrong and that there is a need for a deep reassessment…
The techniques we use affect our thinking in deep and not always conscious ways… These techniques however made sense only under a vision in which economic fluctuations were regular enough so that, by looking at the past, people and firms … could understand their nature and form expectations of the future, and simple enough so that small shocks had small effects and a shock twice as big as another had twice the effect on economic activity…
We in the field did think of the economy as roughly linear, constantly subject to different shocks, constantly fluctuating, but naturally returning to its steady state over time… Whatever caused the Great Moderation, for a quarter century the benign, linear view of fluctuations looked fine… That small shocks could sometimes have large effects and, as a result, that things could turn really bad, was not completely ignored by economists. But such an outcome was thought to be a thing of the past that would not happen again… (Blanchard 2014, p. 28)
Figure 3: Blanchard’s first, and deepest, consideration of why macroeconomic theory failed
Apart from these valid insights, the paper was more notable for its illustrations than any intellectual revolution in its content. Blanchard’s main policy advice was that we should “Stay away from dark corners” (Blanchard 2014, p. 31), but he gave no means by which “dark corners” could be identified. Though he called for research to “let a hundred flowers bloom”:
Now that we are more aware of nonlinearities and the dangers they pose, we should explore them further theoretically and empirically—and in all sorts of models. (Blanchard 2014, p. 31)
He also made the bizarre argument that if—somehow, and without any guidance from economic theory—policymakers could “maintain a healthy distance from dark corners”, then it would be OK for economic theory to march on unaltered:
But this answer skirts a harder question: How should we modify our benchmark models—the so-called dynamic stochastic general equilibrium (DSGE) models that we use, for example, at the IMF to think about alternative scenarios and to quantify the effects of policy decisions? The easy and uncontroversial part of the answer is that the DSGE models should be expanded to better recognize the role of the financial system—and this is happening. But should these models be able to describe how the economy behaves in the dark corners?
Let me offer a pragmatic answer. If macroeconomic policy and financial regulation are set in such a way as to maintain a healthy distance from dark corners, then our models that portray normal times may still be largely appropriate…Trying to create a model that integrates normal times and systemic risks may be beyond the profession’s conceptual and technical reach at this stage (Blanchard 2014, p. 31. Emphasis added)
How on Earth could policymakers “maintain a healthy distance from dark corners” if they had no theoretical guidance as to where they were? And if they could work it out for themselves by empirical observation, then what need was there for economists in the first place?
The real dark corner from which Blanchard was retreating was the prospect that the Neoclassical paradigm was in fact fundamentally wrong about the nature of the macroeconomy.
His next paper began with sound criticisms of DSGE models for being “based on unappealing assumptions. Not just simplifying assumptions, as any model must, but assumptions profoundly at odds with what we know about consumers and firms” (Blanchard 2016a, p. 1). But by the end, he could see no alternative to the core of DSGE modelling, of deriving macroeconomics from microeconomic foundations:
The pursuit of a widely accepted analytical macroeconomic core, in which to locate discussions and extensions, may be a pipe dream, but it is a dream surely worth pursuing. If so… Starting from explicit microfoundations is clearly essential; where else to start from? Ad hoc equations will not do for that purpose. Thinking in terms of a set of distortions to a competitive economy implies a long slog from the competitive model to a reasonably plausible description of the economy. But, again, it is hard to see where else to start from. (Blanchard 2016a, p. 3. Emphasis added)
Blanchard’s final word on the need to reform economic theory was written after interactions with a number of economists, including me:
A number of economists joined the debate about the pros and cons of dynamic DSGEs, partly in response to my blog post. Among them were Narayana Kocherlakota (2016), Simon Wren-Lewis (2016), Paul Romer (2016), Steve Keen (2016), Anton Korinek (2015), Paul Krugman (2016), Noah Smith (2016), Roger Farmer (2014), and Brad Delong (2016)…
In a sign of how incapable mainstream economists are of comprehending fundamental challenges to their methodology, he followed up this acknowledgment with this putative summary of agreed positions:
I believe that there is wide agreement on the following three propositions; let us not discuss them further, and move on.
i) Macroeconomics is about general equilibrium… (Blanchard 2018, p. 49. Emphasis added)
I was literally gobsmacked by this alleged point of agreement, and said so at the time, but to no avail. Far from agreeing that “Macroeconomics is about general equilibrium”, in the post of mine that Blanchard cited, I had argued that nonlinear, far-from-equilibrium dynamics had to be the basis of macroeconomic modelling:
Imposing linearity on a nonlinear system is a valid procedure if, and only if, the equilibrium around which the model is linearized is stable… The mathematically more valid approach is to accept that, if your model’s equilibria are unstable, then your model will display far-from-equilibrium dynamics, rather than oscillating about and converging on an equilibrium. This requires you to understand and apply techniques from complex systems analysis, which is much more sophisticated than the mathematics Neoclassical modelers use.
Just as Blanchard ultimately meandered back to DSGE modelling, so did Neoclassical economics: fifteen years after the crisis, DSGE models remain the dominant methodology in macroeconomic modelling. It is as if the crisis itself never occurred. All that has happened is that some modellers have calibrated their models to ex-post fit the crisis, as if that is a sufficient response.
This process began very soon after the crisis, with Peter Ireland’s paper “A New Keynesian Perspective on the Great Recession” (Ireland 2011). Though he began by admitting that “the Great Recession’s extreme severity makes it tempting to argue that new theories are required to fully explain it” (Ireland 2011, p. 31), he quickly disparaged what I will shortly show is in fact the correct approach—”Attempts to explain movements in one set of endogenous variables, like GDP and employment, by direct appeal to movements in another, like asset market valuations or interest rates, sometimes make for decent journalism but rarely produce satisfactory economic insights” (Ireland 2011, p. 32)—and moved back to the bread and butter of DSGE modelling: explaining all macroeconomic phenomena as being due to “exogenous shocks” disturbing a fundamentally stable economic system.
His conclusion, after developing and numerically solving a “small-scale model” (Ireland 2011, p. 52)—which had ten equations and 14 exogenous parameters, and was subjected to four types of exogenous shocks, to consumer preferences, production costs, technology and monetary policy—was that the difference between the worst economic crisis since the Great Depression, and the two relatively mild recessions that preceded it, was that the shocks that caused the “Great Recession” lasted longer and grew bigger over time:
the Great Recession began in late 2007 and early 2008 with a series of adverse preference and technology shocks in roughly the same mix and of roughly the same magnitude as those that hit the United States at the onset of the previous two recessions…
The string of adverse preference and technology shocks continued, however, throughout 2008 and into 2009. Moreover, these shocks grew larger in magnitude, adding substantially not just to the length but also to the severity of the great recession. (Ireland 2011, p. 48)
Ireland concluded that “All of these results indicate that the basic New Keynesian model continues to serve as a reliable guide for business cycle analysis and monetary policy evaluation” (Ireland 2011, p. 52).
A more sensible conclusion is that which Enrico Fermi gave to Freeman Dyson when the latter proudly showed the former his numerical solution to an experimental result of Fermi’s:
“There are two ways of doing calculations in theoretical physics”, he said. “One way, and this is the way I prefer, is to have a clear physical picture of the process that you are calculating. The other way is to have a precise and self-consistent mathematical formalism. You have neither.” (Dyson 2004)
When Dyson protested, Fermi asked “How many arbitrary parameters did you use for your calculations?”:
I thought for a moment about our cut-off procedures and said, “Four.” He said, “I remember my friend Johnny von Neumann used to say, with four parameters I can fit an elephant, and with five I can make him wiggle his trunk.” With that, the conversation was over. (Dyson 2004)
With the 14 arbitrary parameters Ireland used, von Neumann could doubtless make his elephant fly while copulating. Though economics is not applied physics, we need to take heed of Fermi’s advice that we need either “a clear physical picture of the process that you are calculating” or “a precise and self-consistent mathematical formalism.” Both can be constructed once we embrace the inherent complexity of the economic system, and abandon the Neoclassical fetishes of microfoundations, linearity, and equilibrium.