Back in November last year, Professor Olivier Blanchard discussed with me about his view that there should be four types of macroeconomics, and “theory models” like DSGEs is just one of them. Here is the conservation:
Q: In your paper “Do DSGE Models Have a Future?” , you have mentioned one of the problems of DSGE models is that it is difficult to be used for communicating with laymen. Why can’t we have some simple and stylized DSGE model, and use it to discuss with non-economist?
B: As I have said in the note, you cannot have one model for everything. So if you want to have a model, say, for the Fed or the IMF, then it has to fit the data and explain the data well. Otherwise, you can’t pretend to be able to explain the world. If you want to conduct a counterfactual simulation, to see what happens if the Fed increases the interest rate by a 100 bp, something that the Fed normally needs to do, then the model has to have enough structure to be useful for that.
This is the first type of macro model. I think this kind of model has to be close to reality, has as much as possible clear analytical structures, so that you can work on the counterfactuals.
Then you have a second type of models, which I think DSGE model (Dynamic Stochastic General Equilibrium Model) should be. They have to be more theory-based so that researchers can use them as a common basis to add and discuss distortions or other mechanisms they want to explore. This kind of model doesn’t need to be as close to the data as the first class, as they are not for the policy use directly.
So, I think DSGE models should be theoretically based, which means they should be micro-founded, to a large extent. This should be generally accepted as a starting point. When you want to look at the effect of some new mechanism, that is what you will have in mind.
One can find much more details in the original interview article. Here we focus on the first two types of the macro models, as Blanchard recently explained further his vision on these two types in a new blog post.
In this blog I want to make one main point:
Different classes of macro models are needed for different tasks.
Let me focus on two main classes.
Theory models, aimed at clarifying theoretical issues within a general equilibrium setting. Models in this class should build on a core analytical frame and have a tight theoretical structure…The core frame should be one that is widely accepted as a starting point and that can accommodate additional distortions. In short, it should facilitate the debate among macro theorists.
Policy models, aimed at analyzing actual macroeconomic policy issues. Models in this class should fit the main characteristics of the data, including dynamics, and allow for policy analysis and counterfactuals…
As you can see, this is very similar to what Blanchard have said before, what is new is that he make it clear that he doesn’t think developing one model to fit both propose is a good idea.
It would be nice if a model did both, namely have a tight, elegant, theoretical structure and fit the data well. But this is a pipe dream. Perhaps one of the main lessons of empirical work (at least in macro, and in my experience) is how messy the evidence typically is, how difficult aggregate dynamics are to rationalize, and how unstable many relations are over time. This may not be too surprising. We know, for example, that aggregation can make aggregate relations bear little resemblance to underlying individual behavior.
I would again point readers to our interview with Blanchard, where he actually mentioned the issue that makes it hard to develop a ” tight, elegant, theoretical structure and fit the data” macro model:
“[In developing DSGEs,] I think the evidence from the vast body of econometric work should be used much more. To specify the consumption function, we should do it using all kinds of information, single equation estimation, natural experiments, case studies or anything else. And to use all these pieces to characterize a consumption function which can then be put in the DSGE model.
Once this is done, we want to make sure that the DSGE replicates the VAR representation of the data. If the dynamics of the DSGE are close to those of the VAR, the job is done. If not, one has to go back to the drawing board, and see why the DSGE model is not replicating the VAR dynamics.
But this is a completely different exercise, which involves building on the work of hundreds of people who do partial equilibrium estimation. This is not done at this point, and DSGEs are largely operating within an intellectual silo, with not enough interaction with the rest of the empirical work going on in macro.”
Put it in another way, part of the problem is that there are not enough research that works on coordinating and integrating different works in Macroeconomics, such that it is premature to aim at a “one-size-fits-all” macro model.
Blanchard’s vision on “Policy models” v “Theory Models” sparked some serious discussions in the online economics community.
For example, Noah Smith asked on twitter “What else is the point of DSGEs??” (and some of the excellent responses):
@Noahpinion 2 kinds of policy work: regime evaluation and action evaluation. Academics do former. Policymakers do latter. (1)
— NRKocherlakota (@kocherlakota009) January 14, 2017
— Stephen Williamson (@1954swilliamson) January 15, 2017
There is no point of DSGE work @Noahpinion It has been a distortion of the macro profession resource allocations , aka a waste, from day one
— Adam Posen (@AdamPosen) January 14, 2017
In a longer response by Simon Wren-Lewis, he thinks that Blanchard’s “policy model” is just like the “Structural Econometric Models” he has been advocating:
he can only mean SEMs. I prefer SEMs to policy models because SEMs describe what is in the tin: structural because they utilise lots of theory, but econometric because they try and match the data…
…The way I would estimate a SEM today (but not necessarily the only valid way) would be to start with an elaborate DSGE model. But rather than estimate this model using Bayesian methods, I would use it as a theoretical template with which to start econometric work, either on an equation by equation basis or as a set of sub-systems. Where lag structures or cross equation restrictions were clearly rejected by the data, I would change the model to more closely match the data. If some variables had strong power in explaining others but were not in the DSGE specification, but I could think of reasons for a causal relationship (i.e. why the DSGE specification was inadequate), I would include them in the model. That would become the SEM…
…SEMs are also useful for DSGE model development because their departures from DSGEs provide a whole list of potential puzzles for DSGE theorists to investigate. Maybe one day DSGE will get so good at matching the data that we no longer need SEMs, but we are a long way from that…
As we can see from the passage we quoted from our interview with Blanchard, there are certain similarities between Blanchard’s and Wren-Lewis’s vision on “Policy Models”.
Stephen Williamson also weighted in what he think Blanchard’s “Policy Model” actually is.
The prime example of such models is the FRB/MIT/Penn model, which reflected in part the work of Klein, Ando, and Modigliani, among others, including (I’m sure) many PhD students. There was indeed a time when a satisfactory PhD dissertation in economics could be an estimation of the consumption sector of the FRB/MIT/Penn model.
Old-fashioned large-scale macroeconometric models borrowed their basic structure from static IS/LM models. There were equations for the consumption, investment, government, and foreign sectors. There was money demand and money supply. There were prices and wages. Typically, such models included hundreds of equations, so the job of estimating and running the model was subdivided into manageable tasks, by sector…
…What happened to such models? Well, they are alive and well, and one of them lives at the Board of Governors in Washington D.C. – the FRB/US model….
…But, it’s not clear that large-scale macroeconometric models are taken that seriously these days, even in policy circles, Janet Yellen aside. While simulation results are presented in policy discussions, it’s not clear whether those results are changing any minds. Blanchard recognizes that we need different models to answer different questions, and one danger of the one-size-fits-all large-scale model is its use in applications for which it was not designed. Those who constructed FRB/US certainly did not envision the elements of modern unconventional monetary policy…
Yet, as we can see from Williamson’s comment on Blanchard’s vision below, he is seeing eye to eye with Blanchard on the future development of macro models:
Blanchard seems pessimistic about the future of policy modeling. In particular, he thinks the theory modelers and the policy modelers should go their own ways. I’d say that’s bad advice. If quantitative models have any hope of being taken seriously by policymakers, this would have to come from integrating better theory in such models. Maybe the models should be small. Maybe they should be more specialized. But I don’t think setting the policy modelers loose without guidance would be a good idea.
I am in no position to judge who is “right”, but I have to point out to readers that Blanchard might actually have a more, say, optimistic view on the future of macro models:
Q: Is the way to build a model you have just mentioned too ideal to be possible? Or is it just macroeconomists not working fast enough to make significant progress in macroeconomics?
B: I think that there is a lot of progress. If you look at, say, the NBER working papers, there are five or six macroeconomics papers coming out every week. They are all extremely interesting.
What happens is that these research have not been fed into the DSGE models. The DSGE models live in their own universe. They do not build enough on the very good work that is happening in macroeconomics in the recent years.
So it is not that we don’t have enough good research. It is that there are not enough coordination and integration of the results.
Q: Is there any other ways to speed up the progress in coordinating and integrating the research? Maybe the IMF Annual Research Conference “Macroeconomics after the Great Recession” you co-hosted this year is one of those efforts to further the coordination?
B: Yes. And there is another project by David Vines, a professor at Oxford University. He, together with other researchers, is trying to build an alternative DSGE. It will be microfounded, but will also be based on more realistic consumption functions, investment behavior, asset demand. It is building on the works that are done by others. And there are other projects at work also. I hope some of them will be successful.
This is a very important discussion for our “Where is the General Theory of the 21st Century?” project. Therefore, we will keep track if there will be any further discussion from other top economists.
>>> Follow EconReporter on Twitter - @WITGT21