The discussion was civil, which is partially the result that Simon Wren-Lewis agrees that the fixation on "microfoundations" makes very little sense. However, since it is still insisted upon by mainstream academics as being necessary to get published in mainstream journals, his flexibility is not particularly representative.
The following argument by Wren-Lewis is interesting.
Mainstream macroeconomics addiction to microfoundations methodology has given heterodox economists an opportunity. If mainstream macro continues to shun what it calls policy models (models that use aggregate relationships justified by an eclectic mix of theory and data), then this space can be occupied by others. But to do that heterodox economists have to stop being heterodox, by which I mean defining themselves by being against almost all mainstream theory. As Jo Michell writes “The problem with heterdox economics is that it is self-definition in terms of the other”. As the scope and diversity of mainstream theory gets larger and wider, the space that can be occupied by those who reject the mainstream shrinks.I would like to agree with this sentiment, but I differ on the value of mainstream economics.
The mainstream methodology within macroeconomics largely consists of writing down an overly complex model, wave hands about trying to solve the model (linearisation, etc.), and then draw conclusions from the model results (which are often asserted, as there is little indication of solving the mathematics properly). The verbal description of the model results are then presented as being policy relevant.
However, the initial model used is wildly unrealistic, and so critics could obviously point out that the conclusions are wrong. (For example, no financial sector.) The response is to create a completely different unrealistic model in another paper which handles that particular objection, and then draw new policy conclusions -- verbally.
This sequence of verbal observations is what is providing the apparent diversity of mainstream macro. The mathematical models themselves are a sequence of unrealistic messes that are not coherent with each other.
Constructive Advance
Once I finish up the remaining sections of my latest book (on money), I expect that I will be writing a more technical work about some post-Keynesian approaches to the business cycle. (I cannot hope to cover all of them.)My objective is not to advance post-Keynesian theory, rather give my spin as an outsider on the theory. The idea is that it will be an (advanced) primer on post-Keynesian economics, shorter (and cheaper!) than the various textbooks that are available.
As I have previously discussed, I am skeptical about the ability of any model to forecast the economy in a reliable fashion. (During an expansion, growth follows a steady trend, so any methodology that extrapolates past behaviour is going to look good -- until the recession hits.) However, mathematical models still provide some useful input to our thought processes. I hope to be able to demonstrate the usefulness of methods (such as SFC models), even if we cannot hope to have a perfect crystal ball.
For the purposes of business cycle analysis, the various post-Keynesian approaches I will cover will be an improvement over DSGE macro. The "big insight" from DSGE macro is that recessions are caused by inexplicable collapses in productivity. That is setting the bar very low for competing approaches.
See also:
- My resource page for SFC models/Modern Monetary Theory.
- Jo Michell also has a good discussion, including how the name "stock-flow consistent" models was somewhat unfortunate. (UPDATE)
(c) Brian Romanchuk 2016
Brian,
ReplyDeleteHave you picked up the Bank of England working paper on SFC models
I saw the abstract; I have noy had time to read it yet. Should take a look later.
DeleteThe two responses( links) you suggested lead to the same article.
ReplyDeleteOops, cut and paste mishap.
DeleteThanks.
Wren-Lewis: "To summarise, if you were to ask how this model compares to other aggregate (non-microfounded) models, the answer would probably be that it takes theory less seriously and it has a rather elaborate financial side."
ReplyDeleteI'm surprised no one seems to be commenting this part of his post. I do not get how SFC-models would be taking theory "less seriously".
It is certainly a balanced act to choose the level of abstraction for a model but it is quite unexpected, esp. for a professor, to say that more detailed model is "less theory".
I didn't comment on it because I did not see it.
DeleteI would guess what he means is that the details in the SFC model are too simple, in his opinion. I accept that it would be nice to have more complex mechanisms (optimisation) embedded within a model, but the reality is that we need to be able to solve the model. Writing something really complex that has no means of solution is not really mathematics.
I think that's right: in the comment section he refers to VAR, which is mentioned in the (BOE) paper.
DeleteBut I think that is a bit unfair as SFC could easily incorporate more complex mechanisms but then the model might not be solvable as you said. DSGE agents are optimizing but at the same time the model abstracts other critical features away - for the very same reason, otherwise it would not be solvable. So it is just a modelling choice, IMO the important question is which one is more useful abstraction - not which one is less theoretical.
Theory in mainstream means Lucas critic and rational expectations.
DeleteOf course hetrodox people could say that Mainstream doesn't take theory seriously because they ignore things like the Cambridge Capital Controversy.
This is slightly off topic, but I remember reading about how Mainstream models are not SFC because they have real instead of nominal accounting variables. Did I read that here, or can any one point me to discussions of this?
ReplyDeleteJo Michell has a follow up article on this topic, including the stock-flow consistency of mainstream models. (As he notes, the name "Stock-Flow Consistent Model" is now viewed as a mistake, as the issue was not that all other methods have the accounting wrong.)
DeleteHe wrote what I have written - most modern DSGE models start out respecting nominal accounting identities, but the step of linearisation breaks the accounting. Older "mainstream" models had bigger problems with accounting, since they were more approximate. The fact that DSGE models privilege real variables makes the accounting more difficult, but not necessarily impossible to do correctly.
As I forgot to mention, I added the link to the new Jo Michell article above.
DeleteIt was something different than the linearisation issue.
DeleteGoogling around, I think it had to do with the fact that chain price indexes are not additive, so real variables calculated with chain indexes can't be used as accounting variables and be stock flow consistent.
Great article! As a young scholar identified with post-Keynesian theory I also think that the communication with mainstream macro should be greater in order to overcome its problems and (more importantly) provide reliable alternatives. As you said, the bar is low and we must accept the task of lifting it up. Thanks.
DeleteThanks. It's been a awhile since I've written this, and I looked a bit more at DSGE macro. I am largely mystified by the mathematics, which is rather impressive in that my training is as an applied mathematician. I managed to get some useful references from some mainstream economists on Twitter, so I am closer to figuring out what they are going on about. I hope to discuss that in an article that I will co-write with Alex Douglas some time...
Delete