Tuesday 25 January 2011

Post Walrasian Macroeconomics: Beyond the Dynamic Stochastic General Equilibrium Model



Post Walrasian Macroeconomics: Beyond the Dynamic Stochastic General Equilibrium Model
David Colander | 2006-07-17 00:00:00 | Cambridge University Press | 440 | Economics
Macroeconomics is evolving in an almost dialectic fashion. The latest evolution is the development of a new synthesis that combines insights of new classical, new Keynesian and real business cycle traditions into a dynamic, stochastic general equilibrium (DSGE) model that serves as a foundation for thinking about macro policy. That new synthesis has opened up the door to a new antithesis, which is being driven by advances in computing power and analytic techniques. This new synthesis is coalescing around developments in complexity theory, automated general to specific econometric modeling, agent-based models, and non-linear and statistical dynamical models. This book thus provides the reader with an introduction to what might be called a Post Walrasian research program that is developing as the antithesis of the Walrasian DSGE synthesis.
Reviews
David Colander is an intelligent, principled, and indefatiguable proponent of alternatives to the received wisdom in economics. This book lays out a cogent critique of standard macroeconomic theory and proposes directions for empirically-grounded alternatives.



It is easy to critique the three main traditional branches of macroeconomics, Walrasian, Keynesian, and New Classical. The Walrasian model is purely an equilibrium model with no known analytically-based dynamics. Because macroeconomics deals centrally with disequilibrium and dynamical phenomena, it is perfectly useless for macroeconomic dynamics, and this has been well-known for fifty years. The Keynesian and New Classical models are "toy" models with "representative agents" and single markets in investment and consumption. The Keynesian model makes assumptions concerning individual irrationality that it claims underly the volatility of the market economy, and the New Classical model makes even more bizarre assumptions concerning rationality and market clearing. These models are, to my mind, so incredibly stupid that it is unbelievable that reasonable economists would hold any credence in them.



In fact, most economists don't really buy the standard models, but they are more or less in agreement with the policy implications of these models. This is lamentable, because it is clear that their policy implications are reasonable as often as unreasonable (although I must say that the New Classical policy recommendations are sometimes even stupider than the theory from which they are derived, and the Keynesian recommendations are often just social democratic crap---I tend to prefer the latter as the lesser of two really, really, distateful evils). I also think a great case can be made for the argument that many Nobel prizes awarded to New Classical ideologues are due to the right-wing zeal of Nobel Prize Committee's august chairman. The idea that Lucas, Prescott, et al. have made "scientific discoveries" is just one big joke.



What about alternatives? Let me say first that Colander has made a big mistake in identifying the reigning wisdom in macroeconomics as "Walrasian," and his proposed alternatives as "post-Walrasian." As Alan Kirman clearly states in the Forward to this book, not a single one of the attributions to Walras of the "post-Walrasians" is even remotely correct. Here are just a single example, others being scattered throughout the volume. "Walrasians assume high-level information processing capability and a rich information set." (p. 2) In fact, agents in the Walrasian general equilibrium model are price takes who know nothing but their own personal consumption/production/ownership characteristics.



By the choice of Colander's contributors, it is possible to infer that he sees two distinct directions for what he calls post-Walrasian macroeonomics. The first is what I would call "Walrasian Dynamics," which treats the economy as a complex dynamic nonlinear system, and uses techniques from complexity theory, especially agent-based modeling and analytical models with "fat tail" error distributions. This is ably represented by Leigh Tesfatsion, Robert Axtell, and Blake LeBaron. For the interested reader, my paper "The Dynamics of General Equilibrium" (Economic Journal, 2007) lies in this area and is the first to demonstrate both the quasi-stability and excessivel volatility of a dynamic version of Walrasian general equilibrium. The second is vector autoregression techniques that attempt to do without a specific "representative agent" model of the macro economy. I have no faith in the possibilities of such models, but the reader can judge for himself.



This book does not deliever the goods, but it is inspirational for those who are hardy enough to attempt to forge an alternative to the ghastly macro models that monopolize the field these days.
Reviews
This book is a collection of essays that deals with Keynes,uncertainty, " modern " macroeconomic modeling and the DSGE model.This model is criticized .However,the criticism is not decisive because the majority of essay authors still want to work within the confines of mathematical probability theory which requires additivity and linearity.

My review will concentrate on the introduction and essays written by Leijonhufvud and Colander.Many of the other essays concentrate on the " modern " form of General Equilibrium theory called the " Dynamic Stochastic General Equilibrium " model(DSGE).The DSGE model is simply a version of the rational expectations approach based on a reinterpretation of Khinchin's physics gas -particle model that leads to the conclusion that the economy can be correctly modeled as following a Weiner process over time,i.e.,a normal probability distribution.The normal distribution is used in this reinterpreted model to describe the millions of random interactions of particles( trading activities of millions of independent and identically distributed consumer -producers ) that gain/lose energy ( gain/lose by exchanging above/below the mean market price ) through the gain /lose of electrons ( incomes increase/decrease ) in the endless collisions of the gas particles( endless market exchanges above /below the expected,mean,equilibrium market price). Heating up( economic expansion) the gas particles increases the interactions while cooling the particles down decreases( economic contraction ) the interactions.The law of large numbers and the central limit theorem allow one to calculate the mean and standard deviations,which measure the 'uncertainties'(risk)of this process over time.



The only problem is that there is not a shred of empirical,historical or statistical evidence to support this Ptolomaic type artificially constructed model .Benoit Mandelbrot ,as well as the thousand of researchers outside the economics profession who have duplicated and replicated his work, has established over the last 50 years that the distributions are Cauchy.An examination of the data totally and completely refutes the DSGE.



The second problem with the essays of Leijonhufvud and Colander is that the authors are ignorant of the basic model Keynes used in the GT in chapters 20 and 21.This model is an economic version of Keynes's conventional coefficient of risk and weight ,c,model which introduced non linearity and non additivity into the decision making calculus.Keynes integrates the decisive difference between the very special case of a linear and additive risk concept used by the DSGE model through the specification of his e and ed subscripts(see pp.304-306 of the GT).e and ed subscript elasticities that equal 1 lead to the special case result of the DSGE of full employment over time while values of e and ed subscript < 1 lead to the general result of involuntary unemployment.



I append Keynes's 1921 A Treatise on Probability(TP) form of the analysis below for the interested reader.It can be found in sections 6-8 of chapter 26.The technical details can be found on p.315 and in footnote 2 on p.315 .Keynes presented a very precise analysis demonstrating that an analysis of uncertainty introduced non additivity and non linearity into the formal representation of decision making. The subjectivist, Bayesian approach regards probability as another name for the purely mathematical laws of the probability calculus that requires additivity and linearity. The Subjectivist approach makes the crucial error of conflating probability theory with decision theory.Keynes realized that ,due to the impact of the weight of the evidence (confidence)on decision makers ,as well as the optimism-pessimism of the decision maker,decision theory would have to be able to take into account the importance of non linearity and non additivity. The concept of expected value or utility is crucial to the Ramsey-De Finetti approach.Keynes demonstrated that expected value or expected utility can ,at best,be a special case only of a much more general theory .

The Ramsey-De Finetti approach is the mathematical translation of Jeremy Benthem's Benthamite Utilitarian approach.Bentham's approach was that the whole can not be anything more than the sum of the individual ,atomic parts. However, this requires the assumptions of additivity and linearity.Bentham assumed also that all decision makers can calculate the odds.Keynes showed that this was not the case. Keynes's demonstration ,taken from chapter 26 of his A Treatise on Probability(1921;TP),of the special case nature of any expected value(utility) approach ,based on the purely mathematical laws of the probability calculus,shows this to be a very special case when w=1.Bentham claimed that all individuals have the capability to calculate the odds and outcomes and act on the expected utility (the probability times the utility of the outcome) in a rational way.This can be expressed by the following ,where p is the probability of success,q is the probability of failure, and A is the outcome:



Maximize pA.



The modern version of this is to Maximize pU(A),where p is a subjective probability that is additive,linear,precise,and exact and U(A) is a Von Neumann-Morgenstern Utility function. The goal is to



Maximize pU(A).









The modern name for Benthamite Utilitarianism in neoclassical economics is SEU theory(Subjective Expected Utility). Therefore,a microeconomic foundation based on Utility Maximization is just Benthamite Utilitarianism updated with modern mathematical probability techniques.Modern macroeconomics is all disguised SEU theory.



Keynes rejected Benthamite Utilitarianism as a very special case that would only hold under the special assumptions of the subjectivist, Bayesian model-that all probabilities were additive,linear,precise,single number answers that obeyed the purely mathematical laws of the probability calculus.



Keynes specifies his conventional coefficient of risk and weight,c, model in chapter 26 of the TP on p.314 and footnote 2 on p.314,as a counter weight to the Benthamite Utilitarian approach of Ramsey.



Essentially, Keynes's generalized model is given by







c=2pw/(1+q)(1+w),



where w is Keynes's weight of the evidence variable that measures the completeness of the relevant, available evidence upon which the probabilities p and q are calculated.(Benthamite Utilitarians always assume that the value of w is always 1.)w is an index defined on the unit interval between 0 and 1,p is the probability of success,and q is the probability of failure.p+q sum to 1 if they are additive.This requires that w=1.Keynes's c coefficient can be rewritten as



c=p [1/(1+q)][2w/(1+w)].



Now multiply the above by A or U(A).One obtains



cA =p[1/(1+q) ][2w/(1+w)] A or





cU(A)= p[1/(1+q)][2w/(1+w)]U(A).



The goal is to maximuze cA or cU(A).The weight 1/(1+q) deals with non linearity of probability preferences.The weight 2w/(1+w) deals with non additivity.Modern Macroeconomics amounts to nothing more than the claim that c=p or cA [cU(A])= pA [pU(A)] .







It is now straightforward to see that the neoclassical microfoundations of macroeconomics assumes that all probabilities are additive and linear.This is nothing but a special case of Keynes's generalized decision rule to maximize cA,or cU(A),as opposed to the Benthamite Utilitarian rule to maximize pA or pU(A). Economists today have only a very vague,hazy,cloudy understanding of Keynes 's distinction between risk and uncertainty. It is this distinction that has to be grasped first before any economist can have any hope of understanding what Keynes meant in the GT.







The conclusion is very straightforward. SEU theorists use the rule to Maximize pU(A).Keynes used the rule to maximize cU(A).Keynes's rule is of the same kind or type of rule used by the overwhelmingly ambiguity averse decision makers that populated the real world in the past as well as in the present.Keynes's analysis of uncertainty is clearly related to non additivity and non linearity.It is only an anomaly in a neoclassical world of linearity and additivity that exists for Benthamite Utilitarian and for the DSGE model .























Download this book!

Free Ebooks Download

No comments:

Post a Comment