دانلود مقاله ISI انگلیسی شماره 28664
ترجمه فارسی عنوان مقاله

تجزیه و تحلیل اقتصادی از تعادل عمومی مدل های خطی منحصر به فرد پویای تصادفی

عنوان انگلیسی
Econometric analysis of linearized singular dynamic stochastic general equilibrium models
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
28664 2007 33 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Journal of Econometrics, Volume 136, Issue 2, February 2007, Pages 595–627

ترجمه کلمات کلیدی
برآورد - تعادل عمومی پویا - تکینگی - کالیبراسیون -
کلمات کلیدی انگلیسی
Estimation, Dynamic general equilibrium, Singularity, Calibration,
پیش نمایش مقاله
پیش نمایش مقاله  تجزیه و تحلیل اقتصادی از تعادل عمومی مدل های خطی منحصر به فرد پویای تصادفی

چکیده انگلیسی

In this paper I propose an alternative to calibration of linearized singular dynamic stochastic general equilibrium models. Given an a-theoretical econometric model as a representative of the data generating process, I will construct an information measure which compares the conditional distribution of the econometric model variables with the corresponding singular conditional distribution of the theoretical model variables. The singularity problem will be solved by using convolutions of both distributions with a non-singular distribution. This information measure will then be maximized to the deep parameters of the theoretical model, which links these parameters to the parameters of the econometric model and provides an alternative to calibration. This approach will be illustrated by an application to a linearized version of the stochastic growth model of King, Plosser and Rebelo.

مقدمه انگلیسی

In the forties through the sixties of the past century the development of macroeconometrics was inspired and directed by Keynesian macroeconomic theory, and vice versa, the construction and estimation of large Keynesian macroeconomic models was facilitated by econometrics, in particular simultaneous equations theory. With the rise of neoclassical dynamic stochastic general equilibrium (DSGE) macroeconomics, however, econometrics and economic theory have grown apart, to the point where most macro-theorists now consider econometrics irrelevant for what they do. Admittedly, quite a few econometricians have the same attitude towards theoretical macroeconomics. Most economic theories, including DSGE theory, are partial theories in the sense that only a few related economic phenomena are studied. The analysis of this “partial” theory is justified, explicitly or implicitly, by the ceteris paribus assumption (other things being equal or constant). See Bierens and Swanson (2000) and the references therein. However, when simple economic models of this type are estimated using data which are themselves generated from a much more complex real economy, it is not surprising that they often fit poorly. Thus, these models do not represent data generating processes, and are not designed to do. The purpose of these models is to gain insight in particular related economic phenomena rather than to describe an actual economy, and to conduct numerical experiments. Consequently, most macro-theorists do not bother to estimate their models, but instead calibrate the model parameters. See Hansen and Heckman (1996) for a review of calibration, and Sims (1996) and Kydland and Prescott (1996) for opposite views on calibration. The literature on econometric analysis of DSGE models can be divided in two rather short strands. One strand of literature is concerned with model evaluation, i.e., the problem how to measure the fit of these models. The other strand of literature is concerned with finding alternatives to calibration. Watson (1993) proposes to augment the variables in the theoretical model with just enough stochastic error so that the model can match the second moments of the actual data. Measures of fit for the model, called relative mean square approximation errors, are then constructed on the basis of the variance of this stochastic error relative to the variance of the actual series. An alternative approach is to compare the empirical VAR innovation response curves with those computed on the basis of artificial data generated by the calibrated theoretical model. See for example the papers in Pagan (1994), in particular Feve and Langot (1994) and Nason and Cogley (1994). Schorfheide (2000) compares two DSGE models with a benchmark model, using a Bayesian approach. Bierens and Swanson (2000) propose a new measure of fit, called the average conditional reality bound, which compares the non-singular part of a linearized DSGE model with a corresponding marginalized econometric model. Corradi and Swanson (2006) also compare DSGE models with a benchmark model, using squared differences of their distribution functions. DeJong et al., 1996 and DeJong et al., 2000 and Geweke (1999) propose a Bayesian approach. They assume prior distribution for the deep parameters centered around calibrated values. This is indeed a natural extension of calibration. However, there are two major limitations to the Bayesian approach. First, one has to assume that conditional on the parameters the theoretical model represents the data generating process, which is too farfetched an assumption. Second, the Bayesian approach requires the existence of the conditional density of the model variables, whereas in most of these models the model variables are driven by only a few random shock. The latter renders the theoretical distribution involved singular. DeJong et al., 1996 and DeJong et al., 2000 circumvent the singularity problem by focusing on a subset of model variables for which the conditional distribution is non-singular. Geweke (1999) applies the Bayesian approach to a one-dimensional equity premium model. Ireland (2004) proposes to add noise to a linearized DSGE model in order to estimate the resulting hybrid model by maximum likelihood. Also this approach suffers from the limitation that one has to assume that the hybrid model involved represents the data generating process. The singularity problem also prevents direct estimation of a DSGE model by GMM, because due to the singularity some moment conditions will hold exactly for each time period, so that the number of moment condition will exceed the number of observations. Therefore, the application of GMM is only possible after (explicitly or implicitly) adding noise to the exact moment equations. See for example Ambler et al. (2003). In this paper I will propose an alternative non-Bayesian approach to calibration of singular DSGE models, which takes into account that these models do not represent data generating processes and are singular. Given an a-theoretical econometric model as a representative of the data generating process, I will construct an information measure (called the multiplicative conditional reality bound) which compares the conditional distribution of the econometric model variables with the corresponding singular conditional distribution of the theoretical model variables, along the lines in Bierens and Swanson (2000). The singularity problem will be solved by using convolutions of both distributions with a non-singular distribution. This information criterion can be interpreted as the probability that the distribution of the convoluted econometric model is generated by the distribution of the convoluted theoretical model, conditional on the data. The information criterion involved will then be maximized to the deep parameters of the theoretical model, which links these parameters to the parameters of the econometric model and provides an alternative to calibration. This approach will be applied to a linearized version of the stochastic growth model of King et al., 1988a and King et al., 1988b. The linearization procedure is different from the one proposed by KPR, though. See King et al. (2001). I will solve the model without using linearization to the point where the only control variable left is the consumption-output ratio. At that point I will only linearize the state variable process of the concentrated model around the deterministic steady state, and link the parameters of the linearized model to the deep parameters. On the other hand, KPR linearize the (deterministic) Lagrange multiplier solution of their model at an earlier stage. Although it is not impossible to link the parameters of their linearized model to the deep parameters,1 it is more complicated than in my approach. Consequently, KPR (1988a) do not provide this link, except for a deterministic version of their model with fixed labor. See KPR (1988a, Footnote 17). A separate appendix to this paper containing the details of some tedious derivations is downloadable from web page http://econ.la.psu.edu/∼∼hbierens/SDSGEMAPP.PDF.

نتیجه گیری انگلیسی

In the current literature on econometric analysis of DSGE models it is implicitly or explicitly assumed, eventually after adding noise to eliminate singularity, that the model represents the data generating process, and that then the model parameters can be estimated by linking the model to the data via maximum likelihood, GMM or other estimation procedures. In this paper I adopt the theorist's view that these models are misspecified as representatives of data generating processes. Instead of linking a DSGE model directly to the data, I propose to link it indirectly to the data via an econometric model which is assumed to represent the data generating process. In doing so I can estimate the deep parameters as function of the parameters of the econometric model, without worrying about misspecification of the DSGE model. Moreover, via the delta method the estimated deep parameters inherit the asymptotic normality of the estimated parameters of the econometric model, so that the former estimates can be endowed with confidence intervals. The estimation approach in this paper is applicable to any linearized DSGE model for which the link between the parameters of the linearized model and the deep parameters is preserved. For example, the DSGE models considered by Corradi and Swanson (2006) are linearized using iterated quadratic approximations of the value function, which preserves the link with the deep parameters, and can therefore be estimated by my approach. The same applies to the linearized DSGE model considered by Ireland (2004), which is derived from the model restrictions and an Euler equation. Of course, alternative linearization procedures may yield different estimates of the deep parameters. As indicated before, by calibrating DSGE models theorists will limit their ability to detect model failure. In particular the extent of deviation of the estimated deep parameters from the usual calibrated values, as in the KPR case, provides useful information about possible model failure, and could (or should!) lead to a quest for more realistic models. This paper provides new econometric tools to assist in this endeavor.