دانلود مقاله ISI انگلیسی شماره 5925
ترجمه فارسی عنوان مقاله

روش N متوسط​​ برای پیش بینی های اقتصاد کلان

عنوان انگلیسی
A medium-N approach to macroeconomic forecasting
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
5925 2012 7 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Economic Modelling, Volume 29, Issue 4, July 2012, Pages 1099–1105

ترجمه کلمات کلیدی
حداقل مربعات جزئی - رگرسیون مولفه اصلی - مدل های عاملی پویا - اطلاعات غنی از روش های پیش بینی - تکنیک های کاهش ابعاد -
کلمات کلیدی انگلیسی
Partial least squares,Principal component regression,Dynamic factor models, Data-rich forecasting methods,Dimension-reduction techniques,
پیش نمایش مقاله
پیش نمایش مقاله  روش N متوسط​​ برای پیش بینی های اقتصاد کلان

چکیده انگلیسی

This paper considers methods for forecasting macroeconomic time series in a framework where the number of predictors, N, is too large to apply traditional regression models but not sufficiently large to resort to statistical inference based on double asymptotics. Our interest is motivated by a body of empirical research suggesting that popular data-rich prediction methods perform best when N ranges from 20 to 40. In order to accomplish our goal, we resort to partial least squares and principal component regression to consistently estimate a stable dynamic regression model with many predictors as only the number of observations, T, diverges. We show both by simulations and empirical applications that the considered methods, especially partial least squares, compare well to models that are widely used in macroeconomic forecasting.

مقدمه انگلیسی

Growing attention has recently been devoted to forecasting economic time series in a data rich framework (see, inter alia, Forni et al., 2005 and Stock and Watson, 2002a). In principle, the availability of large data sets in macroeconomics provides the opportunity to use many more predictors than those that are conventionally used in typical small-scale time series models. However, exploiting this richer information set comes at the price of estimating a larger number of parameters, thus rendering numerically cumbersome or even impossible the application of traditional multiple regression models. A standard solution to this problem is imposing a factor structure to the predictors, such that principal component [PC] techniques can be applied to extract a small number of components from a large set of variables. Some key results concerning forecasting with many predictors through the application of PCs are given in Stock and Watson, 2002a and Stock and Watson, 2002b and Forni et al., 2003 and Forni et al., 2005. Recently, Gröen and Kapetanios (2008) have proposed partial least squares [PLS] as alternatives to PCs to extract the common factors. A different methodological framework is Bayesian regression as recently advocated by De Mol et al. (2008) and Banbura et al. (2010). Particularly, these authors attempted to solve the dimensionality problem by shrinking the forecasting model parameters using ridge regression [RR]. A common feature of the mentioned approaches is that statistical inference requires a double asymptotics framework, i.e. both the number of observations T and the number of predictors N need to diverge to ensure consistency of the estimators. However, an interesting question to be posed is how large the predictor set must be to improve forecasting performances. At the theoretical level, the answer provided by the double asymptotics method is clear-cut: the larger N, the smaller is the mean square forecasting error. However, Watson (2003) found that factor models offer no substantial predictive gain from increasing N beyond 50, Boivin and Ng (2006) showed that factors extracted from 40 carefully chosen series yield no less satisfactory results than using 147 series, Banbura et al. (2010) found that a vector autoregressive [VAR] model with 20 key macroeconomic indicators forecasts as well as a larger model of 131 variables, and Caggiano et al. (2011) documented that the best forecasts of the 7 largest European GDPs are obtained when factors are extracted from 12 to 22 variables only. The above results advocate in favor of a sort of “medium-N” approach to macroeconomic forecasting. Specifically, we aim at solving prediction problems in macroeconomics where N is considerably larger than in typical small-scale forecasting models but not sufficiently large to resort to statistical inference that is based on double asymptotics methods. In order to accomplish this goal, we reconsider some previous results in the PLS literature in a time-series framework. Particularly, we argue that, under the so-called Helland and Almoy condition ( Helland, 1990 and Helland and Almoy, 1994), both principal component regression [PCR] and the PLS algorithm due to Wold (1985) provide estimates of a stable dynamic regression model that are consistent as T only diverges. Since to date little is known on the statistical properties of PLS in finite samples, a Monte Carlo study is carried out to evaluate the forecasting performances of this method in a medium-N environment. To our knowledge, our simulation analysis is unique in that we simulate time series generated by stationary 20-dimensional VAR(2) processes that satisfy the Helland and Almoy condition. Indeed, several studies were devoted to compare PCR and PLS with other methods (see, inter alia, Almoy, 1996) but always in a static framework. Our results suggest that dynamic regression models estimated by PCR and, especially, PLS forecast well when compared to both OLS and RR. In the empirical application, we forecast four US macro time series by a rich variety of methods using similar variables as in the medium dimension VAR model by Banbura et al. (2010). The empirical findings indicate that PLS outperforms the competitors. Interestingly, Lin and Tsay (2006), Gröen and Kapetanios (2008) and Eickmeier and Ng (2011) reached similar conclusions using PLS as an alternative to PCs in large-N dynamic factor models. The remainder of this paper is organized as follows. The main theoretical features of the suggested methods are detailed in Section 2. The Monte Carlo design and the simulation results are discussed in Section 3. Section 4 compares various forecasting procedures in empirical applications to US economic variables. Finally, Section 5 concludes.

نتیجه گیری انگلیسی

In this paper we have examined the forecasting performances of various models in a medium-N environment. Moreover, we have argued that under the so-called Helland and Almoy condition ( Helland, 1990 and Helland and Almoy, 1994), both PCR and PLS provide estimates of a stable dynamic regression model that are consistent as T only diverges. Our Monte Carlo results, obtained by simulating a 20-dimensional VAR(2) process that satisfy the Helland and Almoy condition, have revealed that PLS often outperforms the competitors, especially when the sample size T and the number of the relevant components become larger. In the empirical application, we have forecasted, by a variety of competing models, four US monthly time series using similar variables as in the medium dimension VAR model by Banbura et al. (2010). Interestingly, PLS has revealed to perform better than other, more well-known, forecasting methods. Moreover, we emphasize that the suggested PLS approach is computationally less demanding than the switching algorithm proposed by Gröen and Kapetanios (2008).