tailieunhanh - Handbook of Economic Forecasting part 51

Handbook of Economic Forecasting part 51. Research on forecasting methods has made important progress over recent years and these developments are brought together in the Handbook of Economic Forecasting. The handbook covers developments in how forecasts are constructed based on multivariate time-series models, dynamic factor models, nonlinear models and combination methods. The handbook also includes chapters on forecast evaluation, including evaluation of point forecasts and probability forecasts and contains chapters on survey forecasts and volatility forecasts. Areas of applications of forecasts covered in the handbook include economics, finance and marketing | 474 H. White 4. Artificial neural networks . General considerations In the previous section we introduced artificial neural networks ANNs as an example of an approximation dictionary supporting highly nonlinear approximation. In this section we consider ANNs in greater detail. Our attention is motivated not only by their flexibility and the fact that many powerful approximation methods can be viewed as special cases of ANNs . Fourier series wavelets and ridgelets but also by two further reasons. First ANNs have become increasingly popular in economic applications. Second despite their increasing popularity the application of ANNs in economics and other fields has often run into serious stumbling blocks precisely reflecting the three key challenges to the use of nonlinear methods articulated at the outset. In this section we explore some further properties of ANNs that may help in mitigating or eliminating some of these obstacles permitting both their more successful practical application and a more informed assessment of their relative usefulness. Artificial neural networks comprise a family of flexible functional forms posited by cognitive scientists attempting to understand the behavior of biological neural systems. Kuan and White 1994 provide a discussion of their origins and an econometric perspective. Our focus here is on the ANNs introduced above that is the class of single hidden layer feedforward networks which have the functional form 7 f x 0 x a x Yj Pj 6 where is a given activation function and 0 a ft y ft ft1 . flq y Y . y q . x yj is called the activation of hidden unit j. Except for the case of ridgelets ANNs generally take the Yj s to be free parameters resulting in a parameterization nonlinear in the parameters with all the attendant computational challenges that we would like to avoid. Indeed these difficulties have been formalized by Jones 1997 and Vu 1998 who prove that optimizing such an ANN is an NP-hard problem. It turns out however that

TỪ KHÓA LIÊN QUAN