tailieunhanh - SAS/ETS 9.22 User's Guide 20

SAS/Ets User's Guide 20. Provides detailed reference material for using SAS/ETS software and guides you through the analysis and forecasting of features such as univariate and multivariate time series, cross-sectional time series, seasonal adjustments, multiequational nonlinear models, discrete choice models, limited dependent variable models, portfolio analysis, and generation of financial reports, with introductory and advanced examples for each procedure. You can also find complete information about two easy-to-use point-and-click applications: the Time Series Forecasting System, for automatic and interactive time series modeling and forecasting, and the Investment Analysis System, for time-value of money analysis of a variety of investments | 182 F Chapter 6 Nonlinear Optimization Methods computationally expensive one of the dual quasi-Newton or conjugate gradient algorithms may be more efficient. Newton-Raphson Optimization with Line Search NEWRAP The NEWRAP technique uses the gradient g h k and the Hessian matrix H .k thus it requires that the objective function have continuous first- and second-order derivatives inside the feasible region. If second-order derivatives are computed efficiently and precisely the NEWRAP method can perform well for medium-sized to large problems and it does not need many function gradient and Hessian calls. This algorithm uses a pure Newton step when the Hessian is positive definite and when the Newton step reduces the value of the objective function successfully. Otherwise a combination of ridging and line search is performed to compute successful steps. If the Hessian is not positive definite a multiple of the identity matrix is added to the Hessian matrix to make it positive definite. In each iteration a line search is performed along the search direction to find an approximate optimum of the objective function. The default line-search method uses quadratic interpolation and cubic extrapolation LIS 2 . Newton-Raphson Ridge Optimization NRRIDG The NRRIDG technique uses the gradient g k and the Hessian matrix H thus it requires that the objective function have continuous first- and second-order derivatives inside the feasible region. This algorithm uses a pure Newton step when the Hessian is positive definite and when the Newton step reduces the value of the objective function successfully. If at least one of these two conditions is not satisfied a multiple of the identity matrix is added to the Hessian matrix. The NRRIDG method performs well for small- to medium-sized problems and it does not require many function gradient and Hessian calls. However if the computation of the Hessian matrix is computationally expensive one of the dual quasi-Newton or conjugate .

TÀI LIỆU LIÊN QUAN
crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.