tailieunhanh - Time Series Forecasting (Part II)

Time Series Forecasting (Part II) povides about Stationary and nonstationary processes, Autocorrelation function, Autoregressive models AR, Moving Average models MA, ARMA models, Estimating and checking ARIMA models(Box-Jenkins Methodology). | Time Series Forecasting (Part II) Duong Tuan Anh Faculty of Computer Science and Engineering September 2011 Outline Stationary and nonstationary processes Autocorrelation function Autoregressive models AR Moving Average models MA ARMA models ARIMA models Estimating and checking ARIMA models(Box-Jenkins Methodology) Stochastic Processes The time series in this part are all based on an important assumption – that the series to be forecasted has been generated by a stochastic process. We assume that X1, X2, ,XT in the series is drawn randomly from a probability distribution. In modeling such a process, we try to describe the characteristics of its randomness. We could assume that the observed series is drawn from a set of random variables. These random variables can be denoted by {Xt, t T} , T is set of time indices. Stationary and Nonstationary Processes We want to know whether or not the underlying stochastic process that generated the time series can be invariant with respect to time. If the characteristics of the stochastic process change over time, ., if the process is nonstationary, it will be difficult to represent the time series by a simple algebraic model. If the stochastic process is fixed in time, ., if it is stationary, then one can model the process via an equation with fixed coefficients that can be estimated from past data. The models described here represent stochatic processes that are assumed to be in equilibrium about a constant mean level. The probability of a given fluctuation in the process from that mean level is assumed to be the same at any point in time. Stationary processes Mathematically, a stochastic process is called stationary if its first moment and second moment are fixed and do not change in time. The first moment is the mean, E[Xt], and the second moment is the covariance between Xt and Xt+k. The kind of covariance applied on the same random variable is called auto-covariance. Variance of a process, Var[Xt], is | Time Series Forecasting (Part II) Duong Tuan Anh Faculty of Computer Science and Engineering September 2011 Outline Stationary and nonstationary processes Autocorrelation function Autoregressive models AR Moving Average models MA ARMA models ARIMA models Estimating and checking ARIMA models(Box-Jenkins Methodology) Stochastic Processes The time series in this part are all based on an important assumption – that the series to be forecasted has been generated by a stochastic process. We assume that X1, X2, ,XT in the series is drawn randomly from a probability distribution. In modeling such a process, we try to describe the characteristics of its randomness. We could assume that the observed series is drawn from a set of random variables. These random variables can be denoted by {Xt, t T} , T is set of time indices. Stationary and Nonstationary Processes We want to know whether or not the underlying stochastic process that generated the time series can be invariant with .