This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: STAT 248: ARMA Models Handout 5 GSI: Gido van de Ven October 1st, 2010 1 Autoregressive Moving Average Models [ARMA(p,q)] 1.1 Introduction Classical regression (i.e. regression with deterministic explanatory variables and/or other observed time series) is often insufficient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of a simple linear regression reveals additional structure in the data that the regression did not capture. The introduction of correlation as a phenomenon that may be generated through lagged linear relations leads to proposing the autoregressive (AR) and the autore gressive moving average (ARMA) models. Adding nonstationary models leads to the autoregressive integrated moving average (ARIMA). ARMA(p,q) class time series { X t ,t = 0 , ± 1 , ± 2 ,... } are defined in terms of linear difference equa tions with constant coefficients . Let’s review the concept of a ”linear difference equation with constant coefficients”. 1. The term linear means that each term of the sequence is defined as a linear function of the preceding terms. 2. The order of a linear recurrence relation is the number of preceding terms required by the definition. 3. The general form of a linear recurrence relation of order d , is as follows: a n = c 1 a n 1 + c 2 a n 2 + ··· + c d a n d + c 4. If, for all i , c i , is independent of n , then the recurrence relation is said to have constant coeffi cients. 5. The linear recurrence, together with seed values (initial conditions) for a ,...,a d 1 , determines the sequence uniquely. Why are ARMA(p,q) models so extremely important? • For any autocovariance function γ ( . ) such that lim h →∞ γ ( h ) = 0 and for any integer k > 0, it is possible to find an ARMA process { X t ,t = 0 , ± 1 , ± 2 ,... } with autocovariance function γ X ( . ) such that γ X ( h ) = γ ( h ), h = 0 , 1 ,...k . • The linear structure of ARMA processes leads to a very simple theory of linear prediction. 1 • Definition : [ARMA process] The process { X t ,t = 0 , ± 1 , ± 2 ,.., } is said to be an ARMA(p,q) process if { X t } is stationary and if for every t , X t φ 1 X t 1 ... φ p X t p = Z t + θ 1 Z t 1 + ... + θ q Z t q where { Z t } is white noise. These difference equations can be written symbolically in the more compact form φ ( B ) X t = θ ( B ) Z t for t = 0 , ± 1 , ± 2 ,..., where φ and θ are the p th and the q th degree polynomials φ ( z ) = 1 φ 1 z ... φ p z p θ ( z ) = 1 + θ 1 z + ... + θ q z q and B is the backward shift operator defined by B j X t = X t j for j = 0 , ± 1 , ± 2 ,..., . The polynomials φ ( . ) and θ ( . ) will be referred to as the autoregressive and moving average polynomials of the difference equations....
View
Full Document
 Spring '11
 kim
 Autoregressive moving average model, Xt

Click to edit the document details