4 - Notes 4 Models for Stationary Time Series: General...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Notes 4 Models for Stationary Time Series : General Linear Processes A general linear process { Z t } is one that can be represented as a weighted linear com- bination of the present and past terms of a white noise process: Z t = a t + ψ 1 a t - 1 + ψ 2 a t - 2 + ... (1) = ψ ( B ) a t where ψ ( B ) = 1 + ψ 1 B + ψ 2 B 2 + ... , B is a ‘backward shift operator’ defined by BZ t = Z t - 1 , B k Z t = Z t - k . If the RHS of (1) truly an infinite series, then certain restrictions must be placed on the ψ 0 s for the RHS to be mathematically meaningful. For our purposes, it suffices to assume that X i =1 ψ 2 i < as V ar ( Z t ) = ( i =0 ψ 2 i ) σ 2 a . We should note that since { a t } is unobservable, there is no loss of generality to assume the coefficient of a t is 1 (we put ψ 0 = 1). E ( Z t ) = 0 , γ k = σ 2 a X i =0 ψ i ψ i + k , k 0 , ψ 0 = 1 . A process with a nonzero mean μ may be obtained by adding μ to the RHS of (1) of the general linear process. Since the mean does not affect the covariance (or correlation) structure of a process, we shall assume a zero mean until we begin fitting the models to the data. Moving Average (MA) Process In the case where only a finite number of ψ 0 s are nonzero, we have what is called a moving average process. In this case, we write Z t = a t - θ 1 a t - 1 - θ 2 a t - 2 - ... - θ q a t - q . We call such a series a moving average of order q [Notation: MA( q )]. Here ψ 0 = 1 , ψ 1 = - θ 1 , ,. ..,ψ q = - θ q , θ j = 0 for all j > q . Example: MA(1). Z t = a t - θa t - 1 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
γ 0 = V ar ( Z t ) = σ 2 a (1 + θ 2 ) γ 1 = Cov ( Z t ,Z t - 1 ) = - θσ 2 a γ k = 0 , k 2 ρ 1 = - θ 1 + θ 2 ρ k = 0 , k 2 . Example: MA(2). Z t = a t - θ 1 a t - 1 - θ 2 a t - 2 γ 0 = V ar ( Z t ) = σ 2 a (1 + θ 2 1 + θ 2 2 ) γ 1 = Cov ( Z t ,Z t - 1 ) = Cov ( a t - θ 1 a t - 1 - θ 2 a t - 2 ,a t - 1 - θ 1 a t - 2 - θ 2 a t - 3 ) = ( - θ 1 + θ 1 θ 2 ) σ 2 a γ 2 = Cov ( Z t ,Z t - 2 ) = Cov ( a t - θ 1 a t - 1 - θ 2 a t - 2 ,a t - 2 - θ 1 a t - 3 - θ 2 a t - 4 ) = - θ 2 σ 2 a γ k = 0 , k 3 ρ 1 = - θ 1 + θ 1 θ 2 1 + θ 2 1 + θ 2 2 ρ 2 = - θ 2 1 + θ 2 1 + θ 2 2 ρ k = 0 , k 3 . For MA( q ), ρ k = - θ k + θ 1 θ k +1 + θ 2 θ k +2 + ... + θ q - k θ q 1+ θ 2 1 + θ 2 2 + ... + θ 2 q k = 1 , 2 ,...,q ; 0 k q + 1 . Process Variance γ 0 = (1 + θ 2 1 + ... + θ 2 q ) σ 2 a . Autoregressive (AR) Process A p th order autoregressive process { Z t } satisfies the equation Z t = φ 1 Z t - 1 + φ 2 Z t - 2 + ... + φ p Z t - p + a t Notation: AR( p ). Example: AR(1).
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 11

4 - Notes 4 Models for Stationary Time Series: General...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online