So only the first autocorrelation coefficient is

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: oefficient is significant. Q=5.09 and Q*=5.26 Compared with a tabulated χ2(5)=11.1 at the 5% level, so the 5 coefficients are jointly insignificant. 6 Moving Average Processes • Let ut (t=1,2,3,...) be a sequence of independently and identically distributed (iid) random variables with E(ut)=0 and Var(ut)= , then yt = μ + ut + θ1ut-1 + θ2ut-2 + ... + θqut-q is a qth order moving average model MA(q). • Its properties are E(yt)=μ; Var(yt) = γ0 = (1+ Covariances )σ2 7 Example of an MA Problem 1. Consider the following MA(2) process: where ut is a zero mean white noise process with variance . (i) Calculate the mean and variance of Xt (ii) Derive the autocorrelation function for this process (i.e. express the autocorrelations, τ1, τ2, ... as functions of parameters θ1 and θ2). (iii) If θ1 = -0.5 and θ2 = 0.25, sketch the ACF of Xt. 8 Solution (i) If E(ut)=0, then E(ut-i)=0 ∀ i. So E(Xt) = E(ut + θ1ut-1+ θ2ut-2)= E(ut)+ θ1E(ut-1)+ θ2E(ut-2)=0 Var(Xt) but E(Xt) Var(Xt) = E[Xt-E(Xt)][Xt-E(Xt)] = 0, so = E[(Xt)(Xt)] = E[(ut + θ1ut-1+ θ2ut-2)(ut + θ1ut-1+ θ2ut-2)] = E[ +cross-products] But E[cross-products]=0 since Cov(ut,ut-s)=0 for s≠0. 9 Solution (cont’d) So Var(Xt) = γ0= E [ = = (ii) The ACF of Xt. γ1 = E[Xt-E(Xt)][Xt-1-E(Xt-1)] = E[Xt][Xt-1] = E[(ut +θ1ut-1+ θ2ut-2)(ut-1 + θ1ut-2+ θ2ut-3)] = E[( )] = = 10 Solution (cont’d) γ2 = E[Xt-E(Xt)][Xt-2-E(Xt-2)] = E[Xt][Xt-2] = E[(ut + θ1ut-1+θ2ut-2)(ut-2 +θ1ut-3+θ2ut-4)] = E[( )] = γ3 = E[Xt-E(Xt)][Xt-3-E(Xt-3)] = E[Xt][Xt-3] = E[(ut +θ1ut-1+θ2ut-2)(ut-3 +θ1ut-4+θ2ut-5)] =0 So γs = 0 for s > 2. 11 Solution (cont’d) We have the autocovariances, now calculate the autocorrelations: (iii) For θ1 = -0.5 and θ2 = 0.25, substituting these into the formulae above gives τ1 = -0.476, τ2 = 0.190. 12 ACF Plot Thus the ACF plot will appear as follows: 13 Autoregressive Processes • An autoregressive model of order p, an AR(p) can be expressed as • Or using the lag operator notation: Lyt = yt-1 Liyt = yt-i • or or where . 14 The Stationary Condition for an AR Model • • • • The condition for stationarity of a general AR(p) model is that the roots of all lie outside the unit circle. A stationary AR(p) model is required for it to have an MA(∞) representation. Example 1: Is yt = yt-1 + ut stationary? The characteristic root is 1, so it is a unit root process (so nonstationary) Example 2: Is yt = 3yt-1 - 0.25yt-2 + 0.75yt-3 +ut stationary? The characteristic roots are 1, 2/3, and 2. Since only one of these lies outside the unit circle, the process is non-stationary. 15 Wold’s Decomposition Theorem • States that any stationary series can be decomposed into the sum of two unrelated processes, a purely deterministic part and a purely stochastic part, which will be an MA(∞). • For the AR(p) model, decomposition is , ignoring the intercept, the Wold where, 16 The Moments of an Autoregressive Process • The moments of an autoregressive process are as follo...
View Full Document

This note was uploaded on 09/20/2013 for the course FINA 5170 taught by Professor Janebargers during the Summer '13 term at Greenwich School of Management.

Ask a homework question - tutors are online