This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECON217_HW_ARMA Suggested Solutions 1. If a time series {X t } is covariance stationary, what do we know about E(X t ) and COV(X t , X tk ) for t = 1, ..., T and k = 0, 1, 2, ..? As. E(X t ) denotes the mean of X t . If {X t } is covariance stationary, E(X t )is time invariant; i.e. E(X t ) equals to a constant, say , for all t. COV(X t , X tk ) is the covariance between X t and X tk . For k = 0, we have the variance of X t (i.e. COV(X t , X t )). If {X t } is covariance stationary, COV(X t , X tk ) is time invariant, and it depends on the distance k between X t and X tk but not the time t at which it is measured. (k) is usually used to represent COV(X t , X tk ); a notation that highlights the dependence on k. 2. If {X t } is a white noise process, what do we know about E(X t ), and COV(X t , X tk ) for for t = 1, ..., T and k = 0, 1, 2, ..? As. Usually, a white noise process is referred to a zero mean white noise process; that is, E(X t ) = 0 for all t. For k = 0, COV(X t , X tk ) = 2 , a standard notation for the variance of a random variable. For k 0, COV(X t , X tk ) = 0. This is, a white noise series is a sequence of zero mean, constant variance and uncorrelated random variables. 3. Define and compare the autocorrelation function and the partial autocorrelation function of a stationary time series. As. For a time series {X t }, the Autocorrelation Function, (k), is defined as (k) = (k)/ (0), where (k) is COV(X t , X tk ). It can be shown that (k) = (k) and (k) 1, (0)= 1. Partial Autocorrelation Function, kk , is defined by the following regression equation: X t = 1k X t1 + ... + kk X tk + t . Note that, depends on the true stochastic property of X t , t is not necessarily a white noise process. Both the autocorrelation function and partial autocorrelation function measure the association of the variables in a time series. In contrast to (k), kk eliminates the effects of the intervening values X t1 through X tk+1. 4. Suppose Y t follows Y t = Y t1 + t ; t ~ WN (0 , 2 ) . a. State the assumption(s) on that will make {Y t } stationary. b. Assuming {Y t } is stationary. Find the autocorrelation function and the partial autocorrelation function. As. Stationarity condition:   < 1 and t is large. Note that Y t = Y t1 + t ; Y o = 0. Y t = t Y o + t1 1 + t2 2 + ... + 1 t1 + t . E(Y t ) = 0. V(Y t ) = ( 2(t1) + 2(t2) + ... + 2 + 1) 2 = 2 (1 2t )/(1 2 ). If Y t has an infinite history (i.e., we do not have the initial condition Y = 0), then V(Y t ) = 2 /(1 2 ). Autovariance: (1) = E(Y t Y t1 ) as E(Y t ) = 0 = E( Y 2 t1 + Y t1 t ) = E(Y 2 t1 ) = (0)....
View
Full
Document
 Winter '09
 Fairlie

Click to edit the document details