{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

node27 State Space Models STAT 510 - Applied Time Series Analysis

Node27 State Space Models STAT 510 - Applied Time Series Analysis

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
This is Google's cache of http://onlinecourses.science.psu.edu/stat510/node/27 . It is a snapshot of the page as it appeared on 7 Aug 2010 06:05:04 GMT. The current page could have changed in the meantime. Learn more Text-only version STAT 510 - Applied Time Series Analysis ANGEL Department of Statistics Eberly College of Science Home // Section 2: Time Domain Models State Space Models Submitted by gfj100 on Sun, 03/28/2010 - 15:43 Regression in Time Series with Dependent Errors We now turn to regression in time series, regression with dependent errors. Here is our basic time series model: z t - the exogenous variable, information that is given (independent variables in regression) x t - the observation (dependent variables in regression) y t - we assume this is zero mean, stationary (Gaussian) time series In particular, we want to find β. For now, we don't really care what is going on with the y t . If we do not have iid errors here and we plug what we have into R and use the lm function, it would return a β. However, this doesn't mean that your estimate is any good. R just gave us something which might be wrong. Certainly things like the p -values and standard errors cannot be trusted if the errors are correlated. We talked about how, if this dependency is not outrageously strong, then our estimate for β is probably decent. The standard errors, however, could be bad and prediction intervals could be bad - we can't trust anything that depends on the distribution. Geometrically it should give us a reasonable estimate for β, but this doesn't mean other aspects of the fit are good. We would like to be in a situation where everything is correct, i.e., we properly adjust for the dependence. The formula above is the univariate case. Generally, we can have more than one covariate and then it would look more like this multivariate model:
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
where z t and β are the vectors of exogenous variables and parameters: and One thing we could do is look at simply dependency on time as above. We also might look at something such as unemployment ( x t ) and GDP ( z t ). For our regression, we want is for y t not to be a general stationary process, but to be white noise. Assume for a minute that y t is an AR (2). Let's also assume that you know the AR coefficients φ 1 and φ 2 : How can we convert y t into white noise? Can we multiply our y observations y 1 , y 2 , y 3 , ... by something? What if we multiplied them by φ( B ): We get the following transformation: , which then gives us , which is equal to w t . This is how we can 'whiten' our time series. Now, let's go back to our regression. I want to make this transform so that y t is no longer y t but that it is w t . If it is w t we are in a regular regression situation with iid errors. We can multiply everything by φ( B ): Now, φ( B ) y t is w t correct?
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}