TimeSeriesBook.pdf

6 additional results and more general exponential

Info icon This preview shows pages 82–85. Sign up to view the full content.

View Full Document Right Arrow Icon
6 Additional results and more general exponential smoothing methods can be found in Abraham and Ledolter (1983) and Mertens and R¨assler (2005). 6 This happens, for example, when many, perhaps thousands of time series have to be forecasted in a real time situation.
Image of page 82

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
64 CHAPTER 3. FORECASTING STATIONARY PROCESSES 3.4 Exercises Exercise 3.4.1. Compute the linear least-squares forecasting function P T X T + h , T > 2 , and the mean squared error v T ( h ) , h = 1 , 2 , 3 , if { X t } is given by the AR(2) process X t = 1 . 3 X t - 1 - 0 . 4 X t - 2 + Z t with Z t WN(0 , 2) . To which values do P T X T + h and v T ( h ) converge for h going to infinity? Exercise 3.4.2. Compute the linear least-squares forecasting function P T ( X T +1) and the mean squared error v T (1) , T = 0 , 1 , 2 , 3 , if { X t } is given by the MA(1) process X t = Z t + 0 . 8 Z t - 1 with Z t WN(0 , 2) . To which values do P T X T + h and v T ( h ) converge for h going to infinity? Exercise 3.4.3. Suppose that you observe { X t } for the two periods t = 1 and t = 3 , but not for t = 2 . (i) Compute the linear least-squares forecast for X 2 if X t = φX t - 1 + Z t with | φ | < 1 and Z t WN(0 , 4) Compute the mean squared error for this forecast. (ii) Assume now that { X t } is the MA(1) process X t = Z t + θZ t - 1 with Z t WN(0 , 4) . Compute the mean squared error for the forecast of X 2 .
Image of page 83
3.5. PARTIAL AUTOCORRELATION 65 3.5 The Partial Autocorrelation Function Consider again the problem of forecasting X T +1 from observations X T , X T - 1 , . . . , X 2 , X 1 . Denoting, as before, the best linear predictor by P T X T +1 = a 1 X T + a 2 X T - 1 + a T - 1 X 2 + a T X 1 , we can express X T +1 as X T +1 = P T X T +1 + Z T +1 = a 1 X T + a 2 X T - 1 + a T - 1 X 2 + a T X 1 + Z T +1 where Z T +1 denotes the forecast error which is uncorrelated with X T , . . . , X 1 . We can now ask the question whether X 1 contributes to the forecast of X T +1 controlling for X T , X T - 2 , . . . , X 2 or, equivalently, whether a T is equal to zero. Thus, a T can be viewed as a measure of the importance of the additional information provided by X 1 . It is referred to as the partial autocorrelation . In the case of an AR(p) process, the whole information useful for forecasting X T +1 , T > p , is incorporated in the last p observations so that a T = 0. In the case of the MA process, the observations on X T , . . . , X 1 can be used to retrieve the unobserved Z T , Z T - 1 . . . , Z t - q +1 . As Z t is an infinite weighted sum of past X t ’s, every new observation contributes to the recovering of the Z t ’s. Thus, the partial autocorrelation a T is not zero. Taking T successively equal to 0, 1, 2, etc. we get the partial autocorrelation function (PACF). We can, however, interpret the above equation as a regression equa- tion. From the Frisch-Lovell-Waugh Theorem (See Davidson and MacK- innon, 1993), we can obtain a T by a two-stage procedure. Project (regress) in a first stage X T +1 on X T , . . . , X 2 and take the residual. Similarly, project (regress) X 1 on X T , . . . , X 2 and take the residual. The coefficient a T is then obtained by projecting (regressing) the first residual on the second. Station- arity implies that this is nothing but the correlation coefficient between the two residuals.
Image of page 84

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 85
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern