TimeSeriesBook.pdf

52 ols estimation of an arp ordinary least squares

• 435

This preview shows pages 115–118. Sign up to view the full content.

efficient alternatives.

This preview has intentionally blurred sections. Sign up to view the full version.

5.2. OLS ESTIMATION OF AN AR(P) MODEL 97 5.2 Ordinary least-squares (OLS) estimation of an AR(p) model An alternative approach is to view the AR model as a regression model for X t with regressors X t - 1 , . . . , X t - p and error term Z t : X t = φ 1 X t - 1 + . . . + φ p X t - p + Z t , Z t WN(0 , σ 2 ) . Given observation for X 1 , . . . , X T , the regression model can be compactly written in matrix algebra as follows: X p +1 X p +2 . . . X T = X p X p - 1 . . . X 1 X p +1 X p . . . X 2 . . . . . . . . . . . . X T - 1 X T - 2 . . . X T - p φ 1 φ 2 . . . φ p + Z p +1 Z p +2 . . . Z T , Y = X Φ + Z. (5.2) Note that the first p observations are lost and that the effective sample size is thus reduced to T - p . The least-squares estimator (OLS estimator) is obtained as the minimizer of the sum of squares S (Φ): S (Φ) = Z 0 Z = ( Y - X Φ) 0 ( Y - X Φ) = T X t = p +1 ( X t - φ 1 X t - 1 - . . . - φ p X t - p ) 2 = T X t = p +1 ( X t - P t - 1 X t ) 2 -→ min Φ . (5.3) Note that the optimization problem involves no constraints, in particular causality is not imposed as a restriction. The solution of this minimization problem is given by usual formula: b Φ = ( X 0 X ) - 1 ( X 0 Y ) . Although equation (5.2) resembles very much an ordinary regression model, there are some important differences. First, the usual orthogonality as- sumption between regressors and error is violated. The regressors X t - j , j = 1 , . . . , p , are correlated with the error terms Z t - j , j = 1 , 2 , . . . . In addi- tion, there is a dependency on the starting values X p , ..., X 1 . The assumption of causality, however, insures that these features do not play a role asymp- totically. It can be shown that ( X 0 X ) /T converges in probability to b Γ p and
98 CHAPTER 5. ESTIMATION OF ARMA MODELS ( X 0 Y ) /T to b γ p . In addition, under quite general conditions, T - 1 / 2 X 0 Z is asymptotically normally distributed with mean 0 and variance σ 2 Γ p . Then by Slutzky’s Lemma C.10, T ( b Φ - Φ) = ( X 0 X T ) - 1 X 0 Z T converges in distri- bution to N(0 , σ 2 Γ - 1 p ) Thus, the OLS estimator is asymptotically equivalent to the Yule-Walker estimator. Theorem 5.2 (Asymptotic property of the Least-Squares estimator) . Under the same conditions as in Theorem 5.1 the OLS estimator b Φ = ( X 0 X ) - 1 ( X 0 Y ) : T b Φ - Φ d ----→ N ( 0 , σ 2 Γ - 1 p ) , plim s 2 T = σ 2 where s 2 T = b Z 0 b Z/T and b Z t are the OLS residuals. Proof. See Chapter 13 and in particular section 13.3 for a proof in the mul- tivariate case. Additional details may be gathered from Brockwell and Davis (1991, chapter 8). Remark 5.1. In practice σ 2 Γ - 1 p is approximated by s 2 T ( X 0 X /T ) - 1 . Thus, for large T , b Φ can be viewed as being normally distributed as N(Φ , s 2 T ( X 0 X ) - 1 ) . This result allows the application of the usual t- and F-tests. Because the regressors X t - j , j = 1 , . . . , p are correlated with the errors terms Z t - j , j = 1 , 2 , . . . , the Gauss-Markov theorem cannot be applied. This implies that the least-squares estimator is no longer unbiased in finite sam- ples. It can be shown that the estimates of an AR(1) model are downward biased when the true value of φ is between zero and one. MacKinnon and

This preview has intentionally blurred sections. Sign up to view the full version.

This is the end of the preview. Sign up to access the rest of the document.
• Spring '17
• Raffaelle Giacomini

{[ snackBarMessage ]}

What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern