{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ps1_2010

# ps1_2010 - y 1;y n ² ^ y n in matrix notation as a linear...

This preview shows page 1. Sign up to view the full content.

STAT 203 PROBLEM SET 1 Due date: Jan. 21 (1) Let e 1 ; : : : ; e n be the residuals from the regression of y 1 ; : : : ; y n on x 1 ; : : : ; x n . (a) Show that P n i =1 e i = 0. (b) One of the assumptions for simple least squares regression is the following: The errors i are independent and identically dis- tributed, with mean 0 and variance 2 . Does P n i =1 e i = 0 help validate the above assumption? Why or why not? (2) Consider least squares linear regression of ( y 1 ; : : : ; n n ) on ( x 1 ; : : : ; x n ) by the model: y i N ( 0 + 1 x i ; 2 ) ; where the y ’s are assumed to be independent. (a) Express the least squares estimator ^ and the residual vector
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: y 1 ;:::;y n ² ^ y n ) in matrix notation as a linear transformation of y . (b) For y ± N ( ´; ±), u = Ay , v = By , the covariance between u and v is A ± B t : Use this property to show that r is independent of ^ ³ . (c) Show that n X i =1 ( y i ² ^ y ) 2 ± µ 2 n ± 2 : (3) RABE Exercise 3.4 (Data ²le: Examination.txt) (4) RABE Exercise 3.14 (Data ²le: Cigarette.txt) (5) RABE Exercise 4.7 (Data ²le: Cigarette.txt) RABE: Regression Analysis by Example by Chatterjee and Hadi, Ed. 4. 1...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online