Problem Set3 - DGP: Y = X 1 & 1 + X 2...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Problem Set 3 Problem 1 Suppose that Y = + e where e s N (0 T 1 ; and 6 = ± 2 I T . (1) Show that the OLS estimator is unbiased. (2) Can we still use the t-statistics by simply assuming homoscedasticity? Why or why not? (3) Derive the GLS estimator when is known. Is the GLS estimator BLUE? Why or why not? (4) Show that the OLS estimator is not BLUE. (5) If the GLS estimator is not feasible, what would be an alternative for doing t-test? Problem 2 Consider the following regression equation Y = X 1 1 + X 2 2 + X 3 3 + e; e s N (0 T 1 ; ± 2 I T ) Suppose that X 2 = 3 X 1 2 X 3 . Problem 3 Consider the following regression equation DGP: Y = X 1 1 + e; e s N (0 T 1 ; ± 2 I T ) Reg. equation: Y = X 1 1 + X 2 2 + e; e s N (0 T 1 ; ± 2 I T ) Is ^ 1 unbiased? How does adding irrelevant variable X 2 a/ect the distribution of ^ 1 in comparison with the case that X 2 is not included? Prove your answer. Problem 4 Consider the following regression equation
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: DGP: Y = X 1 & 1 + X 2 & 2 + e; e s N (0 T & 1 ; 2 I T ) Reg. equation: Y = X 1 & 1 + e; e s N (0 T & 1 ; 2 I T ) Is ^ & 1 unbiased? Prove your answer. Problem 5 Sang-Hee wants to forecast the dependent variable at time T + 1 , y T +1 by estimating the following regression equation Y = X& + e; e s N (0 T & 1 ; 2 I T ) (1) What is the forecasts of y T +1 ? (2) What is the nature of prediction error? Explain your answer by deriving the variance of prediction error. 1 Problem 6 Brie&y explain the following notions: (1) Unit root process (2) Unit root test (3) Cointegration (4) Stationary process (5) Spurious regression Problem 7 Consider the following model y i = & + e i , e i s i : i : d :N (0 ; 1) (1) Construct the log likelihood function. (2) Derive the maximum likelihood estimator of & . (3) Derive Var( ^ & ). 2...
View Full Document

This note was uploaded on 09/12/2010 for the course GERAS 099876f taught by Professor Gtewewa during the Spring '09 term at Aberystwyth University.

Page1 / 2

Problem Set3 - DGP: Y = X 1 & 1 + X 2...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online