08 Hypothesis Testing in Regression Models

# 08 Hypothesis Testing in Regression Models - Economics 140A...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Economics 140A Hypothesis Testing in Regression Models While it is algebraically simple to work with a population model with a single varying regressor, most population models have multiple varying regressors Y t = & 1 + & 2 X t; 2 + &&& + & k X t;k + U t : The classic assumptions are virtually una/ected by the presence of multiple vary- ing regressors. The only change is that we now assume that there is no multi collinearity among the regressors. The coe cients have the interpretation of partial derivatives & 2 = @Y t @X t; 2 ; that is &The coe cient & 2 measures the e/ect on Y t of a one unit change in X t; 2 holding all other regressors constant. Estimation of the model is exactly as before (there is no simplicity gained by working in deviation-from-means form), so the OLS coe cient estimators are ( B 1 ;:::;B k ) = arg min ~ B 1 ;:::; ~ B k n X t =1 & Y t ~ B 1 ~ B 2 X t; 2 &&& ~ B k X t;k 2 : As we discussed earlier, the probability that B i = & i is zero, so we do not rely on point estimates alone. Rather we focus on interval estimates, which contain information both about the variance and the shape of the distribution of the estimator. There is an interesting parallel between the model with one regressor and the model with multiple regressors. For the model with one regressor the variance of the estimator of the coe cient on X t; 1 is: V ( B 1 ) = 2 P n t =1 X t; 1 & X 1 2 ; For the model with multiple regressors V ( B 1 ) = 2 S 1 ; where S 1 is the sum of squared residuals after regressing X 1 on a constant and the other regressors. We see that the denominator of the variance for the single regressor model is simply the sum of squared residuals from regressing X 1 on a constant. Con&dence Intervals To make clear that a con&dence interval depends on the shape of the estima- tors distribution consider a con&dence interval for the estimator of the regression error variance. The estimator of & 2 is S 2 = 1 n & k P n t =1 & U P t 2 . Note ( n & k ) S 2 & 2 2 n & k : From the tabulated values of 2 n & k P a ( n & k ) S 2 & 2 c = : 95 : Step 1 P 1 c & 2 ( n & k ) S 2 1 a = : 95 Step 2 P ( n & k ) S 2 c & 2 ( n & k ) S 2 a = : 95 Thus, 95 percent of the random intervals of the form ( n & k ) S 2 c ; ( n & k ) S 2 a contain & 2 . How should one choose the critical values a and c ? Because the 2 n & k distrib- ution is skew, there are two ways to choose the critical values. The &rst way is to choose equal-tailed critical values for which P ( n & k ) S 2 & 2 a = P c ( n & k ) S 2 & 2 = : 025 . The second way is to select the critical values to minimize the distance c & a ....
View Full Document

## This note was uploaded on 09/04/2011 for the course ECON 140a taught by Professor Staff during the Fall '08 term at UCSB.

### Page1 / 10

08 Hypothesis Testing in Regression Models - Economics 140A...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online