{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture28

# Lecture28 - 107 Lecture 28 Irrelevant and omitted variables...

This preview shows pages 1–4. Sign up to view the full content.

107 Lecture 28 - Irrelevant and omitted variables 1) We are going to need a derivation to understand some of the work in this chapter. Suppose we have the following regression model and the derivation based upon it. Recall that the formula for the slope coefficient when we have just one independent variable is given by: \$ β = X X Y Y X X t t t c hc h c h 2 . Suppose we Let y Y , these are what we earlier called the deviations from the typical value, or just the deviations, for short. We may then rewrite the formula for as: Y a x X t t t t = = nd X \$ B \$ β = x y x t t t 2 Now suppose we were to look at a regression equation where the constant term = 0. Y X t t = + β ε t . From this it follows that: Y X = + β ε . (Just take averages in the last equation to get this formula). If we subtract this equation from the regression equation we have: Y Y X X t t t t = + = β ε ε ε c h * * , where t ε t . Or: y x t t = + β ε * . Now note that y Y Y Y Y Y nY nY y t t = = = = = c h 0 0 . Hence, = . nY

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
108 Of course, the same holds true for x . Suppose now we find the formula for the estimator in the regression with the variables in deviation form. We have the following formula: \$ β = = x x y y x x x y x t t t t t t b gb g b g 2 2 . But notice that the last part of this equation is exactly the same as the formula for computing the estimated coefficient when the variables are in their original form, i.e., not in deviation form. So, there are two things to note here: 1) The formula for computing the estimated slope coefficient is the same whether the variables are in deviation form or not. 2) You should feel comfortable in using either the deviation or standard form of the variables. For instance, you should be able to identify the fact that: . X X Y Y x y t t = c hc h t t t t t t . We are now going to look at what happens when a variable that should be in the regression equation is left out. This is a type of specification error called omitted variables. Let the true regression model be given by: Y X X t t = + + + β β β ε 0 1 1 2 2 Suppose, however, variable is omitted from the regression equation, so that we have: X t 2 Y X t t = + + β β ε 0 1 1 * , where . ε ε β t t X * = + 2 2 We know that the formula for the estimator for is given by: β 1 \$ β 1 2 = x y x t t t We also know that the deviation form of the regression model would look like the following (I am. just extending what we did above for the case of one independent variable, the same derivation will work for any number of independent variables, try it for yourself) y x x t t t = + + β β 1 1 2 2 * t ε . Plugging this last equation into the formula for the estimator, , we obtain: \$ β 1
109 \$ * * β β β ε β β ε 1 1 1 1 2 2 1 2 1 2 1 2 1 2 1 1 2 = + + = + + x x x x x x x x x t t t t t t t t t t t c h .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 6

Lecture28 - 107 Lecture 28 Irrelevant and omitted variables...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online