Lecture17-19 - 55 Lecture 17 Classical regression model We...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
55 Lecture 17 - Classical regression model We have learned how to estimate regression equations, and how to interpret the results. However, life is not as simple as this, for it turns out that we can only use the method of least squares when a certain set of assumptions, known as the classical assumptions, hold true . Before we explain these assumptions, let us summarize what we have done so far. We have one or more independent variables that we believe can explain movements in the dependent variable. In addition to the independent variables, there is also a random element that also causes the dependent variable to change. The random error term is present because of model miss-specification, collection and measurement of data errors, and inherent randomness. From the population for the dependent variable we select a random sample, obtaining a set of values for the dependent variable, and we would like to know if there is a relationship between the dependent and independent variables. The regression model is given by: . Because Y depends upon both the independent variables and the random error term, Y is a random variable, so it has a PDF . The estimated regression line is given by: YBB X B X B X tt t k k =+ + ++ + 01 1 2 2 .... ε t t k k t $$$ $ $ X B X t + 2 . The 's, i.e. the estimated coefficients, are also random variables and hence also have a PDF. We known they are random variables because we know that the value they take on depends on the particular random sample selected. Different random samples will yield different values for the estimators. There is another way to see that the estimated coefficients are in fact random variables. Let us look at the formula for the slope coefficient in the case of a regression equation with one independent variable. (RECALL that the formulas for the estimated coefficients will change as the number of independent variables changes, but the result we are going to show now holds in all cases). $ B $ () ( B XXYY XX t 1 2 = −− ) Let us define xXX =− and yY . These are called deviations, i.e., they are just the deviation of the variable from its typical value. Y Then we may re-write the equation for the estimated coefficient as: $ B xy x t 1 2 =
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
56 Now let w . Now we can write the equation for the estimator as: x x t t t = 2 $ Bw tt 1 = y In this form we can see that the estimated coefficient is simply a weighted average of the values of the dependent variable. Since the dependent variable is a random variable, so must be the estimated coefficients . Assumptions: We intend to use the method of least squares to find the estimated values for the coefficients. In order to use this method, the following assumptions must hold. (We will see later how to deal with those situations in which the assumptions do not hold).
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 03/03/2011 for the course ECO 230 taught by Professor Yongjinpark during the Spring '11 term at Conn College.

Page1 / 12

Lecture17-19 - 55 Lecture 17 Classical regression model We...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online