lecture2_140b_2011

# lecture2_140b_2011 - Lecture 2 Review of bivariate...

This preview shows pages 1–10. Sign up to view the full content.

Olivier Deschenes, UCSB, Econ 140B, Winter 2011 Lecture 2: Review of bivariate regression model Background Population regression line OLS estimator of the population regression line Assumptions of the linear regression model Sampling distribution of OLS estimator Law of large numbers Central limit theorem Stata example

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 Linear regression model Why do we estimate regressions? Linear regression allows us to estimate population slope coefficients, that quantify the association between 2 or more variables and make inferences about them (prediction, hypothesis tests, confidence intervals, etc) Ultimately our goal will be to estimate the causal effect on Y of a unit change in X Related to the concept of “ceateris paribus” central to economic analysis Hinges on validity of E[u|X] = 0 assumption For now, just think of the problem of fitting a straight line to data on two variables, Y and X
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 The Population Linear Regression Model X is the independent variable or regressor Y is the dependent variable 0 = intercept 1 = slope (recall that is Y/ X) The slope measures the change in Y for a 1-unit change in X . The magnitude, sign, and significance of 1 is important u i = the regression error The regression error consists of omitted factors, or possibly measurement error in the measurement of Y . In general, these omitted factors are other factors that influence Y , other than the variable X Y i = 0 + 1 X i + u i , i = 1,…, n

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 In a picture: Observations on Y and X; the population regression line; and the regression error (the “error term”):
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 The Ordinary Least Squares Estimator The OLS estimator is a function of the data The OLS estimator of population regression line seeks the line that “best summarizes the trend in the data”

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 How to choose the best line? Several criterions exist. Below are 2 possible lines OLS chooses the line that minimizes the squared prediction errors (mean-square error). This is the origin of the “least squares” term
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 Let b 0 and b 1 be some estimators (or guesses) of 0 and 1 Given b 0 and b 1 , the predicted regression line is b 0 +b 1 X i . The sum of the squared prediction errors is: n 1 i 2 i 1 0 i ) X b b (Y

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 The OLS estimator solves: The OLS estimator minimizes the (average) squared difference between the actual values of Y i and the prediction (“predicted value”) based on the estimated line The OLS estimator is denoted by and This minimization problem can be solved using calculus (i.e. solving FOC) 0 β ˆ 1 β ˆ 01 2 ,0 1 1 min [ ( )] n bb i i i Yb b X 
Olivier Deschenes, UCSB, Econ 140B, Winter 2011 Ratio of sample covariance of Y and X to sample variance of X

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 33

lecture2_140b_2011 - Lecture 2 Review of bivariate...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online