lecture14

lecture14 - ECON 103, Lecture 14: Regression with a binary...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon
ECON 103, Lecture 14: Regression with a binary dependent variable Maria Casanova May 21st (version 1) Maria Casanova Lecture 14
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Requirements for this lecture: Chapter 11 of Stock and Watson Maria Casanova Lecture 14
Background image of page 2
0. Introduction So far we have considered dependent variables ( Y ) that were continuous , such as: Y = Wages Y = Wealth Sometimes, we’ll be interested in explaining dependent variables that are discrete and, in particular, binary , such as: Y = indicator of whether an individual gets into college or not. Y = indicator of whether an individual smokes. Y = indicator of whether an mortgage is accepted. Maria Casanova Lecture 14
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
0. Introduction Example: Morgage denial and race - The Boston Fed HMDA dataset. Sample of individual applications for single-family mortgages made in 1990 in the greater Boston Area. The sample included all applications made by blacks and hispanics and a random sample of the applications made by whites. It consisted of 2,380 observations, collected under the Home Mortgage Disclosure Act (HMDA). The set of variables included: Whether the mortgage application is accepted (dependent variable) Income, wealth, employment status, other loans, property characteristics (independent variables) Race of applicant (independent variables). Maria Casanova Lecture 14
Background image of page 4
1. Linear Probability Model A natural starting point for the analysis of a binary dependent variable is the linear regression model with a single regressor: Y i = β 0 + β 1 X i + u i The linear regression model is called the linear probability model when the dependent variable is binary. This is because, when Y is binary, the population regression function corresponds to the probability that Y equals 1. (proof on the next slide) Maria Casanova Lecture 14
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
1. Linear Probability Model Recall that, according to the first least squares assumption, E ( u i | X i ) = 0. Therefore: E ( Y | X ) = E ( β 0 + β 1 X + u | X ) = β 0 + β 1 X When Y is binary, E ( Y ) = 0 × Pr ( Y = 0) + 1 × Pr ( Y = 1) = Pr ( Y = 1) So when Y is binary, E ( Y | X ) = Pr ( Y = 1 | X ) = β 0 + β 1 X Maria Casanova Lecture 14
Background image of page 6
1. Linear Probability Model When Y is binary, the population coefficient β 1 represents the change in the probability that Y is 1 associated with a one unit change in X . β 1 = Δ Y Δ X = Pr ( Y = 1 | X = x + Δ X ) - Pr ( Y = 1 | X = x ) Δ X The predicted value ˆ Y , computed using the estimated regression function: ˆ Y = ˆ β 0 + ˆ β 1 X is the predicted probability that Y equals 1 given X . Maria Casanova Lecture 14
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
1. Linear Probability Model Application to the Boston HMDA dataset Consider the following model: Y i = β 0 + β 1 X i + u i , where: Y i = ± 1 if mortgage is denied 0 otherwise X = P/I ratio.
Background image of page 8
Image of page 9
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 29

lecture14 - ECON 103, Lecture 14: Regression with a binary...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online