Binary Choice (Probit and Logit) Models

Binary Choice (Probit and Logit) Models - ECON 203C: System...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECON 203C: System Models TA Note 4: Version 3 Binary Choice (Probit and Logit) Models Hisayuki Yoshimoto Last Modi&ed: May 7, 2008 Abstract: In section 1, we review Bernoulli random variable that is basis of binary choice model. In section 2, we solve Final 2003 Question 5, the ML estimation of Bernoulli distribution. In section 4, we discuss binary choice models and their asymptotic distributions. In section 5, we solve Final 2006 Question 1 and learn identi&cation problem. In section 6 and 7, we solve Comp 2003S Part III Question 2 and Comp 2003F Part III Questions, binary choice model problems. 1 Review of Bernoulli Random Variable Since binary choice models are extension of Bernoulli random variable distribution, lets start this TA note with reviewing Bernoulli distribution. Bernoulli random variable y takes only two values and 1 with probabilities & y = 1 : success with probability p y = 0 : failure with probability 1 & p : It has the (mass) density function f ( y j p ) = p y (1 & p ) 1 & y : The expectation and variance of Bernoulli random variable Y are E [ y ] = p 1 + (1 & p ) 0 (de&nition of expectation) = p and V ar [ y ] = E h ( y & E [ y ]) 2 i (de&nition of variance) = p (1 & p ) 2 + (1 & p ) (0 & p ) 2 (since E [ y ] = p ) = p (1 & p ) : Note that the property E [ y ] = p becomes crucial when we extend Bernoulli distribution to binary choice models. The following question explains how to obtain the estimate of p by the maximum likelihood method. 2 Final 2003: Question 5 - ML Estimation of Bernoulli Random Variable Let y 1 ;: :: ;y n be a random sample from a Bernoulli distribution with the probability of success given by p: (1) Write the likelihood function for p i.e. L ( p j y 1 ;: :: ;y n ) : Answer: Since f y i g n i =1 are Bernoulli, we have & y i = 1 : success with probability p y i = 0 : failure with probability 1 & p The density function of Bernoulli distribution is f ( y i j p ) = p y i (1 & p ) 1 & y i 1 Since f y i g n i =1 are random (independent) sample, the likelihood function is L ( p j y 1 ;: :: ;y n ) = f ( y 1 ;:::; y n j p ) = n Q i =1 f ( y i j p ) (since samples are random) = n Q i =1 p y i (1 & p ) 1 & y i (2) Provide the MLE for p: Answer: The log-likelihood function is l ( p j y 1 ;: :: ;y n ) = ln L ( p j y 1 ;: :: ;y n ) = n P i =1 ln h p y i (1 & p ) 1 & y i i = n P i =1 [ y i ln p + (1 & y i ) ln (1 & p )] : Taking f.o.c. w.r.t. p; @ @p l ( p j y 1 ;: :: ;y n ) = @ @p n P i =1 [ y i ln p + (1 & y i ) ln (1 & p )] = n P i =1 & y i 1 p & (1 & y i ) 1 1 & p = 1 p n P i =1 y i & 1 1 & p n P i =1 (1 & y i ) = 0 ; and by arranging (1 & p ) n P i =1 y i = p n P i =1 (1 & y i ) n P i =1 x i & p n P i =1 y i | {z } cancel out = pn & p n P i =1 y i | {z } cancel out ^ p ML = 1 n n P i =1 y i : This result is nothing surprising because we already know the property of Bernoulli random variable, E [ y ] = p: (3) Derive the asymptotic distribution for p and provide the asymptotic covariance matrix....
View Full Document

This note was uploaded on 03/26/2012 for the course ECON 203c taught by Professor Buchinsky,m during the Spring '08 term at UCLA.

Page1 / 19

Binary Choice (Probit and Logit) Models - ECON 203C: System...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online