Sufficiency

Sufficiency - Economics 245A The Likelihood Function and...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Economics 245A The Likelihood Function and Sufficiency In Economics 241B we introduced the likelihood function and provided an intuitive de&nition of the maximum likelihood (ML) estimator. We also showed the equivalence of the OLS estimator and the ML estimator for linear regression models with Gaussian errors. Today we begin more formal discussion of ML estimation with full treatment of the likelihood (function). Recall that the likelihood function states how \likely" a given density is to have generated the observed sample. In full generality, we assume that Y = f Y 1 ; : : : ; Y n g is generated by a joint density function that belongs to the family of joint density functions denoted by F : The likelihood that a given joint density, f Y ( & ), generated the observed values y is L [ f Y ( & ) ; y ] ; which has analytic form f Y ( & ). In the applications we will consider the functional form of all members of F is speci&ed up to a &nite number of parameters & . The allowable set of parameters, or parameter space, is ¡. The likelihood that the joint density function with a given parameter value, & , generated the observed values y is L ( & ; y ) = f Y ( y ; & ) for all & 2 ¡ : The key to keeping distinct the ideas of the likelihood and the joint density is to note that the joint density takes as its argument y given a value of & while the likelihood takes as its argument & given a value of y . The log likelihood is l ( & ; y ) = log f Y ( y ; & ) for all & 2 ¡ : Observe that if we maximize the (log) likelihood to estimate & , then if f Y ( y ; & ) = h ( y; z ) f Z ( z; & ) for all & 2 ¡, then identical conclusions about & are drawn from y and z . In what follows, we will repeatedly refer to the Example (Cauchy). Let f Y 1 ; : : : ; Y n g be a sequence of i.i.d. Cauchy random variables with location ¡ and scale ¢ . The parameter space is de&ned by the restrictions ¡1 < ¡ < 1 and < ¢ . Let & = ( ¡; ¢ ) . The probability density function for Y t is f Y t ( y t ; & ) = 1 £¢ n 1 + [( y t ¡ ¡ ) =¢ ] 2 o ; where subscript 0 denotes population values. Remark. Much is interesting about the Cauchy distribution. The density is symmetric about ¡ . As the &rst moment of Y t does not exist, no integer moments exist. Hence the moment generating function does not exist and the Cauchy is a distribution that cannot be expressed through a sequence of moments. Further, because the &rst moment does not exist, the sample mean is not a consistent estimator of ¡ . The characteristic function of Y t does exist and can be used to show that the distribution of ¢ Y n is simply the distribution of Y t for any n , thereby con&rming the inconsistency of the sample mean. The ML estimator is a consistent estimator of & ....
View Full Document

This note was uploaded on 12/26/2011 for the course ECON 245a taught by Professor Staff during the Fall '08 term at UCSB.

Page1 / 5

Sufficiency - Economics 245A The Likelihood Function and...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online