This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Examples on Maximum Likelihood Estimation and Bayesian Inference November 13, 2009 1 Estimating a Normal Population Mean 1.1 Unknown , Known 2 Let X i N ( , 2 ) where 2 is known and is to be estimated based on the observed n samples, x 1 ,...,x n . To be able to find the maximum likelihood estimator of or to obtain a posterior distribution over using the Bayesian route, we first need to state the likelihood. Recall that the likelihood is obtained from the data generating machinery and in this case it is a normal distribution. Our observations x i are generated by a normal distribution with mean which we would like to learn about. Thus the likelihood of (or the joint density of the observed x i ) stated as the product of normal densities evaluated at each x i . L ( ) = p ( x 1 ,...,x n | ) = n Y i =1 1 2 exp (- 1 2 x i- 2 ) The maximum likelihood estimator, , is argmax log L ( ), i.e. the value that maximizes the logarithm of the likelihood function given above. If we take the log of the likelihood function we obtain log L ( ) = ( ) =- n 2 log(2 )- n 2 log 2- 1 2 2 n X i =1 ( x i- ) 2 . If we take the derivative with respect to , and set it equal to zero evaluated at the maximum , we obtain = i x i /n = x . If desired confidence intervals may be obtained on ....
View Full Document
- Fall '08