Section 7.6 - Section 7.6 We had a brief...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Section 7.6 We had a brief introduction/review to random sample (iid rvs), estimator, unbiased estimator, and consistent estimator. We now discuss a method of finding estimators which possess good properties. 1. Maximum Likelihood Estimation This method yields estimators that have many desirable properties; both finite as well as large sample properties. The basic idea to find an estimator ) ( x which is the most likely given the data ) ,..., ( 1 n X X X = . Example 1. Let ) 1 ( , X ~ B , Bernoulli population with density 1 , , ) 1 ( ) | ( ) | ( 1 x =- = = =- x x p x X P x where ) 1 ( = = X P . We want to estimate population proportion based on a random sample of size n . That is, n X X ,..., 1 are independent and identically distributed random variables. 1 For } 1 , { i x , we have ] ,..., [ 1 1 n n x X x X P = = ] [ . . . ] [ 1 1 n n x X P x X P = = = n n x x x x---- = 1 1 ) 1 ( ... ) 1 ( 1 1 = , ) 1 ( 1 1 - - n i n i x n x is called the joint density of...
View Full Document

This note was uploaded on 07/25/2008 for the course STT 351 taught by Professor Palaniappan during the Summer '08 term at Michigan State University.

Page1 / 7

Section 7.6 - Section 7.6 We had a brief...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online