{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Stat400Lec17(Ch6.1)

# Stat400Lec17(Ch6.1) - STAT 400 p.m.f or p.d.f(Chapter 6.1 f...

This preview shows pages 1–3. Sign up to view the full content.

STAT 400 (Chapter 6.1) Spring 2012 p.m.f. or p.d.f. f ( x ; ) , . parameter space. 1. Suppose = { 1, 2, 3 } and the p.d.f. f ( x ; ) is = 1: f ( 1 ; 1 ) = 0.6, f ( 2 ; 1 ) = 0.1, f ( 3 ; 1 ) = 0.1, f ( 4 ; 1 ) = 0.2. = 2: f ( 1 ; 2 ) = 0.2, f ( 2 ; 2 ) = 0.3, f ( 3 ; 2 ) = 0.3, f ( 4 ; 2 ) = 0.2. = 3: f ( 1 ; 3 ) = 0.3, f ( 2 ; 3 ) = 0.4, f ( 3 ; 3 ) = 0.2, f ( 4 ; 3 ) = 0.1. What is the maximum likelihood estimate of ( based on only one observation of X ) if … a) X = 1; b) X = 2; c) X = 3; d) X = 4. Likelihood function: L ( ) = L ( ; x 1 , x 2 , … , x n ) = n i 1 f ( x i ; ) = f ( x 1 ; ) f ( x n ; ) It is often easier to consider ln L ( ) = n i 1 ln f ( x i ; ) . The maximum likelihood estimator (MLE) is the value that maximize the (log) likelihood function.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1½. Let X 1 , X 2 , … , X n be a random sample of size n from a Poisson distribution with mean , > 0. a) Obtain the maximum likelihood estimator of , ˆ . Let θ ˆ be the maximum likelihood estimate (m.l.e.) of . Then the MLE of any function h ( ) is h ( θ ˆ ) . ( The Invariance Principle ) b) Obtain the maximum likelihood estimator of P ( X = 2 ) .
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}