Chapter 3

Chapter 3 - § 3 Likelihood and Sufficiency § 3.1...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: § 3 Likelihood and Sufficiency § 3.1 Introduction 3.1.1 Sample data: X = ( X 1 ,...,X n ) Postulated parametric family of probability functions (i.e. either mass functions or pdf’s) : { p ( · | θ ) : θ ∈ Θ } e.g. — • X 1 ,...,X n are iid Poisson( θ ): p ( x 1 ,...,x n | θ ) = n Y i =1 P ( X i = x i | θ ) = n Y i =1 e- θ θ x i x i ! , θ ∈ (0 , ∞ ) . • X 1 ,...,X n are iid N ( μ,σ 2 ): p ( x 1 ,...,x n | θ ) = n Y i =1 • 1 √ 2 πσ 2 exp ‰- ( x i- μ ) 2 2 σ 2 ‚ θ = ( μ,σ ) ∈ (-∞ , ∞ ) × (0 , ∞ ) . If we observe X = x = ( x 1 ,...,x n ), what can we learn from x about the true value of θ ? We wish to answer this question by means of statistical inference . 3.1.2 Raw sample data, i.e. x , contain information relevant to our inference about θ . We want to extract such information completely yet “economically”. This amounts to finding an efficient way to “summarize” data. Answer: Likelihood function — a mathematical device summarizing all information available in x which is relevant to θ . It measures the plausibility of each θ ∈ Θ being the true θ that gives rise to x . 3.1.3 Suppose the random vector X has a probability function belonging to the parametric family { p ( · | θ ) : θ ∈ Θ } . Definition. Given that X is observed (realised) to be x , the likelihood function of θ is defined to be ‘ x ( θ ) = p ( x | θ ) , i.e. the probability function of X , evaluated at X = x , but considered as a function of θ . 17 The loglikelihood function is S x ( θ ) = ln ‘ x ( θ ) , which gives equivalent information but is more convenient to work with than ‘ x ( θ ). 3.1.4 Common special case — X iid: X is a random sample, i.e. X = ( X 1 ,...,X n ) are iid with each X i ∼ probability function f ( x | θ ). Then the (joint) probability function of X is p ( x | θ ) = n Y i =1 f ( x i | θ ) and so ‘ x ( θ ) = p ( x | θ ) = n Y i =1 f ( x i | θ ) , where x = ( x 1 ,...,x n ) is the realisation of X . 3.1.5 When examining a likelihood function ‘ x ( θ ), we are mainly interested in the relative likelihoods of different values of θ . The actual magnitude of the likelihood function itself is unimportant. Without loss of information we may ignore multiplicative factors in the likelihood function that do not depend on θ . For example, if ( X 1 ,...,X n ) iid ∼ Poisson( θ ), we may take the likelihood function to be ‘ x ( θ ) = e- nθ θ ∑ i x i , omitting the factor ( Q i x i !)- 1 . 3.1.6 Example § 3.1.1 There are two decks of playing cards: • deck A — conventional, has 52 cards; • deck B — has 13 cards of the same suit ( ♥ ). Your friend draws two cards randomly without replacement from the same deck and gets an ace and a king of hearts. Without asking your friend, which deck do you think your friend is more likely to draw the cards from?...
View Full Document

This note was uploaded on 05/04/2011 for the course STAT 1302 taught by Professor Smslee during the Spring '10 term at HKU.

Page1 / 14

Chapter 3 - § 3 Likelihood and Sufficiency § 3.1...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online