The following two inequalities can still be used to

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ls. The coin shows heads with probability p each time it is flipped. The student flips the coin n times for some large value of n (for example, n = 1000 is reasonable). Heads shows on k of the flips. Find the ML estimate of p. Solution: Let X be the number of times heads shows for n flips of the coin. It has the binomial distribution with parameters n and p. Therefore, the likelihood that X = k is pX (k ) = n pk (1 − k p)n−k . Here n is known (the number of times the coin is flipped) and k is known (the number of times heads shows). Therefore, the maximum likelihood estimate, pM L , is the value of p that maximizes n pk (1 − p)n−k . Equivalently, pM L , is the value of p that maximizes pk (1 − p)n−k . First, k assume that 1 ≤ k ≤ n − 1. Then d(pk (1 − p)n−k ) = dp k n−k − p 1−p pk (1 − p)n−k = (k − np)pk−1 (1 − p)n−k−1 . 44 CHAPTER 2. DISCRETE-TYPE RANDOM VARIABLES k k This derivative is positive if p ≤ n and negative if p ≥ n . Therefore, the likelihood is maxi...
View Full Document

This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.

Ask a homework question - tutors are online