slides12 - Lecture Stat 302 Introduction to Probability...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture Stat 302 Introduction to Probability - Slides 12 AD March 2010 AD () March 2010 1 / 32
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Hypergeometric Random Variable Consider a barrel or urn containing N balls of which m are white and N m are black. We take a simple random sample (i.e. without replacement) of size n and measure X , the number of white balls in the sample. The Hypergeometric distribution is the distribution of X under this sampling scheme and P ( X = i ) = m i N m n i ± N n ± AD () March 2010 2 / 32
Background image of page 2
Example: Survey sampling Suppose that as part of a survey, 7 houses are sampled at random from a street of 40 houses in which 5 contain families whose family income puts them below the poverty line. What is the probability that: (a) None of the 5 families are sampled? (b) 4 of them are sampled? (c) No more than 2 are sampled? (d) At least 3 are sampled? Let X the number of families sampled which are below the poverty line. It follows an hypergeometric distribution with N = 40 , m = 5 and n = 7. So (a) P ( X = 0 ) (b) P ( X = 4 ) (c) P ( X 2 ) and (d) P ( X ± 3 ) AD () March 2010 3 / 32
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Mean and Variance of the Hypergeometric Distribution Let us introduce p = m / N then E ( X ) = np , Var ( X ) = np ( 1 p ) 1 n 1 N 1 ± . Suppose that m is very large compared to n , it seems reasonable that sampling without replacement is not too much di/erent than sampling with replacement. It can indeed be shown that the hypergeometric distribution can be well approximated by the binomial of parameters p = m N and N . AD () March 2010 4 ± 32
Background image of page 4
Example: Capture-Recapture Experiments We are interested in estimating the population N of animals inhabiting a certain region. To achieve this, capture-recapture studies proceed as follows. First, you capture m individuals, mark them and release them in the nature. A few days later, you capture say n animals; among the n animals some of them are marked and some are not. Let X be the number of animals which have been recaptured, then X follows an hypergeometric distribution of parameters N , m and n . Assume you have recaptured X = x animals, then you can estimate N by maximizing with respect to N the probability P ( X = x ) = m x N m n x ± / N n ± . This known as the Maximum Likelihood (ML) estimate of N . One can show that the ML estimate is the largest integer value not exceeding mn / x ; i.e. b N = b mn / x c . AD () March 2010 5 / 32
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Suppose that we have a lake containing N N is unknown. We capture and mark m = n = X of them are marked. What is the ML estimate of N if X = 35 and X = 5? If X = 35 then the ML estimate of N is b N m ± n / 35 = 142 . If
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/21/2010 for the course STAT Stat302 taught by Professor 222 during the Spring '10 term at UBC.

Page1 / 32

slides12 - Lecture Stat 302 Introduction to Probability...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online