week3-notes

week3-notes - Poisson distribution BAYESIAN ANALYSIS: Week...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Poisson distribution BAYESIAN ANALYSIS: Week 3 and 4 - Poisson Distribution July 3rd and 17th 2008 This week we are interested in the Poisson distribution. We first look at the probabil- ity density function and discuss some of its properties. We then compute the likelihood associated with the Poisson distribution and work out why the gamma distribution is its conjugate prior. Finally, we discuss how the negative binomial distribution arises as a gamma mixture of Poisson distributions. R code for the example is included in the Ap- pendix. 1 Poisson distribution The probability density function for a single observation of the Poisson distribution is: P ( x | λ ) = λ x e - λ x ! (1) Assuming that a series of observations x 1 ,...,x n are sampled from P ( x | λ ) and that each observation is independent and is indentically distributed (i.i.d), the joint probability density function for x 1 n is the product of the individual pdfs: P ( x 1 ,...x n | λ ) = n Y i =1 P ( x i | λ ) = n Y i =1 λ x i e - λ x i ! = λ P n i =1 x i e - Q n i =1 x i ! (2) The likelihood function for λ will be proportional to the joint pdf and only needs the terms that involve λ (i.e. we can drop the denominator of equation 2): L ( λ | x 1 n ) P ( x 1 n | λ ) λ P n i =1 x i e - (3) 1 Notes: D. Ricard. Bayesian Group:W. Blanchard, D. Ricard, D.P. Tittensor, C. Minto, T. Davies, S. Anderson
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Poisson distribution The maximum likelihood estimate of λ (called ˆ λ hereafter) can be found by first taking the log of the likelihood: log ( L ( λ | x 1 ,...,x n )) = l ( λ | x 1 n ) n X i =1 x i log ( λ ) - λn (4) and then taking the derivative with respect to λ : dl n X i =1 x i 1 λ - n (5) setting the derivative equal to 0 and finally solving for λ : n X i =1 x i 1 λ - n = 0 ˆ λ = n i =1 x i n (6) In order to ascertain that this is indeed the maximum likelihood estimate, we would also take the second order derivative. It can be shown that the second derivative satifies the criteria for global optimality for λ .
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 7

week3-notes - Poisson distribution BAYESIAN ANALYSIS: Week...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online