elemprob-fall2010-page16

# elemprob-fall2010-page16 - X has a Poisson distribution...

This preview shows page 1. Sign up to view the full content.

To get the variance of X , we have E X 2 = n X k =1 E Y 2 k + X i 6 = j E Y i Y j . Now E Y i Y j = 1 · P ( Y i Y j = 1) + 0 · P ( Y i Y j = 0) = P ( Y i = 1 ,Y j = 1) = P ( Y i = 1) P ( Y j = 1) = p 2 using independence. The square of Y 1 + ··· + Y n yields n 2 terms, of which n are of the form Y 2 k . So we have n 2 - n terms of the form Y i Y j with i 6 = j . Hence Var X = E X 2 - ( E X ) 2 = np + ( n 2 - n ) p 2 - ( np ) 2 = np (1 - p ) . Later we will see that the variance of the sum of independent random variables is the sum of the variances, so we could quickly get Var X = np (1 - p ). Alternatively, one can compute E ( X 2 ) - E X = E ( X ( X - 1)) using binomial coeﬃcients and derive the variance of X from that. 6 Poisson distributions
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X has a Poisson distribution with parameter λ if P ( X = i ) = e-λ λ i i ! . Note ∑ ∞ i =0 λ i /i ! = e λ , so the probabilities add up to one. To compute expectations, E X = ∞ X i =0 ie-λ λ i i ! = e-λ λ ∞ X i =1 λ i-1 ( i-1)! = λ. Similarly one can show that E ( X 2 )-E X = E X ( X-1) = ∞ X i =0 i ( i-1) e-λ λ i i ! = λ 2 e-λ ∞ X i =2 λ i-2 ( i-2)! = λ 2 , 16...
View Full Document

## This note was uploaded on 12/29/2011 for the course MATH 316 taught by Professor Ansan during the Spring '10 term at SUNY Stony Brook.

Ask a homework question - tutors are online