Unformatted text preview: Y i Y j with i 6 = j . Hence Var X = E X 2( E X ) 2 = np + ( n 2n ) p 2( np ) 2 = np (1p ) . Later we will see that the variance of the sum of independent r.v.’s is the sum of the variances, so we could quickly get Var X = np (1p ). Alternatively, one can compute E ( X 2 )E X = E ( X ( X1)) using binomial coeﬃcients and derive the variance of X from that. Poisson . X is Poisson with parameter λ if P ( X = i ) = eλ λ i i ! . Note ∑ ∞ i =0 λ i /i ! = e λ , so the probabilities add up to one. To compute expectations, E X = ∞ X i =0 ieλ λ i i ! = eλ λ ∞ X i =1 λ i1 ( i1)! = λ. Similarly one can show that E ( X 2 )E X = E X ( X1) = ∞ X i =0 i ( i1) eλ λ i i ! = λ 2 eλ ∞ X i =2 λ i2 ( i2)! = λ 2 , 12...
View
Full Document
 Spring '09
 wen
 Bernoulli, Probability theory, Trigraph, yi yj, E Yi Yj

Click to edit the document details