elemprob-page24 - X i are independent Bernoullis. 3....

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Proposition 11.2. If X and Y are independent, then Var ( X + Y ) = Var X + Var Y. Proof. We have Var ( X + Y ) = Var X + Var Y + 2Cov ( X,Y ) = Var X + Var Y. Since a binomial is the sum of n independent Bernoulli’s, its variance is np (1 - p ). If we write X = n i =1 X i /n and the X i are independent and have the same distribution ( X is called the sample mean), then E X = E X 1 and Var X = Var X 1 /n . We define the conditional expectation of X given Y by E [ X | Y = y ] = Z xf X | Y = y ( x ) dx. 12. Moment generating functions. We define the moment generating function m X by m X ( t ) = E e tX , provided this is finite. In the discrete case this is equal to e tx p ( x ) and in the continuous case R e tx f ( x ) dx . Let us compute the moment generating function for some of the distributions we have been working with. 1. Bernoulli: pe t + (1 - p ). 2. Binomial: using independence, E e t X i = E Y e tX i = Y E e tX i = ( pe t + (1 - p )) n , where the
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X i are independent Bernoullis. 3. Poisson: E e tX = X e tk e- k k ! = e- X ( e t ) k k ! = e- e e t = e ( e t-1) . 4. Exponential: E e tX = Z e tx e-x dx = -t if t < and if t . 5. N (0 , 1): 1 2 Z e tx e-x 2 / 2 dx = e t 2 / 2 1 2 Z e-( x-t ) 2 / 2 dx = e t 2 / 2 . 6. N ( , 2 ): Write X = + Z . Then E e tX = E e t e tZ = e t e ( t ) 2 / 2 = e t + t 2 2 / 2 . Proposition 12.1. If X and Y are independent, then m X + Y ( t ) = m X ( t ) m Y ( t ) . Proof. By independence and Proposition 11.1, m X + Y ( t ) = E e tX e tY = E e tX E e tY = m X ( t ) m Y ( t ) . 24...
View Full Document

Ask a homework question - tutors are online