This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 7 Sums of Random Variables and LongTerm Averages ENCS6161  Probability and Stochastic Processes Concordia University Sums of Random Variables Let X 1 , ,X n be r.v.s and S n = X 1 + + X n , then E [ S n ] = E [ X 1 ] + + E [ X n ] V ar [ S n ] = V ar [ X 1 + + X n ] = E n summationdisplay i =1 ( X i X i ) n summationdisplay j =1 ( X j X j ) = n summationdisplay i =1 V ar [ X i ] + n summationdisplay i =1 n summationdisplay j =1 i negationslash = j Cov ( X i ,X j ) If Z = X + Y ( n = 2) , V ar [ Z ] = V ar [ X ] + V ar [ Y ] + 2 Cov ( X,Y ) ENCS6161 p.1/14 Sums of Random Variables Example : Sum of n i.i.d r.v.s with mean and variance 2 . E [ S n ] = E [ X 1 ] + + E [ X n ] = n V ar [ S n ] = nV ar [ X i ] = n 2 pdf of sums of independent random variables X 1 , ,X n indep r.v.s and S n = X 1 + + X n , then S n ( w ) = E [ e jwS n ] = E [ e jw ( X 1 + + X n ) ] = X 1 ( w ) X n ( w ) and f S n ( s ) = F 1 { X 1 ( w ) X n ( w ) } ENCS6161 p.2/14 Sums of Random Variables Example : X 1 X n indep and X i N ( m i , 2 i ) . What is the pdf of S n = X 1 + + X n ? For a Guassian r.v. X N ( , 2 ) X ( w ) = e jw w 2 2 2 (prove it by yourself) So S n ( w ) = n productdisplay i =1 e jwm i w 2 2 i 2 = e jw ( m 1 + + m n ) w 2 ( 2 1 + + 2 n ) / 2 S n...
View
Full
Document
This note was uploaded on 01/15/2011 for the course ECE 616 taught by Professor Khkjk during the Winter '10 term at Concordia Canada.
 Winter '10
 khkjk

Click to edit the document details