ISEN 609 Lecture 5

ISEN 609 Lecture 5 - Sums of Independent Random Variables:...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Sums of Independent Random Variables: Convolutions Let X ~ F and Y ~ G be two independent variables and suppose we are interested in their sum Z=X + Y. What is the distribution of Z? 1. Compute the distribution function directly: F Z ( a )= P ( Z a )= P ( X + Y a ) = ±± x + y a f ( x, y ) dxdy = ±± x + y a f ( x ) g ( y ) dxdy = ± -∞ ± a - y -∞ f ( x ) g ( y ) dxdy = ± -∞ F X ( a - y ) g ( y ) dy = ± -∞ F X ( a - y ) dG ( y )
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Convolutions, con’t 2. Use moment generating functions: Which approach is easier? It depends. = E [ e tX ] E [ e tY ]= φ X ( t ) φ Y ( t ) φ Z ( t )= E [ e tZ ]= E [ e t ( X + Y ) ]= E [ e tX e tY ]
Background image of page 2
Example Suppose X and Y are independent Poisson random variables, parameters and , resp. What is the distribution of Z = X+Y ? or and now? Recognize this as the mgf of a Poisson r.v. with parameter φ Z ( t )= e λ 1 ( e t - 1) e λ 2 ( e t - 1) = e ( λ 1 + λ 2 )( e t - 1) λ 1 + λ 2 . λ 1 λ 2 F Z ( n )= n ± i =0 ² n - i ± j =0 e - λ 1 λ j 1 j ! ³ e - λ 2 λ i 2 i !
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
5. Conditional Probability, Conditional Distributions, and Conditional Expectation Knowing something about the outcome of an experiment (i.e., knowing an event occurred) reduces the possible outcomes and may change the probability we assign to other events. Suppose A occurred, with P ( A ) > 0. Then Note that P ( B | A ) := P ( A B ) P ( A ) P ( A | A ) 1
Background image of page 4
Now let’s move to random variables Suppose X and Y are discrete and let P ( X = x | Y = y ) = P ( X = x,Y = y ) P ( Y = y ) P ( X A| Y B ) = P ( X A Y B ) P ( Y B ) = P ( X - 1 ( A ) Y - 1 ( B )) P ( Y - 1 ( B )) A = { x } , B = { y }
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Conditional Density and Distribution Functions Now we can define a conditional mass function and a conditional density function and finally a conditional distribution function
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/28/2011 for the course ISEN 609 taught by Professor Klutke during the Spring '08 term at Texas A&M.

Page1 / 20

ISEN 609 Lecture 5 - Sums of Independent Random Variables:...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online