This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ISYE 2028 A and B Lecture 8 Kobi Abayomi March 25, 2009 1 Independent Random Variables Two random variables are independent if p X,Y ( x,y ) = p X ( x ) p Y ( y ) (1) or f X,Y ( x,y ) = f X ( x ) f Y ( y ) (2) This is directly analogous to the general probability rules. The conditional probability mass and density functions, are then just: p X  Y ( X  Y ) = p X,Y ( x,y ) p Y ( y ) = p X ( x ) p Y ( y ) p Y ( y ) = p X ( x ) f X  Y ( X  Y ) = f X,Y ( x,y ) f Y ( y ) = f X ( x ) f Y ( y ) f Y ( y ) = f X ( x ) Dependence is any violation of this condition. 1.1 Example Let f X 1 ,X 2 ( x 1 ,x 2 ) = x 1 + x 2 1 { <x 1 < 1 , <x 2 < 1 } Are x 1 and x 2 independent? 1 Well...: f 1 ( x 1 ) = Z f ( x 1 ,x 2 ) dx 2 = Z 1 x 1 + x 2 dx 2 = x + 1 / 2 1 { x 1 < 1 } f 2 ( x 2 ) = Z f ( x 1 ,x 2 ) dx 1 = Z 1 x 1 + x 2 dx 1 = 1 / 2 + x 2 1 { x 2 < 1 } But: x 1 + x 2 6 = ( x 1 + 1 / 2)( x 2 + 1 / 2). The answer is no. 1.2 Independence is factorization of the pdf In general, for X 1 ,X 2 f ( x 1 ,x 2 ), if f ( x 1 ,x 2 ) = g ( x 1 ) h ( x 2 ) this implies that X 1 is indepen dent of X 2 This is not a formal proof 1 : We know that we can always write a joint pdf as product of a conditional and marginal... f X 1 ,X 2 ( x 1 ,x 2 ) = f X 2  X 1 ( x 2  x 1 ) f X 1 ( x 1 ) If the functional form of f X 2  X 1 ( x 2  x 1 ) does not include (depend) x 1 , say f X 2  X 1 ( x 2  x 1 ) = h ( x 2 ) then  integrate both sides over x 1 ... Z f X 1 ,X 2 ( x 1 ,x 2 ) dx 1 = Z f X 2  X 1 ( x 2  x 1 ) f X 1 ( x 1 ) dx 1 which yields f X 2 ( x 2 ) = h ( x 2 ) Z f X 1 ( x 1 ) dx 1 and of course f X 2 ( x 2 ) = h ( x 2 ) = f X 2  X 1 ( x 2  x 1 ) 1 But then, what is these days...? 2 Thus, independence of X 2 and X 1 is equivalent to the factorability of the joint distribution. The definitions for independence are straightforward. Often we can exploit the strong as sumption of independence to simplify modelling, and get interesting results. 1.3 Example In n + m independent trials, each with probability p of success: let X the successes in the 1 st the first n trials; let Y the successes in the final m trials. Then P ( X = x,Y = y ) = C n x p x (1 p ) n x C m y p y (1 p ) n y and it is apparent that X and Y are independent. What about Z = X + Y ? Are Z and X independent? P ( X = x,Z = z ) = P ( X = x,Y = z x ) = C n x p x (1 p ) n x C m z x p z x (1 p ) m ( z x ) which implies that X and Z are not independent. 2 Mutual Independence Heres an example of mutual independence Let f ( x,y,z ) = e ( x + y + z ) 1 { <x,y,z< } The cumulative distribution, then, is F ( x,y,z ) = Z z Z y Z x e ( u + v + w ) dudvdw = = (1 e x )(1 e y )(1 e z ) The joint distribution is completely factorable X , Y and Z are mutually independent....
View
Full
Document
This note was uploaded on 11/08/2009 for the course ISYE 2028 taught by Professor Shim during the Spring '07 term at Georgia Institute of Technology.
 Spring '07
 SHIM

Click to edit the document details