This preview shows page 1. Sign up to view the full content.
Unformatted text preview: o sets A
and B we have
P (X 2 A, Y 2 B ) = P (X 2 A)P (Y 2 B ) (A.13) In the discrete case, (A.13) is equivalent to
P (X = x, Y = y ) = P (X = x)P (Y = y ) (A.14) for all x and y . The condition for independence is exactly the same in the
continuous case: the joint distribution is the product of the marginal densities.
fX,Y (x, y ) = fX (x)fY (y ) (A.15) The notions of independence extend in a straightforward way to n random
variables: the joint probability or probability density is the product of the
Two important consequences of independence are
Theorem A.1. If X1 , . . . Xn are independent, then
E (X1 · · · Xn ) = EX1 · · · EXn
Theorem A.2. If X1 , . . . Xn are independent and n1 < . . . < nk n, then
h1 (X1 , . . . Xn1 ), h2 (Xn1 +1 , . . . Xn2 ), . . . hk (Xnk 1 +1 , . . . X nk ) are independent.
In words, the second result says that functions of disjoint sets of independent
random variables are independent.
Our last topic in this section is the distribution of X + Y w...
View Full Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).
- Spring '10
- The Land