7Discrete_Random_Vectors

7Discrete_Random_Vectors - Chapter 7 Discrete Random...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 7 Discrete Random Vectors Thus far, our treatment of probability has been focused on single random vari- ables. It is often convenient or required to model stochastic phenomena using multiple random variables. In this section, we extend some of the concepts developed for single random variables to multiple random variables. We center our exposition of random vectors around the a simple case, pairs of random variables. 7.1 Joint Probability Mass Functions Consider two discrete variables X and Y associated with a single experiment. The random pair ( X,Y ) is characterized by the joint probability mass function of X and Y , which we denote by p X,Y ( · , · ). If x is a possible value of X and y is a possible value of Y , then the probability mass of ( x,y ) is denoted by p X,Y ( x,y ) = Pr( { X = x } ∩ { Y = y } ) = Pr( X = x,Y = y ) . Note the similarity between the definition of the joint PMF and (5.1). Suppose that S is a subset of X (Ω) × Y (Ω). We can express the probability of S as Pr( S ) = Pr( { ω ∈ Ω | ( X ( ω ) ,Y ( ω )) ∈ S } ) = ( x,y ) ∈ S p X,Y ( x,y ) . 71 72 CHAPTER 7. DISCRETE RANDOM VECTORS 1 2 3 4 5 Sample Space R R X Y ( X,Y ) Figure 7.1: The random pair ( X,Y ) maps every outcome contained in the sample space to a real vector in R 2 . In particular, we have x ∈ X (Ω) y ∈ Y (Ω) p X,Y ( x,y ) = 1 . To further distinguish between the joint PMF of X and Y and the indi- vidual PMFs p X ( · ) and p Y ( · ), we occasionally refer to the latter as marginal probability mass functions . We can compute the marginal PMFs of X and Y from the joint PMF p X,Y ( · , · ) using the formulas p X ( x ) = y ∈ Y (Ω) p X,Y ( x,y ) , p Y ( y ) = x ∈ X (Ω) p X,Y ( x,y ) . On the other hand, knowledge of the marginal distributions p X ( · ) and p Y ( · ) is not enough to obtain a complete description of the joint PMF p X,Y ( · , · ). This fact is illustrated in Examples 52 & 53. Example 52. An urn contains three balls numbered one, two and three. A random experiment consists of drawing two balls from the urn, without replace- ment. The number appearing on the first ball is a random variable, which we denote by X . Similarly, we refer to the number inscribed on the second ball as Y . The joint PMF of X and Y is specified below, 7.2. FUNCTIONS AND EXPECTATIONS 73 p X,Y ( x,y ) 1 2 3 1 1 / 6 1 / 6 2 1 / 6 1 / 6 3 1 / 6 1 / 6 . We can compute the marginal PMF of X as p X ( x ) = y ∈ Y (Ω) p X,Y ( x,y ) = 1 6 + 1 6 = 1 3 , where x ∈ { 1 , 2 , 3 } . Likewise, the marginal PMF of Y is seen to equal p Y ( y ) = 1 / 3 , if y ∈ { 1 , 2 , 3 } , otherwise . Example 53. Again, suppose that an urn contains three balls numbered one, two and three. This time the random experiment consists of drawing two balls from the urn with replacement. We use X and Y to denote the numbers appearing on the first and second balls, respectively. The joint PMF of X and Y now becomes p X,Y ( x,y ) 1 2 3 1 1 / 9 1 / 9 1 / 9 2 1 / 9 1 / 9...
View Full Document

This note was uploaded on 03/30/2010 for the course ECEN 303 taught by Professor Chamberlain during the Fall '07 term at Texas A&M.

Page1 / 17

7Discrete_Random_Vectors - Chapter 7 Discrete Random...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online