lect06 - Lecture Notes 6 Random Vectors Joint, Marginal,...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Notes 6 Random Vectors Joint, Marginal, and Conditional CDF, PDF, PMF Independence and Conditional Independence Mean and Covariance Matrix Mean and Variance of Sum of RVs Gaussian Random Vectors MSE Estimation: the Vector Case EE 278: Random Vectors 6 1 Specifying Random Vectors Let X 1 , X 2 , . . . , X n be random variables on the same probability space. We define a random vector (RV) as X = X 1 X 2 . . . X n X is completely specified by its joint cdf for x = ( x 1 , x 2 , . . . , x n ) : F X ( x ) = P { X 1 x 1 , X 2 x 2 , . . . , X n x n } , x R n If X is continuous, i.e., F X ( x ) is a continuous function of x , then X can be specified by its joint pdf: f X ( x ) = f X 1 ,X 2 ,...,X n ( x 1 , x 2 , . . . , x n ) , x R n If X is discrete then it can be specified by its joint pmf: p X ( x ) = p X 1 ,X 2 ,...,X n ( x 1 , x 2 , . . . , x n ) , x X n EE 278: Random Vectors 6 2 A marginal cdf (pdf, pmf) is the joint cdf (pdf, pmf) for a subset of { X 1 , . . . , X n } ; e.g., for X = X 1 X 2 X 3 the marginals are f X 1 ( x 1 ) , f X 2 ( x 2 ) , f X 3 ( x 3 ) f X 1 ,X 2 ( x 1 , x 2 ) , f X 1 ,X 3 ( x 1 , x 3 ) , f X 2 ,X 3 ( x 2 , x 3 ) The marginals can be obtained from the joint in the usual way. For the previous example, F X 1 ( x 1 ) = lim x 2 ,x 3 F X ( x 1 , x 2 , x 3 ) f X 1 ,X 2 ( x 1 , x 2 ) = - f X 1 ,X 2 ,X 3 ( x 1 , x 2 , x 3 ) dx 3 EE 278: Random Vectors 6 3 Conditional cdf (pdf, pmf) can also be defined in the usual way. E.g., the conditional pdf of X n k +1 = ( X k +1 , . . . , X n ) given X k = ( X 1 , . . . , X k ) is f X n k +1 | X k ( x n k +1 | x k ) = f X ( x 1 , x 2 , . . . , x n ) f X k ( x 1 , x 2 , . . . , x k ) = f X ( x ) f X k ( x k ) Chain Rule : We can write f X ( x ) = f X 1 ( x 1 ) f X 2 | X 1 ( x 2 | x 1 ) f X 3 | X 1 ,X 2 ( x 3 | x 1 , x 2 ) f X n | X n- 1 ( x n | x n- 1 ) Proof: By induction. The chain rule holds for n = 2 by definition of conditional pdf. Now suppose it is true for n- 1 . Then f X ( x ) = f X n- 1 ( x n- 1 ) f X n | X n- 1 ( x n | x n- 1 ) = f X 1 ( x 1 ) f X 2 | X 1 ( x 2 | x 1 ) f X n- 1 | X n- 2 ( x n- 1 | x n- 2 ) f X n | X n- 1 ( x n | x n- 1 ) , which completes the proof EE 278: Random Vectors 6 4 Independence and Conditional Independence Independence is defined in the usual way; e.g., X 1 , X 2 , . . . , X n are independent if f X ( x ) = n i =1 f X i ( x i ) for all ( x 1 , . . . , x n ) Important special case, i.i.d. r.v.s : X 1 , X 2 , . . . , X n are said to be independent, identically distributed (i.i.d.) if they are independent and have the same marginals Example: if we flip a coin n times independently, we generate i.i.d. Bern( p ) r.v.s. X 1 , X 2 , . . . , X n R.v.s X 1 and X 3 are said to be conditionally independent given X 2 if f X 1 ,X 3 | X 2 ( x 1 , x 3 | x 2 ) = f X 1 | X 2 ( x 1 |...
View Full Document

Page1 / 14

lect06 - Lecture Notes 6 Random Vectors Joint, Marginal,...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online