# RP-HW2-sol - Kyung Hee University Department of Electronics...

This preview shows pages 1–4. Sign up to view the full content.

1 Kyung Hee University Department of Electronics and Radio Engineering C1002900 Random Processing Homework 2 Solutions Spring 2010 Professor Hyundong Shin Issued: April 14, 2010 Due: April 28, 2010 Reading: Course textbook Chapter 7 HW 2.1 (a) Covariance matrices are symmetric and positive semi-definite. A - not symmetric. B - not positive semi-definite. Has a negative eigenvalue. C - not positive semi-definite. In particular,   C 010 010 3 . D - not symmetric. E - possible covariance matrix. Note: E 0 . F - possible covariance matrix. Note: F 0 . G - possible covariance matrix. Note: G 0 . ,, EFG are the only possible covariance matrices. (b) All are possible cross-covariance matrices. In fact, any matrix may be a cross-covariance matrix. One way to see this is as follows. Let X be a zero mean random vector with covariance X Λ I . Let YP X . Then  XY E  Λ XY X E  XX P Λ PP . Since P is arbitrary, so is XY Λ . (c) Of the possible covariance matrices, only G may be one for a random vector with one component a linear combination of the other two because G is the only singular matrix among . To see that such a matrix must be singular let ZXY    and consider random vector: XY Z W .   Cov W Λ WW . Then  W 1 , a constant with va- riance   W 11 0 . ,no t W  Λ 00 .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 (d) Statistical independence uncorrelated. Thus, such a matrix must be diagonal only F . However, uncorrelated statistical independent so a random vector with F as covariance matrix need not have statistically independent components. HW 2.2 (a) X and Y are independent. TRUE Reasoning/Work to be looked at:   EE E E E E E E E  XY XY Y Y X Y Y X X Y . So X and Y are uncorrelated, and since X and Y are jointly Gaussian, uncorrelated implies independent, so they are independent. (b)    Ef f XY X Y . TRUE Reasoning/Work to be looked at:  E f EE f E E E  XYY Y XY Y X X Y . (c) X . TRUE Reasoning/Work to be looked at: let E E f E E f f E X Y X Y Y Y Y Y 2 . and E E f XX Y Y .        E E f f   X Y X Y 0 . Then Var f   YY Y 2 2 0 . If the variance of something is zero, then it is deterministic and equal to its mean. Thus     fE f E E YYX Y X . HW 2.3 We will use formula      Var YE Y E Y  2 2 . For the variance of random variable Y . Let   ˆ YX x . Then
3         늿늿 Var Var ex E X x X x E X x X    2 2 2 where the last equality follows from the fact that shifting a random variable by a constant (in this case ˆ x ) does not change its variance. Since the first term is not dependent on ˆ x and the second is always nonnegative, we see that this expression is minimized when ˆ EX x  0 .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 06/10/2010 for the course ELECTRONIC C1002900 taught by Professor Hyungdongshin during the Spring '10 term at Kyung Hee.

### Page1 / 13

RP-HW2-sol - Kyung Hee University Department of Electronics...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online