{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

hilb_rand

# hilb_rand - Two random vectors X and Y are said to be...

This preview shows page 1. Sign up to view the full content.

Hilbert Spaces and Random Vectors Consder the random vectors of dimension n defined on R n with an underly- ing sample space S . We would like to characterize the Hilbert space of these random vectors with finite average power. The inner product between two zero mean random vectors X and Y that belong to this space is given by: < X , Y > = E { X T Y } , where the notation X T stands for the transpose of the vector X . The cor- responding average power of the random vector X < X , X > = P ave = E { X T X } = E {|| X || 2 } < . The angle between two random vectors X and Y is then given by: Θ XY = cos - 1 ˆ | E { X T Y }| p E {|| X || 2 } E {|| Y || 2 } ! . The triangle inequality in this Hilbert space is given by: E { ( X + Y ) 2 } ≤ E {|| X || 2 } + E {|| X || 2 } . The Cauchy Schwarz inequality on this Hilbert space of random vectors with finite average power is given by: E 2 { X T Y } ≤ E {|| X || 2 } E {|| Y || 2 } , where the equality holds when the vectors are linearly dependent.
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Two random vectors X and Y are said to be statistically orthogonal if: Θ XY = 90 ◦ ⇐⇒ E { X T Y } = 0 . In a similar fashion two random vectors X and Y are said to be statistically colinear if: Θ XY = 0 ⇐⇒ E 2 { X T Y } = E {|| X || 2 } E {|| Y || 2 } . When the random vectors are scalars. i.e., 1D random variables these results reduce back to the ones we saw with just two random variables. Any random vector in the sample space has a Karhunen Loeve (KLT) expansion of the form: X = n X i =1 λ i ~v i , where the λ i are the expansion coe±cients and ~v i are the orthonormal eigen-vectors of the covariance matrix C X of the random vector X...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online