This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Stochastic Process 9/29/2006 Lecture 3 Jointly Gaussian Density Function NCTUEE Summary In this lecture, I will discuss: • Jointly Gaussian Random Variables • Joint Gaussian Density Function • Conditional Joint Gaussian Density • A Simple Detection Problem Notation We will use the following notation rules, unless otherwise noted, to represent symbols during this course. • Boldface upper case letter to represent MATRIX • Boldface lower case letter to represent vector • Superscript ( · ) T and ( · ) H to denote transpose and hermitian (conjugate transpose), respectively • Upper case italic letter to represent RANDOM VARIABLE 31 1 Joint Gaussian Density 1. (Recall) Any jointly Gaussian random vector x can be represented by a linear combination of the vector of i.i.d. standard normal random variables z ∼ N ( , I n ) . That is, if x ∼ N ( m x , K x ), we can write x = K x 1 / 2 z + m x , where K x 1 / 2 = EΛ 1 2 E H with E being the matrix of orthonormal eigen vectors and Λ the diagonal matrix of eigenvalues of K x . 2. Let z ∼ N ( , I n ) and U be a unitary matrix. Then, Uz has an identical distribution as z , denoted by z = d Uz . (Justify) a. Uz is jointly Gaussian. b. Mean vector of Uz is a zero vector. c. Cov( Uz , Uz ) = U Cov( z , z ) U T = UU T = I 32 3. (General Expression) Let x = [ X 1 ,X 2 , ··· ,X n ] be a real jointly Gaus sian random vector (Normal random vector) with mean vector m x and covariance matrix...
View
Full
Document
This note was uploaded on 11/28/2010 for the course EE 301 taught by Professor Gfung during the Winter '10 term at National Chiao Tung University.
 Winter '10
 GFung

Click to edit the document details