{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

HW8s[1] - ECE 534 Elements of Information Theory Fall 2010...

Info icon This preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 534: Elements of Information Theory, Fall 2010 Homework: 8 Solutions Exercise 8.9 (Johnson Jonaris GadElkarim) Gaussian mutual information . Suppose that ( X, Y, Z ) are jointly Gaussian and that X Y Z forms a Markov chain. Let X and Y have correlation coefficient ρ 1 and le t Y and Z have correla- tion coefficient ρ 2 . Find I ( X ; Z ). Solution I ( X ; Z ) = h ( X ) + h ( Z ) h ( X, Z ) Since X, Y, Z are jointly Gaussian, hence X and Z are jointly Gaussian, their covariance matrix will be: K = bracketleftBigg σ 2 x σ x σ z ρ xz σ x σ z ρ xz σ 2 z bracketrightBigg Hence I ( X ; Z ) = 0 . 5 log(2 πeσ 2 x ) + 0 . 5 log(2 πeσ 2 z ) 0 . 5 log(2 πe | K | ) | K | = σ 2 x σ 2 z (1 ρ 2 xz ) I ( X ; Y ) = 0 . 5 log(1 ρ 2 xz ) Now we need to compute ρ xz , using markotivity p(x,z—y) = p(x—y)p(z—y) we can get ρ xz = E ( xz ) σ x σ z = E ( ( xz | y ) ) σ x σ z = E ( E ( x | y ) E ( z | y ) ) σ x σ z Since X, Y and Z are jointly Gaussian: E ( x | y ) = σ x ρ xy σ y Y , we can do the same for E ( z | y ) ρ xz = ρ xy ρ zy I ( X ; Z ) = 0 . 5 log(1 ( ρ xy ρ zy ) 2 ) Exercise 9.2 (Johnson Jonaris GadElkarim) Two-look Gaussian channel . Consider the ordinary Gaussian channel with two correlated looks at X , that is, Y = ( Y 1 , Y 2 ), where Y 1 = X + Z 1 Y 2 = X + Z 2 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
with a power constraint P on X , and ( Z 1 , Z 2 ) ∼ N 2 (0 , K ), where K = bracketleftbigg N N bracketrightbigg . Find the capacity C for (a) ρ = 1 (b) ρ = 0 (c) ρ = 1 Solution The capacity will be C = max I ( X ; Y 1 , Y 2 ) I ( X ; Y 1 , Y 2 ) = h ( Y 1 , Y 2 ) h ( Y 1 , Y 2 | X ) = h ( Y 1 , Y 2 ) h ( Z 1 , Z 2 ) h ( Z 1 , Z 2 ) = 0 . 5 log(2 πe ) 2 | k | = 0 . 5 log(2 πe ) 2 N 2 (1 ρ 2 ) The mutual information will be maximized when Y 1 , Y 2 are jointly Gaussian with covariance matrix K y = P.I 2 X 2 + K z where I 2 X 2 is an identity matrix of dimension 2.
Image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern