hw2sol

# hw2sol - EE 376B Information Theory Prof T Cover Handout#7...

This preview shows pages 1–4. Sign up to view the full content.

EE 376B Handout #7 Information Theory Thursday, April 21, 2011 Prof. T. Cover Solutions to Homework Set #2 1. Multiple layer waterﬁlling Let C ( x ) = 1 2 log(1 + x ) denote the channel capacity of a Gaussian channel with signal to noise ratio x . Show C ( P 1 N ) + C ( P 2 P 1 + N ) = C ( P 1 + P 2 N ) . This suggests that 2 independent users can send information as well as if they had pooled their power. Solution: Multiple layer waterﬁlling C ( P 1 + P 2 N ) = 1 2 log ( 1 + P 1 + P 2 N ) = 1 2 log ( N + P 1 + P 2 N ) = 1 2 log ( N + P 1 + P 2 N + P 1 · N + P 1 N ) = 1 2 log ( N + P 1 + P 2 N + P 1 ) + 1 2 log ( N + P 1 N ) = C ( P 2 P 1 + N ) + C ( P N 1 ) 2. Parallel channels and waterﬁlling Consider a pair of parallel Gaussian channels, i.e., ( Y 1 Y 2 ) = ( X 1 X 2 ) + ( Z 1 Z 2 ) , where ( Z 1 Z 2 ) ∼ N ( 0 , [ σ 2 1 0 0 σ 2 2 ]) , and there is a power constraint E ( X 2 1 + X 2 2 ) P . Assume that σ 2 1 > σ 2 2 . 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
(a) At what power does the channel stop behaving like a single channel with noise variance σ 2 2 , and begin behaving like a pair of channels, ie., at what power does the worst channel become useful? (b) What is the capacity C ( P ) for large P ? Solution: Parallel channels and waterﬁlling (a) By the result of Section 9.5 of Cover and Thomas, it follows that we will put all the signal power into the channel with less noise until the total power of noise + signal in that channel equals the noise power in the other channel. After that, we will split any additional power evenly between the two channels. Thus the combined channel begins to behave like a pair of parallel channels when the signal power is equal to the diﬀerence of the two noise powers, i.e., when P = σ 2 1 - σ 2 2 . (b) Let E ( X 2 1 ) = P 1 and E ( X 2 2 ) = P 2 . Therefore P = P 1 + P 2 . (1) From waterﬁlling we know P 2 = P 1 + σ 2 1 - σ 2 2 . (2) From equations (1) and (2) we get P 1 = P - ( σ 2 1 - σ 2 2 ) 2 P 2 = P + ( σ 2 1 - σ 2 2 ) 2 . Hence C ( P ) = 1 2 log ( 1 + P - ( σ 2 1 - σ 2 2 ) 2 σ 2 1 ) + 1 2 log ( 1 + P + ( σ 2 1 - σ 2 2 ) 2 σ 2 2 ) 2
3. Vector channel Consider the 3 input 3 output Gaussian channel ±² ³´ - - ? X Y Z N 3 (0 ,K ) where X,Y,Z R 3 , E || X || 2 = E ( X 2 1 + X 2 2 + X 2 3 ) P, and Z N 3 (0 ,K ) . Find the capacity for K = 1 0 0 0 1 ρ 0 ρ 1 . Solution: Vector channel We know that C = 1 2 log ( | K Y | | K | ) where K Y is the covariance matrix of the channel output. We can calculate the eigen- values of the K matrix to be λ 1 = 1 , λ 2 = 1 - ρ, and λ 3 = 1 + ρ. Hence | K | = 1 - ρ. We now need to maximize | K Y | = | K X + K | . From Section 9.5 of Cover and Thomas we see that sup K X | K X + K | = 3 i =1 ( A i + λ i ) , where A i = ( ν - λ i ) + and A 1 + A 2 + A 3 = P. We will ﬁrst look at the case where ρ > 0. Hence we have

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 11

hw2sol - EE 376B Information Theory Prof T Cover Handout#7...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online