solns5 - ECE 1502 Information Theory November 30, 2007 9.3...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 1502 Information Theory November 30, 2007 9.3 Output power constraint . Since Z is independent of X , we have EY 2 = EX 2 + EZ 2 = EX 2 + 2 , and by the output power constraint, we have EX 2 P- 2 . In the following, we assume P > 2 , since otherwise the problem is uninteresting, since the output power constraint would be violated by the noise alone. Now, for a maximum expected output power P , the entropy of Y is maximized when Y N (0 , P ), which is achieved when X N (0 , P- 2 ). Therefore, the channel is equivalent to one with an input power constraint EX 2 P- 2 , and it follows that the capacity is C = 1 2 log 2 1 + P- 2 2 ! = 1 2 log 2 P 2 . 9.5 Fading Channel . We have I ( X ; Y | V ) = H ( X | V )- H ( X | Y, V ) = H ( X )- H ( X | Y, V ) H ( X )- H ( X | Y ) = I ( X ; Y ) , where the second equality follows since X and V are independent, and the inequality follows since conditioning reduces entropy. Therefore, as is intuitively reasonable, knowledge of the fading factor improves capacity. 9.6 Parallel channels and waterfilling. By the result of Section 10.4, it follows that we will put all the signal power into the channel with less noise until the total power of noise + signal in that channel equals the noise power in the other channel. After that, we will split any additional power evenly between the two channels. Thus the combined channel begins to behave like a pair of parallel channels when the signal power is equal to the difference of the two noise powers, i.e., when 2 P = 2 1- 2 2 . 9.7 Multipath Gaussian channel . (a) The channel output is Y = 2 X + Z 1 + Z 2 = 2 X + Z, where Z = Z 1 + Z 2 is the sum of two Gaussian random variables, and is thus itself Gaussian, and EZ 2 = EZ 2 1 + 2 EZ 1 Z 2 + EZ 2 2 = 2 2 (1 + ) , 1 therefore Z N (0 , 2 2 (1 + )). Similarly, since Y is itself the sum of two Gaussian random variables (2 X and Z ), we have Y N (0 , 4 P + 2 2 (1 + )). Therefore, C = 1 2 log 2 4 P + 2 2 (1 + ) 2 2 (1 + ) ! = 1 2 log 2 1 + 2 P 2 (1 + ) (b) When = 0, we have Z N (0 , 2 2 ), and thus the noise power is doubled (relative to the case of a single noise source), while the source power is (independent of ) increased by a factor of 4, since two copies of X are coherently added at the receiver. Therefore, C = 1 2 log 2 1 + 2 P 2 . When = 1, we have Z N (0 , 4 2 ), and thus the noise power is multiplied by a factor of 4 (relative to the case of a single noise source), since Z 1 = Z 2 and they are added coherently at the receiver. Therefore, C = 1 2 log 2 1 + P 2 ....
View Full Document

This note was uploaded on 04/23/2011 for the course ECE 1502 taught by Professor Nick during the Spring '11 term at Andrews Univeristy.

Page1 / 7

solns5 - ECE 1502 Information Theory November 30, 2007 9.3...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online