midterm2_2009_solutions - ECE 534 Information Theory -...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 534 Information Theory - Midterm 2 Nov.4, 2009. 3:30-4:45 in LH103. You will be given the full class time: 75 minutes. Use it wisely! Many of the problems have short answers; try to find shortcuts. You may bring and use two 8.5x11 double-sided crib sheets. No other notes or books are permitted. No calculators are permitted. Talking, passing notes, copying (and all other forms of cheating) is forbidden. Make sure you explain your answers in a way that illustrates your understanding of the problem. Ideas are important, not just the calculation. Partial marks will be given. Write all answers directly on this exam. Your name: Your UIN: Your signature: The exam has 4 questions, for a total of 65 points. Question: 1 2 3 4 Total Points: 18 17 12 18 65 Score: ECE534 Fall 2009 Midterm 2 Name: 1. A sum channel. Let X = Y = { A,B,C,D } be the input and output alphabets of a discrete memoryless channel with transition probability matrix p ( y | x ), for 0 , 1 given by p ( y | x ) = 1- 1- 1- 1- . Notice that this channel with 4 inputs and outputs looks like the sum or union of two parallel sub- channels with transition probability matrices p 1 ( y | x ) = 1- 1- , p 2 ( y | x ) = 1- 1- , with alphabets X 1 = Y 1 = { A,B } and X 2 = Y 2 = { C,D } respectively. (a) (2 points) Draw the transition probability diagram of this channel. Solution: Y A B C D X 1- 1- 1- 1- A B C D (b) (3 points) Find the capacity of this channel if = = 1 / 2. Solution: If = = 1 / 2 we have a symmetric channel, whose capacity we know is achieved by a uniform input distribution and has capacity C = log 2 |Y| - H ( a row of the transition probability matrix) = log 2 (4)- H (1 / 2 , 1 / 2 , , 0) = 2- 1 = 1 (bit per channel use) (c) (5 points) Let p ( x ) be the probability mass function on X and let p ( A ) + p ( B ) = , p ( C ) + p ( D ) = 1- . Show that the mutual information between the input X and the output Y may be expressed as I ( X ; Y ) = H ( ) + I ( X ; Y | X { A,B } ) + (1- ) I ( X ; Y | X { C,D } ) . Points earned: out of a possible 10 points ECE534 Fall 2009 Midterm 2 Name: Solution: Let be a random variable with the following probability mass function: * 1 if x { A,B } p (1) = if x { C,D } p (0) = 1- We can then express the mutual information between X and Y as I ( X ; Y ) = I ( X ; ) + I ( X ; Y | ) = H ( )- H ( | X ) + I ( X ; Y | ) = H ( )- 0 + p ( = 1) I ( X ; Y | = 1) + p ( = 0) I ( X ; Y | = 0) = H ( ) + I ( X ; Y | X { A,B } ) + (1- ) I ( X ; Y | x { C,D } ) (d) (2 points) Let C 1 and C 2 be the capacities of the subchannels described by p 1 ( y | x ) and p 2 ( y | x )....
View Full Document

Page1 / 11

midterm2_2009_solutions - ECE 534 Information Theory -...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online