midterm2_2009_solutions

# midterm2_2009_solutions - ECE 534 Information Theory...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 534 Information Theory - Midterm 2 Nov.4, 2009. 3:30-4:45 in LH103. • You will be given the full class time: 75 minutes. Use it wisely! Many of the problems have short answers; try to find shortcuts. • You may bring and use two 8.5x11” double-sided crib sheets. • No other notes or books are permitted. • No calculators are permitted. • Talking, passing notes, copying (and all other forms of cheating) is forbidden. • Make sure you explain your answers in a way that illustrates your understanding of the problem. Ideas are important, not just the calculation. • Partial marks will be given. • Write all answers directly on this exam. Your name: Your UIN: Your signature: The exam has 4 questions, for a total of 65 points. Question: 1 2 3 4 Total Points: 18 17 12 18 65 Score: ECE534 Fall 2009 Midterm 2 Name: 1. A sum channel. Let X = Y = { A,B,C,D } be the input and output alphabets of a discrete memoryless channel with transition probability matrix p ( y | x ), for 0 ≤ ,δ ≤ 1 given by p ( y | x ) = 1- 1- 1- δ δ δ 1- δ . Notice that this channel with 4 inputs and outputs looks like the sum or “union” of two parallel sub- channels with transition probability matrices p 1 ( y | x ) = 1- 1- , p 2 ( y | x ) = 1- δ δ δ 1- δ , with alphabets X 1 = Y 1 = { A,B } and X 2 = Y 2 = { C,D } respectively. (a) (2 points) Draw the transition probability diagram of this channel. Solution: Y A B ε ε C D δ δ X 1- ε 1- ε 1- δ 1- δ A B C D (b) (3 points) Find the capacity of this channel if = δ = 1 / 2. Solution: If = δ = 1 / 2 we have a symmetric channel, whose capacity we know is achieved by a uniform input distribution and has capacity C = log 2 |Y| - H ( a row of the transition probability matrix) = log 2 (4)- H (1 / 2 , 1 / 2 , , 0) = 2- 1 = 1 (bit per channel use) (c) (5 points) Let p ( x ) be the probability mass function on X and let p ( A ) + p ( B ) = α, p ( C ) + p ( D ) = 1- α. Show that the mutual information between the input X and the output Y may be expressed as I ( X ; Y ) = H ( α ) + αI ( X ; Y | X ∈ { A,B } ) + (1- α ) I ( X ; Y | X ∈ { C,D } ) . Points earned: out of a possible 10 points ECE534 Fall 2009 Midterm 2 Name: Solution: Let θ be a random variable with the following probability mass function: θ * 1 if x ∈ { A,B } ⇒ p (1) = α if x ∈ { C,D } ⇒ p (0) = 1- α We can then express the mutual information between X and Y as I ( X ; Y ) = I ( X ; θ ) + I ( X ; Y | θ ) = H ( θ )- H ( θ | X ) + I ( X ; Y | θ ) = H ( α )- 0 + p ( θ = 1) I ( X ; Y | θ = 1) + p ( θ = 0) I ( X ; Y | θ = 0) = H ( α ) + αI ( X ; Y | X ∈ { A,B } ) + (1- α ) I ( X ; Y | x ∈ { C,D } ) (d) (2 points) Let C 1 and C 2 be the capacities of the subchannels described by p 1 ( y | x ) and p 2 ( y | x )....
View Full Document

{[ snackBarMessage ]}

### Page1 / 11

midterm2_2009_solutions - ECE 534 Information Theory...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online