Soln05 - ECE 342 Communication Theory Fall 2005 Solutions...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 342 Communication Theory Fall 2005, Solutions to Homework 5 Prof. Tiffany J Li www: http://www.eecs.lehigh.edu/ ∼ jingli/teach email: [email protected] Problem 6.1 H ( X ) =- 6 X i =1 p i log 2 p i =- (0 . 1 log 2 . 1 + 0 . 2 log 2 . 2 +0 . 3 log 2 . 3 + 0 . 05 log 2 . 05 + 0 . 15 log 2 . 15 + 0 . 2 log 2 . 2) = 2 . 4087 bits/symbol If the source symbols are equiprobable, then p i = 1 6 and H u ( X ) =- 6 X i =1 p i log 2 p i =- log 2 1 6 = log 2 6 = 2 . 5850 bits/symbol As it is observed the entropy of the source is less than that of a uniformly distributed source. Problem 6.6 The entropy of the source is H ( X ) =- 6 X i =1 p i log 2 p i = 2 . 4087 bits/symbol The sampling rate is f s = 2000 + 2 · 6000 = 14000 Hz This means that 14000 samples are taken per each second. Hence, the entropy of the source in bits per second is given by H ( X ) = 2 . 4087 × 14000 (bits/symbol) × (symbols/sec) = 33721 . 8 bits/second Problem 6.9 The marginal probabilities are given by p ( X = 0) = X k p ( X = 0 , Y = k ) = p ( X = 0 , Y = 0) + p ( X = 0 , Y = 1) = 2 3 p ( X = 1) = X k p ( X = 1 , Y = k ) = p ( X = 1 , Y = 1) = 1 3 p ( Y = 0) = X k p ( X = k, Y = 0) = p ( X = 0 , Y = 0) = 1 3 p ( Y = 1)...
View Full Document

{[ snackBarMessage ]}

Page1 / 4

Soln05 - ECE 342 Communication Theory Fall 2005 Solutions...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online