This preview shows page 1. Sign up to view the full content.
Unformatted text preview: of this HuFman code? (b) Let Y = ( X a ,X b ) be a vector random variable, where X 1 and X 2 are independent realizations of the source X . What is the entropy of Y ? Design a HuFman code for Y = ( X 1 ,X 2 ).(Hint: Y has 9 possible outcomes of the form ( x i ,x j ), each with probability p ( x i ) p ( x j ).) What is the expected length of this new HuFman code? (c) Is it useful to code over increasingly long vectors of independent random variables? 6. Assume that in a binary digital communication system, the signal component out of the correlator receiver is a i ( T ) = +1 or-1 V with equal probability. If the Gaussian noise at the correlator output has unit variance, ²nd the probability of bit error. Optional Problems: 1. If X and Y are independent and identically distributed with mean μ and variance σ 2 , ²nd E b ( X-Y ) 2 B . 1...
View Full Document
This note was uploaded on 11/13/2011 for the course ECEN 455 taught by Professor Staff during the Spring '08 term at Texas A&M.
- Spring '08