ex3-soln - Channel Capacity, Sampling Theory and Image,...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Channel Capacity, Sampling Theory and Image, Video & Audio Compression exercises all numbered exercises are by Cover and Thomas except where noted otherwise March 11, 2003 Exercise 8.2 : Maximum likelihood decoding . A source produces independent, equally probable symbols from an alphabet { a, b } at a rate of one symbol every 3 seconds. These symbols are transmitted over a binary symmetric channel which is used once each second by encoding the source symbol a as 000 and the source symbol b as 111. If, in the corresponding 3 second interval of the channel output, any of the sequences 000, 001, 010, 100 is received then the output is decoded as a ; otherwise the output is decoded as b . Let < 1 2 be the channel crossover probability. (a) For each possible received 3-bit sequence in the interval corresponding to a given source letter, find the probability that a came out of the source given that received sequence. (b) Using part (a), show that the above decoding rule minimizes the probability of an incorrect decision. (c) Find the probability of an incorrect decision. (d) The source is slowed down to produce one letter every 2 n + 1 seconds, a being encoded by 2 n + 1 0s and b being encoded by 2 n + 1 1s. What decision rule minimizes the probability of error at the decoder? What is the probability of error as n ? What is the transmission rate as n ? Solution: (a) We know that the following are true: p ( y = 000 | x = 000) = (1- ) 3 p ( y = 100 | x = 000) = p ( y = 010 | x = 000) = p ( y = 001 | x = 000) = (1- ) 2 p ( y = 110 | x = 000) = p ( y = 101 | x = 000) = p ( y = 011 | x = 000) = (1- ) 2 p ( y = 111 | x = 000) = 3 but we are asked for the conditional probabilities in the other direction, i.e. p ( x | y ) rather than p ( y | x ): p ( x = 000 | y = 000) = p ( x = 000) p ( y = 000 | x = 000) p ( y = 000) Now: p ( y = 000) = X x p ( x ) p ( y = 000 | x ) = p ( x = 000) p ( y = 000 | x = 000) + p ( x = 111) p ( y = 000 | x = 111) = 1 2 (1- ) 3 + 1 2 3 1 So: p ( x = 000 | y = 000) = 1 2 (1- ) 3 1 2 (1- ) 3 + 1 2 3 = (1- ) 3 (1- ) 3 + 3 Likewise: p ( x = 000 | y = 001) = (1- ) 2 (1- ) 2 +(1- ) 2 = (1- ) p ( x = 000 | y = 011) = (1- ) 2 (1- ) 2 +(1- ) 2 = p ( x = 000 | y = 111) = 3 (1- ) 3 + 3 p ( x = 000 | y = 001) = p ( x = 000 | y = 010) = p ( x = 000 | y = 100) and p ( x = 000 | y = 011) = p ( x = 000 | y = 101) = p ( x = 000 | y = 110). (b) The system is symmetric. Four of the possible cases must be allocated to a , four to case b . 000, 001, 010, and 100 have the highest four probabilities (remember < 1 2 ) so these are the four which should be allocated to a to minimize the probablity of an incorrect decision....
View Full Document

This note was uploaded on 10/27/2010 for the course ECE 221 taught by Professor Sd during the Spring '10 term at Huston-Tillotson.

Page1 / 7

ex3-soln - Channel Capacity, Sampling Theory and Image,...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online