final - ECE563 Fall 2004 Take-home Final Date Assigned : 17...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE563 Fall 2004 Take-home Final Date Assigned : 17 November 2004. Date Due : 1 December 2004 in class. Instructions : I expect you to work on the problems by yourself. You can refer to any textbook (or the technical literature in general), but not confer with any person. 1. Short Questions (a) How many bits are needed to specify a selection of k objects from n objects? ( n and k are assumed to be known and the selection of k objects is unordered.) (b) Either prove that the following code is uniquely decodable or give an ambiguous concatenated sequence of codewords: c = 101 c 1 = 0011 c 2 = 1001 c 3 = 1110 c 4 = 00001 c 5 = 11001 c 6 = 11100 c 7 = 010100 (c) Consider the memoryless AWGN channel y = x + z where z is zero mean Gaussian random variable with variance σ 2 . The transmit signal x has an average power constraint of P . With no other constraints on the input, the capacity of the channel is C = 1 2 log 2 1 + P σ 2 ¶ . Now suppose x is restricted to the binary alphabet n- √ P, + √ P o . i. Find an expression for the capacity of this restrictive channel and denote it by ˆ C (you may not be able to find a closed form expression for ˆ C but you should be able to identify the optimal input distribution exactly). ii. When is ˆ C close to C ? For small signal-to-noise ratios (defined as the ratio P/σ 2 ) or large ones? Commentary : This justifies the usual engineering prac- tice of using simple binary modulation on the AWGN channel in a certain SNR regime. 2. The frequency p n of the n th most frequent word in English is roughly approximated by p n ≈ ‰ . 1 n for 1 ≤ n ≤ 12367 n > 12367 . (This remarkable 1 /n law is known as Zipf’s law, and applies to the word frequencies of many languages [4].) If we assume that English is generated by picking words at random according to this distribution, what is the entropy of English (per word)? You might need a computer to help you arrive at the answer. 3. The Mathematical Games column of the Scientific American featured the following puzzle in 1975. The poisoned glass . ‘Mathematicians are curious birds’, the police com- missioner said to his wife. ‘You see, we had all those partly filled glasses lined up in rows on a table in the hotel kitchen. Only one contained poison, and we wanted to know which one before searching that glass for fingerprints. Our lab could test the liquid in each glass, but the tests take time and money, so we wanted to make as few of them as possible by simultaneously testing mixtures of small samples from groups of glasses. The university sent over a mathematics professor to help us. He counted the glasses, smiled and said: ‘ “Pick any glass you want, Commissioner. We’ll test it first.” ‘ “But won’t that waste a test?” I asked....
View Full Document

This note was uploaded on 02/05/2012 for the course EE EE308 taught by Professor B.k.dey during the Spring '09 term at IIT Bombay.

Page1 / 7

final - ECE563 Fall 2004 Take-home Final Date Assigned : 17...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online