EE 376A Prof. T. Weissman
Information Theory Friday, Feb 19, 2010
Homework Set #6 (Due: 5pm Friday, Feb. 26, 2010) 1. Channel capacity with cost constraint There are applications in which some channel input symbols are more costly than others. Letting : 
EE 376B/Stat 376B Information Theory Prof. T. Cover
Handout #7 Thursday, April 20, 2005
Solutions to Homework Set #1 1. Monotonicity of entropy per element. For a stationary stochastic process X1 , X2 , . . . , Xn , show that H (X1 , X2 , . . . , Xn ) H (
EE 376A Prof. T. Weissman
Information Theory Thursday, Feb. 4th, 2010
Solution, Homework Set #3 1. Venn diagrams. Consider the following quantity: I (X ; Y ; Z ) = I (X ; Y ) I (X ; Y  Z ) . This quantity is symmetric in X , Y and Z , despite the precedi
Problem Set 4
MATH 778C, Spring 2009, Cooper Expiration: Thursday April 30
You are awarded up to 25 points per problem, 5 points for submitting solutions A in L TEX, and 5 points per solution that is used for the answer key. All answers must be fully rigo
Tsachy Weissman
Information Theory EE376A Course reader
Winter 2010
Springer
Preface
These notes form an outline of the core of the material I plan to cover in the course. Most of the theorems, lemmas, and auxiliary results are stated without their proofs
Learning Guide and Examples: Information Theory and Coding
Prerequisite courses: Continuous Mathematics, Probability, Discrete Mathematics Overview and Historical Origins: Foundations and Uncertainty. Why the movements and transformations of information,
Massachusetts Institute of Technology 6.042J/18.062J, Fall 02: Mathematics for Computer Science Professor Albert Meyer and Dr. Radhika Nagpal
Course Notes 8 October 21
revised October 24, 2002, 874 minutes
Basic Counting, Pigeonholing, Permutations
1
Coun
Mu Alpha Theta Calculus Review
1
Introduction
This is a very brief review of AP Calculus for the purposes of doing well at the Mu Alpha Theta State competition. It is by no means a way to learn calculus, and it does not go over basic facts that the reader
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 1, solutions 2009
Homework 1, solutions
Deadline: November 9th, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Let p(x, y ) be given by
X Y
0
1 3 1 6
1
1 2
0 1
0
Find
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 1, solutions 2009
Homework 1, solutions
Deadline: November 9th, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Let p(x, y ) be given by
X Y
0
1 3 1 6
1
1 2
0 1
0
Find
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 2, solutions 2009
Homework 2, solutions
Deadline: November 16th, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Let X and Y be two independent integervalued random v
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 3, solutions 2009
Homework 3, solutions
Deadline: November 23rd, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Apply the two LempelZiv universal coding methods, LZ7
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 5, solutions 2009
Homework 5, solutions
Deadline: December 7th, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Let Y1 and Y2 be conditionally independent and conditio
Mtodos Estad e sticos de la Ingenier a Tema 7: Momentos de Variables Aleatorias Grupo B
Area de Estad stica e Investigacin Operativa o Licesio J. Rodr guezAragn o Marzo 2010
Contenidos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
EE 376A Prof. T. Weissman
Information Theory Thursday, January 21, 2010
Homework Set #2 (Due: Thursday, January 28, 2010) 1. Prove that (a) Data processing decreases entropy: If Y = f (X ) then H (Y ) H (X ). [ Hint: expand H (f (X ), X ) in two dierent w
Channel Capacity, Sampling Theory and Image, Video & Audio Compression exercises
all numbered exercises are by Cover and Thomas except where noted otherwise March 11, 2003
Exercise 8.2: Maximum likelihood decoding. A source produces independent, equally p
1
I. P ROBLEM 1
II. P ROBLEM 2
2
III. P ROBLEM 3
3
IV. P ROBLEM 4
4
V. P ROBLEM 5
5
VI. P ROBLEM 6 A. Part a The capacity of this channel is zero since, (1) B. Part b
(2) (3) (4) C. Part c Let only if , . Then, (5) (6) Hence, (7) (8) (9) where the last in
EE 376A Prof. T. Weissman
Information Theory Feb 24, 2010
Homework Set #5 Solution 1. The Z channel. The Zchannel has binary input and output alphabets and transition probabilities p(y x) given by the following matrix: Q= 1 0 1/2 1/2 x, y cfw_0, 1
Find
EE 376A Information Theory Prof. T. Cover
Handout #28 Tuesday, March 3, 2009
Solutions to Homework Set #7 1. Postprocessing the output. One is given a communication channel with transition probabilities p(y  x) and channel capacity C = max I(X; Y ). A he
ECE 534: Elements of Information Theory, Fall 2010 Homework 7 Solutions all by Kenneth Palacio Baus October 24, 2010 1. Problem 7.23. Binary multiplier channel
(a) Consider the channel Y = XZ , where X and Z are independent binary random variables that ta
EE 376A/Stat 376A Information Theory Prof. T. Cover Solutions prepared by William Wu
Handout #10 Tuesday, January 27, 2009
Homework Set #2 Solutions 1. Entropy and pairwise independence. 1 Let X, Y, Z be three binary Bernoulli( 2 ) random variables that a
EE 376A Information Theory Prof. T. Cover
Handout #30 Thursday, March 12, 2009
Solutions to Homework Set #8
1. Source and channel. We wish to encode a Bernoulli() process V1 , V2 , . . . for transmission over a binary symmetric channel with error probabil
Entropy and Information Theory
July 16, 2009
ii
Entropy and Information Theory
Robert M. Gray Information Systems Laboratory Electrical Engineering Department Stanford University
SpringerVerlag New York
iv c 1990 by Springer Verlag. Revised 2000, 2007, 2
S72.2410 Information Theory Haanp & Linjaaho aa
Homework 4, solutions 2009
Homework 4, solutions
Deadline: November 30th, 16:00 The box for returning exercises is in the Ewing, 2nd oor corridor. 1. Consider a channel Z
?
X

+
Y
X = cfw_0, 1, 2, 3 , w
EE 376A/Stat 376A Prof. T. Weissman
Information Theory Friday, March 17, 2006
Solutions to Practice Final Problems
These problems are sampled from a couple of the actual nals in previous years. 1. (20 points) Errors and erasures. Consider a binary symmetr
University of Illinois at Chicago Department of Electrical and Computer Engineering
ECE 534: Information Theory
Fall 2009 Midterm 1  Solutions
NAME:
This exam has 4 questions, each of which is worth 15 points. You will be given the full class time: 75 m