332:542
Information Theory and Coding March 24, 2005
Examination 1
This is an 80 minute exam. You may have an additional 100 minutes to answer the following questions in the notebooks provided. The exam is closed book. Make sure that you have included you

£hapter8
Channel Capsnty
l. Preprocessing the output. One is given a communication channel with transition prob-
abilities p(y I m) and channel capacity C = mapr) I (X ;Y). A helpful statistician
WWW fgQY). Heclaimsthatthismcllstrictlyimprove
the

332:542
Information Theory and Coding March 8, 2002 SOLUTION
Examination 1
This is an 80 minute exam. You may have an additional 100 minutes to answer the following ve questions in the notebooks provided. You are permitted 2 double-sided sheet of notes. M

Take Home Exam 2 EECE 7252/8252
April 11, 2013
This exam is an open book, open note, exam. You may NOT receive help
from another person either by direct contact or via electronic means (phone,
text, email, chat, newsgroup, etc.). Each test response is to

function [IXY,HX,HY]=Entropies(X,Y,nbins)
% [IXY,HX,HY]=Entropies(X,Y,nbins)
% This function computes the entropy of images X and Y and the mutual
% information between them. X and Y must be of the same size. nbins is the
% number of bins in the 1D histog

% This MATLAB script is written by Md. Iftekhar Tanveer (mtanveer@memphis.edu)
% Any unauthorized copy or modification is prohibited
%
%
clc;clear;
% * Parameter Space *
% files must be located in current folder
FileNames = cfw_'lena-std','NileBend';
for

function Img=gnoise(Sz,Scale)
% Img=gnoise(Size,Scale)
% Computes a zero mean, unit variance, correlated Gaussian noise image.
% S - size of image in pixels (either scalar or 2D vector)
% Scale - related to correlation length of image. Typical values 1-1

Application of Information Theory on Determining
Interesting Area in Images
Md. Iftekhar Tanveer
Department of Electrical and Computer Engineering
The University of Memphis
Memphis, TN
ABSTRACT
Computation of Visual Salience has important applications in

1
An Additive Exponential Noise Channel with a
Transmission Deadline
YiLin Tsai, Student Member, IEEE, Christopher Rose, Fellow, IEEE, Ruochen Song, Student Member, IEEE
and I. Saira Mian
AbstractWe derive the maximum mutual information for
an additive ex

An Additive Exponential Noise Channel with a
Transmission Deadline
YiLin Tsai1
Christopher Rose1
Ruochen Song1
I. Saira Mian2
1Rutgers University, WINLAB
2Lawrence Berkeley National Labs
International Symposium on Information Theory
August 2011, St. Peter

Chapter 10
The Gaussian Channel
1. A mutual information. game. Consider the following channel:
Z .
X Y
I ' cons rain e sign power
EX=0, WzP, (my)
and the BOISE power
E2 = 0, EZ2 = N, (10.2)
and assume that X and Z are independent. The channe

Chapter 9
Differential Entropy
M
1. Differential entropy. Evaluate the differential entropy h(X) = f f 111 f for the fol-
lowing:
(a) The exponential density, f(z) = A6) , a: 2 0.
(b) The Laplace density, f(a:) : %Ae"\l=|_
(c) The sum of X1 and X2, w

Project Proposal: EECE 8252
Application of Information Theory on Determining Interesting Area
in Images
Md. Iftekhar Tanveer
The
advent
of
smartphones incorporated with camera, smart sensors and computational capabilities has
opened up new opportunities f

16:332:542
Information Theory and Coding May 5, 2005
Final Examination
This is an 180 minute exam. Please answer the following questions in the notebooks provided. This is a closed book test. Make sure that you have included your name, personal 4 digit co

16:332:542
Information Theory and Coding May 8, 2002 SOLUTION
Final Examination
This is an 180 minute exam. Please answer the following four questions in the notebooks provided. You are permitted to look at the Cover& Thomas text but not other materials.

11/15/14
1
Application of Information Theory on
Determining
Interesting Area in Images
Md. Iftekhar Tanveer
11/15/14
Outline
Intro to Visual Salience
A method of calculating salience map
Our Definition of Salience
Why our version is more generalized

Final Exam EECE 7252/8252
May 1, 2013
This exam is an open book, open note, exam. You may NOT receive help
from another person either by direct contact or via electronic means (phone,
text, email, chat, newsgroup, etc.). Each test response is to be the re

Take Home Exam 1 EECE 7252/8252
February 14, 2013
This exam is an open book, open note, exam. You may NOT receive help
from another person either by direct contact or via electronic means (phone,
text, email, chat, newsgroup, etc.). Each test response is

EC 421 STATISTICAL
COMMUNICATION THEORY
Instructor: Dr. Heba A. Shaban
Lecture # 6
GAUSSIAN PROCESS
1. Gaussian pdf (probability density function)
2. Bi-variate pdf (jointly Gaussian)
Correlation coefficient
3. N-variate pdf (jointly Gaussian) -1 < < 1
1

COMMUNICATION NETWORK.
NOISE CHARACTERISTICS OF A
CHANNEL
1
Communication Network
Consider a source of communication with
a given alphabet. The source is linked to
the receiver via a channel.
The system may be described by a joint
probability matrix: by

Basic Concepts of
Information Theory
Entropy for Two-dimensional Discrete Finite
Probability Schemes.
Conditional Entropy.
Communication Network.
Noise Characteristics of a Communication
Channel.
1
Entropy. Basic Properties
Continuity: if the probabiliti

Mutual Information for Image
Registration and Feature Selection
M. Farmer
CSE-902
Problem Definitions
Image Registration:
Define a transform T that will map one image onto
another image of the same object such that some image
quality criterion is maximi

Exercise Problems: Information Theory and Coding
Prerequisite courses: Mathematical Methods for CS; Probability
Overview and Historical Origins: Foundations and Uncertainty. Why the movements and
transformations of information, just like those of a uid, a