Continuous and Discrete Time Fourier Series
S R M Prasanna, EEE, IIT Guwahati
September 5, 2014
Fourier Series
1 Overview
Introduction
Response of LTI to Complex Exponentials
Continuous time Fourier series (CTFS)
Dirchlets Conditions
Properties of CT
Differential and Difference Equation
Representation of LTI Systems
S R M Prasanna, EEE, IIT Guwahati
August 29, 2014
DE Repn of LTI System
1 Overview
What is DE Representation of LTI System?
Why DE Representation of LTI System?
DE Repn of CT LTI System
LTI System and its Properties
S R M Prasanna, EEE, IIT Guwahati
August 28, 2014
System
1 Overview
Why LTI?
Essential signal and its property
Essential system properties
Convolution integral
Convolution sum
Properties of LTI systems
Summary
S R M Pr
Elementary Signals and their Characteristics
S R M Prasanna, EEE, IIT Guwahati
August 7, 2014
Signals
1 Overview
Even and odd decomposition
Energy and power calculations
Periodicity
Time shift and Phase change
Cosine and exponential
Delta and step
Laplace Transform
S R M Prasanna, EEE, IIT Guwahati
November 19, 2014
Laplace Transform
1 Overview
Introduction
Laplace transform
LT and CTFT
ROC
Properties of laplace transform
Inverse LT
Summary
S R M Prasanna, EEE, IIT Guwahati  [email protected]
Continuous and Discrete Time Fourier Transform
S R M Prasanna, EEE, IIT Guwahati
October 23, 2014
Fourier Transform
1 Overview
Introduction
Continuous time Fourier transform (CTFT)
Properties of CTFT
Periodic signals using CTFT
Discrete time Fourier
Sampling Process
S R M Prasanna, EEE, IIT Guwahati
October 23, 2014
Sampling
1 Overview
Introduction to sampling
Sampling in time domain
Aliasing in time domain
Sampling theorem
Sampling in frequency domain
Aliasing in frequency domain
Signal recon
z Transform
S R M Prasanna, EEE, IIT Guwahati
November 14, 2014
z Transform
1 Overview
Introduction
z transform
ZT and DTFT
ROC
Properties of z transform
Inverse ZT
Summary
S R M Prasanna, EEE, IIT Guwahati  [email protected]
2/42
z Transform
Frequency Domain Analysis of LTI Systems
S R M Prasanna, EEE, IIT Guwahati
November 19, 2014
FDA of LTI Systems
1 Overview
Introduction
FDA using Fouier repn
FDA using LT and ZT
FDA of LTI using DE repn
FDA of LTI using DE repn + intial conds
Use of
System and Basic Properties
S R M Prasanna, EEE, IIT Guwahati
August 13, 2014
System
1 Overview
Definition and Interconnection
Memory property
Invertibility
Causality
Stability
Time Invariance
Linearity
Summary
S R M Prasanna, EEE, IIT Guwahati 
Signals: Definition, Classification and Operations
S R M Prasanna, EEE, IIT Guwahati
August 1, 2014
Signals
1 Course Overview
Definition
Examples
Classification
Operations
Summary
S R M Prasanna, EEE, IIT Guwahati  [email protected]
2/33
Signal
LTI System and its Properties
S R M Prasanna, EEE, IIT Guwahati
August 16, 2014
System
1 Overview
Why LTI?
Essential signal and its property
Essential system properties
Convolution integral
Convolution sum
Properties of LTI systems
Summary
S R M Pr
EE220 Signals, Systems and Networks
S R M Prasanna, EEE, IIT Guwahati
August 1, 2014
EE220 SSN
1 Course Overview
About EE 220
Topics & Syllabus
Textbooks & References
Additional Material
Course Evaluation
Programming assignments
Significance of Cou
Non parametric Techniques
Parzen Windows
Nearest Neighbor classifier

2
Non parametric
generative classifiers
Generative models assume data to come from a probability
density function.
Parametric learning assumes we know the form of the
underlying de
Bayesian Nets

Introduction
2
Many times, the only knowledge we have
about a distribution is which variables are
or are not dependent.
Such dependencies can be represented
efficiently using a Bayesian Belief Network
(or Belief Net or Bayesian Net).
Bay
Ratnajit Bhattacharjee
IIT Guwahati
Antennas: Definition of some basic parameters
Frequency bands
Antenna modeling: mathematical formulations
Common mobile handset antennas
Design issues
Antenna performance evaluation
SAR issues
Adaptive antennas
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by Ca
Introduction to Support Vector
Machines (SVM)

Introduction
2
SVMs provide a learning technique for
Pattern Recognition
Regression Estimation
Solution provided SVM is
Theoretically elegant
Computationally Efficient
Very effective in many Large pra
Assignment
Submission date: 09022016
Q#1: When a bird perches on a highvoltage dc power line and then flies away, it does so carrying a net charge.
For the purpose of measuring this net charge Q carried by the bird, the apparatus used is shown in the a
Fisher Discriminant

2
3
Fishers Discriminant
4
Fishers Discriminant for two
category case
Sample mean for the class
i
1
mi =
ni
Individual sample mean for class
1
~
mi =
ni
x
i = 1,2
xDi
i
after projection
T
w
x =
T
y
=
w
mi
xDi
yDi
Sample scatter / co
Linear Discriminant Functions

2
Empirical Risk Minimization
Every classifier / regressor does what is
called as  `empirical risk minimization
Learning pertains to coming up with an
architecture that can minimize a risk / loss
function defined on the
K Means algorithm

2
3
4
K means
5
K Means algorithm
6
K means
7
K means
8
k means
9
K means
10
k means
11
k means
12
k means
13
k means
14
K Means algorithm
15
16
17
K Means for
image segmentation
Principal Component Analysis

2
Review of Eigen values
and eigen vectors
Orthogonal Matrix
3
Principal Component Analysis
decorrelation property
4
PCA also called Karnoueve Loeve
(KL) Transform (in image
processing). The transformed coefficients after p
Practice Problem Set 2
1. Consider the application of the 3means clustering algorithm to the onedimensional data
set D (of size 6). D = cfw_0, 1, 5, 8, 14, 16. The algorithm is initialized with the three cluster
means: 1 = 2, 2 = 6 and 3 = 9
(i) What ar
Pattern Recognition and Machine Learning :
Assignments
You are expected to upload a soft copy of the code and a report highlighting your
inferences on the moodle page.
Plagiarism of any kind will result in a failing grade in the course. This
also includ
Parameter estimation for
regression

2
Regression : flashback
3
4
ML estimation for
the regression problem
5
6
1 N
E D ( w ) = (t n yn ) 2
2 n =1
=
1
(t w )T ( t w )
2
1
= (t w )T t (t w )T w
2
=
Using
=
1 T
(t t w T T t t T w + w T T w )
2
w T T t = t T
Gaussian Mixture Models

Mixture Models
2
K
p ( x  ) = p ( x, c k  )
k =1
K
= p ( x  ck ) p ( c k )
k =1
K
= k p ( x  ck , k )
k =1
Gaussian Mixture Models
3
0.5
p(x)
0.4
Component 1
Component 2
0.3
0.2
0.1
0
5
0
5
10
5
10
0.5
p(x)
0.4
Mixture Model
Principal Component Analysis

2
Dimensionality reduction technique  to
alleviate curse of dimensionality.
Concepts from projection in linear algebra
3
4
5
We assume that the training samples
have zero mean
6
7
8
We have,
Hence
9
10
11
Define addition