I. Introduction
Information, Signals and Systems
Signal processing concerns primarily with signals and systems that operate
on signals to extract useful information. In this course our concept of a si
The z -transform
We introduced the z -transform before as
h[k ]z k
H (z ) =
k=
where z is a complex number. When H (z ) exists (the sum converges), it can be
interpreted as the response of an LSI syst
Fourier Representations
Throughout the course we have been alluding to various Fourier representations.
We rst recall the appropriate transforms:
Fourier Series (CTFS) x(t): continuous-time, nite/peri
Inner Product Spaces
Where normed vector spaces incorporate the concept of length into a vector
space, inner product spaces incorporate the concept of angle.
Denition 1. Let V be a vector space over K
Vector Spaces
Metric spaces impose no requirements on the structure of the set M . We
will now consider more structured M , beginning by generalizing the familiar
concept of a vector.
Denition 1. Let
I I. Signal Representations in Vector Spaces
We will view signals as elements of certain mathematical spaces. The spaces
have a common structure, so it will be useful to think of them in the abstract.
Linear Operators
Denition 1. Def: A transformation (mapping) L : X Y from a vector space
X to a vector space Y (with the same scalar eld K ) is a linear transformation
if:
1. L(x) = L(x) x X , K
2. L(
Examples:
1
3
Suppose V = cfw_piecewise constant functions on [0, 1 ), [ 1 , 2 ), [ 1 , 4 ), [ 3 , 1].
4
4
2
4
An example of such a function is illustrated below.
f (t)
1
1
4
1
2
1
3
4
t
1
Consider t
Orthobasis Expansions
N
Suppose that the cfw_vj j =1 are a nite-dimensional orthobasis. In this case we
have
N
x=
x, vj vj .
j =1
But what if x span(cfw_vj ) = V already? Then we simply have
N
x=
x, v
I II. Representation and Analysis of Systems
Linear systems
In this course we will focus much of our attention on linear systems. When our
input and output signals are vectors, then the system is a li
Hilbert Spaces in Signal Processing
What makes Hilbert spaces so useful in signal processing? In modern signal
processing, we often represent a signal as a point in high-dimensional space.
Hilbert spa
Approximation in
p
norms
So far, our approximation problem has been posed in an inner product space,
and we have thus measured our approximation error using norms that are induced by an inner product
Poles and zeros
Suppose that X (z ) is a rational function, i.e.,
X (z ) =
P (z )
Q(z )
where P (z ) and Q(z ) are both polynomials in z . The roots of P (z ) and Q(z )
are very important.
Denition 1.
Discrete-time systems
We begin with the simplest of discrete-time systems, where X = CN and Y =
CM . In this case a linear operator is just an M N matrix. We can generalize
this concept by letting M a
Stability, causality, and the z -transform
In going from
N
m
ak y [n k ] =
k=0
bk x [ n k ]
k=0
to
H (z ) =
Y (z )
X (z )
we did not specify an ROC. If we factor H (z ), we can plot the poles and zero
The DTFT as an eigenbasis
We saw Parseval/Plancherel in the context of orthonormal basis expansions.
This begs the question, do F and F 1 just take signals and compute their
representation in another