Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 7
Out: April 24, 2006 Due: May 3, 2006 Problem 1 Answer the following questions about power spectral dens
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 6
Out: April 12, 2006 Due: April 19, 2006
1
Problem 3
An important step in training a radial-basis-functi
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 5
Out: March 15, 2006 Problem 1 Consider a linear stable system, as shown below. Variable u (t ) is input
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 4
Out: March 8, 2006 Due: March 15, 2006 Problem 1 Consider a transfer operator function: H (q) = 1-1.1q
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 3
Out: March 1, 2006 Due: March 8, 2006 For the first two problems below, hand calculation is recommended
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 2
Out: February 22, 2006 Due: March 1, 2006 Problem 1 A stationary random process X(t) has a mean value o
Massachusetts Institute of Technology Department of Mechanical Engineering 2.160 Identification, Estimation, and Learning
Spring 2006
Problem Set No. 1
Out: February 13, 2006 Due: February 22, 2006 Problem 1 Consider the following example of least square
Department of Mechanical Engineering Massachusetts Institute of Technology
2.160 Identification, Estimation, and Learning Mid-Term Examination
April 3, 2006 1:00 3:00 pm Close book. Two sheets of notes are allowed. Show how you arrived at your answer. Pro
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 21
May 8, 2006
16. Information Theory of System Identification
16.1 Overview Maximum Likelihood Estimate (MLE) ^ ML ( Z N ) = arg ma x log L( )
(1) (2)
Likelihood function
L( ) = f (
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 20
May 3, 2006
15. Maximum Likelihood
15.1 Principle Consider an unknown stochastic process
Observed data Unknown Stochastic Process
y N = ( y1 , y2 ,
, yN )
Assume that each observed
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 18
April 26, 2006 13 Asymptotic Distribution of Parameter Estimates 13.1 Overview
^ If convergence is guaranteed, then N * . ^ But, how quickly does the estimate N approach the limit
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 17
April 24, 2006
12. Informative Experiments
12.1 Persistence of Excitation Informative data sets are closely related to " Persistence of Excitation", an important concept used in ad
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 16
April 19, 2006
11 Informative Data Sets and Consistency 11.1 Informative Data Sets
Predictor: y (t t 1) = H 1 ( q )G ( q )u (t ) + 1 H 1 ( q ) y (t )
[
]
u (t ) y (t t 1) = [Wu (
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 15
April 12, 2006
Part 3 System Identification
Perspective of System Identification Theory e(t) u(t) True Process S y(t)
Experiment Design Data Set Z N = cfw_u(t ), y (t )
Model Set M
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 14
April 10, 2006 8.4 The Error Back Propagation Algorithm The Multi-Layer Perception is a universal approximation function that can approximate an arbitrary (measurable) function to
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 13
March 22, 2006
8. Neural Networks
8.1 Physiological Background Neuro-physiology A Human brain has approximately 14 billion neurons, 50 different kinds of neurons. . uniform Massive
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 12
March 20, 2006
7 Nonlinear Models
7.1 Nonlinear Black-Box Models The predictor of a linear system:
^ y(t ) = H-1(q,)G(q,)u(t) + 1- H-1(q,) y(t)
^ ^ y(t ) = T (t ) or y (t ) = T (t
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 11
March 15, 2006 6.5 Times-Series Data Compression
b2 u(t) b3
FIR
y(t)
b1
Finite Impulse Response Model
t
Consider a FIR Model y (t ) = b1u (t - 1) + b2 u (t - 2) + + bm u (t - m) Th
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 10
March 13, 2006 6 Model Structure of Linear Time Invariant Systems 6.1 Model Structure In representing a dynamical system, the first step is to find an appropriate structure of the
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 9
March 8, 2006
Part 2 Representation and Learning
We now move on to the second part of the course, Representation and Learning. You will learn various forms of system representation,
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 8
March 6, 2006
4.9 Extended Kalman Filter
In many practical problems, the process dynamics are nonlinear.
w Process Dynamics
v
y
u
Kalman Gain & Covariance Update
+ _
Model (Lineariz
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 7
March 1, 2006
4.7. Continuous Kalman Filter
Converting the Discrete Filter to a Continuous Filter
Continuous process
x = Fx + Gw(t )
(49) (50)
Measurement Assumptions
y = Hx + v(t
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 6
February 24, 2006 4.5.1 The Kalman Gain
Consider the error of a posteriori estimate xt et
xt
xt = xt t 1 + K
(
yt
H
t xt t 1 )
xt t =
xt t 1 +
K t (
H
t xt +
vt
H t xt t 1
2.160 System Identification, Estimation, and Learning
Lecture Notes No. 5
February 22, 2006 4. Kalman Filtering
4.1 State Estimation Using Observers
In discrete-time form a linear time-varying, deterministic, dynamical system is represented by xt +1 = At
2.160 Identification, Estimation, and Learning
Lecture Notes No. 4
February 17, 2006 3. Random Variables and Random Processes
Deterministic System: Input Output
In realty, the observed output is noisy and does not fit the model perfectly. In the determini
2.160 Identification, Estimation, and Learning
Lecture Notes No. 3
February 15, 2006 2.3 Physical Meaning of Matrix P
The Recursive Least Squares (RLS) algorithm updates the parameter vector ^(t - 1) based on new data T (t ), y (t ) in such a way that the
2.160 Identification, Estimation, and Learning
Lecture Notes No. 2
February 13, 2006 2. Parameter Estimation for Deterministic Systems 2.1 Least Squares Estimation
u1 u2 M um
M
Deterministic System w/parameter
Linearly parameterized model
y
Input-output
2.160 Identification, Estimation, and Learning Lecture Notes No. 1
February 8, 2006
Mathematical models of real-world systems are often too difficult to build based on first principles alone.
Figure by MIT OCW. Figure by MIT OCW.
System Identification; "L
Identification, Estimation, and
Learning
3-0-9 H-Level Graduate Credit
Prerequisite: 2.151 or similar subject
2.160
Reference Books Books
Lennart Ljung, "System Identification: Theory for the User, Second Edition", Prentice-Hall 1999 Graham Goodwin and
Department of Mechanical Engineering Massachusetts Institute of Technology
2.160 Identification, Estimation, and Learning End-of-Term Examination
May 17, 2006 1:00 3:00 pm (12:30 2:30 pm) Close book. Two sheets of notes are allowed. Show how you arrived a