STAT 420
H18
In many cases, at each time t, several related quantities are observed and, therefore,
we want to study these quantities simultaneously by grouping them to form a vector.
By doing so we have a vector or multivariate process.
We consider first
Lecture 14: StateSpace Models and The Kalman
Filter
Michael Levine
Purdue University
April 21, 2014
Michael Levine (Purdue University)
Time Series
April 21, 2014
1/1
StateSpace Models
Observation = Signal + Noise.
In statistical language, this is equiva
Lecture 13: Linear Systems
Linear Systems in the Time Domain
Walid Sharabati
Purdue University
November 25, 2013
Professor Sharabati (Purdue University)
Time Series
November 25, 2013
1 / 14
Linear Systems
Linear Systems in the Time Domain
Linear Systems i
Lecture 12: Bivariate Processes
CrossCovariance and CrossCorrelation
Michael Levine
Purdue University
April 21, 2014
Michael Levine (Purdue University)
Time Series
April 21, 2014
1 / 26
Consider (Xt , Yt ), where Xt and Yt are maximum/minimum daily
temp
Lecture 11: Spectral Analysis
Methods For Estimating The Spectrum
Michael Levine
Purdue University
March 12, 2014
Michael Levine (Purdue University)
Time Series
March 12, 2014
1 / 22
Fourier Analysis
The approximation of a function by taking sum of sine a
Lecture 9: Stationary Processes in the Frequency
Domain
Michael Levine
Purdue University
February 24, 2014
Michael Levine (Purdue University)
Time Series
February 24, 2014
1 / 20
Introduction
The autocovariance and autocorrelation functions describe the
e
Lecture 8: Forecasting
Michael Levine
Purdue University
February 24, 2014
Michael Levine (Purdue University)
Time Series
February 24, 2014
1 / 28
Types of forecasts I
Subjective  e.g. an expert opinion
Univariate  the forecast h steps ahead for the proc
Lecture 5: Time Series Probability Models
Michael Levine1
Purdue University
February 3, 2014
1
Thanks to Walid Sharabati and Bo Li
Michael Levine (Purdue University)
Time Series
February 3, 2014
1 / 26
Stationarity of AR(p)
Xt = 1 Xt1 + + p Xtp + Zt .
Xt
Lecture 6: Fitting Time Series Models In The
Time Domain
Michael Levine1
Purdue University
February 18, 2014
1
These notes owe a lot to Prof. Walid Sharabati and Prof. Bo Li
Michael Levine (Purdue University)
Time Series
February 18, 2014
1 / 15
Estimatin
STAT 420
H16
Models for Changing Variance.
The model is called homoscedastic if the variance is constant, and heteroscedastic when it is not.
Example 1
Annual numbers of lynx trapped in the Mackenzie River district of Canada.
Example 2
Dow Jones Industria
STAT 420
H18
HW 7 (due Mar. 24)
Give 1step and 2step volatility forecasts for the ARCH(1) model with
Problem 1
0 = 1.5, 1 = 0.9
in terms of the known values of
X t2 and t2 .
Give a 3step volatility forecast for an ARCH(2) model with
Problem 2
0 = 1, 1
STAT 420
H15
Estimation of the Spectral Density Function
The secondorder properties of a TS are completely described by its ACVF
under mild conditions (a sufficient one is
f ( ) =
1
2
h=

h = h
h e i h =
h , or equivalently,
 < ), by its Fourier tran
STAT 420
H12
HW5 (due Feb. 25) Exercises 5.1, 5.2 from the textbook and
Problem
Consider the general ARIMA(1, 2, 1) model.
(a)
Convert the model to the equivalent ARMA(3, 1) form.
(b)
Find the forecasts,
X t + h , h 1, for the model.
_
A 90% CI implies th
STAT 420
H14
HW6 (due Mar. 10):
Exercises 6.1 and 6.2 from the textbook.
_
The spectral density function (or simply the spectrum) of a stationary TS,
f ( ) =
1
2
h=
is the counterpart of the ACVF,
hei h =
1
+
2
cos
h
h
,
2 0
h=1
(1)
h , in the frequen
STAT 420
H11
cfw_ X t is called an autoregressive integrated moving average (ARIMA) process of order ( p, d , q) ,
denoted as cfw_ X t ~ ARIMA( p, d , q) , where d 1 is an integer,
if its dorder difference Yt
= (1 B)d X t
is a casual
i.e.,
It is easy t
STAT 420
H10
X1, . , X t ,
predict the value it will assume at some specific future time point, X t + h .
The prediction problem: from the observed values of a time series at past points,
X t +h ,
We refer to X
In forecasting
t is called the forecast orig
STAT 420
H13
Spectral Analysis
1. A function that satisfies the equation,
g ( x) = g ( x + kp) ,
is called periodic with period
p , if p
Virtually any periodic function
g ( x)
all
x , k = 0, 1, 2, . ,
is the smallest number such that the equation holds fo
Lecture 1: Introduction to Time Series Analysis
Michael Levine
1
Purdue University
January 15, 2014
1
These notes owe a lot to Prof. Walid Sharabati and Prof. Bo Li
Michael Levine
(Purdue)
Time Series
January 15, 2014
1 / 23
The Importance of Forecasting
Lecture 3: Some Time Series Models
Michael Levine1
Purdue University
January 20, 2014
1
These notes owe a lot to Prof. Walid Sharabati and Prof. Bo Li
Michael Levine (Purdue)
Time Series
January 20, 2014
1 / 22
Stochastic (Random) Process
For each t, Xt i
Lecture 4: Basic Time Series Models
Michael Levine
Purdue University
January 27, 2014
Michael Levine (Purdue)
Time Series
January 27, 2014
1 / 23
For the stationary stochastic process X(t) or Xt we have
=
0
1
0 = 1.
2
Covariance is symmetric, = .
= cov
Problem 9.1
Problem 9.2 (except the task described in the last sentence)
Problem 9.4 (again, except the task described in the last sentence)
Often, the periodicities in the sunspot series are investigated by tting
an autoregressive spectrum of sucient
Problem 6.2
Problem 6.4
Consider the same data that we generated in class as a sum of three
harmonic components. Now, change the sample size to 128 instead of
100, then generate and plot the same series (that is, the three original
harmonic series and
Problem 5.2
A monthly simple net return of a security is the relative change in its
Pt1
price over one time period; in other words, Rt = PtPt1 . Suppose
that the simple return of a monthly bond index follows the MA(1)
model
Rt = at + 0.2at1
with at N (0
1. All of these three processes are stationary and invertible. For the
rst one, it is enough to note that the coecient is less than one in
absolute value. For the second one, you need to check that the roots
of the characteristic equation 1 1.3z + 0.4z 2
Problem 5.2
Note that the rst part was pretty much done in class. Note that
the error forecast is XN (h) XN +h = h XN XN +h = h XN
XN +h1 ZN +h = . . . = h k Zk ; taking the variance of the
k=0
above expression and using a property of a geometric series
Problem 3.9
For the AR(2) model given by by Xt = .9Xt2 + Zt , nd the roots
of the characteristic polynomial, and then plot its ACF
Suppose that the daily log return of a security follows the model
rt = 0.01 + 0.2rt2 + Zt
where Zt is a Gaussian white no
1. The cuto lag for the MA(1) process is 1. The AR(1) acf does not have
a clearly dened cuto lag; instead, it decays to zero fairly quickly
3.1 The easiest is to use the formula for the MA(2) autocorrelation function that we obtained earlier in class; if
Appendix R
R Supplement
R.1 First Things First
The website for the text is http:/www.stat.pitt.edu/stoffer/tsa3/. If you
do not already have R, point your browser to the Comprehensive R Archive
Network (CRAN), http:/cran.rproject.org/ and download and in