STAT 626: Outline of Lecture 10
(Partial) Correlogram: ACF and PACF of ARMA Models (
§
3.4)
1. Review:
OneSided MA(
∞
) or Causal Process:
Is a time series involving only
the past
and present values
of a white noise (shocks, inputs):
x
t
=
∞
X
j
=0
ψ
j
w
t

j
with absolutely summable coefficients. Its
autocovariance function
is given by
γ
(
h
) =
σ
2
w
∞
X
j
=0
ψ
j
+
h
ψ
j
.
2.
Autoregressive and Moving Average (ARMA
(
p, q
)) Models:
x
t
=
φ
1
x
t

1
+
. . .
+
φ
p
x
t

p
+
w
t
+
θ
1
w
t

1
+
. . .
+
θ
q
w
t

q
.
OR
φ
(
B
)
x
t
=
θ
(
B
)
w
t
,
where
φ
(
z
)
, θ
(
z
) are the AR and MA polynomials, respectively.
Focus on ARMA(1,1) Models:
x
t
=
φx
t

1
+
w
t
+
θw
t

1
.
3.
Causal Solution:
When
x
t
can be written as a onesided MA or linear process, i.e.
in terms of the
past and present values
of the WN:
w
t
, w
t

1
,
. . . .
This is important for computing the ACF of various ARMA models.
4.
Invertible ARMA:
w
t
can be written in terms of
the past and present values
,
i.e.
x
t
, x
t

1
, . . . ,
OR
w
t
=
x
t
+
π
1
x
t

1
+
π
2
x
t

2
+
. . .
=
∞
X
j
=0
π
j
x
t

j
.
This is important for parameter estimation and computing predictors.
1
5.
What is the partial correlation between
X
and
Y
adjusted for the effect
of
Z
?
6.
What is the partial autocorrelation function (PACF) of a stationary time
series?
The correlation coefficient between
x
t
and
x
t
+
h
after removing the linear effects of the
intervening variables
{
x
t
+1
, . . . , x
t
+
h

1
}
is called the
lag
h
partial autocorrelation
of a
stationary time series and denoted by
φ
hh
, h
= 1
,
2
, . . . ,
.
The sequence or the function
φ
hh
, h
= 1
,
2
, . . . ,
is called the
partial autocorrelation
function
(PACF) of the time series.
The plot of
φ
hh
vs
h
= 1
,
2
, . . .
is call the
partial correlogram
of the time series.
What is the shape of the partial correlogram of an AR(1)? AR(p)?
What is the shape of the partial correlogram of and MA(q)?
What is the shape of the partial correlogram of and ARMA(p,q)?
2
102
3 ARIMA Models
with initial conditions
ψ
j

j
X
k
=1
φ
k
ψ
j

k
=
θ
j
,
0
≤
j <
max(
p, q
+ 1)
.
(3.41)
The general solution depends on the roots of the AR polynomial
φ
(
z
) =
1

φ
1
z
 · · · 
φ
p
z
p
, as seen from (3.40). The specific solution will, of
course, depend on the initial conditions.
Consider the ARMA process given in (3.27),
x
t
=
.
9
x
t

1
+
.
5
w
t

1
+
w
t
.
Because max(
p, q
+ 1) = 2, using (3.41), we have
ψ
0
= 1 and
ψ
1
=
.
9 +
.
5 =
1
.
4. By (3.40), for
j
= 2
,
3
, . . . ,
the
ψ
weights satisfy
ψ
j

.
9
ψ
j

1
= 0. The
general solution is
ψ
j
=
c .
9
j
.
To find the specific solution, use the initial
condition
ψ
1
= 1
.
4, so 1
.
4 =
.
9
c
or
c
= 1
.
4
/.
9. Finally,
ψ
j
= 1
.
4(
.
9)
j

1
, for
j
≥
1, as we saw in Example 3.7.
To view, for example, the first 50
ψ
weights in R, use:
1
ARMAtoMA(ar=.9, ma=.5, 50)
# for a list
2
plot(ARMAtoMA(ar=.9, ma=.5, 50))
# for a graph
3.4 Autocorrelation and Partial Autocorrelation
We begin by exhibiting the ACF of an MA(
q
) process,
x
t
=
θ
(
B
)
w
t
, where
θ
(
B
) = 1+
θ
1
B
+
· · ·
+
θ
q
B
q
. Because
x
t
is a finite linear combination of white
noise terms, the process is stationary with mean
E
(
x
t
) =
q
X
j
=0
θ
j
E
(
w
t

j
) = 0
,
where we have written
θ
0
= 1, and with autocovariance function
γ
(
h
) = cov (
x
t
+
h
, x
t
) = cov
q
X
j
=0
θ
j
w
t
+
h

j
,
q
X
k
=0
θ
k
w
t

k
=
(
σ
2
w
∑
q

h
j
=0
θ
j
θ
j
+
h
,
0
≤
h
≤
q
0
h > q.
You've reached the end of your free preview.
Want to read all 11 pages?
 Fall '08
 Staff
 Autocorrelation, Stationary process, ACF, Autoregressive moving average model, Time series analysis, Partial autocorrelation function