This preview shows page 1. Sign up to view the full content.
Unformatted text preview: EC3062 ECONOMETRICS
IDENTIFICATION OF ARMA MODELS
A stationary stochastic process can be characterised, equivalently, by its
autocovariance function or its partial autocovariance function.
It can also be characterised by is spectral density function, which is
the Fourier transform of the autocovariances {γτ ; τ = 0, ±1, ±2, . . .} :
∞ ∞ γτ cos(ωτ ) = γ0 + 2 f (ω ) =
τ =−∞ γτ cos(ωτ ).
τ =1 Here, ω ∈ [0, π ] is an angular velocity, or frequency value, in radians per
period.
The empirical counterpart of the spectral density function is the
periodogram I (ωj ), which may be deﬁned as
1
I (ωj ) =
2 T −1 T −1 cτ cos(ωj τ ) = c0 + 2
τ =1−T cτ cos(ωj τ ),
τ =1 where ωj = 2πj/T ; j = 0, 1, . . . , [T /2] are the Fourier frequencies and
T −1
¯
¯
{cτ ; τ = 0, ±1, . . . , ±(T − 1)}, with cτ = T −1 t=τ (yt − y )(yt−τ − y ), are
the empirical autocovariances.
1 EC3062 ECONOMETRICS
The Periodogram and the Autocovariances
We need to show this deﬁnition of the peridogram is equivalent to the
previous deﬁnition, which was based on the following frequency decomposition of the sample variance:
1
T
where
αj =
βj = 2
T
2
T T −1 1
2
(yt − y ) =
¯
2
t=0
yt cos(ωj t) = t yt sin(ωj t) =
t [T /2]
2
2
(αj + βj ),
j =0 2
T
2
T (yt − y ) cos(ωj t),
¯
t (yt − y ) sin(ωj t).
¯
t 2
2
Substituting these into the term T (αj + βj )/2 gives the periodogram 2
I (ωj ) =
T T −1 T −1 2 cos(ωj t)(yt − y )
¯
t=0 sin(ωj t)(yt − y )
¯ +
t=0 2 2 . EC3062 ECONOMETRICS
The quadratic terms may be expanded to give
I (ωj ) = 2
T
+ cos(ωj t) cos(ωj s)(yt − y )(ys − y )
¯
¯
t 2
T s sin(ωj t) sin(ωj s)(yt − y )(ys − y ) ,
¯
¯
t s Since cos(A) cos(B ) + sin(A) sin(B ) = cos(A − B ), this can be written as
I (ωj ) = 2
T cos(ωj [t − s])(yt − y )(ys − y )
¯
¯
t s On deﬁning τ = t − s and writing cτ =
reduce the latter expression to t (yt − y )(yt−τ − y )/T , we can
¯
¯ T −1 I (ωj ) = 2 cos(ωj τ )cτ ,
τ =1−T which is a Fourier transform of the empirical autocovariances.
3 EC3062 ECONOMETRICS 10
7.5
5
2.5
0
0 π/4 π/2 3π/4 π Figure 1. The spectral density function of an MA(2) process
y (t) = (1 + 1.250L + 0.800L2 )ε(t) . 4 EC3062 ECONOMETRICS 60
40
20
0
0 π/4 π/2 3π/4 π Figure 2. The graph of a periodogram calculated from 160
observations on a simulated series generated by an MA(2) process
y (t) = (1 + 1.250L + 0.800L2 )ε(t). 5 EC3062 ECONOMETRICS 30
20
10
0
0 π/4 π/2 3π/4 π Figure 3. The spectral density function of an AR(2) process
(1 − 0.273L + 0.810L2 )y (t) = ε(t). 6 EC3062 ECONOMETRICS 125
100
75
50
25
0
0 π/4 π/2 3π/4 π Figure 4. The graph of a periodogram calculated from 160
observations on a simulated series generated by an AR(2) process
(1 − 0.273L + 0.810L2 )y (t) = ε(t). 7 EC3062 ECONOMETRICS 60
40
20
0
0 π/4 π/2 3π/4 π Figure 5. The spectral density function of an ARMA(2, 1)
process (1 − 0.273L + 0.810L2 )y (t) = (1 + 0.900L)ε(t). 8 EC3062 ECONOMETRICS 100
75
50
25
0
0 π/4 π/2 3π/4 π Figure 6. The graph of a periodogram calculated from 160
observations on a simulated series generated by an ARMA(2, 1)
process (1 − 0.273L + 0.810L2 )y (t) = (1 + 0.900L)ε(t). 9 EC3062 ECONOMETRICS
The Methodology of Box and Jenkins
Box and Jenkins proposed to use the autocorrelation and partial autocorrelation functions for identifying the orders of ARMA models. They paid
little attention to the periodogram.
Autocorrelation function (ACF). Given a sample y0 , y1 , . . . , yT −1 of
T observations, the sample autocorrelation function {rτ } is the sequence
rτ = cτ /c0 , τ = 0, 1, . . . , ¯
¯
where cτ = T −1 (yt − y )(yt−τ − y ) is the empirical autocovariance at lag
τ and c0 is the sample variance.
As the lag increases, the number of observations comprised in the
empirical autocovariances diminishes.
Partial autocorrelation function (PACF). The sample partial autocorrelation function {pτ } gives the correlation between the two sets of
residuals obtained from regressing the elements yt and yt−τ on the set
of intervening values yt−1 , yt−2 , . . . , yt−τ +1 . The partial autocorrelation
measures the dependence between yt and yt−τ after the eﬀect of the intervening values has been removed.
10 EC3062 ECONOMETRICS
Reduction to Stationarity.
The ﬁrst step is to examine the plot of the data to judge whether or not
the process is stationary. A trend can be removed by ﬁtting a parametric
curve or a spline function to create a stationary sequence of residuals to
which an ARMA model can be applied.
Box and Jenkins believed that many empirical series can be modelled
by taking a suﬃcient number of diﬀerences to make it stationary. Thus,
the process might be modelled by the ARIMA(p, d, q ) equation
α(L)∇d y (t) = µ(L)ε(t),
where ∇d = (I − L)d is the dth power of the diﬀerence operator.
Then, z (t) = ∇d y (t) will be described by a stationary ARMA(p, q )
model. The inverse operator ∇−1 is the summing or integrating operator,
which is why the model described an autoregressive integrated movingaverage. 11 EC3062 ECONOMETRICS 18.5
18.0
17.5
17.0
16.5
16.0
15.5
0 50 100 150 Figure 7. The plot of 197 concentration readings from a chemical
process taken at 2hour intervals. 12 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
0 5 10 15 20 25 Figure 8. The autocorrelation function of the concentration readings
from a chemical process. 13 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
0 5 10 15 20 25 Figure 9. The autocorrelation function of the diﬀerences of the concentration readings from the chemical process. 14 EC3062 ECONOMETRICS
When Stationarity has been achieved, the autocorrelation sequence of the
resulting series should converge rapidly to zero as the value of the lag
increases. (See Figure 9.)
The characteristics of pure autoregressive and pure movingaverage
process are easily spotted. Those of a mixed autoregressive movingaverage model are not so easily unravelled.
Movingaverage processes. The theoretical autocorrelation function
{ρτ } of an M(q ) process has ρτ = 0 for all τ > q . The partial autocorrelation function {πτ } is liable to decay towards zero gradually.
To determine whether the parent autocorrelations are zero after lag
q , we may use a result of Bartlett [1946] which shows that, for a sample
of size T , the standard deviation of rτ is approximately
(4) 1
2
2
2
√ 1 + 2(r1 + r2 + · · · + rq )
T 1/2 for τ > q. A measure of the scale of the autocorrelations is provided by the limits
√
of ±1.96/ T , which are the approximate 95% conﬁdence bounds for the
autocorrelations of a whitenoise sequence. These bounds are represented
by the dashed horizontal lines on the accompanying graphs.
15 EC3062 ECONOMETRICS 4
3
2
1
0
−1
−2
−3
−4
−5
0 25 50 75 100 Figure 10. The graph of 120 observations on a simulated series
generated by the MA(2) process y (t) = (1 + 0.90L + 0.81L2 )ε(t). 16 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
0 5 10 15 20 25 Figure 11. The theoretical autocorrelation function (ACF) of the
MA(2) process y (t) = (1 + 0.90L + 0.81L2 )ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations. 17 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
−0.75
0 5 10 15 20 25 Figure 12. The theoretical partial autocorrelation function (PACF)
of the MA(2) process y (t) = (1+0.90L +0.81L2 )ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations. 18 EC3062 ECONOMETRICS
Autoregressive processes. The theoretical autocorrelation function
{ρτ } of an AR(p) process obeys a homogeneous diﬀerence equation based
upon the autoregressive operator α(L) = 1 + α1 L + · · · + αp Lp :
(5) ρτ = −(α1 ρτ −1 + · · · + αp ρτ −p ) for all τ ≥ p. The autocorrelation sequence will be a mixture of damped exponential
and sinusoidal functions. If the sequence is of a sinusoidal nature, then
the presence of complex roots in the operator α(L) is indicated.
The partial autocorrelation function {πτ } serves most clearly to identify a pure AR process. An AR(p) process has πτ = 0 for all τ > p.
The signiﬁcance of the values of the empirical partial autocorrelations
is judged by the fact that, for a pth order process, their standard deviations
√
for all √ greater that p are approximated by 1/ T . The bounds of
lags
±1.96/ T are plotted on the graph of the partial autocorrelation function. 19 EC3062 ECONOMETRICS 15
10
5
0
−5
−10
−15
0 25 50 75 100 Figure 13. The graph of 120 observations on a simulated series
generated by the AR(2) process (1 − 1.69L + 0.81L2 )y (t) = ε(t). 20 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
0 5 10 15 20 25 Figure 14. The theoretical autocorrelation function (ACF) of the
AR(2) process (1 − 1.69L + 0.81L2 )y (t) = ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations. 21 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
−0.75
−1.00
0 5 10 15 20 25 Figure 15. The theoretical partial autocorrelation function (PACF)
of the AR(2) process (1 − 1.69L +0.81L2 )y (t) = ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations. 22 EC3062 ECONOMETRICS
Mixed processes. Neither the theoretical autocorrelation or partial autocorrelation functions of an ARMA(p, q ) process have abrupt cutoﬀs.
The autocovariances an ARMA(p, q ) process satisfy the same diﬀerence
equation as that of a pure AR model for all values of τ > max(p, q ).
A rational transfer function is more eﬀective in approximating an
arbitrary impulse response than is an AR or an MA transfer function
The sum of any two mutually independent AR processes gives rise to
an ARMA process. Let y (t) and z (t) be AR processes of orders p and r
respectively described by α(L)y (t) = ε(t) and ρ(L)z (t) = η (t), wherein
ε(t) and η (t) are mutually independent whitenoise processes. Then their
sum will be (6) ε(t)
η (t)
y (t) + z (t) =
+
α(L) ρ(L)
ρ(L)ε(t) + α(L)η (t)
µ(L)ζ (t)
=
=
,
α(L)ρ(L)
α(L)ρ(L) where µ(L)ζ (t) = ρ(L)ε(t)+α(L)η (t) constitutes a movingaverage process
of order max(p, r).
23 EC3062 ECONOMETRICS 40
30
20
10
0
−10
−20
−30
0 20 50 75 100 Figure 16. The graph of 120 observations on a simulated series generated by
the ARMA(2, 2) process (1 − 1.69L +0.81L2 )y (t) = (1+0.90L +0.81L2 )ε(t). 24 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
0 5 10 15 20 25 Figure 17. The theoretical autocorrelation function (ACF) of the ARMA(2,
2) process (1 − 1.69L + 0.81L2 )y (t) = (1 + 0.90L + 0.81L2 )ε(t) (the solid
bars) together with its empirical counterpart, calculated from a simulated series
of 120 observations. 25 EC3062 ECONOMETRICS 1.00
0.75
0.50
0.25
0.00
−0.25
−0.50
−0.75
−1.00
0 5 10 15 20 25 Figure 18. The theoretical partial autocorrelation function (PACF) of the
ARMA(2, 2) process (1 − 1.69L + 0.81L2 )y (t) = (1 + 0.90L + 0.81L2 )ε(t)
(the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations.. 26 EC3062 ECONOMETRICS
FORECASTING WITH ARMA MODELS
The Coeﬃcients of the MovingAverage Expansion
The ARMA model α(L)y (t) = µ(L)ε(t) can be cast in the form of
y (t) = {µ(L)/α(L)}ε(t) = ψ (L)ε(t) where
ψ (L) = {ψ0 + ψ1 L + ψ2 L2 + · · ·}
is from the expansion of the rational function.
The method of ﬁnding the coeﬃcients of the series expansion can be
illustrated by the secondorder case:
µ0 + µ1 z
= ψ 0 + ψ1 z + ψ2 z 2 + · · · .
α1 + α1 z + α2 z 2
We rewrite this equation as
µ0 + µ1 z = α1 + α1 z + α2 z 2
27 ψ0 + ψ1 z + ψ2 z 2 + · · · . EC3062 ECONOMETRICS
The following table assists us in multipling togther the two polyomials:
ψ0 ψ1 z ψ2 z 2 ··· α0 α0 ψ0 α0 ψ1 z α0 ψ2 z 2 ··· α1 z α1 ψ0 z α1 ψ1 z 2 α1 ψ2 z 3 ··· α2 z 2 α2 ψ0 z 2 α2 ψ1 z 3 α2 ψ2 z 4 ··· Performing the multiplication on the RHS of the equation, and by equating
the coeﬃcients of the same powers of z on the two sides, we ﬁnd that
µ0 = α0 ψ0 ,
µ1 = α0 ψ1 + α1 ψ0 ,
0 = α0 ψ2 + α1 ψ1 + α2 ψ0 ,
.
.
.
0 = α0 ψn + α1 ψn−1 + α2 ψn−2 , ψ0 = µ0 /α0 ,
ψ1 = (µ1 − α1 ψ0 )/α0 ,
ψ2 = −(α1 ψ1 + α2 ψ0 )/α0 ,
.
.
.
ψn = −(α1 ψn−1 + α2 ψn−2 )/α0 . 28 EC3062 ECONOMETRICS
The optimal (minimum meansquare error) forecast of yt+h is the conditional expectation of yt+h given the information set It comprising the
values of {εt , εt−1 , εt−2 , . . .} or equally the values of {yt , yt−1 , yt−2 , . . .}.
On taking expectations y (t) and ε(t) conditonal on It , we ﬁnd that (21) ˆ
E (yt+k It ) = yt+k if k > 0,
E (yt−j It ) = yt−j if j ≥ 0,
E (εt+k It ) = 0 if k > 0,
ˆ
E (εt−j It ) = εt−j = yt−j − yt−j if j ≥ 0. In this notation, the forecast h periods ahead is
∞ h E (yt+h It ) = ψh−k E (εt+k It ) +
j =0 k=1
∞ (22) ψh+j εt−j . = ψh+j E (εt−j It ) j =0 29 EC3062 ECONOMETRICS
In practice, the forecasts are generated recursively via the equation
(23) y (t) = − α1 y (t − 1) + α2 y (t − 2) + · · · + αp y (t − p)
+ µ0 ε(t) + µ1 ε(t − 1) + · · · + µq ε(t − q ). By taking the conditional expectation of this function, we get
(24) (25) (26) yt+h = −{α1 yt+h−1 + · · · + αp yt+h−p }
ˆ
ˆ
+ µh εt + · · · + µq εt+h−q when
yt+h = −{α1 yt+h−1 + · · · + αp yt+h−p }
ˆ
ˆ 0 < h ≤ p, q,
if q < m ≤ p, ˆ
ˆ
yt+h = −{α1 yt+h−1 + · · · + αp yt+h−p }
ˆ
+ µh εt + · · · + µq εt+h−q if p < h ≤ q, and
(27) ˆ
ˆ
yt+h = −{α1 yt+h−1 + · · · + αp yt+h−p } when p, q < h.
ˆ
30 EC3062 ECONOMETRICS
Equation (27) ashows that, whn h > p, q , the forecasting function becomes
a pthorder homogeneous diﬀerence equation in y . The p values of y (t)
from t = r = max(p, q ) to t = r − p + 1 serve as the starting values for the
equation.
The behaviour of the forecast function beyond the reach of the starting values is determind the roots of the autoregressive operator α(L) = 0
If all of the roots of α(z ) = 0 are less than unity, then yt+h will
ˆ
converge to zero as h increases.
If one of the roots is unity, then the forecast function will converge
to a nonzero consant.
If the are two unit roots, then the forecast function will converg to a
linear trend.
In general, if d of the roots are unity, then the general solution will
comprise a polynomial in t of order d − 1.
31 EC3062 ECONOMETRICS
The forecasts can be updated easily once the coeﬃcients in the expansion
of ψ (L) = µ(L)/α(L) have been obtained. Consider
(28) yt+ht+1 = {ψh−1 εt+1 + ψh εt + ψh+1 εt−1 + · · ·} and
ˆ
yt+ht = {ψh εt + ψh+1 εt−1 + ψh+2 εt−2 + · · ·}.
ˆ The ﬁrst of these is the forecast for h − 1 periods ahead made at time t + 1
whilst the second is the forecast for h periods ahead made at time t. It
can be seen that
(29) ˆ
yt+ht+1 = yt+ht + ψh−1 εt+1 ,
ˆ ˆ
where εt+1 = yt+1 − yt+1 is the current disturbance at time t + 1. The
later is also the prediction error of the onestepahead forecast made at
time t. 32 ...
View
Full
Document
This note was uploaded on 03/02/2012 for the course EC 3062 taught by Professor D.s.g.pollock during the Spring '12 term at Queen Mary, University of London.
 Spring '12
 D.S.G.Pollock
 Econometrics

Click to edit the document details