Eco 403:Time series analysis and prediction
Lecture 18 VAR model, Granger Causality
[email protected]
1. Summary of Lecture 17
1) Trend and get rid of the trend
a. Deterministic trend, eg deterministic liner trend: yt = y0 + t + A( L) t , t white
noise, A(

Eco 403:Time series analysis and prediction
[email protected]
Lecture 9 Forecast function and forecast intervals
1. Forecast function
1). Eg.1. At year T, apple salesman S order apple from apple planter based on the
F
estimated forecast of demand for next y

Eco 403:Time series analysis and prediction
[email protected]
Lecture 7 Sample ACF and Sample PACF, Box-Jenkins Model Selection
1.
Summary of Lecture 6:
1) The lag s ACF: = Corr ( yt , y=
s
t s )
Cov( yt , yt s ) s
=
= =
; 1, s s
0 0
var( yt )
a. ACF for AR

Eco 403:Time series analysis and prediction
[email protected]
Lecture 5 White noise, ARMA model, Stationary.
1. Summary of Lecture 3:
1) Solve stochastic DE by general solution method:
a. When the forcing process is constructed by random variables, eg. cfw_

Eco 403:Time series analysis and prediction
[email protected]
Lecture 6 ACF and PACF
1. Summary of Lecture 5:
1) White noise:
Example 2 in lecture 3: yt = 0.3 yt 1 + 0.4 yt 2 + t , t is i.i.N (independent, identical,
normal), the restriction is too strict,

Eco 403:Time series analysis and prediction
[email protected]
Lecture 3 DE with stochastic forcing process; stability condition
1 Summary of Lecture 2:
1) Difference Equation (DE): DE express the current value of a variable eg. yt , as
function of its own l

ECO 403: Time series analysis and prediction
Yushu Li
Department of Business and Management Science, NHH
[email protected]
Syllabus
1.
20 lectures, including 5 computer labs
Lectures notes: handwriting + certain selected literatures
1.
Time and place:
Wedne

Eco 403:Time series analysis and prediction
[email protected]
Lecture 2 Difference equation (DE) and How to resolve DE
1. Example 1, stochastic DE (first order):= ayt 1 + t ,
yt
yt : Product of apples (100kg) from farm Q at year t.
As Q wants to plant more

Eco 403:Time series analysis and prediction
[email protected]
Lecture 10 Forecast Evaluation and Comparison
1. Summary for lecture 9
At time T, based on realization y1 , yT , Box-jenkins method selects suitable
ARMA(p,q) model, then we can use this model an

Eco 403:Time series analysis and prediction
[email protected]
Lecture 13 Introduction to ARCH, GARCH process
1. The stylized fact
p
1) For ARMA(p,q) model
q
yt = ai yt i + bi t i , t are white noise with
a0 +
= 1= 0
i
i
= 0, var( t ) 2 , and realizations t

Eco 403:Time series analysis and prediction
Lecture 18 VAR model, Granger Causality
[email protected]
1. Summary of Lecture 17
1) Trend and get rid of the trend
a. Deterministic trend, eg deterministic liner trend: yt = y0 + t + A( L) t , t white
noise, A(

Eco 403:Time series analysis and prediction
[email protected]
Computer Lab 5 Unit Root, Granger Causality, Cointegration
1. Unit root test
Test for unit root is to test if the series is stochastic stationary or not
1) The ACF of a series with unit root deca

Summary for Course Eco403
1. Lecture 1-4: Introduction and Difference Equations (DE)
[email protected]
1) Important contents:
a. Resolve stochastic DE by iterative or general solution method (Lecture 3, section 1.,
2), a., b.)
b. What does convergence mean

Estimation of ARMA Models
Eric Zivot
April 6, 2005
1
Maximum Likelihood Estimation of ARMA Models
For iid data with marginal pdf f (yt ; ), the joint density function for a sample y =
(y1 , . . . , yT ) is simply the product of the marginal densities for

Eco 403:Time series analysis and prediction
[email protected]
Lecture 19 Cointegration and Error correlation models (ECM)
1. Summary of lecture 18
1) VAR model: can investigate the feedback dynamics inside several time series
variables. Eg. VAR(1) with two

Eco 403:Time series analysis and prediction
Lecture 17 Models with trend, Unit root test
[email protected]
1. Deterministic and stochastic trend
Box-Jenkins method applied to time series which is stationary. Now let us first see some
time series which may n

Eco 403:Time series analysis and prediction
[email protected]
Lecture 15 Volatility model: Diagnostic, constructing, forecasting
1. Summary of Lecture 14
1) Test ARCH and GARCH effect
Week white noise: E ( t ) = 0 , Var ( t ) = 2 , Cov( t , t j ) = 0 but co

Eco 403:Time series analysis and prediction
[email protected]
Lecture 14 Test for ARCH, property of GARCH(1,1)
1. Summary for Lecture 13
1) If a time series shows volatility clustering (high volatility followed by high volatility,
low volatility followed by

Eco 403:Time series analysis and prediction
[email protected]
Lecture 11 Seasonality, structural break and MLE
1. Seasonality
1) Def: A stochastic process is a seasonal (or periodic) time series with periodicity s
if X t and X t + ks have the same distribut

Eco 403:Time series analysis and prediction
[email protected]
Lecture 8 Computer Lab 2 Box-Jenkins method
1. Summary of Lecture 7
1). Test if H 0 : s = 0 and H 0 : kk = 0 to identify q and p.
2). Principle of parsimony: model should fit data well (low sum o

1. The cobweb model can be used to determine the price of e.g. wheat.
A stochastic version of it, as formulated in Enders, can be written:
dt = a pt
st = b + p + t
t
st = dt
(1)
(2)
(3)
where
dt = demand for wheat in period t
st = supply wheat in t
pt = m

1. A nancial analyst is using a 2 4 moving averages as a dependent
variable in a regression model. He notes that the moving average is
autocorrelated although economic theory tells him that the original
variable should be approximately white noise. He bec

Solution to Homework 5
1) As the regression include drift but no time trend, here we choose the test statistic
r 0
= = 0.014 . When sample size T=250 we can check the critical value at 5% for
sd (r )
is -2.88, thus we do not reject the null hypothesis, t

Solution to part a.
The stationary condition can refer to lecture note 6, page 2, Stationary Condition for
ARMA(p,q)
(1) In the auto-regressive part, we have 2, 0.3, 0.4 and the characteristic
equation is as below.
0.3 0.4 0,
0.8 0.5 0
The characteri