STAT 758, Spring 2012
Key solution for Home Work 4
Prepared by Tracy Backes
MA(q )
Below we assume that Zt W N (0, 2 ).
4.1 Consider MA(1) process Xt = a Zt + b Zt1 . Find the white noise Wt such that
the process Xt is presented as Xt = Wt + Wt1 with Wt W
Introduction to Time Series Analysis. Lecture 11.
Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker estimation: example
1
Review (Lecture 1): Ti
Homework 2 solutions
Joe Neeman
September 22, 2010
1. (a) We compute three cases: since the Wt are uncorrelated, we can ignore
any cross-terms of the form EWs Wt when s = t. Then
9
19
25
EWt2 1 + EWt2 2 =
4
4
2
15
5
5
(1) = EWt2 + EWt2 1 =
2
4
4
3
3
(2
Introduction to Time Series Analysis. Lecture 5.
Peter Bartlett
www.stat.berkeley.edu/bartlett/courses/153-fall2010
Last lecture: 1. ACF, sample ACF 2. Properties of the sample ACF 3. Convergence in mean square
1
Introduction to Time Series Analysis. Lect
Homework 1 solutions
Joe Neeman
September 10, 2010
1. To check that cfw_Xt is white noise, we need to compute its means and
covariances. For the means, EXt = EWt (1 Wt1 )Zt = (EWt )(1
EWt1 )(EZt ) = 0. For the covariances,
(s, t) = E Ws (1 Ws1 )Zs Wt (
Introduction to Time Series Analysis. Lecture 1.
Peter Bartlett 1. Organizational issues. 2. Objectives of time series analysis. Examples. 3. Overview of the course. 4. Time series models. 5. Time series modelling: Chasing stationarity.
1
Organizational I
STAT 758, Spring 2012
Key solution for Home Work 3
Prepared by Tracy Backes
ACF, iid sequence, white noise, random walk
3.1 Give two examples (specify distributions) of each:
a) iid sequence:
(a.1) Xt iid N (0, 1), t Z,
(a.2) Xt iid Uniform([0, 1]), t Z
b
Stat 153: Homework 3
Due Mon 11/2
October 23, 2015
1. [5pts] Consider a dataset of size n generated according to the zero-mean AR(1) model
with parameter and Gaussian noise. The Yule-Walker estimate of is approxi2 )/n. In this problem, we check this
matel
Statistical Models of Time Series
ARIMA Models
Important Examples
Statistical Measurements
Stationarity
Estimation of Correlation
Sample ACF
Prediction with ACF
More Properties of ACF
Review: Stationarity and Correlation
A time series cfw_Xt has mean fun
Important Examples
Statistical Measurements
Stationarity
Estimation of Correlation
Sample ACF
Prediction with ACF
More Properties of ACF
Statistical Models of Time Series
ARIMA Models
Review: Sample Autocovariance
The sample mean is
n
1X
b=
xi .
n
i=1
The
Statistical Models of Time Series
ARIMA Models
Important Examples
Statistical Measurements
Stationarity
Estimation of Correlation
Sample ACF
Prediction with ACF
More Properties of ACF
Review: Sample Mean
The sample mean is
n
b=
1X
xi .
n
i=1
We saw that
Logistics
Introduction to Time Series
Statistical Models of Time Series
Stat 153: Introduction to Time Series
Joan Bruna
Department of Statistics
UC, Berkeley
September 7, 2015
Joan Bruna
Stat 153: Introduction to Time Series
Logistics
Introduction to Tim
ARIMA Models
Forecasting
Partial Autocorrelation Function
Convergence of Linear Processes
Autoregressive Processes
AR(p) Models
ARMA processes
Review: ARMA Processes
Definition
An ARMA(p,q) process cfw_Xt is a stationary process that satisfies
Xt
1 Xt 1
Statistical Models of Time Series
ARIMA Models
Important Examples
Statistical Measurements
Stationarity
Estimation of Correlation
Sample ACF
Prediction with ACF
More Properties of ACF
Last Lecture Review
A (stochastic) Time Series is a collection cfw_Xt
ARIMA Models
Forecasting
Partial Autocorrelation Function
Innovations Representation
Parameter Estimation
Review: Forecasting Operator
t
Xt+1
=
T
t X
=
t
X
t,k Xt+1 k
,
k=1
Rt =
t
0
B
B
=B
@
RX (0)
RX (1)
.
.
RX (t
, with
t t
RX (1)
RX (0)
.
.
1) RX (t
Rt
ARIMA Models
Forecasting
Convergence of Linear Processes
Autoregressive Processes
AR(p) Models
ARMA processes
Review: AR(1) Process
Xt = Xt
1
+ Wt .
It has a unique, well-defined, stationary solution when
It has many non-stationary solutions, even for | |
Stat 153: Homework 2
Due Mon 10/5
September 27, 2015
1. [5pts] (Problem 1.29)
(a) Suppose Xt is a weakly stationary process with zero mean and
Show that if
h= RX (h) = 0 then
h |RX (h)|
< .
p
nX0,
where X is the sample mean.
(b) Give an example of a proc
Stat 153: Homework 1
Due Fri 9/18
September 9, 2015
1. [5pts] (Problem 1.8) Consider the model
Xt = + Xt1 + Wt ,
for t = 1, 2 . . . with X0 = 0, and where Wt is a white noise with variance 2 .
(a) Plot samples of Xt by simulation, by picking = 0 and = 0.
STAT 758, Fall 2014
Home Work 1
Due date: Sep. 10
Dierencing, backshift operator
All notations are from lectures.
1.1 Show that the dierence operators
and
12
are commutative, that is
12
=
12 .
1.2 Show that the dierence operator
7
eliminates a linear tren
STAT 758, Spring 2012
Key solution for Home Work 2
Prepared by Tracy Backes
ACF, stationarity
2.1 Let cfw_Xt be a sequence of uncorrelated random variables, each with mean 0 and
variance 2 . For each of the following processes, nd its representation in t
STAT 758, Fall 2014
Home Work 2 (due Sep. 24)
ACF, stationarity
2.1 Let cfw_Xt be a sequence of uncorrelated random variables, each with mean 0 and
variance 2 . For each of the following processes, nd its representation in terms
of lagged X-values (i.e.
STAT 758, Spring 2012
Solution key for Home Work 5
Prepared by Tracy Backes
MA(q ), invertibility
Below we assume that Zt W N (0, 2 ), B is a backshift operator.
5.1 Find the operator inverse to
a) 1 + 2B
1
=
1 + 2B
(2)i B i
i=0
b) 1 0.3B
1
=
1 0.3B
0.3i
STAT 758, Fall 2014
Home Work 3 (due Sep 24)
ACF, iid sequence, white noise, random walk
3.1 Give two examples (specify distributions) of each: a) iid sequence, b) white noise,
c) random walk.
3.2 Give example of weakly but not strictly stationary stochas
STAT 758, Fall 2014
Home Work 5
MA(q) processes, invertibility
Below we assume that Zt W N (0, 2 ), B is a backshift operator.
Problem 1
Find the operator inverse to
a) 1+2B
b) 1=0.3B
c) 2+0.6B
Problem 2
Examine invertibility for the following processes:
STAT 758, Fall 2014
Home Work 4
MA(q)
Below we assume that Zt W N (0, 2 ).
4.1 Consider MA(1) process Xt = a Zt + b Zt1 . Find the white noise Wt such that
the process Xt is presented as Xt = Wt + Wt1 with Wt W N (0, 2 ).
4.2 Find acvf and acf for MA(1),
STAT 758, Fall 2014
Home Work 7
Conditional expectation, Second-order forecasting
Problem 1
The rvs Y and X are related as
Y = 10 + 20 X + ,
N (0, 42 ).
a) Find the conditional distribution of Y given X = x.
b) Find the conditional expectation of Y given
STAT 758, Fall 2014
Home Work 6
SARIMA
2
We assume below that Zt is a white noise with mean 0 and variance Z .
Problem 1
For the model (1 B)(1 0.2 B)Xt = (1 0.5 B)Zt :
a) Classify the model as an ARIMA(p, d, q) process (i.e. nd p, d, q).
b) Determine whet
We assume below that Zt W N (0, 2 ), B is a backshift operator.
6.1 For the model (1 B)(1 0.2B)Xt = (1 0.5B)Zt :
a) Classify the model as an ARIMA(p, d, q) process (i.e. nd p, d, q). ARIMA(1,1,1)
b) Determine whether the process is stationary, causal, inv
We assume below that Zt W N (0, 2 ), B is a backshift operator.
1. Construct 1, 2, and 3-step forecasts for AR(2) process Xt = 1 Xt1 + 2 Xt2 + Zt and calculate the
forecast errors. Find the values of 1 , 2 that minimize the forecast errors. Discuss.
h =
Statistics 758, Fall 2014
University of Nevada Reno
Homework 7 Solutions
Problem 1: The rvs Y and X are related as Y 10 20 X , ~ N 0, 42
a) Find the conditional distribution of Y given X x
Y | X x ~ N 10 20x, 42
b) Find the conditional expectation of Y g
STAT 153: Homework 1
Due: September 9, 2016 at the beginning of the lab section
1
Problem 1
Suppose
Yt = 0 +
k
X
[Ai cos(2fi t) + Bi sin(2fi t)] ,
i=1
where 0 , f1 , f2 , ., fk are constants and A1 , A2 , ., Ak , B1 , B2 , ., Bk are independent random var
STAT 153: Homework 2
Due: September 16, 2016 at the beginning of the lab section
1
Problem 1
The data file wages 1 contains monthly values of the average hourly wages (in dollars) for workers in the U.S. apparel
and textile products industry for July 1981