Consistency and and asymptotic
normality of estimators
In the previous chapter we considered estimators of several dierent parameters. The hope
is that as the sample size increases the estimator should get closer to the parameter of
0.4 0.0 0.4 0.8
Figure 1: Plot of the sunspot data
Solutions 6 (STAT673)
(5.1) In this exercise we analyze the Sunspot data found on the course website. In th
Solutions 3 (STAT673)
(i) Show that the function c(u) = exp(a|u|) where a > 0 is a positive semi-denite
The Fourier transform is
f () =
exp(a|u|) exp(iu)du +
which is clearly posi
Solutions 4 (STAT673)
(3.1) Recall the AR(2) models considered in Exercise (2.4). Now we want to derive their
(i) (a) Obtain the ACF corresponding to
Xt = Xt1 Xt2 + t ,
where cfw_t are iid random variables with mean zero and varian
Solutions 8 (STAT673)
(8.1) (a) Simulate an AR(2) process and run the above code using the sample size
2 1 0
sqrt(n) * Re(dftcov1[1:30])
(i) n = 64 (however use k<-kernel("daniell",3)
A plot if given in Figure 1. In this simulation we
Solutions 2 (STAT673)
(1.5) State, with explanation, which of the following time series is second order stationary,
which are strictly stationary and which are both.
(i) cfw_t are iid random variables with mean zero and variance one.
cfw_t is strictly s
Solutions 1 (STAT673)
(i) Import the yearly temperature data (le global mean temp.txt) into R and t
the a linear trend to the data (use the R command lsfit).
(ii) Suppose the errors in model are correlated. Under the correlated assumption,
Solutions 7 (STAT673)
(6.1) Under the assumption that cfw_Xt are iid random variables show that cn (1) is asymp
Hint: Let m = n/(b + 1) and partition the sum
Xt Xt+1 =
Xt Xt+1 as follows
Xt Xt+1 + Xb+1 Xb+2 +
(a) With a sample proportion of = n = 2500 = 0.324, the estimate of standard error is given
by s.e. =
95% margin of error (z.025)( s.e.) = (1.96)(0.00936) = 0.0183456, so that
95% Confidence Interval for = (0.
Time Series Analysis (Final Exam) 2 hours (Statistics Majors)
(1) Suppose that cfw_Xt is a second order stationary time series with autocovariance function
cfw_c(r), with r |rc(r)| < . Directly verify that the DFT of the covariances (you
Estimation for Linear models
The Gaussian likelihood.
Some idea of what a cumulant is.
To derive the sample autocovariance of a time series, and show that this is a positive
STAT 673 Homework 1
(1) Consider the MA(2) process
Xt = t t 1 t 2 ,
where cfw_t are iid random variables with mean zero and variance one. Derive the
autocovariance function of cfw_Xt .
(2) Show for the AR(2) model Xt = 1 Xt1 + 2 Xt2 + t to have a
A time series is a series of observations xt , each observed at the time t. Typically the observations
can be over an entire interval, randomly sampled on an interval or at xed time points. Dierent
types of time sampling require die
Linear time series
Familarity with linear models.
Solve polynomial equations.
Be familiar with complex numbers.
Understand under what conditions the sequences have well dened limits, with particular application to the innite su
The best linear predictor.
Some idea of what a basis of a vector space is.
Understand that prediction using a long past can be dicult because a large matrix
has to be inverted, thus alternative, recursive m
The Gaussian likelihood.
The approximation of a Toeplitz by a Circulant (covered in previous chapters).
The DFTs are close to uncorrelated but have a frequency dependent variance (under
Knowledge of complex numbers.
Have some idea of what the covariance of a complex random variable (we do dene it
Some idea of a Fourier transform.
Know the denition of the spectral den