STAT 626 Exam 1
Summer, 1996 Dr. Newton
Name:
1. (10 points) If X is the result of applying a 2K + 1 moving average smoother to a white noise process
having variance 2 , what is the variance of X (t)? What happens to this variance as K gets large?
2. (10
STAT 626: Outline Lecture 20
ARCH-GARCH Models (5.4)
1. Taking Care of Time-Varying Variances: t2
2. Time Series Decomposition: xt = t + t t ,
Var(t t ) = t2 .
3. How to Model Time-Varying Variances?
Recall that Squared Residuals yt2 are Reasonable Estima
STAT 626: Outline of Lecture 14
The ARIMA (p, d, q) Model Building Process (3.8)
1. Plot the Data, Transform to Stationarity if Necessary,
Select the Differencing Order d.
2. Model Formulation: Use the ACF and PACF to Select p, q:
ARIMA(p, d, q).
3. Model
STAT 626: Outline of Lecture 10
(Partial) Correlogram: ACF and PACF of ARMA Models (3.4)
1. Review:
One-Sided MA() or Causal Process: Is a time series involving only the past
and present values of a white noise (shocks, inputs):
xt =
X
j wtj
j=0
with abso
STAT 626: Outline of Lecture 15
Multiplicative Seasonal ARIMA (SARIMA) Models (3.9)
1. Plot the Data
2. Induce Stationarity by Seasonal Differencing or Other Means
3. Model Formulation: Use the ACF and PACF to Select p, q, P, Q
4. Model Estimation: Find t
STAT 626: Outline of Lecture 1
1. A Quick Review of the Syllabus
2. Time Series Data:
Anything measured regularly over time is a time series.
3. Examples of Time Series Data
4. Goals of Time Series Analysis
Forecasting
Description or
dynamic of the
data
1
STAT 626: Outline of Lecture 9
Review of ARMA Models; Difference Equations (3.2, 3.3)
1. Evaluation of Project Presentations:
Your groups presentation/project is evaluated based on the level of interest/question/enthusiasm
it generates;
Local students are
STAT 626: Outline of Lecture 12
Forecasting Based on the Infinite Past (3.5)
1. Forecasting a One-sided MA() Time Series; Infinite Past
2. Example 3.22: Long-Range Forecasts,
n
) and Forecast Interval:
3. Prediction Error Variance (Pn+1
p n
xnn+1 1.96 Pn+
STAT 626: Review: Multivariate Time Series Analysis (5.8)
1. Problem 3.42: Normal equations, Invertibility, etc.
2. How to Model Several TS Simultaneously?
3. VMA(1) Models: Xt = Wt + Wt1 = (I + B)Wt ,
Wt WN(0, )
ACF, PACF, Invertibility,.?
4. VAR(1) Mode
STAT 626: Outline of Lecture 19
Spectral Density Func., DFT, Periodogram (4.3, 4.4)
1. There are two major approaches to time series analysis, both have the goal of reducing the data to WN or making them uncorrelated:
Time-Domain Approach: Relies on the c
STAT 626: Outline of Lecture 16
Focus On Forecasting (3.5)
1. Treating TS Data as Independent Observations: x IS the Predictor
2. Forecasting An AR(1) Model: The Past is Discounted by Correlation
3. Reading Assignment: Examples 3.36, Random Walk, and 3.37
STAT 626: Outline of Lecture 2
Time Series Statistical Models (1.3)
1. White Noise: The Building Blocks
2. Autoregression: The Birth of Modern Time Series Analysis
3. Random Walks: The Engine of Financial Engineering
4. Signal + Noise: For Other Engineeri
STAT 626: Outline of Lecture 8
Causal and Invertible ARMA Models (3.2)
1. Review of Regression & Stationary TS
Linear Processes: A time series defined by
X
xt = +
j wtj
j=
with absolutely summable coefficients is stationary with the autocovariance functio
STAT 626: Outline of Lecture 5
Transformations to Stationarity (2.3)
1. Forms of Nonstatioarity
xt = t + t y t ,
where cfw_yt is stationary.
2. Differencing (Fixes Mean-Nonstationarity)
3. Log and Power (Box-Cox) Transforms (Fixes Variance-Nonstat.)
4. R
STAT 626: Outline of Lectures 16 and 17
ARMA Model Estimation (3.6)
1. Method of Moments (1900): Yule-Wakler Estimators (1927, 1933)
Example: For causal AR(1), xt = xt1 + wt , there are two parameters (, w2 ) to be
estimated. satisfy
(1) = (0),
w2 = (0) (
STAT 626: Outline of Lecture 4
Estimation of the Mean, Correlation and ACF (1.6)
1. Review of Stationary TS and Computing ACF
2. The Sample Mean
3. Sample Autocovariance Function
4. Distribution of the Sample Autocorrelation Function (ACF)
5. The Sample C
STAT 626: Outline of Lecture 6
Regression in the Time Series Contexts (2.2,)
1. Review of Stationarity, Preview of TS Models
2. A Quick Review of Multiple Regression
x = Z + w,
LSE of :
b = (Z 0 Z)1 Z 0 x.
3. Tests, CIs and Variable Selection
4. AIC (Akai
STAT 626: Outline of Lecture 11
Forecasting (3.5)
1. Best Linear Prediction for Stationary Processes
2. Forecasting an AR(2) Model: Finite Past
3. Durbin-Levinson Algorithm: PACF
n
4. Prediction, Error Variance (Pn+1
) and Forecast Interval:
p n
xnn+1 1.9
STAT 626: Outline of Lecture 6
Regression in the Time Series Contexts (2.2,)
1. Review of Stationarity, Preview of TS Models
2. A Quick Review of Multiple Regression
x = Z + w,
LSE of :
b = (Z 0 Z)1 Z 0 x.
3. Tests, CIs and Variable Selection
4. AIC (Akai
STAT 626: Taking Care of Residual Correlations (5.6)
1. Regression with Time Series Errors
2. Regression with Autocorrelated Errors (5.6)
3. Multivariate ARMAX Models (5.8)
4. Spectral Density Estimation (Chap. 4)
1
Review of ARCH/GARCH Models
1. Time Ser
STAT 626 Exam 2
Summer, 1996 Dr. Newton
Name:
1. (12 points) What are (1) and (2) for an AR(2) process having coecients 1 and 2 ? (Hint:
Use the Yule-Walker Equations for v = 1 and v = 2, expressed in terms of s, to nd two equations
in two unknowns.)
2. (
STAT 626 Exam 1
Summer, 1997 Dr. Newton
Name:
1. (10 points) Derive the value of x(n +2) if I apply rst dierencing to a data set x(t) = a + bt, t = 1, . . . , n,
and then do undierencing as implemented by the EXTEND command.
2. (10 points) Solve the diere
STAT 626 Exam 2
Summer, 1997 Dr. Newton
Name:
1. (10 points) Derive a formula for and 2 for an MA(1) process in terms of R(0) and R(1).
2. (10 points) If W N ( 2 ), is a number in (1, 1), X (0) is a N(0, 2 /(1 2 ) random
variable independent of , and for
HW 1 Solutions
(Points - 1.2, 1.4, 1.6, 1.7 each carry 20 points. Q.II Parts 1 and 2 each carry 10 points. Q.III Parts (a),(b),(c) carry
4,3,3 points respectively)
Question II :
1).
Also,
Pn
i=1 (xi
Pn
i=1 (xi
x
) = (
Pn
i=1
x
)(yi y) =
xi ) n
x = n
x n
x
STAT 626, Summer 2016: Homeworks
NOTE: (i) The bonus problems get deeper into the concepts, will encourage you to study
the relevant sections of of the textbook more systematically.
(ii) To get the bonus points for a problem, a complete solution should be
STAT 626 Exam 1
Summer, 1999 Dr. Newton
Name:
1. (10 points) What are (v ) for v = 0, . . . , 4 if x(t) = t for t = 1, . . . , 5?
2. (10 points) If I have a random walk with X (0) = 0, what is Corr(X (s), X (t) for s < t?
3. (10 points) Consider the diere
STAT 626 Exam 2
Summer, 2004 Dr. Newton
Name:
1. (20 points) If X MA(2) with 2 = 1,
a. What is (1) if 1 = 0?
b. Find the formula for the largest that (1) can be as a function of 1 .
c. What are the equivalent number of uncorrelated observations if 1 = 1 a