Unformatted text preview: Introduction to Time Series Analysis. Lecture 20.
1. Review: The periodogram 2. Asymptotics of the periodogram. 3. Nonparametric spectral estimation. 1 Review: Periodogram
The periodogram is defined as I() = X()2 = 1 n
n 2 e2it xt
t=1 2 2 = Xc () + Xs (). 1 Xc () = n 1 Xs () = n n cos(2t)xt ,
t=1 n sin(2tj )xt .
t=1 The same as computing f () from the sample autocovariance (for x = 0). 2 Asymptotic properties of the periodogram
We want to understand the asymptotic behavior of the periodogram I() at a particular frequency , as n increases. We'll see that its expectation converges to f (). We'll start with a simple example: Suppose that X1 , . . . , Xn are i.i.d. N (0, 2 ) (Gaussian white noise). From the definitions, 1 Xc (j ) = n
n cos(2tj )xt ,
t=1 1 Xs (j ) = n n sin(2tj )xt ,
t=1 we have that Xc (j ) and Xs (j ) are normal, with EXc (j ) = EXs (j ) = 0. 3 Asymptotic properties of the periodogram
Also, 2 Var(Xc (j )) = n
n cos2 (2tj ) 2 . (cos(4tj ) + 1) = 2 t=1 = 2n Similarly, Var(Xs (j )) = 2 /2. t=1 2 n 4 Asymptotic properties of the periodogram
Also, 2 Cov(Xc (j ), Xs (j )) = n
n cos(2tj ) sin(2tj ) sin(4tj ) = 0,
t=1 = 2n Cov(Xc (j ), Xc (k )) = 0 Cov(Xs (j ), Xs (k )) = 0 Cov(Xc (j ), Xs (k )) = 0. for any j = k. t=1 2 n 5 Asymptotic properties of the periodogram
That is, if X1 , . . . , Xn are i.i.d. N (0, 2 ) (Gaussian white noise; f () = 2 ), then the Xc (j ) and Xs (j ) are all i.i.d. N (0, 2 /2). Thus, 2 2 2 2 I(j ) = 2 Xc (j ) + Xs (j ) 2 . 2 2 So for the case of Gaussian white noise, the periodogram has a chisquared distribution that depends on the variance 2 (which, in this case, is the spectral density). 6 Asymptotic properties of the periodogram
Under more general conditions (e.g., normal {Xt }, or linear process {Xt } with rapidly decaying ACF), the Xc (j ), Xs (j ) are all asymptotically independent and N (0, f (j )/2). Consider a frequency . For a given value of n, let (n) be the closest ^ Fourier frequency (that is, (n) = j/n for a value of j that minimizes ^   j/n). As n increases, (n) , and (under the same conditions that ^ ensure the asymptotic normality and independence of the sine/cosine transforms), f (^(n) ) f (). (picture) In that case, we have 2 2 d (n) 2 2 I(^ ) = Xc (^(n) ) + Xs (^(n) ) 2 . 2 f () f () 7 Asymptotic properties of the periodogram
Thus, EI(^(n) ) = 2 f () 2 2 E Xc (^(n) ) + Xs (^(n) ) 2 f () f () 2 2 E(Z1 + Z2 ) = f (), 2 where Z1 , Z2 are independent N (0, 1). Thus, the periodogram is asymptotically unbiased. 8 Asymptotic properties of the periodogram
Since we know its asymptotic distribution (chisquared), we can compute approximate confidence intervals: Pr 2 I(^(n) ) > 2 () 2 f () , where the cdf of a 2 at 2 () is 1  . Thus, 2 2 Pr 2I(^(n) ) 2I(^(n) ) f () 2 2 (/2) 2 (1  /2) 2 1  . 9 Asymptotic properties of the periodogram: Consistency
2 2 Unfortunately, Var(I(^(n) )) f ()2 Var(Z1 + Z2 )/4, where Z1 , Z2 are i.i.d. N (0, 1), that is, the variance approaches a constant. Thus, I(^(n) ) is not a consistent estimator of f (). In particular, if f () > 0, then for > 0, as n increases, Pr approaches a constant. I(^(n) )  f () > 10 Asymptotic properties of the periodogram: Consistency
This means that the approximate confidence intervals we obtain are typically wide. The source of the difficulty is that, as n increases, we have additional data (the n values of xt ), but we use it to estimate additional independent random variables, (the n independent values of Xc (j ), Xs (j )). How can we reduce the variance? The typical approach is to average independent observations. In this case, we can take an average of "nearby" values of the periodogram, and hope that the spectral density at the frequency of interest and at those nearby frequencies will be close. 11 Introduction to Time Series Analysis. Lecture 20.
1. Review: The periodogram 2. Asymptotics of the periodogram. 3. Nonparametric spectral estimation. 12 Nonparametric spectral estimation
Define a band of frequencies L L k  , k + 2n 2n of bandwidth L/n. Suppose that f () is approximately constant in this frequency band. Consider the following smoothed spectral estimator. ^(k ) = 1 f L 1 = L
(L1)/2 (assume L is odd) l=(L1)/2 (L1)/2 I(k  l/n)
2 2 Xc (k  l/n) + Xs (k  l/n) . l=(L1)/2 13 Nonparametric spectral estimation
For a suitable time series (e.g., Gaussian, or a linear process with sufficiently rapidly decreasing autocovariance), we know that, for large n, all of the Xc (k  l/n) and Xs (k  l/n) are approximately independent and normal, with mean zero and variance f (k  l/n)/2. From the assumption that f () is approximately constant across all of these frequencies, we have that, asymptotically, 2 ^ f (k ) f (k ) 2L . 2L 14 Nonparametric spectral estimation
Thus, f () ^ Ef (^(n) ) E 2L
2L 2 Zi i=1 2L 2 Zi i=1 = f (), f 2 () 2 Var(Z1 ), = 2L 2 ^(^(n) ) f () Var Varf 4L2 where the Zi are i.i.d. N (0, 1). 15 Nonparametric spectral estimation: confidence intervals
From the asymptotic distribution, we can define approximate confidence intervals as before: Pr ^ ^ 2Lf (^(n) ) 2Lf (^(n) ) f () 2 2 (/2) 2L (1  /2) 2L 1  . For large L, these will be considerably tighter than for the unsmoothed periodogram. (But we need to be sure f does not vary much over the bandwidth L/n.) 16 Nonparametric spectral estimation
Notice the biasvariance trade off: ^ For bandwidth B = L/n, we have Varf (k ) c/(Bn) for some constant c. So we want a bigger bandwidth B to ensure low variance (bandwidth stability). But the larger the bandwidth, the more questionable the assumption that f () is approximately constant in the band [  B/2, + B/2]. For a ^ larger value of B, our estimate f () will be a smoother function of . We have thus introduced more bias (lower resolution). 17 Nonparametric spectral estimation: confidence intervals
^ Since the asymptotic mean and variance of f (^(n) ) are proportional to f () and f 2 (), it is natural to consider the logarithm of the estimator. Then we can define approximate confidence intervals as before: Pr ^ ^ 2Lf (^(n) ) 2Lf (^(n) ) f () 2 2 (/2) 2L (1  /2) 2L 2L 2 (/2) 2L 2L 2 (1  /2) 2L 1  . 1  , ^ Pr log f (^(n) ) + log ^ log(f ()) log f (^(n) ) + log The width of the confidence intervals for f () varies with frequency, whereas the width of the confidence intervals for log(f ()) is the same for all frequencies.
18 ...
View
Full
Document
This note was uploaded on 05/14/2011 for the course STAT 153 taught by Professor Staff during the Fall '08 term at Berkeley.
 Fall '08
 Staff

Click to edit the document details