lecture-27 - Chapter 27 Mixing A stochastic process is...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 27 Mixing A stochastic process is mixing if its values at widely-separated times are asymptotically independent. Section 27.1 defines mixing, and shows that it implies ergodicity. Section 27.2 gives some examples of mixing processes, both de- terministic and non-deterministic. Section 27.3 looks at the weak convergence of distributions pro- duced by mixing, and the resulting decay of correlations. Section 27.4 defines strong mixing, and the mixing coefficient which measures it. It then states, but does not prove, a central limit theorem for strongly mixing sequences. (The proof would demand first working through the central limit theorem for martingales.) For stochastic processes, mixing means asymptotically independent: that is, the statistical dependence between X ( t 1 ) and X ( t 2 ) goes to zero as | t 1- t 2 | increases. To make this precise, we need to specify how we measure the dependence between X ( t 1 ) and X ( t 2 ). The most common and natural choice (first used by Rosenblatt, 1956) is the total variation distance between their joint distribution and the product of their marginal distributions, but there are other ways of measuring such decay of correlations 1 . Under all reasonable choices, IID processes are, naturally enough, special cases of mixing processes. This suggests that many of the properties of IID processes, such as laws of large numbers and central limit theorems, should continue to hold for mixing processes, at least if the approach to independence is sufficiently rapid. This in turn means that many statistical methods originally developed for the IID case will continue to work when the data-generating process is mixing; this is true both of parametric methods, such as linear regression, ARMA models being mixing (Doukhan, 1995, sec. 2.4.1), and of nonparametric methods like kernel prediction (Bosq, 1998). Considerations of time will prevent us from going into 1 The term is common, but slightly misleading: lack of correlation, in the ordinary covariance-normalized-by-standard-deviations sense, implies independence only in special cases, like Gaussian processes. Nonetheless, see Theorem 350. 182 CHAPTER 27. MIXING 183 the purely statistical aspects of mixing processes, but the central limit theorem at the end of this chapter will give some idea of the flavor of results in this area: much like IID results, only with the true sample size replaced by an effective sample size, with a smaller discount the faster the rate of decay of correlations. 27.1 Definition and Measurement of Mixing Definition 338 (Mixing) A dynamical system , X , , T is mixing when, for any A, B X , lim t | ( A T- t B )- ( A ) ( T- t B ) | = 0 (27.1) Lemma 339 If is T-invariant, mixing is equivalent to lim t ( A T- t B ) = ( A ) ( B ) (27.2) Proof: By stationarity, ( T- t B ) = ( B ), so ( A ) ( T- t B ) = ( A ) ( B ). The result follows....
View Full Document

Page1 / 7

lecture-27 - Chapter 27 Mixing A stochastic process is...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online