Chapter 27
Mixing
A stochastic process is mixing if its values at widelyseparated
times are asymptotically independent.
Section 27.1 defines mixing, and shows that it implies ergodicity.
Section 27.2 gives some examples of mixing processes, both de
terministic and nondeterministic.
Section 27.3 looks at the weak convergence of distributions pro
duced by mixing, and the resulting decay of correlations.
Section 27.4 defines
strong
mixing, and the “mixing coe
ffi
cient”
which measures it. It then states, but does not prove, a central limit
theorem for strongly mixing sequences. (The proof would demand
first working through the central limit theorem for martingales.)
For stochastic processes, “mixing” means “asymptotically independent”:
that is, the statistical dependence between
X
(
t
1
) and
X
(
t
2
) goes to zero as

t
1

t
2

increases. To make this precise, we need to specify how we measure the
dependence between
X
(
t
1
) and
X
(
t
2
). The most common and natural choice
(first used by Rosenblatt, 1956) is the total variation distance between their
joint distribution and the product of their marginal distributions, but there are
other ways of measuring such “decay of correlations”
1
.
Under all reasonable
choices, IID processes are, naturally enough, special cases of mixing processes.
This suggests that many of the properties of IID processes, such as laws of
large numbers and central limit theorems, should continue to hold for mixing
processes, at least if the approach to independence is su
ffi
ciently rapid. This in
turn means that many statistical methods originally developed for the IID case
will continue to work when the datagenerating process is mixing; this is true
both of parametric methods, such as linear regression, ARMA models being
mixing (Doukhan, 1995, sec. 2.4.1), and of nonparametric methods like kernel
prediction (Bosq, 1998). Considerations of time will prevent us from going into
1
The term is common,
but slightly misleading:
lack of correlation,
in the ordinary
covariancenormalizedbystandarddeviations sense,
implies independence only in special
cases, like Gaussian processes. Nonetheless, see Theorem 350.
182
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
CHAPTER 27.
MIXING
183
the purely statistical aspects of mixing processes, but the central limit theorem
at the end of this chapter will give some idea of the flavor of results in this area:
much like IID results, only with the true sample size replaced by an e
ff
ective
sample size, with a smaller discount the faster the rate of decay of correlations.
27.1
Definition and Measurement of Mixing
Definition 338 (Mixing)
A dynamical system
Ξ
,
X
, μ, T
is
mixing
when, for
any
A, B
∈
X
,
lim
t
→∞

μ
(
A
∩
T

t
B
)

μ
(
A
)
μ
(
T

t
B
)

= 0
(27.1)
Lemma 339
If
μ
is
T
invariant, mixing is equivalent to
lim
t
→∞
μ
(
A
∩
T

t
B
) =
μ
(
A
)
μ
(
B
)
(27.2)
Proof:
By stationarity,
μ
(
T

t
B
) =
μ
(
B
), so
μ
(
A
)
μ
(
T

t
B
) =
μ
(
A
)
μ
(
B
). The
result follows.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '06
 Schalizi
 Central Limit Theorem, Probability theory, lim µ

Click to edit the document details