This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 12 Sequences and Limit Theorems Some of the most astonishing results in probability are related to the properties of sequences of random variables and the convergence of empirical distribu tions. 12.1 Types of Convergence The premise on which most probabilistic convergence results lie is a sequence of random variables X 1 ,X 2 ,... and a limiting random variable X , all of which are defined on the same probability space. Recall that a random variable is a function of the outcome of a random experiment. The above statement stip ulate that all the random variables listed above are functions of the outcome of a same experiment. Statements that can be made about a sequence of random variables range from simple assertions to more intricate claims. For instance, the sequence may appear to approach a deterministic quantity or to behave increasingly akin to a certain function. Alternatively, the CDFs of the random variables in the sequence may appear to converge towards a precise function. Being able to recognize specific patterns within the sequence is key in establishing converge results. The different statement one can make about the sequence X 1 ,X 2 ,... lead to the different types of convergence encountered in probability. Below, we discuss briefly three types of convergence. Example 99. Suppose that X 1 ,X 2 ,... is a sequence of independent Gaussian 147 148 CHAPTER 12. SEQUENCES AND LIMIT THEOREMS random variables, each with mean m and variance σ 2 . Define the partial sums S n = n i =1 X i , (12.1) and consider the sequence S 1 , S 2 2 , S 3 3 ,... (12.2) We know that affine transformation of Gaussian random variables remain Gaussian. Furthermore, we know that sums of jointly Gaussian random vari ables are also Gaussian. Thus, S n /n possesses a Gaussian distribution with mean E S n n = E[ S n ] n = E[ X 1 ] + · · · + E[ X n ] n = m and variance Var S n n = Var[ S n ] n 2 = Var[ X 1 ] + · · · + Var[ X n ] n 2 = σ 2 n . It appears that the PDF of S n /n concentrates around m as n approaches infin ity. That is, the sequence in (12.2) seems to become increasingly predictable. Example 100. Again, let X 1 ,X 2 ,... be a sequence as describe above, and let S n be defined according to (12.1) . This time, we wish to characterize the properties of S 1 m, S 2 2 m √ 2 , S 3 3 m √ 3 ,... From our discussion, we know that ( S n nm ) / √ n is a Gaussian random variables. We can compute its mean and variance as follows E S n nm √ n = E[ S n nm ] n = 0 Var S n mn √ n = Var[ S n mn ] n = Var[ S n ] n = σ 2 . No matter how large n is, the random variable ( S n nm ) / √ n has a Gaussian distribution with mean zero and variance σ 2 . Intriguingly, the distributions remains fix throughout the sequence. 12.1. TYPES OF CONVERGENCE 149 12.1.1 Convergence in Probability The basic concept behind the definition of convergence in probability is that the probability that a random variable deviates from its typical behavior be comes less likely as the sequence progresses. Formally, a sequencecomes less likely as the sequence progresses....
View
Full
Document
This note was uploaded on 03/30/2010 for the course ECEN 303 taught by Professor Chamberlain during the Fall '07 term at Texas A&M.
 Fall '07
 Chamberlain

Click to edit the document details