note 11 - MA2216/ST2131 Probability Notes 11 Central Limit...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: MA2216/ST2131 Probability Notes 11 Central Limit Theorem Before proceeding to the central limit theorem, we will first review the topics such as moment generating functions as well as conditional expecta- tion, and then consider a few examples for reference. § 1. Moment Generating Functions. 1. Recall that, ( Cf. § 5, Notes 10), the moment generating function of a random variable X is formally defined as M X ( t ) = EE £ e tX / =        X x e tx f X ( x ) , if X is discrete with p.d.f. f X ( x ); Z IR e tx f X ( x ) dx, if X is continuous with density f X ( x ). We say the moment generating function of X is well-defined if there exists an η > 0 such that M X ( t ) < ∞ for all t ∈ (- η, η ). Obviously, the domain of M X ( · ) is the set of all real numbers such that e tX has finite expectation. A trivial observation is that M X (0) = 1. 2. Why is it called the moment generating function? Because all of the moments of X can be obtained by successively differ- entiating M X ( t ) and then evaluating the result at t = 0. To be precise, if M X ( t ) is well defined in an interval of the origin, then M ( n ) X (0) = EE [ X n ] , n ≥ 1 . As a matter of fact, if M X ( t ) is finite on- η < t < η for some positive number η (could possibly be ∞ ), then we can write M X ( t ) = ∞ X n =0 EE [ X n ] n ! t n ,- η < t < η, which is the Taylor series expansion of M X ( t ) at the origin. 1 3. Remark. However, not every distribution admits a finite mo- ment generating function. If it does admit one on an interval of the origin, then the distribution itself is uniquely determined by its moment generating function. Such a uniqueness result can be stated as follows: If two random variables have the same moment generating func- tion, they have the same distribution. 4. Example. Let X be a r.v. such that M X ( t ) is finite for all t . We may use the same argument as in the proof of Markov’s inequality to derive the following inequality: IP { X ≥ x } ≤ e- tx M X ( t ) , t ≥ . (1 . 1) Derivation. For t ≥ 0, X ≥ x is equivalent to e tX ≥ e tx , and hence IP { X ≥ x } = IP { e tX ≥ e tx } ≤ EE £ e tX / /e tx = e- tx M X ( t ) . We are done. Furthermore, (1.1) holds for all eligible t . Thus, it follows that IP { X ≥ x } ≤ min t ≥ e- tx M X ( t ) . (1 . 2) In general, the inequality in (1.2) is valid in the following form IP { X ≥ x } ≤ min t ∈ D e- tx M X ( t ) , (1 . 3) where D refers to the set of all t such that M X ( t ) exists. 2 5. Let us see a simple application in the following example. Example. Let X have a gamma distribution with parameters α and λ . It is known that M X ( t ) = λ λ- t ¶ α for-∞ < t < λ . By making use of (1.3), we get IP { X ≥ 2 α/λ } ≤ min ≤ t<λ e- t 2 α/λ λ λ- t ¶ α By calculus, as a function of t ∈ [0 ,λ ), ψ ( t ) = e- t 2 α/λ λ λ- t ¶ α attains its minimum (2 /e ) α at t = λ/ 2. Therefore, we conclude that IP { X ≥ 2 α/λ } ≤...
View Full Document

This note was uploaded on 03/19/2012 for the course SCIENCE ST2131 taught by Professor Forgot during the Fall '08 term at National University of Singapore.

Page1 / 16

note 11 - MA2216/ST2131 Probability Notes 11 Central Limit...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online