This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2009, Professor Whitt Topics for Discussion, Thursday, January 29 Our Friends: Transforms This lecture focused on the moment generating function (mgf), as discussed in Section 2.6 of our Ross textbook. You are responsible for that section. We also illustrated how the moment generating function can be applied to prove the central limit theorem. What we did was an elaboration (explanation) of the argument given in our book on pages 8283. You are NOT responsible for the knowing this proof, but you should certainly understand Example 2.52 on page 82. The following notes elaborate even further, but these notes below are optional . OPTIONAL NOTES ON TRANSFORMS 1. Different Kinds of Transforms It is worth knowing that there are many different kinds of transforms, often with very similar structure. This is helpful, because the mgf may not differ much from a transform that you may already know, such as the generating function, Laplace transform, Foourier transform, etc. When you learn about one, you really learn about many. The following are different kinds of transforms: (a) moment generating function (of a probability distribution or of a random variable) (b) generating function (of a sequence) (c) probability generating function (of a probability distribution or of a random variable) (d) z transform (e) characteristic function (of a probability distribution or of a random variable) (f) Fourier transform (of a function) (g) Laplace transform (of a function) —————————————— 2. Definitions (a) moment generating function ——————————————– Given a random variable X the moment generating function of X (really of its probability distribution) is ψ X ( t ) ≡ E [ e tX ] , which is a function of the real variable t , see Section 2.6 of Ross. (I here use ψ , whereas Ross uses φ .) Under regularity conditions (ensuring that the mgf is well defined, i.e., finite, there is a onetoone equivalence between probability distributions and their mgf. It is the probability distribution of X that is characterized by the mgf above (not the probability distribution on the underlying probability space on which X is defined). The random variable could have a continuous distribution or a discrete distribution; Discrete case: Given a random variable X with a probability mass function (pmf) p n ≡ P ( X = n ) , n ≥ , , the moment generating function (mgf) of X (really of its probability distribution) is ψ X ( t ) ≡ E [ e tX ] ≡ ∞ X n =0 p n e tn . The transform maps the pmf { p n : n ≥ } (function of n ) into the associated function of t . Continuous case: Given a random variable X with a probability density function (pdf) f ≡ f X on the entire real line, the moment generating function (mgf) of X (really of its probability distribution) is ψ ( t ) ≡ ψ X ( t ) ≡ E [ e tX ] ≡ Z ∞∞ f ( x ) e tx dx ....
View
Full Document
 Spring '08
 Whitt
 Operations Research, Normal Distribution, Probability distribution, Probability theory

Click to edit the document details