Transformations

Transformations - TRANSFORMATIONS OF RANDOM VARIABLES 1 I NTRODUCTION 1.1 Definition We are often interested in the probability distributions or

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: TRANSFORMATIONS OF RANDOM VARIABLES 1. I NTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of one or more random variables. Suppose we have a set of random variables, X 1 , X 2 , X 3 , . . . X n , with a known joint probability and/or density function. We may want to know the distribution of some function of these random variables Y = φ (X 1 , X 2 , X 3 , . . . X n ). Realized values of y will be related to realized values of the X’s as follows y = Φ ( x 1 , x 2 , x 3 , ··· , x n ) (1) A simple example might be a single random variable x with transformation y = Φ ( x ) = log ( x ) (2) 1.2. Techniques for finding the distribution of a transformation of random variables. 1.2.1. Distribution function technique. We find the region in x 1 , x 2 , x 3 , . . . x n space such that Φ (x 1 , x 2 , . . . x n ) ≤ φ . We can then find the probability that Φ (x 1 , x 2 , . . . x n ) ≤ φ , i.e., P[ Φ (x 1 , x 2 , . . . x n ) ≤ φ ] by integrating the density function f(x 1 , x 2 , . . . x n ) over this region. Of course, F Φ ( φ ) is just P[ Φ ≤ φ ]. Once we have F Φ ( φ ), we can find the density by integration. 1.2.2. Method of transformations (inverse mappings). Suppose we know the density function of x. Also suppose that the function y = Φ ( x ) is differentiable and monotonic for values within its range for which the density f(x) = 0. This means that we can solve the equation y = Φ ( x ) for x as a function of y. We can then use this inverse mapping to find the density function of y. We can do a similar thing when there is more than one variable X and then there is more than one mapping Φ . 1.2.3. Method of moment generating functions. There is a theorem (Casella [2, p. 65] ) stating that if two random variables have identical moment generating functions, then they possess the same probability distribution. The procedure is to find the moment generating function for Φ and then compare it to any and all known ones to see if there is a match. This is most commonly done to see if a distribution approaches the normal distribution as the sample size goes to infinity. The theorem is presented here for completeness. Theorem 1. Let F X (x) and F Y (y) be two cummulative distribution functions all of whose moments exist. Then a: If X and Y hae bounded support, then F X (u) and F Y (u) for all u if an donly if E X r = E Y r for all integers r = 0,1,2, . . . . b: If the moment generating functions exist and M X (t) = M Y (t) for all t in some neighborhood of 0, then F X (u) = F Y (u) for all u. For further discussion, see Billingsley [1, ch. 21-22] . Date : August 9, 2004. 1 2 TRANSFORMATIONS OF RANDOM VARIABLES 2. D ISTRIBUTION F UNCTION T ECHNIQUE 2.1. Procedure for using the Distribution Function Technique. As stated earlier, we find the re- gion in the x 1 , x 2 , x 3 , . . . x n space such that Φ (x 1 , x 2 , . . . x n ) ≤ φ . We can then find the probability that Φ...
View Full Document

This note was uploaded on 01/01/2011 for the course STAT 641 taught by Professor Weining during the Spring '10 term at Bowling Green.

Page1 / 18

Transformations - TRANSFORMATIONS OF RANDOM VARIABLES 1 I NTRODUCTION 1.1 Definition We are often interested in the probability distributions or

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online