5 ways to describe distributions of random variables

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: of an arbitrary random variable X determines the probabilities of any events of the form {X ∈ A}. Of course, the CDF of a random variable X determines P {X ≤ c} for any real number c–by definition it is just FX (c). Similarly, P {X ∈ (a, b]} = FX (b) − FX (a), whenever a < b. The next proposition explains how FX also determines probabilities of the form P {X < c} and P {X = c}. Proposition 3.1.2 Let X be a random variable and let c be any real number. Then P {X < c} = FX (c−) and P {X = c} = FX (c), where FX is the CDF of X. Proof. Fix c and let c1 , c2 , . . . be a sequence with c1 < c2 < . . . such that limj →∞ cj = c. Let G1 be the event G1 = {X ≤ c1 } and for j ≥ 2 let Gj = {cj −1 < X ≤ cj }. Then for any n ≥ 1, {X ≤ cn } = G1 ∪ G2 ∪ · · · ∪ Gn . Also, {X < c} = G1 ∪ G2 ∪ · · · and the events G1 , G2 , . . . are mutually exclusive. Therefore, by Axiom P.2, P {X < c} = P (G1 ) + P (G2 ) + · · · . 3.1. CUMULATIVE DISTRIBUTION FUNCTIONS 71 The sum of a series is, by defi...
View Full Document

This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.

Ask a homework question - tutors are online