density_notes

# density_notes - Notes on random variables density functions...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Notes on random variables, density functions, and measures September 29, 2010 1 Probability density functions and probabil- ity measures We defined a probability density function (pdf for short) as a function f : R → [0 , ∞ ) satisfying integraldisplay R f ( x ) dx = 1 . (Of course, this means that f must be Lebesgue-integrable, and hence mea- surable.) Likewise, we defined a “probability mass function”, which I will also refer to as a pdf, as a function p : { a 1 , a 2 , a 3 , ... } → [0 , 1] satisfying summationdisplay i p ( a i ) = 1 . So, mass functions are analogues of density functions for discrete (finite, or possibly countably infinite) sets of real numbers. What a pdf gives us is a probability measure for subsets of the real num- bers. Specifically, suppose A ∈ B ( R ). Then, we can define P ( A ) := integraldisplay R 1 A ( x ) f ( x ) dx = integraldisplay A f ( x ) dx. (1) And we can do the same in the discrete case, for if A ⊆ { a 1 , a 2 , ... } , then we can define P ( A ) := summationdisplay a i ∈ A p ( a i ) . 1 There is a way to unify both of these types of measures: Basically, let μ : B ( R ) → [0 , ∞ ). Then, for A ∈ B ( R ) we use the notation integraldisplay R 1 A ( x ) dμ ( x ) , or , alternatively...
View Full Document

## This note was uploaded on 10/23/2011 for the course MATH 3225 taught by Professor Staff during the Spring '08 term at Georgia Tech.

### Page1 / 4

density_notes - Notes on random variables density functions...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online