density_notes - Notes on random variables, density...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Notes on random variables, density functions, and measures September 29, 2010 1 Probability density functions and probabil- ity measures We defined a probability density function (pdf for short) as a function f : R [0 , ) satisfying integraldisplay R f ( x ) dx = 1 . (Of course, this means that f must be Lebesgue-integrable, and hence mea- surable.) Likewise, we defined a probability mass function, which I will also refer to as a pdf, as a function p : { a 1 , a 2 , a 3 , ... } [0 , 1] satisfying summationdisplay i p ( a i ) = 1 . So, mass functions are analogues of density functions for discrete (finite, or possibly countably infinite) sets of real numbers. What a pdf gives us is a probability measure for subsets of the real num- bers. Specifically, suppose A B ( R ). Then, we can define P ( A ) := integraldisplay R 1 A ( x ) f ( x ) dx = integraldisplay A f ( x ) dx. (1) And we can do the same in the discrete case, for if A { a 1 , a 2 , ... } , then we can define P ( A ) := summationdisplay a i A p ( a i ) . 1 There is a way to unify both of these types of measures: Basically, let : B ( R ) [0 , ). Then, for A B ( R ) we use the notation integraldisplay R 1 A ( x ) d ( x ) , or , alternatively...
View Full Document

Page1 / 4

density_notes - Notes on random variables, density...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online