851_lectures17_24 - Statistics 851(Fall 2013 Prof Michael...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Statistics 851 (Fall 2013) October 16, 2013 Prof. Michael Kozdron Lecture #17: Expectation of a Simple Random Variable Recall that a simple random variable is one that takes on ±nitely many values. DeFnition. Let (Ω , F , P )beaprobab i l ityspace . Arandomvar iab le X :Ω R is called simple if it can be written as X = n ° i =1 a i 1 A i where a i R , A i ∈F for i =1 , 2 ,...,n .W ede±netheexpe c ta t iono f X to be E ( X )= n ° i =1 a i P { A i } . Example 17.1. Consider the probability space (Ω , B 1 , P )whereΩ=[0 , 1], B 1 denotes the Borel sets of [0 , 1], and P is the uniform probability on Ω. Suppose that the random variable X R is de±ned by X ( ω 4 ° i =1 a i 1 A i ( ω ) where a 1 =4 , a 2 =2 , a 3 , a 4 = 1, and A 1 =[0 , 1 2 ) ,A 2 =[ 1 4 , 3 4 ) 3 =( 1 2 , 7 8 ] 4 7 8 , 1] . Show that there exist ±nitely many real constants c 1 ,...,c n and disjoint sets C 1 ,...,C n ∈B 1 such that X = n ° i =1 c i 1 C i . Solution. We ±nd X ( ω 4 , if 0 ω< 1 / 4 , 6 , if 1 / 4 1 / 2 , 2 , if ω / 2 , 3 , if 1 / 2 <ω< 3 / 4 , 1 , if 3 / 4 7 / 8 , 0 , if ω =7 / 8 , 1 , if 7 / 8 1 , so that X = 7 ° i =1 c i 1 C i where c 1 , c 2 =6 , c 3 , c 4 =3 , c 5 , c 6 =0 , c 7 = 1and C 1 , 1 4 ) ,C 2 1 4 , 1 2 ) 3 = { 1 2 } 4 1 2 , 3 4 ) 5 3 4 , 7 8 ) 6 = { 7 8 } 7 7 8 , 1] . 17–1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Proposition 17.2. If X and Y are simple random variables, then E ( αX + βY )= α E ( X )+ β E ( Y ) for every α , β R . Proof. Suppose that X and Y are simple random variables with X = n ° i =1 a i 1 A i and Y = m ° j =1 b j 1 B j where A 1 ,...,A n ∈F and B 1 ,...,B m each partition Ω. Since αX = α n ° i =1 a i 1 A i = n ° i =1 ( αa i )1 A i we conclude by defnition that E ( αX n ° i =1 ( αa i ) P { A i } = α n ° i =1 a i P { A i } = α E ( X ) . The prooF oF the theorem will be completed by showing E ( X + Y E ( X E ( Y ). Notice that { A i B j :1 i n, 1 j m } consists oF pairwise disjoint events whose union is Ω and X + Y = n ° i =1 m ° j =1 ( a i + b j )1 A i B j . ThereFore, by defnition, E ( X + Y n ° i =1 m ° j =1 ( a i + b j ) P { A i B j } = n ° i =1 m ° j =1 a i P { A i B j } + n ° i =1 m ° j =1 b j P { A i B j } = n ° i =1 a i P { A i } + m ° j =1 b j P { B j } and the prooF is complete. Fact. IF X and Y are simple random variables with X Y ,then E ( X ) E ( Y ) . Exercise 17.3. Prove the previous Fact. 17–2
Background image of page 2
Having already defned E ( X )Forsimplerandomvariables,ourgoalnowistoconstruct E ( X ) in general. To that end, suppose that X is a positive random variable .T h a ti s , X ( ω ) 0 For all ω Ω. (We will need to allow X ( ω ) [0 , + ]Forsomecons istency .) Defnition. IF X is a positive random variable, defne the expectation oF X to be E ( X ) = sup { E ( Y ): Y is simple and 0 Y X } . That is, we approximate positive random variables by simple random variables. OF course, this leads to the question oF whether or not this is possible. Fact. ±or every random variable X 0, there exists a sequence ( X n )o Fpo s i t ive ,s imp le random variables with X n X (that is, X n increases to X ).
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 24

851_lectures17_24 - Statistics 851(Fall 2013 Prof Michael...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online