The next result will lead to a number of examples

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ing Y by 1A sets the product = 0 on Ac and leaves the values on A unchanged. Finally, we define the conditional expectation of Y given A to be E (Y |A) = E (Y ; A)/P (A) This is the expected value for the conditional probability defined by P (·|A) = P (· \ A)/P (A) Example 5.1. A simple but important special case arises when the random variable Y and the set A are independent, i.e., for any set B we have P (Y 2 B, A) = P (Y 2 B )P (A) Noticing that this implies that P (Y 2 B, Ac ) = P (Y 2 B )P (Ac ) and comparing with the definition of independence of random variables in (A.13), we see that this holds if and only Y and 1A are independent, so Theorem A.1 implies E (Y ; A) = E (Y 1A ) = EY · E 1A 159 160 CHAPTER 5. MARTINGALES and we have E (Y |A) = EY (5.1) It is easy to see from the definition that the integral over A is linear: E (Y + Z ; A) = E (Y ; A) + E (Z ; A) (5.2) so dividing by P (A), conditional expectation also has this property E (Y + Z |A) = E (Y |A) + E (Z |A) (5.3) Here and in later formulas and theorems, we...
View Full Document

This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).

Ask a homework question - tutors are online