Discrete-time stochastic processes

The expected value of a rv x will also be called its

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: is undefined if both 0 xfX (x) dx = 1 and R0 −1 (−x)fX (x) dx = −1. Otherwise, the expected value is given by E [X ] = Z 1 xfX (x) dx. (1.22) −1 An example of a rv whose expectation is undefined is given by the following example of a Cauchy rv. 1 Example 1.3.3. A Cauchy rv has the probability density fX (x) = π(1+x2 ) . Note that, R1 both as x → 1 and as x → −1, x/(1 + x)2 goes to zero as 1/x. Thus 0 xfX (x) dx and R0 −1 −xfX (x) dx are b oth infinite and E [X ] is undefined. On the other hand, we see from symmetry that the Cauchy principal value of the integral in (1.22) is given by lim Z +A A→1 −A x dx = 0. π (1 + x)2 18 CHAPTER 1. INTRODUCTION AND REVIEW OF PROBABILITY P The rv M in Example 1.3.3 has this same behavior, i.e., limA→1 A =−A pM (m) = 0. For m most uses of expectations, a rv whose expectation is undefined behaves very differently from one whose expectation is defined, and the existence of a Cauchy principal value is not of great significance. The expected value of a rv X will also be called its mean and will be denoted both by E [X ] and by X . We next derive an alternate expression for the expected value in (1.20) of a positive discrete rv . This new expression is given directly in terms of the distribution function. We then use this new expression as a general definition of expectation which applies to all rv’s, whether discrete, continuous, or arbitrary. It contains none of the convergence questions that sometimes arise for continuous rv’s with very wild densities. For a positive discrete rv X , we will show that (1.20) is simply the integral of the complementary distribution function, where the complementary distribution function of a rv is defined as Pr {X > x} = 1 − FX (x). Z1 E [X ] = [1 − FX (x)] dx. (1.23) 0 a1 pX (a1 ) a1 pX (a1 ) pX (a2 ) pX (a3 ) pX (a4 ) a2 a2 pX (a2 ) a3 a3 pX (a3 ) 1 − FX (x) a4 a4 pX (a4 ) Figure 1.3: For this example, X takes on four possible values, 0 ≤ a1 < a2 < a3 < a4 . Thus 1 − FX (x) = Pr {X > x} = 1 for x < a1 . For a1 ≤ x < a2 , Pr {X > x} = 1 − pX (a1 ), and Pr {X > x} has similar drops as x reaches a2 , a3 , and a4 . E [X ], from P (1.20), is i ai pX (ai ), which is the sum of the rectangles in the figure. This is also R1 the area under the curve 1 − FX (x), i.e., 0 [1 − FX (x)] dx. It can be seen that this argument applies to any non-negative rv, thus verifying (1.23). This relationship is just as important for conceptual purposes as for computational purposes. To derive it for a discrete random variable, consider the sketch of the complementary distriP bution function, Pr {X > x} = 1 − FX (x) in Figure 1.3. The figure shows that ai pX (ai ) is equal to the area under the curve 1 − FX (x) from x = 0 to 1. For an arbitrary nonnegative rv, we can visualize quantizing the distribution function, using the above argument, and then passing to the limit of arbitrarily fine quantizing. Since there are no mathematical subtleties in integrating an arbitrary non-negative decreasing function, this is not only useful in calculating expectations, but is also the fundamental way to define expectation. Note that it is possible for X to be infinite, but this simply arises when the integral in (1.23) is infinite. One can also derive (1.23) from (1.22) by integration by parts, 1.3. PROBABILITY REVIEW 19 but it is the more fundamental graphical argument that shows that the complementary distribution function can always be integrated with either a finite or infinite value. For an arbitrary (rather than nonnegative) rv, the mean, as illustrated in Figure 1.4 and analyzed in Exercise 1.7, generalizes to E [X ] = − Z 0 FX (x) dx + −1 Z 1 0 (1 − FX (x)) dx. (1.24) 1 − FX (0) 1 − FX (x) FX (0) FX (x) Figure 1.4: The figure shows the complementary distribution function, 1 − FX (x), of a rv X for x between 0 and 1 and shows the distribution function, FX (x), for x between −1 and 0. The expectation of X , E [X ] is the area under the curve on the right less the area under the curve on the left. If the area under each curve is finite, then E [X ] is finite. If one area is finite and the other infinite, then E [X ] is ±1, and if both are infinite then E [X ] is undefined. Definition 1.6. The expectation of a rv X for which at least one of the two terms in (1.24) is finite is given by (1.24). If both these terms are infinite, the expectation is undefined. Given this as the definition of expectation, one can use the arguments above to see that the formulas for discrete and continuous expectations in (1.20) and (1.22) respectively follow from this definition. 1.3.7 Random variables as functions of other random variables Random variables (rv’s) are often defined in terms of each other. For example, if g is a function from R to R and X is a rv, then Y = g (X ) is the random variable that maps each sample point ω into the composite function g (X (ω )). As indicated in Exercise 1.15, one can find the expected value of Y (if it is defined) in either of the following...
View Full Document

This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.

Ask a homework question - tutors are online