This **preview** has intentionally **blurred** parts. Sign up to view the full document

**Unformatted Document Excerpt**

Chapter 6 Expected Value and Variance 6.1 Expected Value of Discrete Random Variables When a large collection of numbers is assembled, as in a census, we are usually interested not in the individual numbers, but rather in certain descriptive quantities such as the average or the median. In general, the same is true for the probability distribution of a numerically-valued random variable. In this and in the next section, we shall discuss two such descriptive quantities: the expected value and the variance. Both of these quantities apply only to numerically-valued random variables, and so we assume, in these sections, that all random variables have numerical values. To give some intuitive justification for our definition, we consider the following game. Average Value A die is rolled. If an odd number turns up, we win an amount equal to this number; if an even number turns up, we lose an amount equal to this number. For example, if a two turns up we lose 2, and if a three comes up we win 3. We want to decide if this is a reasonable game to play. We first try simulation. The program Die carries out this simulation. The program prints the frequency and the relative frequency with which each outcome occurs. It also calculates the average winnings. We have run the program twice. The results are shown in Table 6.1. In the first run we have played the game 100 times. In this run our average gain is- . 57. It looks as if the game is unfavorable, and we wonder how unfavorable it really is. To get a better idea, we have played the game 10,000 times. In this case our average gain is- . 4949. We note that the relative frequency of each of the six possible outcomes is quite close to the probability 1/6 for this outcome. This corresponds to our frequency interpretation of probability. It also suggests that for very large numbers of plays, our average gain should be = 1 1 6- 2 1 6 + 3 1 6- 4 1 6 + 5 1 6- 6 1 6 225 226 CHAPTER 6. EXPECTED VALUE AND VARIANCE n = 100 n = 10000 Winning Frequency Relative Frequency Relative Frequency Frequency 1 17 .17 1681 .1681-2 17 .17 1678 .1678 3 16 .16 1626 .1626-4 18 .18 1696 .1696 5 16 .16 1686 .1686-6 16 .16 1633 .1633 Table 6.1: Frequencies for dice game. = 9 6- 12 6 =- 3 6 =- . 5 . This agrees quite well with our average gain for 10,000 plays. We note that the value we have chosen for the average gain is obtained by taking the possible outcomes, multiplying by the probability, and adding the results. This suggests the following definition for the expected outcome of an experiment. Expected Value Definition 6.1 Let X be a numerically-valued discrete random variable with sam- ple space and distribution function m ( x ). The expected value E ( X ) is defined by E ( X ) = X x xm ( x ) , provided this sum converges absolutely. We often refer to the expected value as the mean, and denote E ( X ) by for short. If the above sum does not converge absolutely, then we say that X does not have an expected value. 2 Example 6.1Example 6.... View Full Document