MIT6_042JF10_lec19

MIT6_042JF10_lec19 - 19 “mcs-ftl” — 2010/9/8 — 0:40...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 19 “mcs-ftl” — 2010/9/8 — 0:40 — page 497 — #503 Deviations In some cases, a random variable is likely to be very close to its expected value. For example, if we flip 100 fair, mutually-independent coins, it is very likely that we will get about 50 heads. In fact, we proved in Section 17.5 that the probability of getting fewer than 25 or more than 75 heads are each less than 3 10 7 . In such cases, the mean provides a lot of information about the random variable. In other cases, a random variable is likely to be far from its expected value. For example, suppose we flipped 100 fair coins that are glued together so that they all come out “heads” or they call all come out “tails.” In this case, the expected value of the number of heads is still 50, but the actual number of heads is guaranteed to be far from this value—it will be 0 or 100, each with probability 1=2 . Mathematicians have developed a variety of measures and methods to help us understand how a random variable performs in comparison to its mean. The sim- plest and most widely used measure is called the variance of the random variable. The variance is a single value associated with the random variable that is large for random variables that are likely to deviate significantly from the mean and that is small otherwise. 19.1 Variance 19.1.1 Definition and Examples Consider the following two gambling games: Game A: You win $2 with probability 2=3 and lose $1 with probability 1=3 . Game B: You win $1002 with probability 2=3 and lose $2001 with probabil- ity 1=3 . Which game would you rather play? Which game is better financially? We have the same probability, 2/3, of winning each game, but that does not tell the whole story. What about the expected return for each game? Let random variables A and B be the payoffs for the two games. For example, A is 2 with probability 2/3 and -1 with 498 “mcs-ftl” — 2010/9/8 — 0:40 — page 498 — #504 Chapter 19 Deviations probability 1/3. We can compute the expected payoff for each game as follows: 2 1 Ex ŒA D 2 3 C . 1/ 3 D 1; 2 1 Ex ŒB D 1002 C . 2001/ D 1: 3 3 The expected payoff is the same for both games, but they are obviously very different! The stakes are a lot higher for Game B and so it is likely to deviate much farther from its mean than is Game A. This fact is captured by the notion of variance . Definition 19.1.1. The variance Var ŒR of a random variable R is Var ŒR WWD Ex Œ.R Ex ŒR / 2 : In words, the variance of a random variable R is the expectation of the square of the amount by which R differs from its expectation. Yikes! That’s a mouthful. Try saying that 10 times in a row! Let’s look at this definition more carefully. We’ll start with R Ex ŒR . That’s the amount by which R differs from its expectation and it is obviously an important measure. Next, we square this value. More on why we do that in a moment. Finally, we take the the expected value of the square. If the square is likely to be large, then we take the the expected value of the square....
View Full Document

This note was uploaded on 01/19/2012 for the course CS 6.042J / 1 taught by Professor Tomleighton,dr.martenvandijk during the Fall '10 term at MIT.

Page1 / 37

MIT6_042JF10_lec19 - 19 “mcs-ftl” — 2010/9/8 — 0:40...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online