# lecture21 - CS 70 Discrete Mathematics for CS Spring 2005...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS 70 Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21 Some Important Distributions Question : A biased coin with Heads probability p is tossed repeatedly until the first Head appears. What is the expected number of tosses? As always, our first step in answering the question must be to define the sample space . A moments thought tells us that = { H , TH , TTH , TTTH , . . . } , i.e., consists of all sequences over the alphabet { H , T } that end with H and contain no other H s. This is our first example of an infinite sample space (though it is still discrete). What is the probability of a sample point, say = TTH ? Since successive coin tosses are independent (this is implicit in the statement of the problem), we have Pr [ TTH ] = ( 1- p ) ( 1- p ) p = ( 1- p ) 2 p . And generally, for any sequence of length i , we have Pr [ ] = ( 1- p ) i- 1 p . To be sure everything is consistent, we should check that the probabilities of all the sample points add up to 1. Since there is exactly one sequence of each length i 1 in , we have Pr [ ] = i = 1 ( 1- p ) i- 1 p = p i = ( 1- p ) i = p 1 1- ( 1- p ) = 1 , as expected. [In the second-last step here, we used the formula for summing a geometric series.] Now let the random variable X denote the number of tosses in our sequence (i.e., X ( ) is the length of ). Our goal is to compute E ( X ) . Despite the fact that X counts something, theres no obvious way to write it as a sum of simple r.v.s as we did in many examples in the last lecture. (Try it!) Instead, lets just dive in and try a direct computation. Note that the distribution of X is quite simple: Pr [ X = i ] = ( 1- p ) i- 1 p for i = 1 , 2 , 3 , . . . So from the definition of expectation we have E ( X ) = ( 1 p ) + ( 2 ( 1- p ) p ) + ( 3 ( 1- p ) 2 p ) + = p i = 1 i ( 1- p ) i- 1 . This series is a blend of an arithmetic series (the i part) and a geometric series (the ( 1- p ) i- 1 part). There are several ways to sum it. Here is one way, using an auxiliary trick (given in the following Theorem) that is often very useful. [Ask your TA about other ways.] Theorem 21.1 : Let X be a random variable that takes on only non-negative integer values. Then E ( X ) = i = 1 Pr [ X i ] . CS 70, Spring 2005, Notes 21 1 Proof : For notational convenience, lets write p i = Pr [ X = i ] , for i = , 1 , 2 , . . . . From the definition of expectation, we have E ( X ) = ( p ) + ( 1 p 1 ) + ( 2 p 2 ) + ( 3 p 3 ) + ( 4 p 4 ) + = p 1 + ( p 2 + p 2 ) + ( p 3 + p 3 + p 3 ) + ( p 4 + p 4 + p 4 + p 4 ) + = ( p 1 + p 2 + p 3 + p 4 + ) + ( p 2 + p 3 + p 4 + ) + ( p 3 + p 4 + ) + ( p 4 + ) + = Pr [ X 1 ] + Pr [ X 2 ] + Pr [ X 3 ] + Pr [ X 4 ] + ....
View Full Document

## This note was uploaded on 09/03/2011 for the course CS 70 taught by Professor Papadimitrou during the Fall '08 term at University of California, Berkeley.

### Page1 / 6

lecture21 - CS 70 Discrete Mathematics for CS Spring 2005...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online