This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS 702 Discrete Mathematics and Probability Theory Spring 2009 Alistair Sinclair, David Tse Note 14 Some Important Distributions Question : A biased coin with Heads probability p is tossed repeatedly until the first Head appears. What is the expected number of tosses? As always, our first step in answering the question must be to define the sample space . A moments thought tells us that = { H , TH , TTH , TTTH , . . . } , i.e., consists of all sequences over the alphabet { H , T } that end with H and contain no other H s. This is our first example of an infinite sample space (though it is still discrete). What is the probability of a sample point, say = TTH ? Since successive coin tosses are independent (this is implicit in the statement of the problem), we have Pr [ TTH ] = ( 1 p ) ( 1 p ) p = ( 1 p ) 2 p . And generally, for any sequence of length i , we have Pr [ ] = ( 1 p ) i 1 p . To be sure everything is consistent, we should check that the probabilities of all the sample points add up to 1. Since there is exactly one sequence of each length i 1 in , we have Pr [ ] = i = 1 ( 1 p ) i 1 p = p i = ( 1 p ) i = p 1 1 ( 1 p ) = 1 , as expected. [In the secondlast step here, we used the formula for summing a geometric series.] Now let the random variable X denote the number of tosses in our sequence (i.e., X ( ) is the length of ). Our goal is to compute E ( X ) . Despite the fact that X counts something, theres no obvious way to write it as a sum of simple r.v.s as we did in many examples in an earlier lecture note. (Try it!) In a later lecture, we will give a slick way to do this calculation. For now, lets just dive in and try a direct computation of E ( X ) . Note that the distribution of X is quite simple: Pr [ X = i ] = ( 1 p ) i 1 p for i = 1 , 2 , 3 , . . . So from the definition of expectation we have E ( X ) = ( 1 p )+( 2 ( 1 p ) p )+( 3 ( 1 p ) 2 p )+ = p i = 1 i ( 1 p ) i 1 . This series is a blend of an arithmetic series (the i part) and a geometric series (the ( 1 p ) i 1 part). There are several ways to sum it. Here is one way, using an auxiliary trick (given in the following Theorem) that is often very useful. [Ask your TA about other ways.] Theorem 14.1 : Let X be a random variable that takes on only nonnegative integer values. Then E ( X ) = i = 1 Pr [ X i ] . CS 702, Spring 2009, Note 14 1 Proof : For notational convenience, lets write p i = Pr [ X = i ] , for i = , 1 , 2 , . . . . From the definition of expectation, we have E ( X ) = ( p )+( 1 p 1 )+( 2 p 2 )+( 3 p 3 )+( 4 p 4 )+ = p 1 +( p 2 + p 2 )+( p 3 + p 3 + p 3 )+( p 4 + p 4 + p 4 + p 4 )+ = ( p 1 + p 2 + p 3 + p 4 + )+( p 2 + p 3 + p 4 + )+( p 3 + p 4 + )+( p 4 + )+ = Pr [ X 1 ]+ Pr [ X 2 ]+ Pr [ X 3 ]+ Pr [ X 4 ]+ ....
View Full
Document
 Spring '08
 PAPADIMITROU
 The Land

Click to edit the document details