n14 - CS 70-2 Discrete Mathematics and Probability Theory...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS 70-2 Discrete Mathematics and Probability Theory Spring 2009 Alistair Sinclair, David Tse Note 14 Some Important Distributions Question : A biased coin with Heads probability p is tossed repeatedly until the first Head appears. What is the expected number of tosses? As always, our first step in answering the question must be to define the sample space . A moments thought tells us that = { H , TH , TTH , TTTH , . . . } , i.e., consists of all sequences over the alphabet { H , T } that end with H and contain no other H s. This is our first example of an infinite sample space (though it is still discrete). What is the probability of a sample point, say = TTH ? Since successive coin tosses are independent (this is implicit in the statement of the problem), we have Pr [ TTH ] = ( 1- p ) ( 1- p ) p = ( 1- p ) 2 p . And generally, for any sequence of length i , we have Pr [ ] = ( 1- p ) i- 1 p . To be sure everything is consistent, we should check that the probabilities of all the sample points add up to 1. Since there is exactly one sequence of each length i 1 in , we have Pr [ ] = i = 1 ( 1- p ) i- 1 p = p i = ( 1- p ) i = p 1 1- ( 1- p ) = 1 , as expected. [In the second-last step here, we used the formula for summing a geometric series.] Now let the random variable X denote the number of tosses in our sequence (i.e., X ( ) is the length of ). Our goal is to compute E ( X ) . Despite the fact that X counts something, theres no obvious way to write it as a sum of simple r.v.s as we did in many examples in an earlier lecture note. (Try it!) In a later lecture, we will give a slick way to do this calculation. For now, lets just dive in and try a direct computation of E ( X ) . Note that the distribution of X is quite simple: Pr [ X = i ] = ( 1- p ) i- 1 p for i = 1 , 2 , 3 , . . . So from the definition of expectation we have E ( X ) = ( 1 p )+( 2 ( 1- p ) p )+( 3 ( 1- p ) 2 p )+ = p i = 1 i ( 1- p ) i- 1 . This series is a blend of an arithmetic series (the i part) and a geometric series (the ( 1- p ) i- 1 part). There are several ways to sum it. Here is one way, using an auxiliary trick (given in the following Theorem) that is often very useful. [Ask your TA about other ways.] Theorem 14.1 : Let X be a random variable that takes on only non-negative integer values. Then E ( X ) = i = 1 Pr [ X i ] . CS 70-2, Spring 2009, Note 14 1 Proof : For notational convenience, lets write p i = Pr [ X = i ] , for i = , 1 , 2 , . . . . From the definition of expectation, we have E ( X ) = ( p )+( 1 p 1 )+( 2 p 2 )+( 3 p 3 )+( 4 p 4 )+ = p 1 +( p 2 + p 2 )+( p 3 + p 3 + p 3 )+( p 4 + p 4 + p 4 + p 4 )+ = ( p 1 + p 2 + p 3 + p 4 + )+( p 2 + p 3 + p 4 + )+( p 3 + p 4 + )+( p 4 + )+ = Pr [ X 1 ]+ Pr [ X 2 ]+ Pr [ X 3 ]+ Pr [ X 4 ]+ ....
View Full Document

Page1 / 6

n14 - CS 70-2 Discrete Mathematics and Probability Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online