sol11 - CS 70 Discrete Mathematics and Probability Theory...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS 70 Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner Soln 11 1. (10 pts.) Machine Failures Two faulty machines, M 1 and M 2 , are repeatedly run synchronously in parallel (i.e., both machines execute one run, then both execute a second run, and so on). On each run, M 1 fails with probability p 1 and M 2 with probability p 2 , all failure events being independent. Let the random variable X 1 denote the number of runs until the first failure of M 1 , and X 2 denote the number of runs until the first failure of M 2 . Let X = min ( X 1 , X 2 ) denote the number of runs until the first failure of either machine. Compute the distribution of X . What is its expectation? Answer 1: Let’s compute the probability that neither machine fails, on any particular run. Since failures of the two machines are independent, Pr [ M 1 doesn’t fail ∧ M 2 doesn’t fail ] = Pr [ M 1 doesn’t fail ] × Pr [ M 2 doesn’t fail ] = ( 1- p 1 )( 1- p 2 ) . Therefore, the probability that at least one machine fails, on any particular run, is Pr [ either M 1 or M 2 fails (or both) ] = 1- ( 1- p 1 )( 1- p 2 ) = p 1 + p 2- p 1 p 2 . We repeatedly perform runs until one of the machines fail. Since failures of both machines at different runs are independent events, the number of runs until one of the machines fail is a geometric distribution with parameter p 1 + p 2- p 1 p 2 : X ∼ Geom ( p 1 + p 2- p 1 p 2 ) . By the formula in Lecture Note 15, E ( X ) = 1 p 1 + p 2- p 1 p 2 . Alternatively, we could compute the probability that at least one machine fails, on any particular run, using inclusion-exclusion and independence: Pr [ either M 1 or M 2 fails (or both) ] = Pr [ M 1 fails ]+ Pr [ M 2 fails ]- Pr [ M 1 fails and M 2 fails ] = p 1 + p 2- Pr [ M 1 fails ] × Pr [ M 2 fails ] = p 1 + p 2- p 1 p 2 . The rest is as above. Answer 2: We have that X 1 ∼ Geom ( p 1 ) and X 2 ∼ Geom ( p 2 ) . Also, X 1 , X 2 are independent r.v.’s. We also use the following definition of the minimum: min ( x , y ) = ( x if x ≤ y ; y if x > y . Now, for all k ∈ { 1 , 2 ,... } , min ( X 1 , X 2 ) = k is equivalent to ( X 1 = k ) ∧ ( X 2 ≥ k ) or ( X 2 = k ) ∧ ( X 1 > k ) . CS 70, Fall 2010, Soln 11 1 Hence, Pr [ X = k ] = Pr [ min ( X 1 , X 2 ) = k ] = Pr [( X 1 = k ) ∧ ( X 2 ≥ k )]+ Pr [( X 2 = k ) ∧ ( X 1 > k )] = Pr [ X 1 = k ] · Pr [ X 2 ≥ k ]+ Pr [ X 2 = k ] · Pr [ X 1 > k ] since X 1 , X 2 are independent = [( 1- p 1 ) k- 1 p 1 ]( 1- p 2 ) k- 1 +[( 1- p 2 ) k- 1 p 2 ]( 1- p 1 ) k since X 1 , X 2 are geometric = (( 1- p 1 )( 1- p 2 )) k- 1 ( p 1 + p 2 ( 1- p 1 )) = ( 1- p 1- p 2 + p 1 p 2 ) k- 1 ( p 1 + p 2- p 1 p 2 ) . But this final expression is precisely the probability that a geometric r.v. with parameter p 1 + p 2- p 1 p 2 takes the value k . Hence X ∼ Geom ( p 1 + p 2- p 1 p 2 ) , and E ( X ) = 1 p 1 + p 2- p 1 p 2 ....
View Full Document

This note was uploaded on 12/08/2010 for the course CS 70 taught by Professor Papadimitrou during the Fall '08 term at Berkeley.

Page1 / 11

sol11 - CS 70 Discrete Mathematics and Probability Theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online