This preview shows pages 1–11. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Copyright © 2005 by K.S. Trivedi 1 Probability and Statistics with Reliability, Queuing and Computer Science Applications Second edition by K.S. Trivedi PublisherJohn Wiley & Sons Chapter 5: Conditional Distribution and Expectation Dept. of Electrical & Computer Engineering Duke University Email: kst@ee.duke.edu URL: www.ee.duke.edu/~kst Copyright © 2005 by K.S. Trivedi 2 ependence among Random Variables ¡ So far we assumed that random variables are mutually independent ¡ Dependence arises quite commonly in practice ¡ We start by studying two random variables that have dependence ¡ This will lead us to a family of random variables (a stochastic process) in the next chapter ¡ Our primary tool for dealing with dependence is conditioning and the theorem of total probability with many of its variants Copyright © 2005 by K.S. Trivedi 3 Four Cases ¡ We begin by considering four cases that arise with two random variables X and Y ¡ X and Y both discrete (case 1) ¡ X and Y both continuous (case 2) ¡ X discrete and Y continuous (case 3) ¡ Y discrete and X continuous (case 4) ¡ We will be conditioning on X Copyright © 2005 by K.S. Trivedi 4 Conditional pmf (Case 1) ¡ Conditional probability: ¡ Above works if X is a discrete rv. ¡ For discrete rv’s X and Y , conditional pmf is, ¡ Above relationship also implies, ¡ Hence we have another version of the Theorem of Total Probability (TTP) ) ) ( ( ≠ = x X P Copyright © 2005 by K.S. Trivedi 5 Independence, Conditional Distribution ¡ ¡ Conditional distribution function ¡ Using conditional pmf we get the conditional distribution function, Copyright © 2005 by K.S. Trivedi 6 Case 1: X &Y both discrete rv ¡ Splitting a Poisson stream B A p 1p Bernoulli trial p : prob. that the next job goes to server A k jobs n jobs Poisson job Stream with rate λ ¡ k out of n incoming jobs sent to server A: binomial pmf ¡ Hence using the theorem of total pmfs ` Copyright © 2005 by K.S. Trivedi 7 Another Example of Case 1 ¡ Software Reliability Growth Models ¡ Failure data is collected during testing ¡ Calibrate a reliability growth model using failure data; this model is then used for prediction ¡ Many SRGMs exist (JM,NHPP,HGRGM (chapters 2 & 3), LittlewoodVerrall, etc.) Copyright © 2005 by K.S. Trivedi 8 Randomness in Using Software ¡ P:I Æ O I: Input space O: Outputs P: Program ¡ Assume a subset of I is the error space E ¡ The sequence of applied inputs, starting from some initial input, takes a random amount of time to reach E Copyright © 2005 by K.S. Trivedi 9 A Program with a Single Fault ¡ Let F(t) be the distribution of time to reach the error space E starting from the initial applied input ¡ F(t) is then the distribution of time to find this bug during testing ¡ After t time units of testing, p=F(t) is the probability of finding this bug and 1p = 1 F(t) of not finding that bug ¡ Mean number of faults found by time t (mean value function): m(t)=p=F(t) Copyright © 2005 by K.S. TrivediCopyright © 2005 by K....
View
Full
Document
 Spring '10
 MohammadAbdolahiAzgomiPh.D

Click to edit the document details