06 Probability Distributions

06 Probability Distributions - CS-350: Fundamentals of...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
CS-350: Fundamentals of Computing Systems Page 1 of 13 Lecture Notes © Azer Bestavros. All rights reserved. Reproduction or copying (electronic or otherwise) is expressly forbidden except for students enrolled in CS-350. Probability Distributions as Modeling Tools Recall that a probability distribution provides a characterization of a random variable—by enabling us to calculate the probability of a random variable assuming a particular value or range of values. Probability distributions are thus good “modeling” tools that we use to abstract away much of the details underlying the process responsible for instantiating the value of the random variable. For example, rather than modeling the various processes that determine whether an access to memory will hit or miss in the cache, we may simply abstract out such processes with a simple probability distribution (e.g., probability than an access would be served from level Li of a cache hierarchy). We are now ready to learn about a number of probability distributions that we encounter frequently when modeling computing systems or when evaluating the performance of computing systems. Important Probability Distributions The Uniform Distribution for continuous random variables A continuous random variable is said to follow a uniform distribution if the random variable has an equal probability of assuming any value in a range of values between a lower bound (say a) and an upper bound (say b). Thus, for a uniformly distributed random variable x, the density function f(x) is given by: b x a x c x f = and where , ) ( For the above to be a density function, it must satisfy the condition that 1 . ) ( = = −∞ = dx x f x x This leads to a value of a b c = 1 It is easy to show that for a uniformly distributed random variable x between a and b, the expected value (i.e. the mean) is given by 2 a b + and that the variance is given by 12 ) ( 2 a b . Bernoulli Random Variable Consider an experiment that has two outcomes (e.g. heads or tail, success or failure, packet lost or packet received, true or false, 0 or 1, etc.) Assume that one of these outcomes is deemed a success (e.g. packet received) and the other is deemed a failure. The outcome of such an experiment is clearly a random variable—call it x. Let x=0 denote failure and x=1 denote success. Let p denote
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CS-350: Fundamentals of Computing Systems Page 2 of 13 Lecture Notes © Azer Bestavros. All rights reserved. Reproduction or copying (electronic or otherwise) is expressly forbidden except for students enrolled in CS-350. the probability of failure. The random variable x is called a Bernoulli random variable, with the simple probability mass function (PMF): = = = 1 x 1 0 x ) ( p p x f The Geometric Distribution for discrete random variables Consider a “Bernoulli trial” experiment that has two outcomes (e.g. heads or tail, success or failure, packet lost or packet received, true or false, 0 or 1, etc.). Assume that we conduct these Bernoulli trials a certain number times and that each experiment is independent. Let n
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 04/14/2011.

Page1 / 13

06 Probability Distributions - CS-350: Fundamentals of...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online