This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 10 Some Models of Random Processes 10.1 Introduction 333 10.2 The Bernoulli Process 333 10.3 Random Walk 335 10.4 The Gaussian Process 339 10.5 Poisson Process 342 10.6 Markov Processes 358 10.7 DiscreteTime Markov Chains 359 10.8 ContinuousTime Markov Chains 370 10.9 Gambler’s Ruin as a Markov Chain 376 10.10 Chapter Summary 378 10.11 Problems 378 10.1 Introduction In this chapter we consider some wellknown random processes. These include the Bernoulli process, random walk, Gaussian process, Poisson process, and Markov process. 10.2 The Bernoulli Process Consider a sequence of independent Bernoulli trials, such as coin tossing, where for each trial the probability of success is p and the probability of failure is 1 − p . 333 334 Chapter 10 Some Models of Random Processes Let X i be the random variable that denotes the outcome of the i th trial and let it take values as follows: X i = 1 if a success occurs and X i = 0 if a failure occurs. Then the PMF of X i is given by p X ( x ) = braceleftbigg p x = 1 1 − p x = The Bernoulli random variable is used when an experiment has only two outcomes: on/off, yes/no, success/failure, working/broken, hit/miss, early/late, heads/tails, and so on. The sequence of random variables { X i , i = 1 , 2 ,... } re sulting from, say, tossing the same coin many times, is called a Bernoulli process. In such a process we may be interested in the number of successes in a given number of trials, the number of trials until the first success, or the number trials until the k th success. Let the random variable Y n be defined as follows: Y n = n summationdisplay i = 1 X i Then Y n denotes the number of successes in n Bernoulli trials, which we know from section 4.3 of Chapter 4 to be a binomial random variable . That is, the PMF of Y n is given by p Y n ( k ) = parenleftbigg n k parenrightbigg p k ( 1 − p ) n − k k = , 1 ,..., n Let L 1 be the random variable that denotes the arrival time of the first success; that is, L 1 is the number of times up to and including that trial in which the first success occurs. From Chapter 4 we know that L 1 is a geometrically distributed random variable with parameter p ; that is, the PMF of L 1 is given by p L 1 ( l ) = p ( 1 − p ) l − 1 l = 1 , 2 ,... Also as stated in Chapter 4, L 1 is a random variable that has no memory. That is, if we have observed a fixed number n of Bernoulli trials and they are all failures, the number K of additional trials until the first success has the PMF p K  L 1 > n ( k  L 1 > n ) = P [ K = k  L 1 > n ] = P [ L 1 − n = k  L 1 > n ] = p ( 1 − p ) k − 1 = p L 1 ( k ) Finally, the number of trials up to and including that in which the k th success occurs is known to be a k thorder Pascal random variable X k whose PMF is given by p X k ( n ) = parenleftbigg n − 1 k − 1 parenrightbigg p k ( 1 − p ) n − k k = 1 , 2 ,... ; n = k , k + 1 ,... 10.3 Random Walk 335 Example 10.1 Consider a sequence of independent tosses of a coin with proba...
View
Full
Document
This note was uploaded on 01/05/2010 for the course STAT 350 taught by Professor Carlton during the Fall '07 term at Cal Poly.
 Fall '07
 Carlton
 Bernoulli, Markov Chains

Click to edit the document details