This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 5375/7375 Random Processes October 7, 2003 Homework #5 Solutions Problem 1. textbook problem 6.2 A discretetime random process X n is defined as follows. A fair coin is tossed. If the outcome is heads, X n = 1 for all n ; if the outcome is tails, X n = 1 for all n . (a) Sketch some sample paths of the process. (b) Find the PMF for X n . (c) Find the joint PMF for X n and X n + k . (d) Find the mean and autocovariance functions of X n . (a) If the outcome is heads, then X n = 1 , 1 , 1 , . . . . If the outcome is tails, then X n = 1 , 1 , 1 , . . . (b) Since a head or tail is equally likely, it is equally likely that the process will be +1 or 1: P ( X n = 1) = P ( X n = 1) = 1 / 2 (c) In either case of heads or tails, X n and X n + k will have the same value: P ( X n = 1 , X n + k = 1) = P ( X n = 1 , X n + k = 1) = 1 2 In neither case will X n and X n + k have different values: P ( X n = 1 , X n + k = 1) = P ( X n = 1 , X n + k = 1) = 0 (d) Since X n is 1 or 1 with equal likelihood, the mean is E ( X n ) = 1 ( 1 2 ) 1 ( 1 2 ) = 0. The autocovariance function is C ( n, n + k ) = E ( X n X n + k ) E ( X n ) E ( X n + k ) = E ( X n X n + k ) = 1 P ( X n = 1 , X n + k = 1) + ( 1)( 1) P ( X n = 1 , X n + k = 1) = 1 2 + 1 2 = 1 Problem 2. textbook problem 6.21 Let S n denote a binomial counting process. (a) Show that P ( S n = j, S m = i ) 6 = P ( S n = j ) P ( S m = i ). (b) Find P ( S n = j  S m = i ) where n > m . (c) Show that P ( S n = j  S m = i, S l = k ) = P ( S n = j  S m = i ) where n > m > l . (a) We know that S n has independent increments that follow a binomial distribution. That is, for n > m , the increment S n S m is the sum of n m Bernoulli trials, which has the same distribution as S n m . Hence, P ( S n = j, S m = i ) = P ( S n S m = j i, S m = i ) = P ( S n m = j i ) P ( S m = i ) 6 = P ( S n = j ) P ( S m = i ) The last step follows because S n m is the sum of n m Bernoulli trials whereas S n is the sum of n Bernoulli trials. (b) Given S m = i , the event S n = j is the same as the increment S n S m = j i . This increment is the sum of n m Bernoulli trials, so it is a binomial: P ( S n = j  S m = i ) = P ( S n S m = j i ) = n m j i p j i (1 p ) n m ( j i ) 1 (c) We have to use the property of independent increments. P ( S n = j  S m = i, S l = k ) = P ( S n = j, S m = i, S l = k ) P ( S m = i, S l = k ) = P ( S n S m = j i, S m S l = i k, S l = k ) P ( S m S l = i k, S l = k ) = P ( S n S m = j i ) P ( S m S l = i k ) P ( S l = k ) P ( S m S l = i k ) P ( S l = k ) = P ( S n S m = j i ) = P ( S n = j  S m = i ) Problem 3. textbook problem 6.23 Consider the following moving average processes: Y n = 1 2 ( X n + X n 1 ) , X = 0 Z n = 2 3 X n + 1 3 X n 1 , X = 0 (a) Flip a coin 10 times to obtain a realization of a Bernoulli random process X n . Find the resulting realizations of Y n and Z n . (b) Repeat part (a) with X n given by the random step process introduced in...
View
Full
Document
This note was uploaded on 11/29/2009 for the course EE 131A taught by Professor Lorenzelli during the Fall '08 term at UCLA.
 Fall '08
 LORENZELLI

Click to edit the document details