{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# zz-9 - ECE 6010 Lecture 10 Markov Processes Basic concepts...

This preview shows pages 1–3. Sign up to view the full content.

ECE 6010 Lecture 10 – Markov Processes Basic concepts A Markov process { X t } one such that P ( X t k +1 = x k +1 | X ( t k ) = x k , X ( t k - 1 ) = x k - 1 , . . . , X ( t 1 ) = x 1 ) = P ( X t k +1 = x k +1 | X t k = x k ) (for a discrete random process) or f ( x t k +1 | X t k = x k . . . , x t 1 = x 1 ) = f ( x t k +1 | X t k = x k ) (for a continuous random process). The most recent observation determines the state of the process, and prior observations have no bearing on the outcome if the state is known. Example 1 Let X i be i.i.d. and let S n = X 1 + · · · + x n = S n - 1 + X n . Then P [ S n +1 = s n +1 | S n = s n , . . . , S 1 = s 1 ] = P [ X n +1 = S n +1 - S n ] = P [ S n +1 = s n +1 | S n = s n ] . 2 Example 2 Let N ( t ) be a Poisson process. P ( N ( t k +1 ) = n k +1 | N ( t k ) = n k , . . . , N ( t 1 ) = n 1 )] = P [ j - i events in t k +1 - t k ] = P [ N ( t k +1 ) = n k +1 | N ( t k ) = n k ] 2 Let { X t } be a Markov r.p. The joint probability has the following factorization: P ( X t 3 = x 3 , X t 2 = x 2 , X t 1 = x 1 ) = P ( X t 3 = x 3 | X t 2 = x 2 ) P ( X t 2 = x 2 | X t 1 = x 1 ) P ( X t 1 = x 1 ) (Why?) Discrete-time Markov Chains Definition 1 An integer-valued Markov random process is called a Markov chain. 2 Let the time-index set be the set of integers. Let p j (0) = P [ X 0 = j ] be the initial probabilities. (Note that j p j (0) = 1 .) We can write the factorization as P [ X n = i n , . . . , X 0 = i 0 ] = P [ X n = i n | X n - 1 = i n - 1 ] · · · P [ X 1 = i 1 | X 0 = i 0 ] P [ X 0 = i 0 ] . If the probability P [ X n +1 = j | X n = i ] is does not change with n , then the r.p. X n is said to have homogeneous transition probabilities . We will assume that this is the case, and write p ij = P [ X n +1 = j | X n = i ] . Note: j P [ X n +1 = j | X n = i ] = 1 . That is, j p ij = 1 . We can represent these transition probabilities in matrix form: P = p 00 p 01 p 02 · · · p 10 p 11 p 12 · · · . . .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
ECE 6010: Lecture 10 – Markov Processes 2 The rows of P sum to 1 . This is called a stochastic matrix . Frequently discrete-time Markov chains are modeled with state diagrams. Example 3 Two light bulbs are held in reserve. After a day, the probability that we need a light bulb is p . Let Y n be the number of new light bulbs at the end of day n . P = 1 0 0 p 1 - p 0 0 p 1 - p Draw the diagram. 2 Now let us look further ahead. Let p ij ( n ) = P [ X n + k = j | x k = i ] If the r.p. is homogeneous, then p ij ( n ) = P [ X k = j | X i = i ] . Let us develop a formula for the case that n = 2 . P [ X 2 = j, X 1 = l | x 0 = i ] = P [ X 2 = j, X 1 = l, X 0 = i ] P [ X 0 = i ] = P [ X 2 = j | X 1 = l ] P [ X 1 = l | X 0 = i ] P [ X 0 = i ] P [ X 0 = i ] = p il (1) p lj (1) = p il p lj Now marginalize: P [ X 2 = j | X 0 = i ] = X l P [ X 2 = j, X 1 = l | X 0 = i ] = X l p il P lj . Let P (2) be the matrix of two-step transition probabilities. Then we have P (2) = P (1) P (1) = P 2 . In general (by induction) we have P ( n ) = P n . Let p ( n ) = P ( X n = 0) P ( X n = 1) . . . (or whatever the outcomes are). Then p j ( n ) = P ( X n = j ) = X i P ( X n = j | X n - 1 = i ) P ( X n - 1 = i ) = p ij p i ( n - 1) . Stacking these up, we obtain the equation p ( n ) = p ( n - 1) P.
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern