Lect5

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: hain. Since it consists of L discrete-valued random variables which can each take on one of M values, there are M L possible states. Scalar and vector Markov chains can be represented with a state diagram, which illustrates how states and associated probabilities transition through time. Kevin Buckley - 2007 Example 2: Consider the vector Markov chain of Example 1, and let L = 3 and M = 2 (i.e. binary). Denote the M = 2 possible values as {0, 1}. Denote their probabilities as P (0) and P (1) respectively. There are M L = 8 possible values of the state S n . These are 3 {Sm ; m = 0, 1, 2, · · · , 7} = {(000), (001), (010), (011), (100), (101), (110), (111)} . (8) The state diagram for this example is shown in Figure 1(a). Arrows represent transitions from a state at any time n to the next state at time n + 1. Note that because of the delay-line structure of the vector, only certain states can follow any given state. In the state diagram shown, each transition is labeled with the probability of that transition to the next state, given the present state. A trellis diagram of this same Markov chain is shown in Figure 1(b). For this illustration it is assumed that the initial state is S0 . P(0) state 010 P(0) P(0) state 001 P(0) P(0) state 000 P(1) state 111 (a) n=0 (000) (001) (010) (011) (100) (101) (110) (111) 111111111111111 000000000000000 111111111111111 000000000000000 111111111111111 000000000000000 111111111111111 000000000000000 111111111111111 000000000000000 11111...
View Full Document

This document was uploaded on 10/12/2009.

Ask a homework question - tutors are online