{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# 17-04-03 - A set of states S in a Markov chain is a closed...

This preview shows pages 1–2. Sign up to view the full content.

17.4.3 Consider the following transition matrix: P := 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 4 1 4 0 1 2 0 0 1 0 0 0 0 0 0 1 3 0 0 0 2 3 a Which states are transient? A state i is a transient state if there exists a state j that is reachable from i , but the state i is not reachable from state j . Draw a graphical representation of the transition matrix. By inspection, only State 4 is transient as it can reach any other state while no other state can reach it. page 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
b Which states are recurrent? If a state is not transient, it is called a recurrent state. Since 4 is the only transient state, that means 1, 2, 3, 5, and 6 must be recurrent. c Identify all closed sets of states.
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: A set of states S in a Markov chain is a closed set if no state outside of S is reachable from any state in S . Based on the graphical representation, the following sets of states constitute closed sets: {1, 3, 5} {2, 6} d Is this chain ergodic? If all states in a chain are recurrent, aperiodic, and communicate with each other, the chain is said to be ergodic. Since 4 is transient, not all states in the chain are recurrent. No, the chain is not ergodic. page 2...
View Full Document

{[ snackBarMessage ]}

### Page1 / 2

17-04-03 - A set of states S in a Markov chain is a closed...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online