Tutorial 11.
11/24/2008
Application of Eigenvalues and
Eigenvectors – part I

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
Markov Chain
{
Definition:
z
A Markov Process (Markov Chain) is a process
having the Markov property, which means that,
given the present state
, future states are
independent of the past states
.
z
Assume there are
states. Define the probability of
going from state i to state j in k time steps as
z
And the single-step transition as