This preview shows pages 1–3. Sign up to view the full content.
16
Markov Processes
MULTIPLE CHOICE
1.
In Markov analysis, we are concerned with the probability that the
a.
state is part of a system.
b.
system is in a particular state at a given time.
c.
time has reached a steady state.
d.
transition will occur.
ANSWER:
b
TOPIC:
Introduction
2.
For a situation with weekly dining at either an Italian or Mexican restaurant,
a.
the weekly visit is the trial and the restaurant is the state.
b.
the weekly visit is the state and the restaurant is the trial.
c.
the weekly visit is the trend and the restaurant is the transition.
d.
the weekly visit is the transition and the restaurant is the trend.
ANSWER:
a
TOPIC:
Market share analysis
3.
A transition probability describes
a.
the probability of a success in repeated, independent trials.
b.
the probability a system in a particular state now will be in a specific state next period.
c.
the probability of reaching an absorbing state.
d.
None of the alternatives is correct.
ANSWER:
b
TOPIC:
Introduction
4.
The probability of going from state 1 in period 2 to state 4 in period 3 is
a.
p
12
b.
p
23
c.
p
14
d.
p
43
ANSWER:
c
TOPIC:
Market share analysis
5.
The probability that a system is in a particular state after a large number of periods is
a.
independent of the beginning state of the system.
b.
dependent on the beginning state of the system.
c.
equal to one half.
d.
the same for every ending system.
ANSWER:
a
TOPIC:
Market share analysis
1
This edition is intended for use outside of the U.S. only, with content that may be different from the U.S. Edition. This may not be resold, copied,
or distributed without the prior consent of the publisher.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document2
Chapter 16
Markov Processes
6.
At steady state
a.
π
1
(n+1) >
π
1
(n)
b.
π
1
=
π
2
c.
π
1
+
π
2
>
1
d.
π
1
(n+1) =
π
1
ANSWER:
d
TOPIC:
Market share analysis
7.
Analysis of a Markov process
a.
describes future behavior of the system.
b.
optimizes the system.
c.
leads to higher order decision making.
d.
All of the alternatives are true.
ANSWER:
a
TOPIC:
Introduction
8.
If the probability of making a transition from a state is 0, then that state is called a(n)
a.
steady state.
b.
final state.
c.
origin state.
d.
absorbing state.
ANSWER:
d
TOPIC:
Absorbing state
9.
Absorbing state probabilities are the same as
a.
steady state probabilities.
b.
transition probabilities.
c.
fundamental probabilities.
d.
None of the alternatives is true.
ANSWER:
d
TOPIC:
Fundamental matrix
10.
The probability of reaching an absorbing state is given by the
a.
R matrix.
b.
NR matrix.
c.
Q matrix.
d.
(I  Q)
1
matrix
ANSWER:
b
TOPIC:
Fundamental matrix
TRUE/FALSE
1.
Markov processes use historical probabilities.
ANSWER:
True
TOPIC:
Market share analysis
2.
All entries in a matrix of transition probabilities sum to 1.
ANSWER:
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '04
 Ajay

Click to edit the document details