(a)
X
(0) = 0;
(b)
{
X
(
t
);
t
≥
0
}
has stationary and independent increments; and
(c) for every
t>
0
,X
(
t
) is normally distributed with mean 0 and variance
σ
2
t
.
7. Let
P
{
X
=
x
j
}
=
Cb
j
,j
≥
1, where the
b
j
are speci
f
ed, by
C
is unknown. Suppose it was of interest
to estimate
θ
=E[
h
(
X
)] =
∞
X
j
=1
h
(
x
j
)
P
{
X
=
x
j
}
where
h
is some function which is computationally di
ﬃ
cult. What simulation algorithm would be used
in this case?
HastingsMetropolis
8. For any state
i
,let
f
i
denote the probability that, starting in state
i
, the process with reenter state
i
.
If
f
i
<
1, the process is
transient
.
9. Let
B
(
t
) represent a Brownian motion process. Then the
f
rst time
B
(
t
)h
itstheva
lue
a
is called the
hitting time
of
a
.
10. A stochastic process is called a
Gaussian process
if
X
(
t
1
)
,...,X
(
t
n
) has a multivariate normal
distribution for all
n, t
1
,...,t
n
.
11. A Markov chain having initial probabilities
α
j
equal to the limiting probabilities
π
j
for all states
j
=1
,
2
,...,n
in the state space is said to be
stationary
.
12. A Markov chain is
irreducible
if there is only one class; that is, if all states communicate.
13. Let
{
X
(
t
);
t
≥
0
}
beaBrown
ianmo
t
ionp
roce
ss
. Thep
s
sfo
rva
s0
≤
t
≤
1 conditional on
X
(
t
) = 0 — that is, the process
{
X
(
t
)
,
0
≤
t
≤
1

X
(1) = 0
}
—isca
l
leda
Brownian bridge
.
14. A stochastic process
{
X
(
t
);
t
≥
0
}
is said to be
stationary
if for all
n,s,t
1
n
, the random vectors
X
(
t
1
)
(
t
n
)and
X
(
t
1
+
s
)
,...X
(
t
n
+
s
)havethesamejo
int
distribution.
15. Suppose state
i
is recurrent. If the expected time until the process returns to state
i
is
f
nite, then the
state is
positive recurrent
.