CHAPTER 9
9'1 (a) EM? an-W 1/f2 c - 0.25 0.5
30:, heads) = sinut = 1 t - 0.5 350:, tails) 8 2t = 1
O t - 1 2
F(><,0.25') F591)
1.
0.5
a r/t Mr; x 0 7-
92 as) a egt
man: a at1 at2
n(t) = Ia fa(a)da R(t1,t2) = I e e fa(a)da
From (5-16) with x = g(a) =
228 Chapter 16 16.1 Use (16-132) with r = 1. This gives
pn =
n p , n 1 n! 0 n p 0 , 1 < n m 0nm n = p 0
n=0
= n p0 , Thus
m m
pn = p 0
n=0
(1 m+1 ) =1 1
= and hence pn =
p0 =
1 1 m+1
1 n , 1 n+1
0 n m,
=1
and lim 1, we get pn = 1 , m+1 = 1.
16.2 (a) Let
215 Chapter 15 15.1 The chain represented by
0
1/2 1/2 0
P = 1/2
1/2 1/2
1/2
0
is irreducible and aperiodic. The second chain is also irreducible and aperiodic. The third chain has two aperiodic closed sets cfw_e1 , e2 and cfw_e3 , e4 and a transien
6 3.11
Arguing as in (3.43), we get the corresponding iteration equation
Pn = pPn+ + qPn; Pn = Pn+ + qPn;
and proceed as in Example 3.15. 3.12 Suppose one best on k = 1 2 Then
6. 5 6 5 6
2
p1 = P (k appears on one dice) = 3 1 16 2 p2 = P (k appear on two
20. Extinction Probability for Queues and Martingales
(Refer to section 15.6 in text (Branching processes) for
discussion on the extinction probability). 20.1 Extinction Probability for Queues: A customer arrives at an empty server and immediately goes fo
19. Series Representation of Stochastic Processes
Given information about a stochastic process X(t) in 0 t T , can this continuous information be represented in terms of a countable set of random variables whose relative importance decrease under some arr
18. Power Spectrum
For a deterministic signal x(t), the spectrum is well defined: If X ( ) represents its Fourier transform, i.e., if X ( ) = x(t )e j t dt ,
+
(18-1)
then | X ( ) |2 represents its energy spectrum. This follows from Parsevals theorem sinc
17. Long Term Trends and Hurst Phenomena
From ancient times the Nile river region has been known for its peculiar long-term behavior: long periods of dryness followed by long periods of yearly floods. It seems historical records that go back as far as 622
16. Mean Square Estimation
Given some information that is related to an unknown quantity of interest, the problem is to obtain a good estimate for the unknown in terms of the observed data. Suppose X 1 , X 2 , , X n represent a sequence of random variable
15. Poisson Processes
In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation of Binomial random variables.) From the discussion there (see (4-6)-(4-8) Lecture 4) " k arrivals occ
14. Stochastic Processes
Introduction Let denote the random outcome of an experiment. To every such outcome suppose a waveform X (t, ) X (t , ) is assigned. The collection of such X (t, ) waveforms form a X (t, ) stochastic process. The set of cfw_ k and
13. The Weak Law and the Strong Law of Large Numbers
James Bernoulli proved the weak law of large numbers (WLLN) around 1700 which was published posthumously in 1713 in his treatise Ars Conjectandi. Poisson generalized Bernoullis theorem around 1800, and
12. Principles of Parameter Estimation
The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in earlier lectures to practical problems of interest. In this context, consider the problem of estimating an
11. Conditional Density Functions and Conditional Expected Values
As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about some other related event (refer to ex
10. Joint Moments and Joint Characteristic Functions
Following section 6, in this section we shall introduce various parameters to compactly represent the information contained in the joint p.d.f of two r.vs. Given two r.vs X and Y and a function g ( x, y
9. Two Functions of Two Random Variables
In the spirit of the previous section, let us look at an immediate generalization: Suppose X and Y are two random variables with joint p.d.f f XY ( x, y). Given two functions g ( x, y ) and h( x, y ), define the ne
8. One Function of Two Random Variables
Given two random variables X and Y and a function g(x,y), we form a new random variable Z as
Z = g ( X , Y ).
(8-1)
Given the joint p.d.f f XY ( x , y ), how does one obtain f Z ( z ), the p.d.f of Z ? Problems of t