This preview shows page 1. Sign up to view the full content.
Unformatted text preview: queue: . . . . . . . . . . . . . . . . . . . . . . 281 7.3 Detection, Decisions, and Hypothesis testing . . . . . . . . . . . . . . . . . . 284 7.4 Threshold crossing probabilities in random walks . . . . . . . . . . . . . . . 287 7.5 Thresholds, stopping rules, and Wald’s identity . . . . . . . . . . . . . . . . 291 7.5.1 Stopping rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292 7.5.2 Joint distribution of N and barrier . . . . . . . . . . . . . . . . . . . 297 7.5.3 Proof of Wald’s identity . . . . . . . . . . . . . . . . . . . . . . . . . 298 Martingales and submartingales . . . . . . . . . . . . . . . . . . . . . . . . . 299 7.6.1 Simple examples of martingales . . . . . . . . . . . . . . . . . . . . . 300 7.6.2 Markov modulated random walks . . . . . . . . . . . . . . . . . . . . 301 7.6.3 Generating functions for Markov random walks . . . . . . . . . . . . 303 7.6.4 Scaled branching processes . . . . . . . . . . . . . . . . . . . . . . . 304 7.6.5 Partial isolation of past and future in martingales . . . . . . . . . . 304 7.6.6 Submartingales and supermartingales . . . . . . . . . . . . . . . . . 305 Stopped processes and stopping times . . . . . . . . . . . . . . . . . . . . . 307 7.7.1 Stopping times for martingales relative to a process . . . . . . . . . 311 The Kolmogorov inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . 312 7.8.1 The strong law of large numbers . . . . . . . . . . . . . . . . . . . . 315 7.8.2 The martingale convergence therem . . . . . . . . . . . . . . . . . . 316 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 7.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 7.6 7.7 7.8 7.9 Chapter 1 INTRODUCTION AND REVIEW
OF PROBABILITY
1.1 Probability models Probability theory is a central ﬁeld of mathematics, widely applicable to scientiﬁc, technological, and human situations involving uncertainty. The most obvious applications are to
situations, such as games of chance, in which repeated trials of essentially the same procedure lead to diﬀering outcomes. For example, when we ﬂip a coin, roll a die, pick a card
from a shuﬄed deck, or spin a ball onto a roulette wheel, the procedure is the same from
one trial to the next, but the outcome (heads (H ) or tails (T ) in the case of a coin, one to
six in the case of a die, etc.) varies from one trial to another in a seemingly random fashion.
For the case of ﬂipping a coin, the outcome of the ﬂip could be predicted from the initial
position, velocity, and angular momentum of the coin and from the nature of the surface
on which it lands. Thus, in one sense, a coin ﬂip is deterministic rather than random
and the same can be said for the other examples above. When these initial conditions are
unspeciﬁed, however, as when playing these games, the outcome can again be viewed as
random in some intuitive sense.
Many scientiﬁc experiments are similar to games of chance in the sense that multiple trials
of the same procedure lead to results that vary from one trial to another. In some cases,
this variation is due to slight variations in the experimental procedure, in some it is due
to noise, and in some, such as in quantum mechanics, the randomness is generally believed
to be fundamental. Similar situations occur in many types of systems, especially those in
which noise and random delays are important. Some of these systems, rather than being
repetitions of a common basic procedure, are systems that evolve over time while still
containing a sequence of underlying similar random occurences.
This intuitive notion of randomness, as described above, is a very special kind of uncertainty.
Rather than involving a lack of understanding, it involves a type of uncertainty that can
lead to probabilistic models with precise results. As in any scientiﬁc ﬁeld, the models might
or might not correspond to reality very well, but when they do correspond to reality, there
1 2 CHAPTER 1. INTRODUCTION AND REVIEW OF PROBABILITY is the sense that the situation is completely understood, while still being random. For
example, we all feel that we understand ﬂipping a coin or rolling a die, but still accept
randomness in each outcome. The theory of probability was developed particularly to give
precise and quantitative understanding to these types of situations. The remainder of this
section introduces this relationship between the precise view of probability theory and the
intuitive view as used in applications and everyday language.
After this introduction, the following sections review probability theory as a mathematical
discipline, with a special emphasis on the laws of large numbers. In the ﬁnal section of this
chapter, we use the theory and the laws of large numbers to obtain a fuller understanding
of the relationship between theory and the real world.1
Probability theory, as a mathematical discipline, started to evolve in the 17th century
and was initially focused on gam...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details