{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture1

# Lecture1 - Lecture 1 A rapid overview of probability theory...

This preview shows pages 1–4. Sign up to view the full content.

Lecture 1: A rapid overview of probability theory Biology 429 Carl Bergstrom January 2, 2008 Sources: These lecture notes draw upon material from Taylor and Karlin An Introduction to Stochastic Modeling 3rd Edition (1998), Parzen (1962) Stochastic Processes , Pitman (1993) Probability , Van Campen (1997) Stochastic Processes in Physics and Chemistry , and Dill and Bromberg (2003) Molecular Driving Forces . In places, definitions and statements of theorems may be taken directly from these sources. Probability theory can be developed in a completely formal, rigorous fashion grounded in set theory and measure theory; this is known as ax- iomatic probability theory . Alternatively, one can be develop probability theory in a common-sense fashion, allowing our our basic intuitions about chance to replace much of the formalism. I will take the latter approach throughout. 1 Discrete probability 1.1 Random Variables Definition 1 A random variable is a variable that takes on its value prob- abilistically. A random variable X is defined by a set of possible values x , and a distri- bution of probabilities p ( x ) over this set, such that 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1. p ( x ) 0 for all x , and 2. x p ( x ) = 1 . For example, if X is the random variable representing the outcome of rolling a fair die, the random variable X has a set of possible values { 1 , 2 , 3 , 4 , 5 , 6 } each with probability p ( x ) = 1 / 6. If X drawn is from dis- tribution F, we write x F . In this die example, X is a discrete random variable because it takes one of a set of discrete values. Random variables can also be continuous; they can take on any of a set of continuous val- ues. Here we will treat discrete random variables first, and then move on to continuous random variables. 1.2 Events Closely related to the notion of a random variable is the concept of an event. Definition 2 An event is the case that a random variable takes on a value within a described subset of possible values. For example, the event that I roll a die and get an odd number can be represented as the event X ∈ { 1 , 3 , 5 } . Events can include single values of random variables, e.g. the event that X = 4; they can include all possible values X ∈ { 1 , 2 , . . . , 6 } , and they can include no possible values X ∈ {} . If events A 1 , A 2 , . . . , A n are mutually exclusive events with probabilities P [ A i ], the probability that any one of them occurs is P [ A 1 or A 2 . . . or A n ] = X i P [ A i ] . 1.3 Conditional probability Conditional probability lets us talk about the chance that one event occurs given that another occurs. Definition 3 The conditional probability of A given B is P [ A | B ] = P [ A and B ] P [ B ] 2
(We will write P [ AB ] as shorthand for P [ A and B ]). Note that we can also write P [ AB ] = P [ A | B ] P [ B ] We can extend this to write down a chain rule for probabilities. This rule tells us the probability that a series of events A 1 , A 2 . . . , A n happen in succession.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 18

Lecture1 - Lecture 1 A rapid overview of probability theory...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online