notes_1 - CME 308 Spring 2010 Notes-1 George Papanicolaou...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CME 308 Spring 2010 Notes-1 George Papanicolaou June 2, 2010 Contents 1 Sums of independent identically distributed random variables 2 1.1 The weak law of large numbers . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 The strong law of large numbers . . . . . . . . . . . . . . . . . . . . . . . . 3 2 Weak convergence 4 3 The central limit theorem (CLT) 4 4 Maximum likelihood estimation (MLE) 5 5 Large sample properties of MLE 7 6 Asymptotic efficiency of the MLE 8 7 Markov Chains 9 7.1 Hitting times . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 7.2 Invariant probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 7.3 The ergodic theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 7.4 The central limit theorem for Markov chains . . . . . . . . . . . . . . . . . . 15 7.5 Expected number of visits to a state and π . . . . . . . . . . . . . . . . . . 16 7.6 Random walks and connections with differential equations . . . . . . . . . . 17 7.7 Transience and recurrence . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 7.8 Connections with classical potential theory . . . . . . . . . . . . . . . . . . 21 7.9 Stochastic control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 1 1 Sums of independent identically distributed random vari- ables We will be dealing with sequences of independent identically distributed random variables X 1 ,X 2 ,...,X n where P { X j ≤ x } = F ( x ) is their common distribution function. We will also use the notation F X ( x ) when we want to identify the random variable whose distribution is F ( x ). Independence means that the joint distribution of X 1 ,X 2 ,...,X n is equal to the product of the marginals P { X 1 ≤ x 1 ,X 2 ≤ x 2 ,...,X n ≤ x n } = F ( x 1 ,x 2 ,...,x n ) = n Y j =1 F ( x j ) This implies that the expectation of the product of bounded functions of the random variables equals the product of the expectations. We will be interested in the behavior of the sample or empirical mean X n = X 1 + X 2 + ... + X n n which is the simplest and most widely studied function of the random variables. We expect that it should be closely related to the theoretical mean μ = E { X j } . We denote by σ 2 the variance of X j σ 2 = var( X j ) = E { ( X j- μ ) 2 } = Z ( x- μ ) 2 dF ( x ) 1.1 The weak law of large numbers The simplest large sample relation is the weak law of large numbers (WLLN) which says that ¯ X n → μ in probability as n → ∞ . This is a consequence of the Chebyshev inequality (CI) P {| X n- μ | > δ } ≤ E { ( X n- μ ) 2 } δ 2 = σ 2 nδ 2 → as n → ∞ for all δ > 0. We have used here the fact that X n converges to μ also in mean square, E { ( X n- μ ) 2 } → 0....
View Full Document

{[ snackBarMessage ]}

Page1 / 26

notes_1 - CME 308 Spring 2010 Notes-1 George Papanicolaou...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online