We aren't endorsed by this school

University Of Michigan - STAT 36-754
  • 4 Pages syllabus
    Syllabus

    School: University Of Michigan

    Course: Stochastic Processes

    Syllabus for Advanced Probability II, Stochastic Processes 36-754 Cosma Shalizi Spring 2006 This course is an advanced treatment of interdependent random variables and random functions, with twin emphases on extending the limit theorems of probability fro

  • 4 Pages solutions-3
    Solutions-3

    School: University Of Michigan

    Course: Stochastic Processes

    Solution to Homework #3, 36-754 25 February 2006 Exercise 10.1 I need one last revision of the denition of a Markov operator: a linear operator on L1 satisfying the following conditions. 1. If f 0 (-a.e.), then Kf 0 (-a.e.). 2. If f M (-a.e.), then Kf M (

  • 7 Pages solutions-2
    Solutions-2

    School: University Of Michigan

    Course: Stochastic Processes

    Solution to Homework #2, 36-754 7 February 2006 Exercise 5.3 (The Logistic Map as a MeasurePreserving Transformation) The logistic map with a = 4 is a measure-preserving transformation, and the measure it preserves has the density 1/ x (1 x) (on the unit

  • 2 Pages solutions-1
    Solutions-1

    School: University Of Michigan

    Course: Stochastic Processes

    Solution to Homework #1, 36-754 27 January 2006 Exercise 1.1 (The product -eld answers countable questions) Let D = S X S , where the union ranges over all countable subsets S of the index set T . For any event D D, whether or not a sample path x D depend

  • 6 Pages references
    References

    School: University Of Michigan

    Course: Stochastic Processes

    Bibliography Abramowitz, Milton and Irene A. Stegun (eds.) (1964). Handbook of Mathematical Functions . Washington, D.C.: National Bureau of Standards. URL http:/www.math.sfu.ca/cbm/aands/. Algoet, Paul (1992). Universal Schemes for Prediction, Gambling a

  • 9 Pages lecture-35
    Lecture-35

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 35 Large Deviations for Stochastic Dierential Equations This last chapter revisits large deviations for stochastic dierential equations in the small-noise limit, rst raised in Chapter 22. Section 35.1 establishes the LDP for the Wiener process (Sc

  • 5 Pages lecture-34
    Lecture-34

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 34 Large Deviations for Weakly Dep endent Sequences: The Grtner-Ellis Theorem a This chapter proves the Grtner-Ellis theorem, establishing an a LDP for not-too-dependent processes taking values in topological vector spaces. Most of our earlier LDP

  • 5 Pages lecture-32
    Lecture-32

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 32 Large Deviations for Markov Sequences This chapter establishes large deviations principles for Markov sequences as natural consequences of the large deviations principles for IID sequences in Chapter 31. (LDPs for continuous-time Markov process

  • 10 Pages lecture-31
    Lecture-31

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 31 Large Deviations for I ID Sequences: The Return of Relative Entropy Section 31.1 introduces the exponential version of the Markov inequality, which will be our ma jor calculating device, and shows how it naturally leads to both the cumulant gen

  • 10 Pages lecture-30
    Lecture-30

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 30 General Theory of Large Deviations A family of random variables follows the large deviations principle if the probability of the variables falling into bad sets, representing large deviations from expectations, declines exponentially in some ap

  • 9 Pages lecture-29
    Lecture-29

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 29 Entropy Rates and Asymptotic Equipartition Section 29.1 introduces the entropy rate the asymptotic entropy per time-step of a stochastic process and shows that it is well-dened; and similarly for information, divergence, etc. rates. Section 29.

  • 8 Pages lecture-28
    Lecture-28

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 28 Shannon Entropy and Kullback-Leibler Divergence Section 28.1 introduces Shannon entropy and its most basic properties, including the way it measures how close a random variable is to being uniformly distributed. Section 28.2 describes relative

  • 7 Pages lecture-27
    Lecture-27

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 27 Mixing A stochastic process is mixing if its values at widely-separated times are asymptotically independent. Section 27.1 denes mixing, and shows that it implies ergodicity. Section 27.2 gives some examples of mixing processes, both determinis

  • 8 Pages lecture-26
    Lecture-26

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 26 Decomp osition of Stationary Pro cesses into Ergo dic Comp onents This chapter is concerned with the decomposition of asymptoticallymean-stationary processes into ergodic components. Section 26.1 shows how to write the stationary distribution a

  • 7 Pages lecture-25
    Lecture-25

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 25 Ergo dicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces some consequences. The most important one is that sa

  • 6 Pages lecture-24
    Lecture-24

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 24 The Almost-Sure Ergo dic Theorem This chapter proves Birkho s ergodic theorem, on the almostsure convergence of time averages to expectations, under the assumption that the dynamics are asymptotically mean stationary. This is not the usual proo

  • 6 Pages lecture-22
    Lecture-22

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 22 Large Deviations for Small-Noise Sto chastic Dierential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a rst taste of large deviations theory. Here we study the divergence between

  • 9 Pages lecture-20
    Lecture-20

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 20 More on Sto chastic Dierential Equations Section 20.1 shows that the solutions of SDEs are diusions, and how to nd their generators. Our previous work on Feller processes and martingale problems pays o here. Some other basic properties of solut

  • 7 Pages lecture-18
    Lecture-18

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 18 Stochastic Integrals with the Wiener Pro cess Section 18.1 addresses an issue which came up in the last lecture, namely the martingale characterization of the Wiener process. Section 18.2 gives a heuristic introduction to stochastic integrals,

  • 5 Pages lecture-17
    Lecture-17

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 17 Diusions and the Wiener Pro cess Section 17.1 introduces the ideas which will occupy us for the next few lectures, the continuous Markov processes known as diusions, and their description in terms of stochastic calculus. Section 17.2 collects s

  • 6 Pages lecture-16
    Lecture-16

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 16 Convergence of Random Walks This lecture examines the convergence of random walks to the Wiener process. This is very important both physically and statistically, and illustrates the utility of the theory of Feller processes. Section 16.1 nds t

  • 6 Pages lecture-15
    Lecture-15

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 15 Convergence of Feller Pro cesses This chapter looks at the convergence of sequences of Feller processes to a limiting process. Section 15.1 lays some ground work concerning weak convergence of processes with cadlag sample paths. Section 15.2 st

  • 8 Pages lecture-14
    Lecture-14

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 14 Feller Processes Section 14.1 fullls the demand, made last time, for an example of a Markov process which is not strongly Markovian. Section 14.2 makes explicit the idea that the transition kernels of a Markov process induce a kernel over sampl

  • 3 Pages lecture-13
    Lecture-13

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 13 The Strong Markov Prop erty and Martingale Problems Section 13.1 introduces the strong Markov property independence of the past and future conditional on the state at random (optional) times. Section 13.2 describes the martingale problem for Ma

  • 5 Pages lecture-12
    Lecture-12

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 12 Generators of Markov Pro cesses This lecture is concerned with the innitessimal generator of a Markov process, and the sense in which we are able to write the evolution operators of a homogeneous Markov process as exponentials of their generato

  • 4 Pages lecture-11
    Lecture-11

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 11 Markov Examples Section 11.1 nds the transition kernels for the Wiener process, as an example of how to manipulate such things. Section 11.2 looks at the evolution of densities under the action of the logistic map; this shows how deterministic

  • 6 Pages lecture-10
    Lecture-10

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 10 Alternate Characterizations of Markov Pro cesses This lecture introduces two ways of characterizing Markov processes other than through their transition probabilities. Section 10.1 addresses a question raised in the last class, about when being

  • 5 Pages lecture-09
    Lecture-09

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 9 Markov Pro cesses This lecture begins our study of Markov processes. Section 9.1 is mainly ideological: it formally denes the Markov property for one-parameter processes, and explains why it is a natural generalization of both complete determini

  • 8 Pages lecture-08
    Lecture-08

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 8 More on Continuity Section 8.1 constructs separable modications of reasonable but non-separable random functions, and explains how separability relates to non-denumerable properties like continuity. Section 8.2 constructs versions of our favorit

  • 6 Pages lecture-07
    Lecture-07

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 7 Continuity of Sto chastic Pro cesses Section 7.1 describes the leading kinds of continuity for stochastic processes, which derive from the modes of convergence of random variables. It also denes the idea of versions of a stochastic process. Sect

  • 4 Pages lecture-05
    Lecture-05

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 5 Stationary One-Parameter Pro cesses Section 5.1 describes the three main kinds of stationarity: strong, weak, and conditional. Section 5.2 relates stationary processes to the shift operators introduced in the last chapter, and to measure-preserv

  • 5 Pages lecture-04
    Lecture-04

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 4 One-Parameter Pro cesses, Usually Functions of Time Section 4.1 denes one-parameter processes, and their variations (discrete or continuous parameter, one- or two- sided parameter), including many examples. Section 4.2 shows how to represent one

  • 5 Pages lecture-03
    Lecture-03

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 3 Building Innite Pro cesses from Regular Conditional Probability Distributions Section 3.1 introduces the notion of a probability kernel, which is a useful way of systematizing and extending the treatment of conditional probability distributions

  • 6 Pages lecture-02
    Lecture-02

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 2 Building Innite Pro cesses from Finite-Dimensional Distributions Section 2.1 introduces the nite-dimensional distributions of a stochastic process, and shows how they determine its innite-dimensional distribution. Section 2.2 considers the consi

  • 5 Pages lecture-01
    Lecture-01

    School: University Of Michigan

    Course: Stochastic Processes

    Chapter 1 Basic Denitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery to consider random functions, especially the product -el

Back to course listings