{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Topics_covered

# Topics_covered - Introductory Engineering Stochastic...

This preview shows pages 1–8. Sign up to view the full content.

Introductory Engineering Stochastic Processes, ORIE 361 Instructor: Mark E. Lewis, Associate Professor School of Operations Research and Information Engineering Cornell University Spring, 2008 1/ 28

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Disclaimer This file can be used as a study guide. Please note that as the semester progresses, some of the material may be adjusted (added to or subtracted). This is due to the fact that the class may require further clarification on some topics and less on others depending on your strengths and weaknesses. 2/ 28
Preliminaries 1 Course Prerequisites 1 Basic knowledge of random variables Discrete random variables Continuous random variables 2 Independence 3 Expectation Functions of random variables (Law of the unconscious statistician) Expectation of linear combinations of random variables Variance of linear combinations of independent random variables 4 Moment Generating Functions (mgf’s) k th moments via differentiating mgf’s Mgf of sums of independent random variables Uniqueness of mgf’s (equal mgf’s equal distributions) 5 Conditional probability and expectation 3/ 28

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Markov chain (DTMCs) 2 Define a stochastic process A sequence of random variables An outcome is called a sample path The set of possible outcomes is called the state space , say S . 3 The Markov Property Given the present, the future is independent of the past When the state space is discrete P ( X n +1 = j | X n = i , X n - 1 = i n - 1 , . . . , X 0 = i 0 ) = P ( X n +1 = j | X n = i ) . for all i , j , i n - 1 , . . . i 0 S . You can think of this as a refinement of independence; the above would be LHS = P ( X n +1 = j ). A discrete time stochastic process with the Markov property is called a discrete-time Markov chain (DTMC) . Since p ij = P ( X n +1 = j | X n = i ) is defined for all i , j S , this defines a matrix P (with ( i , j ) th element p ij ). The p ij ’s are independent of n ; this is called time homogeneity . 4/ 28
Transient distributions; P ( X n = j | X 0 = i ) = p ( n ) ij Chapman-Kolmogorov equations p n + m ij = X k S p ( n ) ik p ( m ) kj = ( P n P m ) ij (just multiply matrices) Initial distributions: Let α j = P ( X 0 = j ), then P ( X n = k ) = X j S α j p ( n ) jk (condition on the initial state) 5/ 28

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Long-Run Behavior; as n → ∞ Need to classify states: starting in state i 1 Transient: return to state i a finite number of times 2 Recurrent: return to state i an infinite number of times More formally: let T i = min { n > 0 | X n = i } 1 Transient: P ( T i < ∞| X 0 = i ) < 1 2 Recurrent: P ( T i < ∞| X 0 = i ) = 1 Several checks: let f i = probability of starting in i we ever return to state i 1 Transient: f i < 1. Alternatively, n =0 p ( n ) ii < 2 Recurrent: f i = 1. Alternatively, n =0 p ( n ) ii = j is accessible from i (written i j , not a limit ) if p ( n ) ij > 0 for some n 0 (note “=” is included) j communicates with i if i j and i j (written i j ) Communication is an equivalence relation The state space can be divided into disjoint equivalence classes If all states in the state space communicate, the DTMC is said to be irreducible 6/ 28
Long-Run Behavior (2) Theorem If state i is recurrent (transient) and i j, then j is recurrent (transient).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern