lecture-2

lecture-2 - EE378 Statistical Signal Processing Lecture 2 -...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE378 Statistical Signal Processing Lecture 2 - 04/09/2007 Gaussian Processes, Stationarity, Ergodicity Lecturer: Tsachy Weissman Scribe: Boon Sim Thian, Huixian Chen, Kamakshi S 1 Random Processes (discrete time) Throughout the course, we will mostly talk about discrete time processes.Some equivalent terms: • Stochastic Processes • Random Signals • Time Series Why should we mostly be concerned with discrete-time processes? • Most of the processes that we would encounter will be discrete • Continuous processes can be approximated by discrete time processes arbitrarily well. • The mathematics involved is more tractable. Definition 1 (Random Process) . : A random process is a collection of random variables { X ( n ) } n ∈ Z . (Other notations: { X ( t ) } , { X ( n ) } , { X t } , { X n } ). A random process is fully specified by the finite-dimensional distribution, i.e. distributions of ( X ( n 1 ) ,X ( n 2 ) ,...,X ( n k )) , ∀ k,n 1 ,n 2 ,...,n k . Theorem 2 (Kolmogorov’s Theorem) . Any consistent set of finite-dimensional distribution uniquely speci- fies a process. What do we mean by consistent? Colloquially it means, the marginal on the distributions of k + 1- dimensional vector must equal the distributions of k-dimensional vector. More rigorously, we first define Definition 3 (The Distribution Functions of a Stochastic Process, { X t ,t ∈ T ⊂ R } ) . Let T be the set of all vectors { t = ( t 1 , ··· ,t n ) ∈ T n : t 1 < ··· < t n ,n = 1 , 2 ···} . Then the (finite-dimensional) distribution functions of { X t ,t ∈ T } are the functions F t ( · ) , t ∈ T defined for t = ( t 1 , ··· ,t n ) by F t ( x ) = P ( X 1 ≤ x 1 , ··· ,X t n ≤ x n ) , x = ( x 1 , ··· ,x n ) ∈ R n (1) and the Kolmogorov Theorem from [Brockwell&Davis] is, Theorem 4 (Kolmogorov’s Theorem) . The probability distribution functions { F t ( · ) , t ∈ T } are the distri- bution functions of some stochastic process if and only if for any n ∈ { 1 , 2 , ···} , t = { t 1 ,t 2 , ··· ,t n } ∈ T and 1 ≤ i ≤ n , lim x i →∞ F t ( x ) = F t ( i ) ( x ( i )) (2) where, t ( i ) and x ( i ) are the ( n- 1)-component vectors obtained by deleting the...
View Full Document

This note was uploaded on 03/03/2011 for the course EE 378 taught by Professor Weissman,i during the Spring '07 term at Stanford.

Page1 / 4

lecture-2 - EE378 Statistical Signal Processing Lecture 2 -...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online