Lect09 - Lecture Notes 9 Stationary Random Processes • Strict-Sense and Wide-Sense Stationarity • Autocorrelation Function of a Stationary

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Notes 9 Stationary Random Processes • Strict-Sense and Wide-Sense Stationarity • Autocorrelation Function of a Stationary Process • Power Spectral Density • Response of LTI System to WSS Process Input • Linear Estimation: the Random Process Case EE 278: Stationary Random Processes 9 – 1 Stationary Random Processes • Stationarity refers to time invariance of some, or all, of the statistics of a random process, such as mean, autocorrelation, n-th-order distribution • We define two types of stationarity: strict sense (SSS) and wide sense (WSS) • A random process X ( t ) (or X n ) is said to be SSS if all its finite order distributions are time invariant, i.e., the joint cdfs (pdfs, pmfs) of X ( t 1 ) , X ( t 2 ) , . . . , X ( t k ) and X ( t 1 + τ ) , X ( t 2 + τ ) , . . . , X ( t k + τ ) are the same for all k , all t 1 , t 2 , . . . , t k , and all time shifts τ • So for a SSS process, the first-order distribution is independent of t , and the second-order distribution — the distribution of any two samples X ( t 1 ) and X ( t 2 ) — depends only on τ = t 2- t 1 To see this, note that from the definition of stationarity, for any t , the joint distribution of X ( t 1 ) and X ( t 2 ) is the same as the joint distribution of X ( t ) = X ( t 1 + ( t- t 1 )) and X ( t 2 + ( t- t 1 )) = X ( t + ( t 2- t 1 )) EE 278: Stationary Random Processes 9 – 2 • Example: The random phase signal X ( t ) = α cos( ωt + Θ) where Θ ∈ U[0 , 2 π ] is SSS ◦ We already know that the first order pdf is f X ( t ) ( x ) = 1 πα 1- ( x/α ) 2 ,- α < x < + α which is independent of t , and is therefore stationary ◦ To find the second order pdf, note that if we are given the value of X ( t ) at one point, say t 1 , there are (at most) two possible sample functions: x 1 x 21 x 22 t t 1 t 2 EE 278: Stationary Random Processes 9 – 3 The second order pdf can thus be written as f X ( t 1 ) ,X ( t 2 ) ( x 1 , x 2 ) = f X ( t 1 ) ( x 1 ) f X ( t 2 ) | X ( t 1 ) ( x 2 | x 1 ) = f X ( t 1 ) ( x 1 ) ( 1 2 δ ( x 2- x 21 ) + 1 2 δ ( x 2- x 22 ) ) , which depends only on t 2- t 1 , and thus the second order pdf is stationary ◦ Now if we know that X ( t 1 ) = x 1 and X ( t 2 ) = x 2 , the sample path is totally determined (except when x 1 = x 2 = 0 , where two paths are possible), and thus all n-th order pdfs are stationary • IID processes are SSS • Random walk process is not SSS (in fact, no independent increment process is SSS) • The Gauss-Markov process (as we defined it) is not SSS. However, if we set X 1 to the steady state distribution of X n , it becomes SSS (see homework exercise) EE 278: Stationary Random Processes 9 – 4 Wide-Sense Stationary Random Processes • A random process X ( t ) is said to be wide-sense stationary (WSS) if its mean and autocorrelation functions are time invariant, i.e., ◦ E( X ( t )) = μ , independent of t ◦ R X ( t 1 , t 2 ) is a function only of the time difference t 2- t 1 ◦ E[ X ( t...
View Full Document

This note was uploaded on 11/28/2009 for the course EE 278 taught by Professor Balajiprabhakar during the Fall '09 term at Stanford.

Page1 / 18

Lect09 - Lecture Notes 9 Stationary Random Processes • Strict-Sense and Wide-Sense Stationarity • Autocorrelation Function of a Stationary

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online