MIT2_017JF09_ch04

MIT2_017JF09_ch04 - 4 RANDOM PROCESSES 23 4 RANDOM...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 4 RANDOM PROCESSES 23 4 RANDOM PROCESSES From the essential aspects of probability we now move into the time domain, considering random signals. For this, assign to each random event A i a complete signal, instead of a single scalar: A i −→ x i ( t ). The set of all the functions that are available (or the menu) is call the ensemble of the random process. An example case is to roll a die, generating i = [1 , 2 , 3 , 4 , 5 , 6] and suppose x i ( t ) = t i . In the general case, there could be infinitely many members in the ensemble, and of course these functions could involve some other variables, for example x i ( t, y, z ), where y and z are variables not related to the random event A i . Any particular x i ( t ) can be considered a regular, deterministic function, if the event is known. x ( t o ), taken at a specific time but without specification of which event has occurred, is a random variable. 4.1 Time Averages The theory of random processes is built on two kinds of probability calculations: those taken across time and those taken across the ensemble. For time averages to be taken, we have to consider a specific function, indexed by i : 1 T m ( x i ( t )) = lim x i ( t ) dt (mean) T →∞ T V t ( x i ( t )) = lim 1 T [ x i ( t ) − m ( x i ( t ))] 2 dt (variance on time) T →∞ T 1 T R t ( τ ) = lim [ x i ( t ) − m ( x i ( t ))][ x i ( t + τ ) − m ( x i ( t ))] dt (autocorrelation) . i T →∞ T The mean and variance have new symbols, but are calculated in a way that is consistent with our prior definitions. The autocorrelation is new and plays a central role in the definition of a spectrum. Notice that is an inner product of the function’s deviation from its mean, with a delayed version of the same, such that R (0) = V t . Consider the roll of a die, and the generation of functions x i ( t ) = a cos( iω o t ). We have T m ( x i ( t )) = lim a cos( iω o t ) dt = 0 T →∞ V t ( x i ( t )) = lim 1 T a 2 cos 2 ( iω o t ) dt = a 2 T →∞ T 2 1 T a 2 R i t ( τ ) = lim a 2 cos( iω o t ) cos( iω o ( t + τ )) dt = cos( iω o τ ) . T →∞ T 2 In this case, the autocorrelation depends explicitly on the event index i , and has a peak of a 2 / 2 at iω o τ = 2 πk , where k is an integer. These values for τ are precisely separated by the period of the i ’th harmonic in the ensemble. When the functions line up, we get a positive R t ; when they are out of phase, we get a negative R t . 4 RANDOM PROCESSES 24 4.2 Ensemble Averages The other set of statistics we can compute are across the ensemble, but at a particular time....
View Full Document

This note was uploaded on 11/29/2011 for the course CIVIL 1.00 taught by Professor Georgekocur during the Spring '05 term at MIT.

Page1 / 7

MIT2_017JF09_ch04 - 4 RANDOM PROCESSES 23 4 RANDOM...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online